People's Democracy(Weekly Organ of the Communist Party of India (Marxist) |
Vol.
XXVI No. 49 December 15,2002 |
Poll
Khul
Gayee?
Nachiketa
THE
phrase
in
Hindi,
Poll
Khul
Gayee
[which
means
the
cat
is
out
of
the
bag],
is
perhaps
most
appropriate
when
applied
to
various
pollsters
sticking
their
necks
out
to
give
their
numbers
on
Gujarat.
Whilst
India
Today
[ORG-Marg]
announced
that
it
was
Narendra
Modi
who
was
the
clear
winner,
Outlook
was
singing
a
different
tune.
The
polling
agency
-
C
fore
said
that
the
BJP
would
manage
only
80-85
of
the
total
182
seats.
Another
agency
-
CSDS,
through
Frontline
agreed
with
India
Today,
and
if
one
were
to
go
by
what
they
say,
the
BJP
has
a
clear
15-percentage
advantage!
The
Week,
is
trying
its
best
to
hedge
its
bets,
and
gives
exactly,
85-90
seats
to
both,
the
BJP
and
the
Congress.
If
the
Election
Commission
was
buying
all
these
magazines,
and
J
M
Lyngdoh
wasn’t
so
busy,
they
may
well
feel
inclined
to
simply
do
away
with
the
polls!
What’s
the
point,
when
it
has
all
been
settled
and
sorted
by
media
gurus
and
psephologists.
The
thing
to
note
is
that
different
polls
have
given
completely
divergent
results,
and
maybe
that’s
why
the
Election
Commission
has
decided
to
go
ahead
with
the
polls!
This
is
not
the
first
time
that
opinion
polls
have
been
carried
out.
It
has
been
the
trend
for
at
least
the
past
ten
years,
to
try
and
lend
authenticity
to
assessments
and
reportage.
Even
if
one
was
to
not
dig
way
back
into
the
past,
but
just
look
at
how
the
pre-poll
surveys
dished
out
in
the
run
up
to
the
assembly
elections
in
West
Bengal
and
Tamil
Nadu
fared.
Surveys
by
the
same
old
agencies,
suggested
a
“photo-finish”,
a
“close
call”.
But
the
real
results,
were
completely
out
of
line
with
the
pre-poll
estimates.
Psephologists
and
agencies
in
the
pre-poll
survey
business
defend
their
surveys
vorciferously.
It
is
all
scientific;
random
samples
of
a
suitable
size
are
picked,
and
enthusiastic
youngsters
get
busy
trying
to
gauge
preferences.
But
just
think
about
this.
For
an
electorate
of
3
crore
and
28
lakhs,
The
Week
claims
to
have
spoken
to
3204
voters
across
24
constituencies,
and
CSDS
to
just
1,775
voters,
in
27
constituencies.
Is
that
enough?
You
might
argue
that
speaking
to
all
these
voters
is
a
darned
sight
better
than
randomly
trying
to
get
a
feel
of
the
land
and
making
rough
estimates.
But
is
it?
CHOOSING
The
business
of
journalism,
what
the
trade
involves,
and
means
-
all
this
comes
under
review
when
pre-poll
methodologies
are
discussed.
The
reporter
with
conversations
at
tea
stalls,
knocks
on
doors,
and
an
attempt
to
visit
real
issues,
is
of
course
likely
to
go
wrong.
And
sometimes
seriously
wrong.
And
anybody
who
has
reported
before
on
elections
in
India
would
vouch
for
how
discerning
the
Indian
voter
is,
and
how
well
he/she
has
read
the
Indian
constitution,
zealously
guarding
the
right
to
a
secret
ballot.
Reporters
have
learnt
by
burning
their
fingers
how
little
people
disclose
and
how
they
are
tempted
to
tell
you
what
they
believe
you
wish
to
hear!
More
interestingly,
the
person
who
talks
the
most
may
be
least
indicative
of
the
general
mood
in
that
particular
area.
Factoring
all
this
in
articles
written
about
election
trends
is
very
tedious
and
difficult,
and
it
is
this
that
perhaps
first
made
media
czars
to
look
to
the
`safe’,
easy
and
perhaps
a
more
fool-proof
statistical
route?
But
quantifying
social
trends,
and
trying
to
make
a
pre-poll
assessment
seem
very
precise
-
and
therefore
more
credible,
seems
to
be
doing
exactly
the
opposite.
It
is
worthwhile
remembering,
that
even
in
a
far
less
complex,
much
more
homogenous
environment,
with
a
two-party
system
like
that
in
Britain,
opinion
polls
have
gone
hopelessly
wrong.
Even
the
otherwise
very
cautious
BBC,
on
the
basis
of
opinion
polls,
had
declared
Labour
victorious
in
two
consecutive
general
elections,
and
ended
up
with
egg
on
its
face
when
Margaret
Thatcher
romped
home
with
a
huge
majority!
Experience
taught
the
BBC
to
not
allow
or
commission
opinion
polls
in
the
future,
and
also
to
report
very
cautiously
on
polls
done
by
others.
The
lessons
learnt
there
haven’t
put
anyone
on
notice
here.
It
is
easy
to
dismiss
the
cynicism
about
pre-poll
survey
fever
as
being
old-fashioned,
and
inspired
by
a
general
hesitation
to
adapt
to
new
technology
but
fighting
this
dependence
on
pre-poll
surveys
is
not
a
negation
of
science.
It
is
in
fact
an
acknowledgement
of
the
potency
of
scientific
and
mathematical
tools,
recognition
of
how
hopelessly
things
can
go
wrong
when
a
small
error
creeps
into
the
sample.
When
some
ordinary
voters
speak
to
pollsters
as
normally
as
they
would
to
journalists,
then
instead
of
being
able
to
pick
up
the
nuances
or
discount
for
the
complex
calculations
that
the
voter
makes
before
he/she
commits
himself
to
the
journalist
-
the
formulae
used
for
pre-poll
surveys
actually
run
the
risk
of
building
on
the
errors
-
exponentially
at
times.
The
result,
ending
up
as
a
completely
incorrect
prediction.