Advertisement
Guest User

PollingShortForm

a guest
Mar 20th, 2017
164
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.35 KB | None | 0 0
  1. purpose of the poll,
  2. sponsor of the poll,
  3. polling organization,
  4. questions asked,
  5. order of the questions,
  6. who was polled,
  7. how were the interviews conducted,
  8. date of the poll,
  9. statistics offered to substantiate accuracy
  10.  
  11. Pollsters – Who conducted the poll? Was it conducted by a disinterested group, like a news organization, or by an advocacy organization? Who paid for it? Why was it conducted?
  12. Sample Size – How many people need to be interviewed for the sample size to produce valid, representative results? For any particular poll, how many people were interviewed? What population are the results said to represent? Is the size adequate to represent that population? How can you tell?
  13. Sample Randomization – Was the sample self-selected or chosen at random (“scientific”)? If the latter, was the methodology of finding respondents sufficient to ensure randomness? Why or why not?
  14. Sampling Error – What do “sampling error,” “margin of error” and “confidence level” mean? How is the margin of error determined? How can errors be minimized and confidence level be maximized?
  15. Demographics – What area(s) or region(s) were respondents from? Did the survey focus on a specific population, like drivers, voters, African-Americans, etc.? If the poll is meant to represent a large, diverse population, like Americans, do you think it does so and why or why not?
  16. Methodology – Was this poll done over the telephone, in person, on the Internet, some other way? Was it done by experienced poll-takers, or by inexperienced volunteers or students? How might the methodology affect or shape the results?
  17. Question Wording – Does the question wording seem neutral, biased or perhaps confusing or misleading in some way? How about the question sequencing? Might any of these factors, as well as the poll-taker’s inflection, have influenced how respondents answered the questions? Do you think respondents expressed their true views?
  18.  
  19. 1. How does the poll compare to other recent surveys of the state?
  20. 2. How does the poll compare to the polling firm’s previous surveys in the state?
  21. 3. How does the survey compare with the polling firm’s surveys in other states?
  22. 4. How does the poll compare with the national trend?
  23. 5. How does the poll compare with the historical trend in a state?
  24. 6. How does the poll relate to the electoral calendar?
  25. 7. How does the number of undecided voters compare with prior renditions of the survey?
  26.  
  27. Who did the poll?
  28. Who paid for the poll and why was it done?
  29. How many people were interviewed for the survey?
  30. How were those people chosen?
  31. What area (nation, state, or region) or what group (teachers,lawyers, Democratic voters, etc.) were these people chosen from?
  32. Are the results based on the answers of all the people interviewed?
  33. Who should have been interviewed and was not? Or do response rates matter?
  34. When was the poll done?
  35. How were the interviews conducted?
  36. What about polls on the Internet or World Wide Web?
  37. What is the sampling error for the poll results?
  38. Who’s on first?
  39. What other kinds of factors can skew poll results?
  40. What questions were asked?
  41. In what order were the questions asked?
  42. What about "push polls?"
  43. What other polls have been done on this topic? Do they say the same thing? If they are different, why are they different?
  44. What about exit polls?
  45. What else needs to be included in the report of the poll?
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement