Iraq dominated the headlines throughout the fall of 2002 and into the winter of 2003. Public opinion on the wisdom of war, however, stabilized relatively early and slightly in favor of war. Gallup found that from August 2002 through early March 2003 the share of Americans favoring war hovered in a relatively narrow range between a low of 52 percent and a high of 59 percent. By contrast, the share of the public opposed to war fluctuated between 35 percent and 43 percent.
Looks like Americans are even more happy with murdering people if its done by a puppet.
https://www.brookings.edu/articles/rally-round-the-flag-opinion-in-the-united-states-before-and-after-the-iraq-war/
Ok cool. 1000 people who will answer a text message from unknown. Would you? Again, I know literally zero people who support Israel in this conflict. Got friends up and down the age spectrum, in and outside of the states.
So my poll says 100% of everyone I know wants this shit to end. Facebook, Twitter, insta, all on the same page here.
Somehow American polls always seem to be, ahem, skewed.
Look at those 538 predictions based on 'polling data' on the last several elections.
This is not how statistics works
Look up i.i.d sampling please for the love of God.
This wasn't a random sample though
Maybe not but I don't see anything in their brief methodology section quoted above to indicate it wasn't a random sample of mobile phone users. What makes you think it wasn't?
A random sample of mobile phone users isn't a random sample, because you're only going to get the people who answer texts from strangers.
It's called "non-response bias" and it's a huge part of the reason political polling doesn't work. It's strong enough to render almost any sample from a phone survey non-representative.
Yeah I understand that there's a difference between the sampled population and the actual population of interest, but you can't discount the results of on account of that unless you can meaningfully show a non-zero covariance between the response variable and likelihood of non-response.
By all means it's a caveat but it doesn't make these results entirely non-informative.
In any case I cited iid conditions to explain why asking all their friends is certain to produce a useless estimate of the population proportion.
They know that asking their friends is useless. They were trying to make a point about sampling bias.
To me they were saying this numbers are totally made up.
If it's not representative, it basically is
A biased estimator is not a meaningless or non-informative estimator.
Look at it this way. When the US wants to prosecute a suspect in federal court, where do they generally hold the trials? In the whitest, most affluent part of Virginia. So the jury, while random, is still taken from a specific pool of people who lean a certain way politically.
If you think polling data isn't politically motivated and influenced by sample location, age, and the way the questions are formulated, you're deceiving yourself. What's that saying? There are small lies, big lies, and statistics.