It's not exactly the same question. Wouldn't that response include much more than Facebook and Twitter? E.g. Pinterest, LinkedIn, Reddit, Google+, a long tail of social media sites?
It's a great point. To compare benchmarks, we'd have to consider...6 months in behavioral changes since the Pew survey; the likelihood that long tail users also use the main social properties; what we actually classify as a social network; medium effects on answer bias and more.
I couldn't agree more with your points about the importance of survey design. Our hope with GCS is to provide a real time, affordable, iterative mechanism for conducting this type of research.
The challenges faced when benchmarking against Pew and others, is, I believe, an indication of the value of continually collecting and updating data.
That is correct. However, Google Consumer Surveys is accessible to laypeople for a relatively low price. I would assume that when a company commissions a Gallup Poll, Gallup ensures the poll is properly designed. On the other hand, people without the proper knowledge could spend money on Google polls only to draw wrong/useless conclusions.
We had two professionally conducted polls in Germany by one company, asking more-or-less the same thing (internet censorship as means to combat child pornography), commissioned by different stakeholders (one in favor of internet censorship, one against), with different phrasing.
Each got >90% approval for their preferred point of view.
That company took no issue with running both polls (telephone, n=1030, "representative of people living in Germany"), and merely pointed out the differences in the questions when asked for comment afterwards. The second poll was mostly run to prove exactly that point.
It seems they take money for whatever poll they're asked to do, no matter how useless. They probably even optimize it to confirm the bias of their client - since polls are usually used for (internal) marketing purposes, I'd assume that's usually desired, too.
I wouldn't expect Gallup to do different. That's their business.
Suppose I want to run a survey, and I want the responses to be as unbiased as possible (not like in the video). How can I learn to do it "the right way" ?
Also, if you're close to a university check out their social science courses for courses on survey/questionnaire design that you might be able to audit. They'll usually be something in that area on social psych courses for example.
The biggest mistakes I see are:
* Surveys being too long. Every additional question you have makes it less likely that people will bother to complete it.
* Not tracking abandonment rates. High abandonment rates mean that people cannot complete (because you've messed up a question design), or don't want to complete because of a perceived bias, or because it's too long, etc. It's a sign of a bug you need to fix.
* Trying to do too much in a single survey. If you have eight assumptions in your product that you're trying to explore do eight small surveys rather than one large one.
* Questions that assume the answer. You're often asking a survey because you hope people answer in a particular way ("Yes! People are going to be interested in my product!"). Get somebody else to read the survey and see if they can figure out from the questions what answers you want. If they can - rephrase the questions.
* Questions with no correct answer. For example "Do you prefer to log in with Facebook or LinkedIn? yes/no". I can't answer that truthfully (real answer "it depends"). People who can't answer questions truthfully either "lie" or abandon the survey. Both bias your results. Look at all your questions and think whether there is another way of answering it that you don't include.
* A preference for quantitative rather than qualitative data. Checkboxes and radio buttons and hard numbers make it easy to draw pretty graphs and fool yourself into thinking you're being scientific. The key info you need is often in the "soft" written answers.
* Doing surveys too early when you should actually be talking to people.
* More of a form design issue - but people tend to fit their answers to the box size. If you have a small text entry box people are more likely to give small answers, which may be less useful than the longer answer you would get from a bigger text area.
I'll shut up now since I should be doing actual work :-)
I would posit that the issue with the original post referenced by Diego is, in fact, that the conclusion he draws is poorly stated. Reading the full post by Jamie, it's clear that the title "34.5% of US Internet Population not using Facebook/Twitter" should actually read 34.5% of US Internet Population not using Facebook/Twitter to manage their identities on third party sites."
34.5% chose the answer "No - I'm not on Facebook/Twitter", there were other ways to answer "no" that didn't imply that you weren't on Facebook or Twitter, and in total the poll showed 77% responding that they didn't use Facebook/Twitter to manage their online identities with third parties.
The only way to choose "no" is "no - I don't understand how it works" or "no - I'm scared of scams". I know how it works and I'm not scared of scams, but I still don't use Facebook to log in to third party sites. What do I choose?
I'm not interested in what percentage of the population uses social media. I'm interested in increased sales/subscriptions.
It's the difference between speculating and actually making money. It's the difference between making a survey (the beginning, perhaps, of an indirect path to something... who knows what) and making a sale (direct path to booking revenue).
Programmers have a particular love for indirection. And the web has a particular love for speculation. Can we sell surveys? Yes, I think we can.
Here is a February 2012 study from Pew Internet, citing that 66% of the American internet population uses social media.
http://pewinternet.org/Commentary/2012/March/Pew-Internet-So...
Our data, in addition to the survey shared on HN earlier, shows very similar statistics.