Note: I sent him a rough draft version of below, and yes it might seem like everything I write is rough, the content though is the same just a little tightened up. Professor Binder's responses are in bold.
Usually these polls give confidence intervals, they say things like, we are 95 percent confident that the answers are within plus or minus 4 percentage points. It gives them some wiggle room in case something is totally wrong.
The margin of error for the entire survey was 4.38 percentage points. It's not "wiggle room", it is the statistics of the sampling procedure.
There's two pieces to this. First, you are correct. There is likely a social desirability bias upping the reported number of voters. Second, there is also likely a little bit of a selection issue as well. Folks that are more likely to vote are also more likely to pick up their phone and agree to take the survey. These numbers aren't very different than any other survey you'll see.
My response to this is again two-fold. It is possible that folks like common core more here than elsewhere, but I'd likely suggest another explanation. Question wording in surveys matters, and it matters even more so when asking about issues that the adults aren't well versed in. Common Core is one of those issues people have limited information about. People like being able to compare their students to other students, that described explained in our survey question. I suspect if we simply said "common core" and didn't explain that is was comparative standards the support would have been markedly lower.
That is interesting conjecture. Those aren't questions or comments that I know the answer to. As the the Faculty Director for the Public Opinion Research Laboratory at UNF, we worked with JPEF in constructing the survey, question wording and order of the questions. The content they wanted to ask about had to be balanced against the amount of questions we can actually ask — the survey ran about 15 minutes as it was. I would direct those questions to JPEF.
Again, I would direct that question to JPEF. I could venture guesses, none of which are as Machiavellian as you're implying, but they would simply be guesses.
We executed the survey and sent them the data without any interference or even a hint that they had any preference for what the results actually said.
If you have any more technical survey questions I'd be happy to answer them for you.
Michael Binder, Ph.D.Assistant Professor, Political Science