Well, it’s another slow news day in the Six Californias signature verification world. There was no update from the SoS Wednesday. The only news in Thursday’s update was that the County of Santa Barbara finished their random sample, with a validity rate of 54.1%. This brings the overall validity rate down from 66.7% to 65.4%. Still no word from Alameda, Amador, Inyo, or Trinity counties as to their raw counts.
In my previous report I opined how the projected numbers made it seem unlikely that Six Californias would qualify for the ballot. It occurred to me that a random sample is subject to, well, randomness, and even if the projected number is below the number needed to qualify, a full count could reverse that. That indeed is what happened with the “State Fees on Hospitals” initiative that has qualified for the November 2016 ballot, so I thought a review of that initiative’s process might be educational.
Initiative 1613 (as it is known to the SoS) was filed late last April. By May 6th enough counties had submitted their raw counts to the SoS that she was able to declare on May 7th that more than 807,615 signatures had been filed and so the counties should begin their random sampling and report back no later than June 19th.
On June 19th, despite no projected numbers from Inyo, Mariposa, or Trinity counties, she reported that the initiative had a projected validity rate of 64.6% and a projected count of 787,693 signatures, not enough to qualify by random sample (which would have required a projected count of 888,377 signatures, 10% over the 807,615 minimum), but enough to require a full count of each and every signature. The full count was to complete by August 1st.
On August 1st she reported that, even without a full count from Kings County, the initiative had received either 807,950 or 807,984 valid signatures, enough (barely) to qualify for the ballot. (For some reason the spreadsheet shows different numbers in the “Valid Sigs.” and “Valid” column for Humboldt and Imperial counties. Also, the total in the “Valid” column is off by one as well, making me think someone doesn’t understand how to create a spreadsheet that adds the numbers for you.) The actual validity rate was 66.4%, almost two percentage points higher than projected.
I know when one does sampling one should also compute the margin of error. To be rigorous, you have to compute the margin of error separately for each county, and then combine them by squaring each one, adding them together, and then taking the square root. I’m not going to do the complete calculation right now (it’s late and I’m tired; I might do it for Six Californias when they finish the random sampling), but an oversimplified estimate gives an overall margin of error on the order of 5%. Thus the actual validity rate of 66.4% is within the margin of error of the estimated one, which is why even if an initiative is projected to fall short by 5% a full count is done.