Campaigns are turning to text messaging as an effective means to reach voters—and for good reason. The tool is inexpensive, relatively simple, and the message is almost sure to be read. (Studies show texting open rates to be 96% or better.)
In Eric Johnson’s successful campaign for Dallas Mayor, we sent more than 178,000 text messages in eighteen waves over the course of seven weeks of the initial election and runoff. Our texting targets were previously identified supporters from canvassing and live phone calling programs, and much later in the runoff, those modeled to be very likely voters who were likely Eric Johnson supporters.
Here are some useful observations.
Too Much of a Good Thing?
Campaign professionals are wise to ask when texts are most effective, how much is too much and what are the downsides to text messaging in political campaigns.
- Capturing the Low Hanging Fruit in Early ID Programs. Well-known incumbents seeking higher office have found that texting voters in their base district with a simple Yes/No request for support can quickly identify a core group of loyalists to enlist for the larger effort.
- Persuasion vs. Supporter Mobilization. The best evidence to date is that text messaging does not perform well as a persuasion tool. With a limit of 160 characters, it’s hard to convince an undecided voter. And there is no conclusive evidence that for texts which include webpage links in the message nudging voters toward more lengthy arguments facts and figures do the job. However, reminding committed supporters about upcoming events, advising them of polling locations and even asking them to convince a family member or neighbor to join them have proven to work.
- Limits to The Effectiveness of Text GOTV Reminders. The best evidence to date says that more than two GOTV text reminders aren’t worth the costs and could be counterproductive.
List Quality, Voter Push Back and Other Concerns
We know of no metadata studies that address the overall issue of list quality and quantify pushback from voter targets. But the stats in Dallas prove reassuring.
- <4% Bad Numbers. Combining the initial bounce-back rates with written responses from those who chose to inform us of wrong numbers or a voter’s ineligibility, our text programs averaged about 4% “bad numbers”.
- >9% Response to requests for feedback
- 4% Response rate to simple reminder texts
We drilled down into the individual responses to each wave of texting for more useful information:
- Some Respondents Don’t Follow Instructions. In those texts (>50,000) where we asked for a numerical response (1/Yes, 2/No, 3/Undecided, 9/ Stop), we found that for those giving a reply about 55% of those respondents actually responded as directed. The remaining 45% of responses were various written messages (see below.).
- The Concern About Pushback For Texting Is Unwarranted.
- Opting Out We found that on average 3.0% of voters taking the time to respond opted out of the exchanges (by either pressing 9 to stop texting or by asking in writing to be removed from the lists.)
- Angry responses: Less than four-tenths of one percent of those getting a text used profanity in their responses.
- Texting Doesn’t Create A lot of Additional Work for Campaign Staffers. Only 1.14% of responses (.06% of total texts). asked for candidate issue positions or actions by the campaign that would require staff follow-up.
- No Immediate Evidence That List Quality is an Issue. About 2.31% of the responses (.13% of total texts) were notifications that voters had moved or that the persons getting the text were ineligible to vote.
What About the 90% Who Didn’t Respond?
We need more case studies to determine whether this “silent majority” behaves like those who actively responded to the text programs. In the meantime, Eric Johnson’s resounding double-digit victory is reassurance. And we are reminded that only .84% of more than 174,000 texts generated any type of negative response.