Optimization was the name of the game for the Obama Digital team. We optimized just about everything from web pages to emails. Overall we executed about 500 a/b tests on our web pages in a 20 month period which increased donation conversions by 49% and sign up conversions by 161%. As you might imagine this yielded some fascinating findings on how user behavior is influenced by variables like design, copy, usability, imagery and page speed.
What we did on the optimization team was some of the most exciting work I've ever done. I still remember the incredible traffic surge we got the day the Supreme Court upheld Obamacare. We had a queue of about 5 ready-to-go a/b tests that would normally take a couple days to get through, yet we finished them in just a couple hours. We had never expected a traffic surge like that. We quickly huddled behind Manik Rathee—who happened to be the frontend engineer implementing experiments that day—and thought up new tests on the fly. We had enough traffic to get results on each test within minutes. Soon our colleagues from other teams gathered around us to see what the excitement was about. It was captivating to say the least.
How we a/b tested
Optimization was a science for us. We started off with a hypothesis and then we came up with several tests to prove (or disprove) it. For example, our hypothesis might be "less copy is better" and to prove that we would chose 5 areas of the site to remove copy. We used several tools to measure the affect. Optimizely for a/b tests, Google Analytics for general data gathering and the Blue State Digital tools to enhance or gut check our data. Sticking to a hypothesis was beneficial because it allowed us to retain focus on our goals. Of course not every a/b test followed this strategy, but we kept with it for the most part.
Design and Interaction
By June of 2012 our donate pages had undergone nearly 14 months of optimization. The low hanging fruit had been picked and it was difficult for variations to beat the control. We were working with a page that was engaging and had a low error rate, but it still looked like a long form. To solve that problem we started work on a variation that made the form look easier to complete. Our plan was to separate the field groups into four smaller steps so that users did not feel overwhelmed by the length of the form. Essentially the idea was to get users to the top of the mountain by showing them a small incline rather than a steep slope. We called this project Sequential because it turned our donate form into a sequenced process.
We had no idea if it would work. It was a gamble because it took a decent amount of development time, but we put our best foot forward. We placed the fields into four groups: amount, personal information, billing information and occupation/employer. We considered a number of factors to determine the order of the field groups, but the most persuasive was error rates. For months we had been tracking validation errors which occured when users submitted an invalid value in a form field (e.g. nothing in a required field or an improperly formatted email address). The occupation/employer fields generated the most errors because users would leave them blank even though they were required—people don't like giving out information they think is unecessary. The billing field group produced the second most errors because it is hard for users to enter a 15-16 digit credit card correctly.
Using this information we determined the field group order: 1. Donation amount, 2. Personal information, 3. Billing information and 4. occupation/employer. By putting the easier field groups first we not only lowered the engagement barrier (all you had to do was click a donation amount button to get started vs. typing your first name), but also to produce a sense of investment before users reached the difficult parts of the form.
We were very happy with the finished product because we felt like we had achieved our goal to make the donation form simpler, but how did it fare in a/b testing?
By turning the long donation form into 4 smaller steps we increased the conversion rate by more than 5%. Turns out you can get more users to the top of the mountain if you show them a gradual incline instead of a steep slope.
We began a/b testing the first iteration of Sequential on July 26th, 2012 and it replaced our standard donation form on August 7th. After vigorous optimization we ended up with what would almost be the final version of Sequential on August 7th. On November 1st we were delighted to see that our friends at the Romney campaign liked it so much.
It probably comes as no surprise that copy affects conversions. Lots of people are familiar with classic copy tests like this one. Like 37 Signals we had lots of success with altering the copy on our web pages. About halfway through the campaign we figured out that of all variables that affect user behavior (design, usability, imagery, etc.), copy has the highest ROI. This is because copy adjustments are just about the easiest change to make on a web page, yet they can produce some of the biggest gains.
In late 2011 we launched a product called Quick Donate which made donating extremely fast and easy. Users who had Quick Donate could donate with a single click through email or on the web and even through SMS. The program was cutting edge because nobody had engineered donations through email before and at the time the Federal Election Commission did not allow political campaigns to use cell phone carrier short codes to raise money through text messages (the FEC later reversed this decision after Quick Donate launched). The programs was so successful that the stats behind it are kind of overwhelming. By the end of the campaign more 1.5 million Quick Donate users donated $115 million. Quick Donate users donated four times as often and gave three times as much money. The program received a lot of optimization simply because of its success.
There were two ways to sign up for Quick Donate. First, you could create a BarackObama.com account and then, in your account settings, save your billing information. The second way we designed to be much easier. After submitting a donation we gave users the option to create and account and save their credit card using the information they had just submitted with the donation. We called this the Quick Donate opt-in page and it received a lot of traffic since the campaign brought in tens of millions of donations. The page itself was very simple: Users with an existing account only needed to enter a password and users without an account only needed to create one. Underneath the password field was the option to enable SMS donations.
We tested many variations of this page, but one of my favorites was when we adjusted the headline. The original headline of the page—which had not been tested at this point—read "Save your payment information for next time." That is pretty simple and straightforward and we definitely didn't want to make it longer. Our idea was to make the headline seem more connected to the donation that users had just made. Our new headline read "Now, save your payment information." The first headline made the Quick Donate opt-in seem disconnected from the donation while the second did exactly the opposite.
By making the follow up ask more connected with the first ask, we increased conversions by 21%. As with Sequential, we were also delighted to see that the Romney campaign loved Quick Donate.
Photography was a huge part of the Obama brand. We had several photographers that took lots of amazing photos of the President, the First Lady and everyone else. We took advantage of this by testing a ton of images. We tested photos just about everywhere from donate pages to sign up forms and about everywhere else you can imagine. As with layout and usability we learned a lot about how users react to different kinds of imagery. We found that there are many variables in photos that can affect conversions, but possibly the biggest impact had to do with the context in which the photo was used.
Similar to the 2008 campaign, our splash page was the subject of a/b tests with different photos. Optimizing the splash page with a/b tests was a lot of fun because it received so much traffic that results came in quickly. One of the splash pages we ran was for a contest called Dinner with Barack. If you won, you got a free trip for yourself and a guest to have dinner with the President. To sweeten the deal even more the First Lady would be at the dinner as well. If you want to see what you missed out on, you can still watch video from several of the dinners.
We had so many great pictures of the previous Dinner with Barack that we wanted to see which performed the best. In the following test we had two photos. The first was a medium shot of the President at the dinner table, but it didn't have much context as you nothing was in view/focus. Previous tests showed that large photos with focus on the President increased conversions. The second photo had a wider frame that revealed the First Lady and two dinner guests. We hoped that users would be more likely to convert if they could see just how close they would be sitting to the President of the United States during dinner.
By changing the photo on the splash page we lifted conversions by more than 19%.
We knew from the beginning how valuable a/b testing could be in helping us achieve our goals and we took it seriously. We spent countless hours thinking critically about user psychology and implementing our ideas with a/b tests. We had developers working around the clock to ensure that we always had an a/b test running. In looking at the overall results I think you could say our efforts paid off. We increased donation conversions by 49%, sign up conversions by 161% and we were able to apply our findings to other areas and products.
However the effect our optimization efforts had on conversion rates was not the only benefit. Along the way we uncovered lots of interesting ways in which design, imagery, copy, usability and page speed affect user behavior. We were able to answer very specific questions like what kind of form input and label alignments are best for conversions and error rates. We were able to second guess our assumptions about how a web page should look and behave. We learned how to answer questions and we ended up with a treasure trove of best practices.