2010 Online Testing Awards
Entries for the 2010 testing awards were free, compliments of Awards Sponsor Vertster. Our goal was to encourage as many marketers as possible to enter their tests in order to inspire others. We received dozens of entries in late November 2009. Judging took place the first week in December. (See below for the judges’ profiles.)
Judging was non-biased. We did not favor any particular agency, testing platform, or brand. Our goal was to spotlight tests that would be inspirational for the marketing, analytics and Web design community. Each submission was judged by the following criteria:
#1. Technical & Tactical Basics:
Was the test conducted properly? Were the results conclusive? Did the test meet the stated objective?
#2. What was measured?
Was the conversion activity measured by the test directly relevant to the objective? Did measurements take any KPIs (key performance indicators) into consideration beyond immediate clicks?
#3. Results Data
What were the actual results? Did the test cause a substantial conversion gain? Note: All entries had to reveal their data to the judges, but we did agree to keep some data private. Also, judges did not automatically award the top gaining tests, because gain is related to how well the control creative was previously optimized. It’s not fair to judge a bad original with huge gains against a great original with smaller gains.
#4. Inspirational due to counter-intuitive results?
Were the results surprising, proving that you can’t depend on your intuition or “best practices” to guarantee best conversions?
#5. Inspirational because widely applicable?
Was the test one that many, many marketers could and should copy on their sites or landing pages? Would the lessons learned be a useful tool to help marketers convince the powers-that-be to let them run similar tests?