Rewiring your mindset about marketing experimentation, part 2


Rewiring your mindset about marketing experimentation, part 2

Three savvy marketing experiments that will inspire you

Of all the tools in the marketer’s arsenal, methodically testing new ideas and following the data is one of the most powerful ways to drive business growth. Yet many marketers still rely on the traditional seat-of-the-pants, gut-feel approach to marketing.

In part 1 of this blog series, we defined several best practices to consider as you plan your marketing experiments. In this blog, part 2, we will explore the objectives, hypotheses and results of three experiments curated by – ranging from simple to complex.

Experiment #1: Will CTA placement on the page improve response?

The hypothesis: TrustRadius discovered that click-throughs from a CTA on its home page were lower than anticipated. Its marketing team hypothesized that moving the CTA button from a hero position on a page that displays vendor reviews to the website header would make it stand out more and thus increase click-throughs.

How many of us as marketers have adjusted something on a web page with the hope of better results? Chances are, we did it on an ad hoc basis. No hypothesis. No measurement. What’s the difference here? Being intentional and structuring this as an experiment helped to bring clarity to the objective. We did A, with the expectation that B would happen.

Results: Moving the “Request a Demo” CTA button to the right side of the website header resulted in twice the number of click-throughs to the TrustRadius “Request Demo” page.

Original layout (control group)


Experimental layout (winning layout with 2x the number of click-throughs)


How to: To do this type of experiment, use your website development platform to set up an A/B test. The original version will act as the control in your experiment. Every other visitor will see the experimental layout. Soon, you should have enough data to determine which layout is performing better. One caution: To ensure that you get valid data, only change one variable at a time.

Lesson learned: “You can never anticipate how your target audience will consume your online content. That’s why you shouldn’t be afraid to experiment with minor changes to your website, email messaging and design elements,” explains Cultivate managing director Tina Deprez. “Each experiment produces valuable data that can help you iterate your way to better engagement and results.”

Experiment #2 Will bundling products be more convenient for customers?

Hypothesis: Atlassian, a developer of enterprise productivity tools like Jira and Confluence, wanted to know if bundling its products would be more convenient for users. Their hypothesis was the more products it bundled together, the more its users would evaluate and then adopt them.

Experiment: To test this theory, Atlassian offered several bundling options to prospective customers. This experiment is a little more complex and likely involved coordinating details across departments. The point is that experimentation can benefit multiple functions in your business (marketing, sales, product marketing, etc.). In this case, Atlassian creatively bundled products and services.

The results: The company discovered if it bundled more than two products together, prospects became confused. Trying to learn three new applications at once made them less engaged and resulted in lower conversions to sales. Bundling two products together provided optimum opportunities to upsell and cross-sell its services and led to higher retention rates after customers completed their trials of its collaboration tools.

Lesson learned: “Not every experiment will support your hypothesis,” Deprez cautions. “Sometimes the data may surprise you. Follow where it leads, even if it conflicts with your ideas of how customers should react. They’re telling you by their behavior what they prefer. Listen to them.”

Experiment #3 Will personalized segmentation of target audience improve sales?


Dakota Supply Group's (DSG) e-commerce system was generating less than 0.2% of online sales. The company wanted to increase that to 10% over three years. To meet this aggressive goal, it needed to revamp its system to provide a better customer experience.

Hypothesis: The DSG team hypothesized that by providing a personalized experience to each market segment, running targeted promotions, offering customer-specific pricing and empowering more sophisticated searches of its extensive online catalog, it could increase e-commerce sales.

Alright, this experiment required extensive orchestration and collaboration across the organization, including people, processes and technology.

Results: DSG’s upgraded e-commerce platform delivered a 13x growth in online sales, quadrupled the number of orders it received and resulted in a 25% increase in the number of line items per order.

Lesson learned: Upgrading your technology stack can help you conduct more sophisticated experiments, learn more and iterate your results faster. DSG selected a technology platform that combined sophisticated e-commerce capabilities with highly personalized marketing and sales capabilities in a single solution.

“Our clients are finding that investing in a marketing automation platform is a must-do. It supports a data-centric approach that is ideal for conducting experiments like these. It can become a powerful catalyst that drives greater engagement and more sales leads,” Deprez emphasizes.

The bottom line is that marketing can impact business outcomes. The best strategy to ensure that is by methodically structuring experiments that solve business problems.

In addition, conducting experiments gives you a distinct competitive advantage. Consistent experimentation gives you ongoing intelligence about your marketplace that your competitors may not have.

Contact us if you have any questions about how to structure your first or next experiment.

Source: The Big Book of Experimentation by Optimizely.