A blue street sign that reads "attraction" with a right arrow on it

When it comes to optimizing your purchase funnel, simplicity often wins. Studies consistently show that the more steps you include in the buying process—such as extra forms, pages, or confirmation screens—the higher the likelihood of cart abandonment. Each additional step creates friction, offering users more opportunities to second-guess their decision or get distracted, ultimately leading to fewer completed purchases.

That said, while research strongly suggests that a shorter funnel improves conversion rates, what works best can vary depending on your specific audience and product. For some users, additional steps like comparing product options, reviewing their order, or confirming details might provide reassurance, increasing their confidence in the purchase. This is why it’s crucial to test changes before fully implementing them.

By experimenting with a streamlined approach, such as enabling purchases directly from the landing page of your product, you can gather data to see how your audience responds. Running an A/B test between your current funnel and a shorter version will provide actionable insights, helping you make data-driven decisions that cater to your users’ preferences while optimizing your sales process.

Testing Hypothesis

When optimizing a landing page, every element plays a crucial role in guiding visitors toward conversion. At Nelio A/B Testing, our current landing page design follows a traditional approach: the first fold prominently features a compelling headline and a call-to-action (CTA) button that directs visitors to our pricing page. It’s on that page where visitors can explore our subscription options and ultimately purchase the service. While this structure has worked for us, we believe there’s room for improvement—and that’s where our latest A/B test comes in.

We hypothesize that, by shortening the purchase journey, we could boost conversions. Specifically, we’re testing whether allowing visitors to subscribe directly from the landing page itself will yield better results. To test this, we’ve created a variant of the landing page that includes a pricing box right in the first fold. This box highlights the price of our Professional plan, our intermediate option, showcasing its monthly rate alongside a convenient “Subscribe” button. For visitors who want to explore other plans, there’s also a clear link directing them to the full pricing page.

The idea behind this variant is simple: reduce friction. By eliminating the need for an additional click and showing pricing information upfront, we’re aiming to streamline the decision-making process for potential customers who are already inclined to subscribe. However, we’re mindful that some visitors may still prefer to explore all available options before committing, which is why the link to view more plans remains a key element in the new design.

The results of this test will provide valuable insights into our audience’s behavior. If the variant performs better, it could indicate that presenting the pricing and subscription options earlier in the funnel aligns more closely with our visitors’ expectations. Conversely, if the original design holds its ground, it would reinforce the importance of a more gradual funnel, where visitors take time to learn about our product before reaching a purchase decision. Either way, we’re committed to using data-driven experimentation to enhance the user experience and maximize conversions.

Definition of the A/B test

To conduct this experiment, we’re leveraging our very own WordPress plugin, Nelio A/B Testing. This tool allows us to seamlessly define and run A/B tests on our website. For this particular test, we’ve selected the landing page for Nelio A/B Testing as the control version. The variant version of the page introduces a key change: the inclusion of a pricing box in the first fold, as described earlier. Importantly, the rest of the page remains identical between the two versions to ensure that any differences in performance can be attributed solely to this adjustment.

For this test, we’ve established two critical conversion goals to measure success:

  1. Purchases of Any Subscription Plan: Our primary goal is to track the number of visitors who complete a subscription. When a visitor successfully subscribes, they are redirected to a thank-you page, and this visit is recorded as a conversion. This metric will give us a direct understanding of how each page version performs in driving actual purchases.
  2. Visits to the Pricing Page: While the variant includes a pricing box in the first fold, we recognize that some visitors may still want more details before making a decision. That’s why our secondary goal is to track how often users click the “See More Plans” link and navigate to the pricing page. This will help us understand the information-seeking behavior of our audience. If the variant shows fewer visits to the pricing page but maintains or increases purchases, it could indicate that providing pricing information upfront simplifies the decision process.

By analyzing these two goals, we aim to answer key questions: Will the new variant reduce the need for visitors to explore the pricing page, suggesting that upfront information is sufficient? More importantly, will it drive more direct purchases, proving that a shorter funnel leads to higher conversions?

This test exemplifies how A/B testing can help fine-tune the balance between providing enough information and simplifying the user journey. We’re eager to see the results and use the insights to further optimize our landing page strategy.

Nelio A/B Testing

Native Tests for WordPress

Use your WordPress page editor to create variants and run powerful tests with just a few clicks. No coding skills required.

Analysis of the A/B testing results

After running the A/B test for one month and 12 days, we have gathered a considerable amount of data to analyze the performance of both versions of the landing page. Let’s examine how each version fared in terms of our defined conversion goals and user behavior.

Goal 1: Purchase of Any Subscription Plan

The primary goal of this test was to evaluate how the changes in the first fold impacted overall purchases. According to the results, the B version—which includes a pricing box—resulted in 18.6% fewer purchases compared to the control. However, the confidence level for this goal is not statistically significant, meaning the sample size or observed difference isn’t sufficient to confirm that the performance disparity isn’t due to random variation. In short, while there’s an apparent negative trend, we cannot draw any definitive conclusions from this particular result.

Goal 2: Visits to the Pricing Page

The second goal measured the frequency of visits to the pricing page, which provides more comprehensive information on the different plans. The B version resulted in a 13.1% reduction in visits to this page compared to the control. Unlike the first goal, the results here have a 99.17% confidence level, indicating a highly reliable outcome.

This means that the introduction of the pricing box in the first fold did succeed in reducing the need for users to visit the dedicated pricing page for further details. Essentially, fewer people were inclined to click through because they were already presented with key pricing information directly on the landing page.

Heatmap Analysis

To further understand user behavior, we analyzed heatmaps for both versions of the landing page.

On the control version, the heatmap revealed that the hottest areas were the links to the pricing page. This suggests that visitors who landed on the page naturally gravitated towards these links as their next step in the funnel, seeking more detailed pricing and plan information before making a purchase decision.

Interestingly, the heatmap for the B version followed a similar pattern. The most-clicked element in the pricing box was the “See More Plans” link, which also directed users to the pricing page. However, the “Subscribe” button—intended to streamline the purchase process by allowing users to directly subscribe to the Professional plan—received significantly less attention.

These findings suggest that users still prefer to review the full range of pricing options before committing, even when presented with a simplified, upfront option.

Conclusions

The results from this experiment provide valuable insights into user behavior and preferences on our landing page. Here’s what we’ve learned:

  1. Reduced Pricing Page Visits, But Not Conversions
    The presence of a pricing box in the first fold effectively reduced the number of visitors who felt the need to explore the pricing page. This indicates that some users found the upfront pricing information sufficient. However, this did not translate into increased conversions, as the B version performed worse in driving actual subscriptions, though not significantly.
  2. User Preference for Comprehensive Information
    The heatmap analysis highlights that visitors still prefer exploring the full pricing options on the dedicated pricing page. Even when presented with a direct “Subscribe” button, most users clicked on the “See More Plans” link instead. This behavior suggests a cautious decision-making process, where users seek additional context before committing to a subscription.
  3. No Definitive Impact on Purchases
    While the B version showed a decline in purchases, the lack of statistical significance for this goal means we cannot conclusively state that the pricing box harms conversions. Further testing with larger sample sizes might provide clearer insights.

Given these findings, our recommendation is to stick with the current control version for now. The control version’s design aligns better with user behavior, providing them with a natural path to explore all available options on the pricing page. However, this doesn’t mean the experiment was a failure. Instead, it highlights the importance of user preferences and the need to provide comprehensive information when asking for a commitment.

Moving forward, it may be worth exploring other strategies to optimize conversions. For instance, testing different designs for the pricing page itself, tweaking the call-to-action buttons, or even experimenting with other first-fold content to better capture user interest could provide further opportunities for improvement.

Concretely, it may be worth exploring a new test where the landing page displays the complete pricing plans in the first fold rather than focusing on a single option. This approach could provide users with the full range of choices upfront, potentially increasing their confidence and likelihood to convert without needing to navigate to another page.

Featured Image by Tim Mossholder on Unsplash.

Leave a Reply

Your email address will not be published. Required fields are marked *

I have read and agree to the Nelio Software Privacy Policy

Your personal data will be located on SiteGround and will be treated by Nelio Software with the sole purpose of publishing this comment here. The legitimation is carried out through your express consent. Contact us to access, rectify, limit, or delete your data.