one-line definition
A/B testing is an experiment where you show two versions of a page, feature, or message to different user groups and measure which performs better.
formula: Split traffic 50/50 between control and variant. Run until you reach statistical significance (typically 95% confidence, ~1,000 conversions per variant).
tl;dr
Most solo founders don't have enough traffic for statistically valid A/B tests. If you're under 10K monthly visitors, make bold changes and compare week-over-week instead. Save formal A/B testing for high-traffic pages.
Simple definition
A/B testing splits your users into two groups: one sees the current version (control) and the other sees a modified version (variant). You measure a specific outcome — signups, clicks, purchases — and determine which version wins. It removes guesswork from product decisions. But it only works with enough traffic. For solo founders, the most important thing to understand about A/B testing is when not to use it.
How to calculate it
Split traffic 50/50 between control (A) and variant (B). Define your success metric before starting — conversion rate, click-through rate, or revenue per visitor. Run the test until both variants have enough conversions for statistical significance.
Method: Split traffic 50/50 between control and variant. Run until you reach statistical significance (typically 95% confidence, ~1,000 conversions per variant).
Use a sample size calculator (Evan Miller's is popular) to determine how long you need to run. A page converting at 3% needs about 5,200 visitors per variant to detect a 25% improvement. At 100 visitors/day, that's over 100 days — too slow for most bootstrapped products.
Example
You sell a CLI tool for developers. Your pricing page gets 4,000 visits/month and converts at 4% (160 signups). You want to test whether showing annual pricing first (instead of monthly) increases conversions. At 4% baseline, you'd need about 7 weeks to reach significance. Instead of running a formal test, you switch to annual-first for two weeks and compare: signups jump from 40/week to 52/week. It's not statistically rigorous, but a 30% lift over two weeks is a strong enough signal to keep the change. Formal A/B testing would have taken almost two months to tell you the same thing.
Related reading
Related terms
- Conversion Rate
- CTR
- Feature Adoption
FAQ
How much traffic do I need to run a valid A/B test?+
For a page converting at 5%, you need roughly 1,500 visitors per variant to detect a 20% relative improvement at 95% confidence. That's 3,000 total visitors minimum — which can take months for small products.
What should solo founders A/B test first?+
Your pricing page and signup CTA. These are high-leverage, high-traffic pages where small conversion improvements directly impact revenue. Don't waste time testing button colors.