Drove 9% YoY growth (from funnel optimization alone, no new product investment), increased application starts 13% (expanding the addressable pipeline), and reduced support calls 25% (saving 1+ FTE in support operations) through systematic funnel optimization.
+13%
Increase in application starts
Storefront pages underperformed because value propositions were unclear at the moment of decision. Deposit account experiences lacked clarity across onboarding and day-to-day usage. Product decisions were driven by assumptions rather than validated behavior -- and there was no structured framework for testing and learning.
By 2020, I had spent four years building the behavioral foundation -- the onboarding journey, the Learning Center, the direct deposit activation system. The next challenge was to systematize the learning. Individual product improvements had produced measurable outcomes, but the organization lacked a framework for continuous optimization. Each test was a one-off; each insight lived in a specific team's memory rather than in a shared system.
The A/B testing framework was the answer to this problem. Not just a tool for running experiments, but a systematic approach to connecting behavioral data to product decisions -- a feedback loop that could operate continuously rather than episodically.
The most counterintuitive finding from the testing program was the value of delayed value reveal. The conventional wisdom in conversion optimization is to surface the value proposition as early as possible -- tell the user what they get before they have to do anything. My testing program discovered the opposite: for OEM discounts and certain financial product features, delaying the reveal until after the user had taken an initial action drove significantly higher activation than front-loading it. The user who had already started an application was more receptive to the value proposition than the user who hadn't yet committed.
This finding generalized across the funnel: the timing of information delivery matters as much as the information itself. A decision system that delivers the right information at the wrong moment is no more effective than one that delivers the wrong information. The A/B testing framework was the mechanism for discovering the right moments -- empirically, at scale.
A systematic A/B testing framework across the deposit and car buying funnels, targeted UX improvements to the highest-friction storefront experiences, and a mobile deposit feature that eliminated the top three sources of support calls -- direct deposit confusion, transaction clarity, and basic account action friction.
Analyzed customer support data to isolate the highest-frequency call drivers in deposit accounts and prioritize UX improvements
I introduced direct deposit status visibility, step-by-step setup guidance, and real-time updates to eliminate the top support call drivers
Rewrote value propositions and restructured page hierarchy to surface key decision information at the right moment in the funnel
I ran A/B tests on OEM discount placement and value reveal timing -- discovering that delayed reveal drove higher activation
I implemented iterative testing across onboarding flows, feature placement, and contextual guidance to build a continuous optimization loop
9% YoY growth in deposit accounts. 13% increase in application starts. 25% reduction in deposit-related support calls. 10% increase in self-service adoption. $200K increase in quarterly revenue from A/B testing optimization.
Funnel optimization without a testing framework is guesswork. Building the infrastructure to validate behavior -- and connecting those learnings to revenue -- is what separates product management from project management. The $200K quarterly revenue increase is the direct output of a system that makes evidence-based decisions instead of assumption-based ones.
Product decisions would have continued to be made on assumptions rather than validated behavior -- with no feedback loop to correct them.
The 25% of support calls driven by UX friction would have continued, with no systematic mechanism for identifying and eliminating the root causes.
The $200K quarterly revenue increase from A/B testing would have remained unrealized -- the conventional wisdom about front-loading value propositions would have gone unchallenged.
The delayed value reveal finding is the most important insight from this work. It challenges the conventional wisdom that value propositions should be front-loaded. In high-commitment decisions (financial products, large purchases), the user who has already taken an initial action is more receptive to value framing than the user who hasn't yet committed. My testing framework discovered this empirically -- and the discovery changed the design approach across the entire funnel.
"I discovered that delayed value reveal drove higher activation than front-loading -- counterintuitive but empirically validated"
"The 25% support call reduction came from making the product self-explanatory, not from adding more support content"
"The testing framework was the infrastructure -- the $200K quarterly revenue increase was the output of systematic learning"
What the data says
“Only 1 out of 26 dissatisfied customers actually raises the issue with the company. The rest churn silently.”
The 25% support call reduction is the visible outcome of eliminating friction. But the more important outcome is the silent churn that was prevented -- the 25 out of 26 dissatisfied customers who would have churned without raising the issue.
Source“For SaaS and digital subscription products, every extra minute added to the onboarding flow lowers trial-to-paid conversion by approximately 3%.”
The storefront and onboarding UX improvements were designed to reduce time-to-value. The 13% increase in application starts reflects the conversion impact of removing friction from the decision-making flow.
Source“Approximately 65% of a company's revenue comes from just 8% of its most loyal customers.”
The funnel optimization work was designed to increase the number of members who became highly engaged -- the 8% who generate 65% of revenue. The 9% YoY growth and 10% self-service adoption increase are indicators of movement toward that high-value cohort.
SourceProprietary Framework Applied
"You have to market to three visits, not one. This is the part everyone misses."
Jon Taffer, Bar Rescue -- 40% return probability after Visit 1 · 42% after Visit 2 · 70%+ after Visit 3
return probability
The Red Napkin
User arrives at the storefront or application start page
SaaS Translation
Internal signal: this is a first-session user. Route to the high-clarity storefront track with restructured page hierarchy. The A/B testing framework discovered that the conventional wisdom -- front-load the value proposition -- was wrong for high-commitment financial decisions. The 'red napkin' for Visit 1 was a clean, friction-free entry point that didn't ask for commitment before delivering value.
return probability
The Chicken Discount
User returns after initial exploration; delayed value reveal trigger fires
SaaS Translation
The most counterintuitive finding from the testing program: for OEM discounts and financial product features, delaying the reveal until after the user had taken an initial action drove significantly higher activation. The 'chicken discount' is the value proposition delivered at peak receptivity -- after the user has already committed to the first step, not before.
return probability
The Free Cheesecake
User completes application start; mobile deposit and self-service features surface
SaaS Translation
The mobile deposit feature and direct deposit status visibility were the 'free cheesecake' -- features that made the product self-explanatory and reduced the need for support. The 25% support call reduction was the outcome of a Visit 3 experience that locked in the habit of self-service over phone support.
Framework Insight
The A/B testing framework was the mechanism for discovering the right Visit 1/2/3 sequence empirically. The delayed value reveal finding -- that Visit 2 receptivity is higher than Visit 1 receptivity for high-commitment decisions -- is the Red Napkin Protocol validated at scale.
White Paper Thread: The Decision Layer
The A/B testing framework is the white paper's argument about empirical validation made concrete. A decision system is only as good as the quality of its inputs. The testing framework was the mechanism for improving input quality -- replacing assumptions with validated behavioral data. The delayed value reveal finding is a specific example of how empirical testing can overturn conventional wisdom and produce better decisions at scale.
Read the White Paper →Connective Tissue
The A/B testing framework built in this initiative extended the Learning Center testing work. The OEM discount placement and value reveal timing tests were the foundation for the broader funnel optimization program.
Read case study
The onboarding journey and the A/B testing framework were built in sequence. The journey established the behavioral baseline; the testing framework optimized it. Both are part of the same system.
Read case study
Both cases use high-intent moments (checkout, application start) as the primary conversion surface. The upsell logic at iPROMOTEu and the A/B-tested value propositions at USAA are the same pattern applied to different contexts.
Read case study
The Operating System
ibuildsystems.io
Four frameworks. One repeatable system. Applied across banking, fintech, government, and B2B SaaS to turn broken workflows into scalable revenue engines.