Project Type
Product Design, UX/UI Design, Front-End Development
Rol
Product Designer (Lead)
Deliverables
Product Design, Interaction Design, User Research, Strategy, Implementation
Timeline
4 weeks
Team composition
Growth Specialist, Project Manager, and Marketer.
The Challenge
Eventbrite launched a new pricing model with expanded services (email marketing, ads, premium support), but event organizers couldn't understand which plan fit their business. Fee complexity and unclear value propositions caused sign-up drop-off, blocking revenue growth during a critical launch period.
Core problems
Decision paralysis when comparing plans
Hidden fees eroded trust
Unclear ROI for new premium features
The opportunity
Help organizers quickly find the right plan while transparently showing Eventbrite's competitive value.

Impact
17%
increase in user acquisition
$6 M
Revenue Increase
40+
touchpoints integrated
6
Markets covered

My Approach
What I did
Analyzed funnel drop-off data, reviewed support tickets about pricing confusion, and interviewed the Growth team to understand sales conversation patterns.
Key insight
We didn't need novel research—internal data already showed where and why users were getting stuck. The new pricing rollout created urgency: we had four weeks before Marketing needed this live.
Strategic question
How might we help organizers understand which plan fits their business while proving Eventbrite's value?
I made four key choices given our constraints (tight timeline, no engineering resources, urgent business need):
Leverage familiar patterns
Researched pricing tools from HubSpot, Stripe, and similar SaaS leaders. Users already understood calculator interfaces—I adapted proven patterns rather than inventing new ones. This reduced risk and eliminated the need for extensive pre-launch testing.
Code it myself
No dev bandwidth meant months of delay. I used AI-assisted coding to build the tool, keeping full control over iteration speed and interaction details.
Validate post-launch
Given the low-risk UI pattern, I validated internally, launched in the US, then iterated based on real usage. This got us live faster while learning from actual behavior.
Scale intentionally
Started with one market to monitor performance and refine before rolling out to five more markets.

Building the logic (FigJam)
Created a visual model mapping user inputs (event count, attendees, ticket price) to plan recommendations. Ran a workshop with Growth, Marketing, and PM to align the logic with business strategy and ensure recommendations matched what Sales would suggest.
Designing the interface (Figma)
Built a simple calculator: input fields, real-time calculations, clear recommendations, and CTAs to sign up or contact sales. Used Eventbrite's design system for consistency and focused on trustworthy presentation—clear labels, transparent math, no surprises.
Coding the prototype
Partnered with AI tools to write the JavaScript recommendation engine and build the responsive interface. Building it myself let me iterate immediately when something felt off, rather than waiting for dev cycles.

Internal testing
Ran QA with the team, testing edge cases and validating recommendations against real customer profiles. Fixed microcopy issues and improved error handling.
US launch
Monitored usage, drop-off points, and conversion rates. Early signals were strong—more time on pricing pages, increased sign-ups—but "contact sales" CTA caused confusion.
Iteration
Based on data, I refined microcopy, adjusted visual hierarchy to emphasize recommended plans, and simplified input labels.
Six-market rollout
Scaled to Canada, UK, Ireland, Australia, and New Zealand with localized currency and terminology. Core tool remained unchanged, validating the initial design decisions.

Reflections and Learnings
This project taught me the value of resourcefulness under constraints. Coding the tool myself accelerated delivery and deepened my technical fluency, making me a better partner to engineering teams.
Working with Growth, Marketing, and Leadership clarified how pricing strategy connects to design decisions and revenue outcomes. The four-week timeline forced discipline—leaning on familiar patterns reduced risk and let us ship confidently, then iterate with real data.
Launching in one market first, then scaling to five others, proved we'd built something genuinely useful. The positive team feedback reinforced that cross-functional trust and scrappy execution matter as much as craft. Taking full ownership of outcomes—not just deliverables—drives meaningful results while solving real problems.