“The VP of Marketing wants to know if the new landing page is performing better than the old one. The. Something that used to take 30 minutes took 30 seconds. When event tracking showed that a specific CTA placement drove 3x more conversions. That was the aha.”
When I'm the vp of marketing wants to know if the new landing page is performing better t, I want to know what's working and what isn't without needing an analyst to tell them, so I can attribute conversions to the campaigns that actually drove them.
A marketing manager or digital marketer at a company of 10–200 people who is responsible for understanding how the website is performing and why. They are not a data person. They've been through the GA4 migration and have not recovered emotionally. They know enough to navigate the interface but not enough to build custom reports without three tabs of documentation open. They check analytics several times a week and leave most sessions with more questions than answers.
To make google-analytics the system of record for know what's working and what isn't without needing an analyst to tell them. Not aspirationally — operationally. The kind of intention that shows up as a daily habit, not a quarterly goal.
The tangible result: know what's working and what isn't without needing an analyst to tell them happens on schedule, without manual intervention, and without the anxiety of gA4's interface, which feels like it was designed for people who already know the answer. google-analytics has earned a place in the daily workflow rather than being tolerated in it.
The VP of Marketing wants to know if the new landing page is performing better than the old one. They're in GA4. They know this should be answerable. They've been in the interface for 22 minutes. They've found three different numbers for "conversions" that don't match. The Explorations tab is open. They've created and deleted two custom reports. They know the answer exists in here. They are going to find it. It will take longer than it should and they will not be sure it's right.
Uses GA4 as their primary analytics tool. Also has data in Google Search Console, Google Ads, and Meta Ads Manager — none of which agree with each other. Checks analytics 3–4 times per week. Uses the standard reports more than Explorations because Explorations requires knowing what to ask. Has set up Goals (now "Key Events") once, with help from a developer. Has UTM parameters on most campaigns — not all. Sends a monthly traffic report to leadership using screenshots from GA4.
They've stopped comparing alternatives. google-analytics is open before their first meeting. Exploration reports are saved and shared across the marketing team. The strongest signal: they've started onboarding teammates into their setup unprompted.
The trigger is specific: attribution that changes depending on the model and the date range, combined with a high-stakes deadline. google-analytics fails them at exactly the wrong moment. Data sampling on the free tier meant their reports were estimates, not facts. What makes it irreversible: they fundamentally believe data is only useful if you can act on it — a dashboard nobody understands is decoration, and google-analytics just proved it doesn't share that belief.
Pairs with `hubspot-primary-user` for the full marketing stack attribution and lead tracking workflow. Contrast with `data-analyst` to map the sophistication gap and the tools that serve each. Use with `canva-primary-user` for the small marketing team building and measuring their own content.