“The checkout conversion rate dropped 2.3% after last week's redesign.. Something that used to take 30 minutes took 30 seconds. They looked at the old way and couldn't believe they'd tolerated it. That was the aha.”
When I'm the checkout conversion rate dropped 2, I want to see what users do on specific pages and flows — not just that they left, but what they did before leaving, so I can find usability problems that don't surface in surveys or analytics.
A product manager, growth marketer, or UX designer at a company of 20–500 people who uses Hotjar to answer questions that quantitative analytics can't. They know their funnel. They know where users drop off. What they don't know is why. Session recordings are how they find out. They've watched hundreds of sessions. They've seen users rage-click on things that aren't buttons, scroll past CTAs without seeing them, and get confused by flows the team thought were obvious. Each one of these is a design decision waiting to happen.
To see what users do on specific pages and flows — not just that they left, but what they did before leaving — reliably, without workarounds, and without becoming the team's single point of failure for hotjar.
A product manager, growth marketer, or ux designer who trusts their setup. See what users do on specific pages and flows — not just that they left, but what they did before leaving is reliable enough that they've stopped checking. Session filtering by rage click, u-turn, or error event removes the noise and surfaces. They've moved from configuring hotjar to using it.
The checkout conversion rate dropped 2.3% after last week's redesign. The developer says nothing changed that would affect checkout. The PM is in Hotjar filtering for sessions from the past 7 days that include the checkout page and end without a purchase. They're watching session 4 of 12. A user gets to the payment step, tries to click something that doesn't respond, clicks it four more times, and leaves. The PM screenshots the rage-click cluster. This is the bug. This is how they found it.
Uses Hotjar Business or Scale. Has recordings running on 3–8 key pages. Watches 10–30 sessions per week — more during active experiments or post-launch. Uses heatmaps for landing pages and key conversion pages. Uses Hotjar Surveys for exit-intent and post-purchase feedback. Sends Hotjar share links in Slack when a session illustrates a specific problem. Has presented session recordings in product reviews — always effective. Has filters set up for segments: paid users, mobile only, specific entry pages.
Two things you'd notice: they reference hotjar in conversation without being asked, and they've built workflows on top of it that weren't in the original plan. See what users do on specific pages and flows — not just that they left, but what they did before leaving is consistent and expanding. They're now focused on find usability problems that don't surface in surveys or analytics — a sign the basics are solved.
Session noise — recordings of bots, internal traffic, and edge cases that waste review time keeps recurring despite updates and workarounds. They start tracking how much time they spend fighting hotjar versus using it. The switching cost was the only thing keeping them — and it's starting to look like an investment in the alternative.
Pairs with `fullstory-primary-user` to map the lightweight-heatmap vs. deep-behavioral-analytics session tool philosophy. Contrast with `mixpanel-primary-user` for the qualitative session replay vs. quantitative funnel analysis approach. Use with `maze-primary-user` for teams combining unmoderated testing with passive session observation.