“What was the moment this product clicked?” —
A product manager, growth marketer, or UX designer at a company of 20–500 people who uses Hotjar to answer questions that quantitative analytics can't. They know their funnel. They know where users drop off. What they don't know is why. Session recordings are how they find out. They've watched hundreds of sessions. They've seen users rage-click on things that aren't buttons, scroll past CTAs without seeing them, and get confused by flows the team thought were obvious. Each one of these is a design decision waiting to happen.
What are they trying to do? —
What do they produce? —
The checkout conversion rate dropped 2.3% after last week's redesign. The developer says nothing changed that would affect checkout. The PM is in Hotjar filtering for sessions from the past 7 days that include the checkout page and end without a purchase. They're watching session 4 of 12. A user gets to the payment step, tries to click something that doesn't respond, clicks it four more times, and leaves. The PM screenshots the rage-click cluster. This is the bug. This is how they found it.
Uses Hotjar Business or Scale. Has recordings running on 3–8 key pages. Watches 10–30 sessions per week — more during active experiments or post-launch. Uses heatmaps for landing pages and key conversion pages. Uses Hotjar Surveys for exit-intent and post-purchase feedback. Sends Hotjar share links in Slack when a session illustrates a specific problem. Has presented session recordings in product reviews — always effective. Has filters set up for segments: paid users, mobile only, specific entry pages.
Pairs with `fullstory-primary-user` to map the lightweight-heatmap vs. deep-behavioral-analytics session tool philosophy. Contrast with `mixpanel-primary-user` for the qualitative session replay vs. quantitative funnel analysis approach. Use with `maze-primary-user` for teams combining unmoderated testing with passive session observation.