Persona Library
← All personas
dovetailanalyticsAPP-196

The Dovetail Research Operations Manager

#dovetail#user-research#research-ops#insights#qualitative
Aha Moment

A teammate asked how they managed tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy. They started explaining and realized every step ran through dovetail. It had become the spine of the process without a formal decision to make it so.

Job Story (JTBD)

When I'm a pm is redesigning the settings page and asks the research team: "do we know an, I want to tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy, so I can build a searchable insights repository that surfaces relevant findings across studies.

Identity

A UX research lead or research operations manager at a product company who uses Dovetail to turn the chaos of qualitative research — interview transcripts, survey responses, usability test recordings — into a structured, searchable insights repository. They tag, code, and synthesize findings so that when a PM asks "what do we know about onboarding friction?" the answer is a link, not a 3-week research project. They are the librarian of user insights, and they've learned that research nobody can find is research that didn't happen.

Intention

To make dovetail the system of record for tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy. Not aspirationally — operationally. The kind of intention that shows up as a daily habit, not a quarterly goal.

Outcome

The tangible result: tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy happens on schedule, without manual intervention, and without the anxiety of tagging is time-consuming — even with AI assistance, coding transcripts manually is the bottleneck. dovetail has earned a place in the daily workflow rather than being tolerated in it.

Goals
  • Tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy
  • Build a searchable insights repository that surfaces relevant findings across studies
  • Track research coverage — which personas, features, and topics have been researched and which haven't
  • Share research findings in formats that PMs and designers actually use in their decision-making
Frustrations
  • Tagging is time-consuming — even with AI assistance, coding transcripts manually is the bottleneck
  • The taxonomy grows unwieldy without governance — tags proliferate and overlap
  • Adoption by non-researchers is low — PMs and designers don't search the repository because they don't know it exists
  • Connecting qualitative insights to quantitative data requires switching tools and manual correlation
Worldview
  • Research has a half-life — insights that aren't findable decay into opinions within months
  • The biggest waste in UX research is re-researching questions that were already answered
  • A research repository is only valuable if the team uses it — discoverability is as important as depth
Scenario

A PM is redesigning the settings page and asks the research team: "Do we know anything about how users currently navigate settings?" The research ops manager searches Dovetail and finds 3 relevant studies from the past 18 months: a usability test with 8 participants, an interview study about power user workflows, and a survey about feature discovery. They compile the relevant tags and insights into a summary with video clips. The PM has the answer in 2 hours instead of commissioning a 3-week study. They use the existing insights to inform the redesign and commission targeted testing only for the new patterns they've introduced.

Context

Manages Dovetail for a research team of 2–8 researchers serving 3–10 product teams. Has cataloged 50–300 research studies. Maintains a taxonomy of 100–500 tags organized by theme, persona, and feature area. Codes 10–30 transcripts per month. Creates research summaries and insight reports for stakeholders. Trains new researchers on the tagging taxonomy and workflows. Spends 40–60% of their time on research operations and 40–60% on conducting research. Has built templates for different research study types.

Success Signal

They've stopped comparing alternatives. dovetail is open before their first meeting. Tag and code qualitative data (transcripts, notes, videos) with consistent taxonomy runs on a cadence they didn't have to enforce. The strongest signal: they've started onboarding teammates into their setup unprompted.

Churn Trigger

Tagging is time-consuming — even with AI assistance, coding transcripts manually is the bottleneck keeps recurring despite updates and workarounds. They start tracking how much time they spend fighting dovetail versus using it. The switching cost was the only thing keeping them — and it's starting to look like an investment in the alternative.

Impact
  • AI-assisted tagging that suggests codes based on content, reducing manual coding time by 50%+
  • Better taxonomy management with merge, rename, and hierarchy tools that keep tags clean
  • Discovery features for non-researchers — recommendations, search, and topic pages that make the repository accessible
  • Integration with product analytics tools to connect qualitative insights with quantitative behavior data
Composability Notes

Pairs with dovetail-primary-user for the standard research repository perspective. Use with hotjar-ux-researcher for combining qualitative research with behavioral analytics. Contrast with maze-ux-tester for the moderated vs. unmoderated research comparison.