Finding the right UX research platform for your team is a tall order. It requires considering your organization's current needs and budget, as well as looking to the future.
Which processes, tools, and frameworks are necessary for both long-term and short-term solutions?
This shopping list will help you ask the right questions so that you can find a tool that fits your organization's unique specializations, budget, and needs. Plus, we've created a full Buyer's Guide to Experience Research Platforms to make the final decision even easier.
Jump to an item on the shopping list
What problems are you looking to solve?
With ever-tightening budgets and uncertain headcount, having a clear idea in mind when shopping for a research and insights tool is critical.
- Missing a key data point from your customers?
- Noticing insights gaps on other teams?
- Wishing stakeholders had more access to customers?
Whatever the reason, jot it down and keep it handy as you begin your peruse of the countless options of research tools.
Narrowing in on these key questions not only helps you, the decision-maker, but can also drive buy-in from other stakeholders and leaders that this is a problem worth solving. If you’re not sure where to start, take a week to conduct informal interviews with your collaborators, stakeholders, and team to identify the drivers of tooling.
And if you're stumbling with how to even conceptualize and operationalize "ROI," elements like acquisition, activation, retention, referral, and revenue (termed the "pirate metrics") are a good place to start. Developing business knowledge—how a business "works" and what impacts research can have—is another core skillset when starting shopping.
Having an unclear driver or a lack of a rationale can lead to aimlessly wandering the aisles of research tools, wondering if you are even hungry at all. More seriously, it can lead to poor or undetermined ROI, a loss of trust, and the potential devaluing of research broadly.
Without metrics, it will be very difficult to create a case for securing ongoing funding. You'll likely want to keep using a research tool you and your team find impactful.
So really stop and think—why are you in the market for a tool? What do you hope to achieve once one has been onboarded?
Now you can start perusing those aisles...
This is likely the first item on your list. What exactly does the tool or platform do? Most platforms have an angle or reason for existing. Ideally, that will be spelled out on their homepage. "Usability testing for all" or "The only recruitment tool you'll ever need."
Before you start combing search results or databases, create a list of the kinds of research projects you and your team will need. Don't get too narrow—staying broad and considering the complement of approaches, methods, and features you and your team will eventually need is helpful when evaluating.
Here are some example questions:
- Do I need moderated (e.g., interviews), unmoderated (e.g., task-based), or both kinds of research?
- Does the platform have a panel? If so, who comprises it? If I need to bring my own participants, can I? Will they be marketed to or automatically added to the tool's?
- How are projects displayed to participants? Are there mobile and desktop options? Do participants have to download and utilize an account or app?
- Are we well-equipped for foundational, generative, and evaluative research?
Most user research platforms have similar question libraries (e.g., open ends and scales), so note if you need something more specific, such as mobile view or desktop screen recording.
This is a perfect place to start because these elements can usually be intuited from a marketing website or help center, avoiding the need for a sales call just yet.
Equally important to the core research functionality is the operations or ops elements. For scrappy or small teams (or those facing uncertain budget futures), having built-in ops features can maximize time, reduce rework, and improve research output quality.
To build your ops list, think of all the preparatory, setup, design, and admin-related tasks any one research project requires. This list should really help narrow in on the best tool fits.
Questions to suss out operations capabilities:
- If recruitment is available, how are incentives processed? Is there a fee?
- Can I create my own panel or participant pool?
- What kinds of account access or seats are available?
- e.g., Can stakeholders view sessions or data? Can I limit that access?
- Are there seat types offering limited research design access?
- Is there an account-level view displaying usage, seats, and settings?
Depending on the kinds of projects you seek to run (see above) you might also have more specific ops needs, like:
- Is there transcription for audio or video data?
- What analysis capabilities are on-tool (e.g., tagging, note taking)?
- Are there ways to message or follow-up with participants?
- For interviewing, can I schedule sessions directly from the platform?
This is a key shopping bucket to consider, because operations misses can grow over time, limiting your team's impact. When ops are considered, a tool can free up more time for those knotty strategic and foundational questions that help make research invaluable to an org.
✔ User experience
Despite this being a list of considerations for experience software, not all platforms will leverage design, UI, and engineering power similarly. This shopping item might require a formal demo or signing up for a sandbox account, but will go a LONG way toward helping you determine long-term fit.
As an experience thinker, you'll likely have a list of preferences regarding an experience, but be sure to consider junior and non-research colleagues.
- How many steps (roughly) does it take to go from idea to project?
- How are tools organized? Can a user move from recruitment to design easily?
- Are features named in sensible or intelligible ways? Could you explain them?
- What does data look like when participants submit? How is it presented?
- What will the experience be like for my non-research stakeholders? Will they be set up for success on the platform?
User interface and experience is key for teams who are asked to democratize or who want to loop in stakeholders more actively into the research process. If you find it difficult to intuit and explain, consider the time your product, design, and engineering colleagues might.
As an experience pro, you should have a high bar for your tooling:
- Have you noticed (thoughtful) updates to the UI? Can you find an updates log?
- Does the design leverage best practices related to color, contrast, sizing, and structure?
- For tools with mobile components, are the mobile and desktop interface consistent across both modalities?
After getting a handle on the core elements of a research tool—the what it does—it's time to consider how it does it.
These shopping list items are key for tools you plan to purchase subscriptions for—those that you hope to integrate into your team as extensions of it.
Privacy and security
✔ Data governance
The research tool itself is only half of the story. The other half is how your participants’ data is managed. This might require a message to your security or trust team, but these are critical questions, especially if you work in a regulated industry, on a remote team, or have international aspirations.
Frankly, even if those don't apply to you or your org, privacy and security issues should still be top of mind for you and the providers you partner with.
Some questions to get you going:
- Is data encrypted? Where is data stored?
- Can data be exported from the platform? Can data be deleted from the platform?
- Does the tool have a dedicated privacy, security, or trust team?
- Has the tool earned any certifications (e.g., HIPAA or HITRUSTI)?
- Can informed consent documents be created and uploaded?
An engineering partner might have more specific questions about penetration tests, up/down time, and other specific technical elements, but don't skip questions about your participants’ data—that is your relationship to maintain and sustain.
Some teams may need facial anonymization, per-person data scrubbing, or even client-side invisibility. Any and all of these questions should be fair game. If a sales person hesitates or waffles at all, consider another provider. They should be as serious about it as you are.
With more info on the what and how, it's time to turn to how much—pricing and offerings. There might not be much to evaluate in pricing as much of the industry shifts to subscriptions, but that creates openings for vendors to separate themselves with other items like support and research skills.
✔ Pricing model
In general, research tooling is priced in a few ways
With freemium, access is free, but with certain features gated. Ad-hoc usually offers a complete platform experience on a per-project or engagement basis (although these can be more expensive over the long run). Finally, subscriptions are the most frequent and are typically segmented by tier.
Here is another moment to reflect on your team, both in its current form and how you hope it will grow. Will your team grow in the next 12 months? Will stakeholders want to view projects? Could other teams be interested in running research?
Penciling down these nuances will help you better evaluate the myriad of tiers, terms, and types of subscriptions.
Here are some things to consider:
- Does the platform offer a trial? What is the access and duration?
- Are the core features available across all subscription tiers?
- How flexible are contract lengths?
- Are there limits on concurrent studies or recruitment?
- Can you "see" your consumables or usage at any time or only at specific times?
- What are the pricing elements (e.g., seats, participants, projects)?
- Can you adjust (e.g., add onto) your contract once it has been finalized?
A good shopping tip is to reflect on the past 12 months of research you ran: What did each project require in terms of access, features, etc.? Now imagine you had to run ALL of those projects with this tool. Could you?
Some platforms gate concurrent studies or limit the number of builders versus viewers. There is no "right," answer, only better or worse fits.
A dollar spent on a tool is not equal across tools, with support being the difference. Regardless of your team's size, having on-hand support (or not) and the kinds of support a platform can provide is a strong predictor in whether or not you see ROI or mis-spend your budget.
Once again, reflect on your team and your own bandwidth: Do you have a principal researcher who folks can route questions to? Are you available to help if/when it's needed?
If not, make sure to dig into the platform's on-staff support. Better yet, ask for a conversation with a support person.
Here are some questions to guide you:
- Is there a dedicated support person assigned to your team?
- How is support communicated? Are there specific hours?
- What kinds of support are available (e.g., design, fielding, or analysis)?
- Are support staff technical/platform experts only, or can they help uplevel my work?
- Are there opportunities for alpha or beta testing of new features?
Support can go a long way, especially if you or your team are newer to remote experience research tooling. Not only can they help with bugs, participant problems, and the day-to-day of operations, but some can help mentor junior colleagues and serve as extensions of your team. This further boosts your ROI.
A final few points
You might have already found your tool of choice, but if you're still debating, here are some other elements and wish list items to consider:
- Are they reviewed well? Check sites like G2 for verified feedback from others or industry-specific analysts like Forrester and Gartner.
- Have they earned independent awards? Many firms conduct evaluations and comparisons.
- What is their selling process? Do you feel seen and heard? Are you informed of next steps? Do reps leave time for you to ask questions? How a firm sells can speak volumes as to how they will consider you if you become a subscriber.
- To what extent are they involved in the broader community? Do they host or attend events, create programming, or maintain ways that suggest a commitment to the space beyond selling software?
The market for research software has never been as active and diverse as it is now. Experience and insights teams have hundreds of options before them.
The best choice is likely out there, especially for those who stop to consider their needs—both now and in the future—and carefully compare those to what a platform offers.
When it works, a research tool shifts from a platform to a partner. This helps teams scale, speed up, and smooth their research practice—further building empathy and insight impact org-wide.
Ensure that your Experience Research platform checks all of your boxes and uses your team's time and budget efficiently
By downloading A Buyers Guide to Experience Research Platforms you'll get...
- Advice on how to find a quality and engaged participant pool
- Tips on how to streamline your research ops/what features your tool should provide
- A list of analysis features to look out for
...and so much more with a bookmarkable checklist of 25 Questions to Ask Your Experience Research Provider.
Download the guide below and feel more confident throughout the shopping process.
Ben is the product evangelist at dscout, where he spreads the “good news” of contextual research, helps customers understand how to get the most from dscout, and impersonates everyone in the office. He has a doctorate in communication studies from Arizona State University, studying “nonverbal courtship signals”, a.k.a. flirting. No, he doesn’t have dating advice for you.