Skip to content

From Collecting to Connecting: Interoperability, Openness, and Research Ops

Casey Gollan @ IBM

ResearchOps practitioners extend the reach and impact of research across the company and make it possible for every design decision to be informed by research and data. Some of the ways this is accomplished are by building user research libraries and voice of the customer platforms. Casey Gollan advocates for smoother data flow between systems and tooling.

Transcript

Casey Gollan:

Imagine you're a firefighter. You're running to a burning building, rushing to a fire hydrant with your wrench and your hose. But when you start to turn the wrench, you realize it's the wrong shape, and you can't get the water to flow.

Is this a bad dream? How is this possible? Lucky for you, you have a hundred other wrenches that you can try, and you finally find one that works. Only to realize that now your hose doesn't fit. So water is just splashing to the ground, and everyone starts scooping it up with their hands and throwing it into the flames. You're powerless to do your job. Your expertise is wasted, and the building burns down.

In this imaginary example, fire hydrants are a super hot industry, kind of like tech is today, and hundreds of different kinds of proprietary hydrants and wrenches have been created because of competition between companies. So fire hydrant innovation abounds, but fire safety not so much.

In the real world, this would never happen because of agreements between the makers of hydrants, wrenches, and hoses. This is thanks to professional associations that maintain fire hydrant specifications, city councils that enact regulation, and laws that require standards in the name of safety.

This is interoperability. It's what it looks like when systems and products are designed to work together. Interoperability is so common that you might not even realize how much it's part of our every day.

Take emailing, for example. On the left, we see a person using Microsoft Outlook, and on the right we see a person using Gmail by Google. Email software is built on open and interoperable protocols like SMTP and IMAP. Just like the pipes, hoses, and wrenches designed to interlock, shared protocols are what makes it possible for us to communicate across tools made by different companies.

Today I'm super excited to talk to you all about what interoperability has to do with user research. My name is Casey Gollan, I'm a board member of the global research ops community, and I work in research ops at IBM with a focus on insights, data, and technology.

My mission as a re-ops practitioner is to extend the reach and the impact of research across the company, and make it possible for every design decision to be informed by research.

In order for research to make a business impact, research and data need to reach the right stakeholders at the right time in the right place. That's what got me looking closely at how data flows between the platforms used by researchers, product teams, and collaborators.

I've found that on a technical level, research platforms are locking in data and effectively siloing insights, instead of integrating them into product planning. We can send emails between Outlook and Gmail. So why can't we research across platforms? And why is research such a fragmented and oftentimes manual process full of copy and pasting between tools?

If you're like most researchers and re-op teams, I know you may just be starting the process of adopting purpose-built research tools, and that's something to celebrate. This talk is about the future of research ops, and some of the problems that as a field we're increasingly going to run into down the line when it comes to managing our data.

Later in the talk, I'll share a really simple framework for you to take back to your team and your vendors to start the conversation about truly owning your research data. But first, we'll start by talking about, what is research data?

Researchers produce, consume, and generally just juggle a lot of data, not to mention many different kinds of data. So interoperability for research data is no small feat. What might be possible if all this research data were interoperable? If we could push, pull, and sync between all these different data sources?

Imagine a re-op manager trying to figure out who participated in research this year, and how much they were paid. This is really important for compliance and for creating strong relationships by not overacting participants. Almost every research tool has a list of participants somewhere in it. So right now, you might have participants in three or more tools and your list of internal users in another. But right now, they're all siloed.

If these tools were open platforms, we could automatically sync participants across tools, and make it easy to answer simple questions like this. We could even do math across tools. Right now, this looks like downloading a spreadsheet from here, there, and the other place. And figuring out how to put all the pieces together each time.

Imagine a research manager at a super large company trying to figure out which departments are engaging with research. You might have sources of research data, a reporting tool, and an org chart. But right now nothing really brings them together, except manually creating presentations for each department.

With open platforms research data could flow into a reporting tool that stakeholders access, and we could use our org chart to show not only who's interacting with research, but at a higher level, what departments are showing an uptick in engagement and are worth following up with.

Imagine a product manager who just spoke with a customer and heard something about adoption that they want to validate. But there are a hundred tools and they can't remember, do I even have a login for the place that research lives? With open platforms, we could connect research insights directly into the company-wide search engine so research is meeting people where they're already working.

And for all the researchers here today, when I see an infographic like this about research, part of me thinks, wow, this is so clarifying. The other part of me is reminded of a romcom with an unrealistic, predictable straight line of a plot.

Research, when I've seen it, is almost never tidy like this. In real life, research is non-linear and iterative. And it looks a lot more like jumping between a dock, a whiteboard, a database, and a slide deck. When you do this with copy and paste, you're losing metadata and context each time.

What if, instead of having to create static artifacts like a research report, research could be created and navigated in full fidelity, across platforms, through space and time and with meaningful links between tools?

So these are just a few of the possibilities of what I would call open research platforms, and they're mostly not possible today. So to understand why we lack openness in research platforms, let's look at where these tools came from, and also the markets that have shaped up around them.

Before research platforms as we know them today, you might have done your research in a room like these. Maybe you had shelves for storing files and folders, a Rolodex of participants, notebooks, scissors and tape, and even a two-way mirror.

Desktop research software started coming into the picture in the eighties. It was originally developed out of universities, oftentimes by researchers themselves, before it started becoming commercialized in the nineties. As the usability and user experience of software became increasingly important to the tech industry, web-based product research platforms took off too.

Fast forwarding to today, we've got over 400 platforms in our research ops toolbox. On this annual UX research tools map, common features like surveys or scheduling are shown as their own subway lines, spanning multiple features.

Where the subway map metaphor falls apart is that navigating UX research tools is actually a lot harder than taking the train. I can swipe my metro card and move through the entire subway system, transferring lines, changing directions. But trying to change research platforms or work between tools can feel impossible.

A better analogy for UX research tools might be traveling by airplane. Tickets are expensive, there are upsells at every turn, and you have to take everything out of your bag and put it all back together again. But there's a bigger problem here.

In a tangled and disconnected ecosystem like this, there's no single source of truth for research, and information becomes fragmented between platforms. Data can even get lost or discarded as we move between platforms.

On every marketing website that you visit, you'll find most of these same features. But the way that research happens, and how the data is actually organized within each platform, can be frustratingly different. In the category of insight repositories, for example, you're often doing the same kind of analysis that's been happening in desktop software since the eighties. But there's no agreement between today's platform makers on what constitutes a research finding, a story, or an insight. I've seen researchers, and even re-op teams who evaluate tools every day, struggle to parse the differences between platforms and which tool to use for what.

So how did we get here? Over the past decade, investors have pumped billions of dollars of venture capital into companies creating software for user research. This has created a big bang proliferation and consolidation of research software. Startups have to grow fast in an attempt to bite off a chunk of the market, or they'll be swallowed by an even larger platform. Through acquisitions, a few mega platforms have emerged, and they're now competing to be the one research platform to rule them all.

So in this market, instead of having simple well-crafted tools that can all work together, companies are incentivized to create big, bloated, all-purpose mega platforms that claim to do a little bit of everything. And they're avoiding solving for interoperability by encouraging teams to keep all their research activities under one roof. Maintaining subscriptions to a bunch of tools that pretty much do all the same things is expensive, confusing, and it creates a lot of admin overhead. So teams have to make hard choices between this or that all inclusive vendor.

Imagine that you finally made the perfect choice, only to find out that the platform you chose is shutting down next month. The market is volatile, and even successful platforms disappear. In a recent audit of the re-ops toolbox, Caro Morgan discovered that over 50 of the listed tools were deprecated.

User research platforms are increasingly critical infrastructure on which companies are creating and managing knowledge. They're the data repository, the collaborative workspace, and the organizational memory. But unlike the past, where you might have owned your office building, your notebook, or your files, organizations today are renting our research workspaces. And these platform companies are the ones who control how and where we can work with our own research data, and whether or not we truly own it.

So research platform companies are driving innovation and making research so much more intuitive. But they're also adding these challenging new layers of organizational complexity and fragility.

So what does an open research platform look like in practice? There are three key areas you'll want to pay special attention to; portability, integrations, and extensibility. And if you can only ask three questions when procuring a new tool, make it these. Can I move my data in and out in standard formats? Can this platform connect with others? And can I build on this platform?

We'll go through these categories in a bit more detail, but there's one more criteria. A lot of evaluations focus entirely on features. But when you're evaluating a platform, take a look at the company too. And ask, is this platform here to stay? You can use a rubric like this, and I'll link you to a copy of this at the end of the talk. So try it out on some of the tools you use, and let me know what you find out.

When you're researching a research platform, you have to go deeper than the marketing website. If there's a help center, dig around in there. Sign up for an account and go straight to the settings page to see if you can download a full backup.

You may also want to look at parts of the site that are usually in the footer, like the about page or the careers page to see, has this company been around for a while? Is it growing?

For information on the business, check out Crunchbase, which is a handy site with a lot of investment information. It can also be fun and informative to use the Wayback Machine to look at what a company's website looked like two or three years ago. Have there been major pivots?

When it comes to data portability, look for these things. Can you export everything in one click, or can you only get bits and pieces out of the system? Does the platform have migration tools to help you move your data into and out of the platform?

I saw some big applause for integrations this morning, and integrations are super important as a researcher for controlling the flow of your data between platforms. You'll find integrations advertised a lot, and pretty vaguely on marketing sites, with huge lists of logos of tools. So it's important to be able to break this category down and to spot the limitations.

Do the integrations work in both directions, or are they just one way? Are they built in or are they third-party? Does this platform have web hooks? Web hooks are the most flexible and powerful way that a platform can enable integration, by giving you full control to send data between tools in real-time.

For extensibility, the key feature is having an API. If you think of a user interface as what connects a computer to a person, an API, or application programming interface, is what connects computers to each other. So when you're looking out for extensibility, make sure your platform has an API. And also check, is there a marketplace for plug-ins or add-ons? These allow for extending what a platform can do.

Some platforms also have built in scripting, which allows you to write and run code directly inside the platform. If there's a developer portal or a community, that's also always a good sign.

Finally, consider sustainability. The sudden disappearance of software, or sun-setting as it's known in the tech industry, is so common that there's a whole blog dedicated to these announcements with hundreds of posts. Venture capital, the type of investment that's driving today's booming market for research platforms, is a make or break kind of business model. And at the end of the day, there need to be returns for the investors.

But VC is not the only way to create and maintain software. There are other models, like Open Source. If you've used the Android operating system, visited a WordPress site, which powers two-thirds of the internet or used Firefox to browse the web, you're using open-source technology. Even if the maintainers of an open source tool were to close up shop and walk away, you can continue to run your own copy. And you can even add features to the tool yourself.

So if platforms could pass these four criteria, they'd be more like building blocks instead of monolithic, walled gardens. And we might be able to do away with some of these gigantic evaluation spreadsheets. We could open up our toolbox, pick the right tools for the job, and work seamlessly between them.

We'd also unlock all kinds of new possibilities about how we do research. Having evaluated the openness of many research platforms over the past year, I have a few anecdotal findings. There are far more ways to push data into platforms than to get your own data out. And while most tools have some form of data exports, they often have to be manually requested one at a time on the level of individual projects or pages. Almost all research tools advertise integrations, but they're so limited that you'll still find yourself better off copy and pasting.

And finally, APIs are almost non-existent in the user research space. Where they do exist, they're limited in scope, not allowing for a lot of possibility in terms of building on the ecosystem.

So is your user research platform locking in your data? If you use purpose-built research tools, the answer is almost certainly yes. But it doesn't have to be this way. Design tools are ahead of the curve in terms of openness. Whether you're using Figma or Sketch, designers aren't using fake lorem ipsum copy anymore. They're piping in real product data to design tools. They're creating design systems that are generating code, like color tokens, which are exported directly into developer environments. And there are plug-in ecosystems. On the left is a content library made by Microsoft inside of Figma, and shared for any design team to use.

So interoperability may be technically challenging, but it's not impossible or unprecedented. There are so many amazing examples, from academia, to government, to industry. In 2016, desktop research software makers, the kind still used mostly in academia, came together to create the REFI-QDA standard, so that research projects can be opened between seven different analysis tools.

In 2018, Google, Facebook and other major platforms implemented the Data Transfer Project, a shared protocol for photos, allowing users to move all their photos between services with one click. And in the research ops community earlier this year, the minimum viable taxonomy project shared their work towards a common language for knowledge management.

So what might an ecosystem of open and interconnected research platforms look like?

Researchers would move fluidly between tools. Research workspaces would be deeply interconnected. Migrating to a new platform would be a one-click process. Information would flow to meet collaborators wherever they work. Re-ops teams would build new ways of conducting research on top of cross-platform APIs. Competition between vendors would be based on innovation instead of lock-in, and nonprofit and open source research platforms would be vibrant alternatives to today's venture-backed offerings.

Kate Towsey describes research ops as a lot less like administration and a lot more like service design. That, to me, is the heart of research ops. It's about understanding the social life of research within an organization, removing barriers through research success, and creating new kinds of possibilities. Especially in parts of the research process like data management that are so dreary in the present-day that it's hard to even imagine the possibilities.

I would add to this that research ops is also designing dark matter, to quote the designer Dan Hill. He describes dark matter as, "The imperceptible yet fundamental facets of design; organizational cultures, regulatory environments, business models and ideologies."

In the example of the fire hydrants, you can think of safety regulations as the dark matter, the literal code that writes the city, enabling or inhibiting design patterns like interoperability. Hill proposes that architects could spend less time designing buildings and more time designing building codes to have a greater impact on the city.

I would propose something similar for research ops. Next time you're looking at what tools you use across the organization, go beyond comparing features, and consider the dark matter. How is data governed? How do these tools interconnect with each other? As research ops practitioners who are responsible for procuring tools, we have an incredible leverage to work with vendors and to push vendors towards greater interoperability.

Designing this dark matter might start by opening up your team sense of what's possible, thinking big beyond today's limitations about what's possible with research and operations. And it may extend even to advocating to your vendors that our field needs open platforms, and partnering with them to help realize that.

So with collaboration and a bit of rallying, I believe that as researchers and re-ops practitioners, we're in the right place at the right time to shape a more open future.

I will leave you with a few quick resources and takeaways. If you visit the open research platforms site, you can download the evaluation cheat sheet and share your experience of trying to integrate platforms. I'd also really encourage you to take part in the re-ops tools census. This will help our community continue to benchmark what kinds of tooling teams are using.

And finally, you can join the research ops community Slack, and check out the Tools channel where we have an ongoing conversation about topics like these. If you're interested in these ideas, I would love to connect. My contact info is at the bottom of this slide, and it would be great to continue the conversation. Thanks again to dscout for organizing this great event.

The Latest