Skip to content
Ideas

The CHANGE Canvas: A Plug-and-Play Brag Sheet for Your UX Accomplishments

Use this six-step framework to track and prove the value of your research.

Words by Devin Harold, Visuals by Allison Corr

Tracking the real-world impact of your UX research is important to your career. But how do you do it in a convincing way?

It's a place most of us have been before: after a year (or years) of hard work, documenting the real value and impact of that work suddenly feels overwhelming.

Your accomplishments don't come off with the impact that they should in your performance review. Because of this, you miss out on proving your value and getting that desired raise or promotion.

In a previous article, I shared my IMPACTS framework, which includes practical steps to activate your insights and keep your research from being shelved.

But without being able to identify and track meaningful changes resulting from your insights work, you’ll still be unable to articulate impacts to your leader, get credit for your work, and take your career to the next level.


This article will walk you through the six parts of a one-page canvas for tracking CHANGE being made as a result of your research. Using the canvas, you will not only know what to look out for, but you’ll be able to capture everything in one place. This serves as the foundation for a project case study and a brag sheet for your next performance review.


Jump to..

Prefer a PDF? Download this article, alongside the canvas you can use to note your progress.

What does CHANGE look like?

As we did before with the definition of IMPACT, let’s make sure we know what we mean when we say ‘changes.’ The definition of change tells us to “make (someone or something) different; alter or modify.”

When we put that into context of user research, this means that the change we’re looking for is any alterations or modifications to people, process, designs, roadmaps, budgets, or work plans as a direct result from our findings.

The scale of change which can be tracked can also vary. In some cases you may change the entire direction of a product or improve the organization’s UX maturity, while in other cases you simply inform changes in the UI of a web page.

Both are solid examples of impacts from great insights work, and both should be tracked. Others have coined terms global and local impact, which speak to data or metrics from launched products versus internal decision, process or infrastructure-based changes, respectively.

Why use a canvas?

There are great pieces of advice out there for tracking the impact of UX Research work. But a canvas (like the one shown below) creates a tangible artifact for describing, visualizing, and assessing something all in one place.

A canvas is great to leverage when you expect to use its contents as a tool to:

  • Ground conversations
  • Frequently refer back to
  • Add to over time

One great use case for the CHANGE canvas is when you want to build rapport with a new department or stakeholder. You can pull up a case study for generative research and point to the impacts made across multiple levels in order to demonstrate UXR value.

Another great time to use the CHANGE canvas is during career conversations or performance reviews, where you have a chance to highlight your recent accomplishments. For research professionals, performance is often tied to the impacts that their research has made.

Capturing impacts in one place for every major project or initiative will make it easy to pull up prior to (and during) those conversations, so that you have everything you need to get credit for your hard work.

The scope of the canvas is flexible. You may use this canvas to reflect on and gather impacts for a single recently completed project—or an entire quarter’s worth of work. You can even use the canvas for documenting multiple different projects which had varying impact. This allows you to document the impact of an entire career at a company.

Back to top

The Canvas

To make things simple, the canvas for identifying meaningful action taken using our research is an acronym which serves as the ultimate goal: seeing CHANGE through our work. So what does C.H.A.N.G.E. stand for?

  • Creative direction
  • Hard data and metrics
  • Advocacy
  • New priorities
  • Generative focus
  • Enterprise-wide

Let’s continue to unpack why these matter and how you can practice this canvas in your own day-to-day projects.

Creative direction

Creative direction is typically the most obvious and often the most direct form of change observed when others leverage our research.

For example, changes to creative direction are seen when our fellow designers…

  • Move a button which was unnoticeable to users
  • Reformat the layout of a page to better merchandise products
  • Improve a task flow by removing an unnecessary step in the process

A benefit of this type of impact is that it’s easy to track these changes for designers leveraging common tools. Creative changes in the context of this canvas are not limited to interaction design, either.

Changes in creative direction are also observed when a content strategist updates the language and descriptions that were once vague or confusing to users, or even when graphic designers and marketing professionals leverage user-centered terms in advertising campaigns.

Anyone can look at the work before and after research was conducted. This helps identify meaningful changes that resulted from user observations or feedback. In sum, it clearly exemplifies your insight’s impact.

Updating the canvas

When documenting changes to creative direction on the canvas, it helps to have tangible artifacts to point to which convey the nature of the change. Include screenshots, excerpts, and even links to files both before and after user feedback was incorporated, and make it clear the linkage between insight and action.

Motivating questions

To get started, ask yourself the following:

  • What visible, tangible creative changes have you seen after completing research?
  • Do you have clear designs/artifacts before-and-after research?
  • Are you sure the changes made are a result of your insights?
Back to top

Hard data

Hard data and metrics may be the most revered form of changes we all wish to see from our work because it clearly ties business value to our efforts. Improved satisfaction, increased conversion rates, positive click-through rates, and reduction in call-in volume are all gold—especially to our business and product partners.

If you’re not used to thinking about these business metrics as a practicing UX professional, consider how each of these are translated from our own language.

UX professionals use terms like:

  • Cognitive load
  • Task completion
  • Error rates
  • Time-on-task
  • System usability scale ratings
  • Usability

Because the translation from our UX-centric language to business-centric language can feel unfamiliar—and because there are many different metrics we could possibly measure—identifying hard data resulting from our insights work can be a challenge.

While this is in no way comprehensive, here’s a cheat sheet of a few UX terms you can generally translate to or correlate with business terms:

UX Terms

Business Terms

“Users”

“Customers” / “Prospects” / “Leads”

“Usability”

“Click-Through Rates” (CTR) / “Engagement”

“User journey”

“Funnel”

“Error/Failure rate”

“Exit rate”

“Success rate”

“Click-Through Rates” (CTR) / “Conversion”

“Time-on-task”

“Sessions”

“Satisfaction”

“Decreased churn” / “NPS” / “Retention”

“Findability”

“Click-Through Rates” (CTR) / “Conversion”

“Learnability”

“Engagement” / “Adoption”

Observing improved metrics and tying them to UXR contributions might require a healthy dose of inference. Unless you’re running a controlled multivariate test, it can be tricky to know for sure whether improvements in sales were a true result of an improved design based on user feedback.

Still, it’s important to track and celebrate improvements to critical business KPIs when your solid user research played an active role throughout the process.

Updating the canvas

Documenting changes to hard data and metrics within the canvas is simple! Simply jot down each metric improvement one after the other (i.e. +12% on-page engagement; +43% CVR, etc). When coupled with the changes in creative direction, you already have a solid case for your insights work.

Motivating questions

To get started, ask yourself the following:

  • What are the team’s measures of success?
  • Do you have clear metrics from before and after a launch which you contributed to?
  • If you don’t know the measures of success, who can help you get them?
Back to top

Advocacy announcement from dscout

Advocacy for user research is a precious, underrated impact which can often be difficult to identify. Unlike hard data which is quantifiably measured, this is an internal observation for the following:

  • How receptive other departments or stakeholders are to research efforts
  • How much they’re going to bat to ensure it’s an integral part of the collaborative process

Advocacy also refers to how many more people are willing to listen to—and fight for—user needs. For example…

  • Is that one executive who was once difficult to find calendar time with now inviting you to staff meetings to present findings?
  • Is that product manager who was giving you trouble fitting into their release schedule now engaging you to proactively embed research into the process?
  • Do you have new stakeholder groups or teams requesting research?
  • Are you showing up in more forums with wider reach?

These are all signs of advocacy. Unless you’re diligently tracking every meeting you have with certain groups, or exactly how many times a single insight makes it into a product release, you probably won’t have anything actually measurable in terms of Advocacy.

You have to keep an eye out for a few tell-tale signs for when…

  • Hard-to-reach stakeholders are no longer hard to reach
  • Skeptical partners are questioning you less
  • Current allies are introducing you to new allies
  • You’re receiving ever-increasing requests for research
  • User research is assumed or a table stake in the process
  • You’re being invited to more forums and/or more senior forums
  • You have to convince less and prioritize more

Updating the canvas

To document changes to Advocacy within the canvas, it’s best to summarize your observations which reflect one of the above examples (among others). Clearly state the who, the what, and the when.

Including when this has been taking place is important because it may not be a singular project which influences partners to engage more in our work. Some of these impacts could take weeks, or even months to surface! That’s why the canvas is a living document where you should feel free to update as you gain new intelligence.

Motivating questions

To get started, ask yourself the following:

  • Do you have emails, chats, or memories of a current ally who used to be skeptical?
  • If you’re unsure where to start, identify current allies. What are their origin stories?
  • What stakeholders should you begin monitoring for allyship?
Back to top

New priorities

New priorities can show up in several ways. For example, new priorities may include…

  • Prioritizing the backlog
  • Overhauling an entire release schedule
  • Generating new quarterly or yearly roadmaps
  • Allocating more departmental budget for more research
  • Supporting new or refreshed initiatives as a result of your insights

Capturing changes in priorities can sometimes be relatively straightforward, but other times quite tough.

If you’re familiar with the backlog of the product or experience you’re working on, then it should be straightforward. If insights you’ve delivered clearly favor a newly prioritized feature in the next release, then that’s a pretty clear change! Some high-functioning teams even take individual usability findings, rank them, and work together to prioritize them within the backlog for maximum impact.

In other cases, it can be difficult to get a hold of other department’s roadmaps—or to know when they’re about to begin next year’s budgeting exercises. This is in part due to partners wanting to avoid too many inputs and opinions when their own thinking is only half-baked.

So how do you influence and track changes in priorities? From our IMPACTS framework, it helps to involve these partners from the beginning and motivate action by asking to be involved in their planning process.

Hopefully over time, the exposure to user needs and your strong POV will bolster their priority-planning process. If you’re lucky, they’ll directly involve you as a primary contributor. To gain visibility into the roadmap, simply ask for it!

Then, you’ll be able to clearly see the changes in the team’s priorities, which may manifest as an improved UX roadmap. If you’re able to identify a few features, bullet points, or budget requests which tie to user needs you helped uncover, that’s trackable impact.

Updating the canvas

To document changes to New Priorities within the canvas, it’s ideal to practice the same before and after view we demonstrated with changes in creative direction. If you don’t have the “before” view, it's not a problem! Just include screen captures, artifacts, or bullets of the current-state roadmap with clear highlights for the pieces your research directly influenced.

You can do this by circling things in bright red, highlighting in obnoxious yellow, or double underlining in lime green. Don’t be afraid to boast about which aspects your work helped put on the map.

Motivating questions

To get started, ask yourself the following:

  • Do you have access to partner roadmaps? If not, how might you get access to them?
  • If you have an old roadmap, who can you follow up to get an up-to-date view?
  • What priorities/roadmaps should you begin tracking?
Back to top

Generative focus

Generative focus is about seeing early shifts in the type of research we do, as a result of the perceived value it provides.

You may prove value through impact like:

  • Conducting foundational research
  • Shifting priorities
  • Getting the cross-functional team closer to deep customer needs

Tracking the changes in how much generative research your team conducts over time is a great indirect way to make tailwinds with stakeholders and promote user-centered culture. While this is a more operational and indirect measure of impact, it can be powerful when tied to other changes reflected within the canvas.

For instance, let’s say you had a project that was grounded in foundational research. It identified key unmet needs, which later led to improved creative direction, new work plans, and improved KPIs. You not only have a lot to celebrate, but you can leverage this as a case study for increasing this type of generative work within the team.

Generative focus means more than just conducting customer interviews or diary studies at the start of the project. It also means starting with competitive intelligence or existing insights as a jumping off point.

Changes in generative focus could also mean increased awareness and utility of leveraging frameworks to identify unmet needs, such as…

  • Personas
  • Journey maps
  • Service blueprints

This type of change can (and should) be tracked per project to identify the ones which begin with a discovery phase. However, true value is unlocked when looking at this over a period of time.

Are you seeing quarter-over-quarter growth in foundational studies run by yourself or the team as a whole? Are you seeing more and more projects begin with a dedicated discovery phase within the past six months v.s. the prior six months?

The hard work you’ve been putting into building allies, presenting your work, and demonstrating value may be paying off.

Updating the canvas

When documenting changes to Generative Focus, try capturing answers to two questions:

  1. Did the project or initiative you’re focusing on within the canvas start with generative exercises?
  2. Did this jumpstart, or continue an upward trend you’ve observed over time?

If yes to number one, mark it in the canvas with a few bullet points around what the foundational phase was, and what it enabled the team to focus on for subsequent phases.

If yes to number two, try to capture where this project was in the trend line, in addition to what the longtail impact has been so far. For example: “The first study built true cross-functional personas, leading to an increase of two times the foundational work in the past four months, closing gaps in knowledge of our customers.”

Motivating questions

To get started, ask yourself the following:

  • Are you currently tracking what types of research your team does?
  • How many projects begin with generative research today? How many of them needed one?
Back to top

Enterprise-wide

Enterprise-wide is the macro view of…

  • How your organization values human-centered decision making
  • How embedded a learning culture is in the company

When you start seeing signs of enterprise-wide changes as a result of your work, you’ve begun improving the UX maturity of your organization and your seat at the table is being built out of stone.

Be patient though, as enterprise-wide impacts won’t be seen on many of your project-specific canvases. These macro-trends often take more than a single project to bolster and a long time—sometimes years—to truly manifest.

Enterprise-wide impacts come in a few forms, including…

  • Advancements up the UX maturity scale
  • Shifts in company-wide strategic direction
  • Product development process changes that influence how teams operate to deliver core products and services

All of these are signs of monumental change, though may be difficult to spot. Monitor your organization’s UX maturity every six months. Remember this canvas is about how UXR contributions led to impacts and be prescriptive about how your research has led to advancements in maturity.

Some examples of enterprise-wide achievements include…

  • Building a democratization effort leading to more user centricity
  • Revamping the design-thinking process to include more research
  • Building a model for agile research within cross-functional sprints
  • Noticing more executives consistently asking about research when making decisions

The closer and more directly that you can tie continual and strong research POV to these shifts, the more confident you’ll be.

Updating the canvas

When documenting UX maturity, or organizational-wide changes from your work, it may be hard to come by single-artifact examples. Of course if you have them, they’re best to include.

Otherwise, a simple bulleted documentation of your observations should suffice. To paint an accurate picture, include what type of enterprise change it is, and what continued or longitudinal efforts led to that change.

Because it may take many studies to amount to such large-scale changes, try not to list every one. Instead reference topics, programs, or milestones as your evidence.

Motivating questions

To get started, ask yourself the following:

  • Are you aware of your organization’s current UX maturity?
  • How do you find out about large organizational shifts in priorities?
  • What senior executives really matter when observing user-centricity in high-level decision making?
Back to top

How to get started

We’ve covered the most important CHANGE to observe when driving impact within any organization:

  • Creative direction
  • Hard data and metrics
  • Advocacy
  • New priorities
  • Generative focus
  • Enterprise-wide strategy or maturity

In order to get credit for your contributions, you’ll need to track these changes consistently and across multiple projects.

Using this canvas, you’ll be able to document your impacts all in one place—instead of needing to remember it all or tracking things down. By having clear, concise evidence of how you influence your team, process, and even your organization, you’ll have a strong artifact for socialization (plus get credit for your hard work).

Remember this canvas when completing your next project, brushing up your resume, or preparing for a performance review with your boss.

Now, go track down the meaningful CHANGE resulting from your hard work.

You may also like...

Back to top

Devin Harold is a Sr. Manager—Head of UX Research for Consumer Digital Channels at Verizon. He leads a team focused on delivering foundational research to drive strategy within Sales, Assist, Account & Platform experiences across dotcom, native app, and conversational channels.

Outside of Verizon, Devin loves being involved in the UX community. He is a guest lecturer and critic for Carnegie Mellon University and a member of the Design Leadership Forum by InVision.

Subscribe To People Nerds

A weekly roundup of interviews, pro tips and original research designed for people who are interested in people

The Latest