In the EdTech industry, data drives business decisions — and that’s a good thing. Leaning into facts and metrics prevents you from being sidelined by assumptions and blinded by bias. Plus, data is available for your whole team to use. Shared data points can keep your team aligned and create ways for them to collaborate and connect.
But data can be incredibly difficult to sort through; it’s not automatically helpful and instructive.
You want to build a product powered by premium data fuel that goes the distance. When you partner with an external UX research team, you can be sure you’ll have a more nuanced understanding of data. And you’ll also know what data is trustworthy and relevant — and which isn’t — as you move forward with product initiatives.
Do You Understand the Story Your Data Tells?
Data is a valuable resource. There’s no debate about that. But it can be difficult to know how to use this resource responsibly and beneficially — especially when there’s an abundance of it, as tends to be the case.
You may have many research reports in a given year, and each report tells a story that provides actionable insights. If you conduct all your research in-house, well, squeezing the right insights out of the data story can be tricky because:
- The data can be overwhelming. The sheer volume of data may leave your researchers wondering where to start.
- Your researchers might be stretched thin and not have the bandwidth to apply statistical rigor to each data set.
- People are inherently biased and subject to internal pressures, including your well-intentioned in-house researchers. Bias skews the quality and usefulness of your research findings.
Volume, bandwidth, and bias: These are formidable obstacles for your team. So, too, is discerning what data matters — and how to best measure it.
Beware of Making Product Decisions Based on Vanity Metrics
Focusing on the wrong data can set your product back. You can waste time, energy, and budget heading in the wrong direction because you haven’t prioritized the right research findings. Ideally, data collection tools should help keep you on the right track. Sophisticated platforms like Google Analytics and Pendo provide opportunities to know how your users really interact with your product — at least in theory.
These tools can be avenues to key insights, but if used incorrectly, they can easily become dead-end streets. Why? Because of the allure of vanity metrics — statistics that look impressive on the surface but do little (if anything) for your business objectives. Vanity metrics aren’t relegated to social media likes and shares. They’re alive and well in the EdTech space, too.
Vanity metrics like the ones listed below lead to assumptions, shallow explorations, and/or rushed conclusions:
- Subscriptions and adoptions. Subscriptions and adoptions create revenue, and the numbers might sound good to stakeholders and board members. But subscription metrics don’t tell the story about whether or not individual user needs are consistently being met.
- Feature usage. Engagement metrics may be misleading. For example, new users may spend time exploring, but as they get familiar with your product, their engagement may drop severely.
- Net promoter score (NPS). The standard NPS question (“Would you recommend this to a friend?”) is not an optimal one to ask EdTech product users. How many students do you truly believe will highly recommend a required class product to a friend?
Another glaring issue with vanity metrics is that they measure past performance. Checking the rearview mirror when making product decisions is smart. After all, you have to take time to evaluate what you’ve already done. But even if you experienced success, you can’t keep your eyes in the rearview.
Past success doesn’t mean rinsing and repeating those same processes will lead to product growth.
Strive for a More Balanced Data View
When you work with Openfield, we’ll help you keep your eyes on the road ahead. We pay close attention to both lagging and leading indicators and conduct both quantitative and qualitative research.
Lagging and Leading Indicators
Lagging indicators are outputs that confirm an in-progress or past pattern. Leading indicators are inputs that point to future events. Both are pertinent to your EdTech product decisions.
For example, if instructors are not visiting the reports page, your data will show low page views (lagging indicator). It’s good to know the page isn’t being used as intended. But based on this data alone, you might assume the reports page isn’t valuable to instructors and be tempted to throw it out.
After conducting appropriate research on the instructor reports page, we may determine that certain actions are necessary to drive instructor engagement in the future (leading indicator). Instead of throwing out a potentially powerful resource for instructors, we can work toward solutions that support and encourage their use of it.
Quantitative and Qualitative Data
Just like you need to be attentive to both past and future indicators, you need to conduct both quantitative and qualitative research. One without the other gives you an incomplete view of the reality of your product.
Qualitative research collects hard numbers. Numbers can be very compelling, but numbers need context. As far as UX research is concerned, observed numbers are just a starting point. Knowing a product page has low page views is critical. But more critical is our follow-up question: “Why?”
Our UX researchers carry out qualitative research in order to give proper context to the numbers and dig deeper. Qualitative research strives to understand user behavior, desires, preferences, and thought processes. Because of that, it yields deeper, richer insights that have the power to meaningfully shape successful products.
In the case of the instructor reports page, our researchers might ask:
- Is the reports page following best UX practices?
- Is the reports page in the optimal location?
- What do instructors expect to be able to do on the page?
- Is the page redundant? Are instructors doing the same activities elsewhere?
Our researchers answer these questions with user testing. Surveys, interviews, and focus groups are all part of understanding numbers and driving an appropriate solution.
Your UX Research Team Sorts the Data Wheat From the Chaff
Your data tells a story. An external partner like Openfield ensures you are able to listen to the story it tells. We’ll focus on the data that matters, trim out what doesn’t, and help you understand the difference.
Let’s talk about your product’s data story — and how to use it as fuel to drive your product strongly into the future.