DEV Community

Cover image for Balancing quantitative and qualitative user engagement
Brian Neville-O'Neill
Brian Neville-O'Neill

Posted on • Originally published at blog.logrocket.com on

Balancing quantitative and qualitative user engagement

Written by Frederick O'Brien✏️

For as long as there are things happening, someone, somewhere will want to quantify them — and someone else standing nearby can be relied upon to share their opinion that the first person is wasting their time. This is a fact of life.

The tension between qualitative and quantitative analysis is age-old. There are cautionary tales from both camps to be found in every walk of life, from tourism to theaters of war. In the digital age, we all stand to benefit from their lessons.

As account managers and frontend engineers alike are deluged by unprecedented amounts of data, the temptation to follow the numbers is stronger than ever. It requires great vigilance, but if we’re able to step back and harness that power for our own ends, the potential for improvement is incredible.

U.S. Secretary Of Defense Robert McNamara
U.S. Secretary of Defense Robert McNamara famously went all in on a quantitative approach to the Vietnam War. It went badly.

Online, there is no series of metrics more revered than user engagement. It is the golden ticket to how people interact with your site, product, or service. The metrics take many forms — traffic, conversion rate, time on page, downloads, shares, email signups, and so on.

To an extent, the metrics are shaped by what you do. User engagement on a music streaming app like Spotify would be very different to that of an affiliate marketing website, for example, but there is plenty of overlap.

Getting the most out of its myriad data points is a balancing act. At a time when developers have more tools than ever at their disposal, it is just as essential to know when not to use them and what questions to ask. More often than not, targeted use — combined with experience and expertise — is far more valuable.

TL;DR

  • Raw data is deceptively slippery. Put metrics in their proper context and never let them run the show.
  • User engagement data is only as good as the scrutiny it is subjected to.
  • Whenever in doubt, remember Goodhart’s Law: when a measure becomes a target, it ceases to be a good measure.

LogRocket Free Trial Banner

The age of big data

Data processing power has grown exponentially since the turn of the millennium. Today, all manner of sophisticated analytical tools are available, oftentimes for free. This, by and large, is a really good thing. Developers can find and fix problems more easily, editorial types can see what content is clicking with readers, and KPI aficionados have more numbers to put into more spreadsheets. Everyone’s happy.

Only it’s never that simple, is it? As the stature of big data has grown, so, too, has wariness about its potential to stifle good decision-making. As Kenneth Cukier and Viktor Mayer-Schönberger summarize in “The Dictatorship of Data,” a 2013 essay for MIT Technology Review, “The threat is that we will let ourselves be mindlessly bound by the output of our analyses even when we have reasonable grounds for suspecting that something is amiss.”

This is something most of us will be familiar with. I’ve worked with otherwise brilliant professionals who have slipped so deep into the numbers that they stop seeing users as people. Instead, they become a kind of vague mathematical challenge.

You would do well to top the example set by Marissa Mayer, who, at one point during her time at Google, decreed that 41 shades of blue be tested to find out which one users preferred. Granted, Google blue is a pretty nice blue, but there comes a point where you’re wasting yours and everyone else’s time.

Google Blue
Painstakingly optimized blue is the best blue, I guess. (#4285f4, if you were wondering.)

It has always been an easy mindset to fall into, and the accessibility of user engagement metrics multiplies that risk several times over. The potential of big data is (ironically) immeasurable, but that does not mean it deserves unconditional deference. As Jerry Muller summarizes in his 2018 book The Tyranny of Metrics, “Not everything that is important is measurable, and much that is measurable is unimportant.”

Be the dog that wags the tail

Believe it or not, I’m not here to bash quantitative data. It is an invaluable resource, and in the realm of user engagement, the breadth and quality of tools available is unparalleled. Just don’t let them be the tail that wags the dog. Below, we’ll break down two common examples of user engagement data and the wariness they require.

Traffic

Anyone who’s worked on the web knows this one. Traffic is king. Page views and unique users are key to sales, subscriptions, advertisers, conversions, and all else that is great and good. On the surface, this seems like a no-brainer. Lots of traffic is good, right? Having more traffic month over month is good, right?

Well, it depends. I don’t think anyone would want to see their traffic in decline, but treating it as a purely quantitative metric can lead to bizarrely inhuman choices. For example, you know those tiny articles inexplicably spread across multiple pages? They’re textbook examples of quantitative user engagement dictating behavior. The practice may lead to more page views in the short term, but it’s bad for UX, it’s bad for writers, and, eventually, it’s even bad for advertisers.

This particular imbalance is nothing new. For as long as publications have sold advertising space, they have desperately sought to pad their circulation numbers. More readers means you can charge more, after all. Is high circulation still a good thing when 41 percent of it is phony, as was the case with the Wall Street Journal Europe between 2009 and 2011? Probably not. Traffic for traffic’s sake can lead to thin content, misleading SEO, and threadbare development.

So what’s the right balance? Oftentimes, it can be as simple as cross-referencing data points and putting each in their proper context. If traffic skyrockets but return rates plummet, maybe things aren’t as rosy as you’d like to think. Beyond that, you have to be willing to cut yourself adrift from data altogether and ask questions that can’t be measured. Is there a community around your brand that trusts you? What external factors may be driving people to you?

Increased page views and unique users are worthwhile things to aim for. Just remember, they’re not numbers, they’re people.

Heat maps

Visual analytics are an altogether different game. Tools like heat maps have a much more natural qualitative leaning because they allow you to see how people behave. They can remove the conjecture that comes with rawer forms of data. As with any analytical tool, though, they’re not the be-all end-all. To get the most out of them, you must be vigilant to their limitations.

Just like surveys, low sample sizes usually make the results dubious at best. Before you even get into the nitty-gritty of the results, know where they’ve come from. Are they the result of 200 sessions or 200,000? What devices are they on? Where did they come from? Someone coming to read your blog will likely have different browsing behavior than someone keen to demo your software.

Eye Tracking Map
Eye tracking is another measurement that benefits from wider knowledge. People may be drawn to the top left of this page because it’s super interesting, but more likely they’re just following the Gutenberg Principle.

Eye tracking is another measurement that benefits from wider knowledge. People may be drawn to the top left of this page because it’s super interesting, but more likely, they’re just following the Gutenberg Principle.

As far as the heat maps themselves go, it’s again a case of digging that little bit deeper. Lots of clicks may mean people want to engage with your product, or it might mean your site isn’t as easy to navigate as you think it is. Your CTAs are having a torrid time, but is that a problem with their placement or with the copy leading up them? Good questions have a knack for producing good answers.

Also understand that not everyone involved is necessarily pulling in the same direction. A sales manager and a writer can look at the exact same heat map and come away with very different conclusions. There’s no malice to it, it’s just a fact of people working on different things. To get the most out of visual analytics, different teams need to communicate with each other and agree on what the priorities are. If they don’t, then heat maps risk becoming colorful Rorschach tests.

Again, depending on your goals, these variables may not matter so much. A glitch is a glitch is a glitch, and if all you need is one replay to find it, then all the better. It all comes back to making the data work for you, and not the other way around. As Jerry Muller writes:

“I can’t see how competent experts could ignore metrics. The question is their ability to evaluate the significance of the metrics, and to recognize the role of the unmeasured.”

Eternal vigilance

With these and any other user engagement data, there is no magic formula for getting the balance right. As much as anything else, it entails a frame of mind. There are times when you will be better served by The Elements of Style than by any spreadsheet.

User engagement metrics are incredibly powerful tools, but we remain the craftspeople. If in doubt, a good rule of thumb can be found in Goodhart’s Law: when a measure becomes a target, it ceases to be a good measure. The tension is never going away, so our vigilance shouldn’t either.

Ask questions, combine data points, ignore data points, step outside accepted parameters, and see what happens. Communicate with other departments to find common ground and be clear on where you differ. Master data; don’t let it master you.

Further reading


Plug: LogRocket, a DVR for web apps

 
LogRocket Dashboard Free Trial Banner
 
LogRocket is a frontend logging tool that lets you replay problems as if they happened in your own browser. Instead of guessing why errors happen, or asking users for screenshots and log dumps, LogRocket lets you replay the session to quickly understand what went wrong. It works perfectly with any app, regardless of framework, and has plugins to log additional context from Redux, Vuex, and @ngrx/store.
 
In addition to logging Redux actions and state, LogRocket records console logs, JavaScript errors, stacktraces, network requests/responses with headers + bodies, browser metadata, and custom logs. It also instruments the DOM to record the HTML and CSS on the page, recreating pixel-perfect videos of even the most complex single-page apps.
 
Try it for free.


The post Balancing quantitative and qualitative user engagement appeared first on LogRocket Blog.

Top comments (0)