Batman and Robin LegosKnowing the difference between the individual terms might be elementary to some. Admittedly, the root words can pretty much sum up their meaning, but understanding the roles each play to tell your data’s story, can make your analysis powerful.

To Quantify Means to Measure

If I needed to measure my living room in order to buy a new sofa, I’d have to measure the space with a tape measure. No matter how many times I measure and no matter who did the measuring, the data would never change. The dimensions are what they are, as long as everyone uses the same standard of measurement. This is an example of quantifiable data.

I could argue with my family about whether or not the room is therefore big or small depending on how we all perceive the quantifiable data but the data itself is unaffected by the opinions of my family. That is pretty simple.

When Feelings Get in the Way – Qualitative Data

Qualitative data gets a little sticky. To be sure, determining the quality of something can be completely subjective. As an example, my friend Barbara loves silk shirts. She thinks the feel of the material on her skin is bar none. She says that one of the qualities of silk is the luxurious texture. She would say that silk clothing is the best kind that you can buy.

I do not like silk. To me, it is a slippery material that rubs my skin the wrong way. I would not say that it is a great material to wear.

To try and argue the points between the two of us would be completely pointless, we won’t ever agree. Why? Because the quality of what we are individually experiencing is dependent upon each “user.”

Think of quantitative data as the “what” and qualitative data as the “why.”

*For information on practical ways to know if you have properly qualified data click here.

Now That We Have Qualitative and Quantitative Data Settled

When conducting user experience (UX) research, both qualitative and quantitative research are needed to understand where the problems are, how to empathize with the end users, and propose solutions that ease their pain or create moments of delight and surprise.

Most organizations on the Sales end would simply say that their ideal customer fits into the top 3 major pools of job titles and industries who typically buy from them. Based on that assumption, they dive head-long into baking communication to them that “speaks their language.” Does it really? Have they asked them? Are there hard and fast numbers and metrics that prove the theory of who is the ideal customer and what issues in their business that they need resolved?

Without the quantitative data that shows baseline metrics and improvements (or not), most organizations will go down a rabbit hole of making educated guesses of where users are having issues, meanwhile missing many opportunities to engage along the way. Secondly, without having a qualitative understanding of how users are feeling; that is, what they’re experiencing, we cannot discuss the problem in real terms. Once we have this, we will be best positioned to develop the innovations or incremental improvements needed to address our customers needs.

The issue is that needs and pain points may be unspoken; it’s not always easy for our research subjects to articulate their concerns – which is why it takes specialists with a background in the cognitive sciences and the creative arts to find resolution.

Neglecting to garner that information from your constituents leaves the door wide open to allow your customers to diminish their loyalty because they do not feel like it is them that you are speaking to through your product, service or communication of such.

Why Does this Matter for Client Acquisition and Retention? A Story from Sales

The old adage, “Seek first to understand, rather than be understood,” is sage wisdom that needs to be carried out in all aspects of your business for the best possible client acquisition and retention.

In my tenure as a sales professional, I had an opportunity to champion an online learning platform that my company would sell to their existing customer base as part of an overall IT consulting solution. It was my job to promote this valuable tool to our sales representatives and to help evangelize this new solution to the company’s existing customer base.

This online platform theoretically had everything that you would need, as an IT Director, in order to assess, plan and prescribe the correct IT training for your direct reports so that their proficiencies would increase and future IT migrations would have a greater chance of success. Sounds great doesn’t it? This solution promised to save our customers thousands of dollars in IT budget and we scrambled to get our customers on board. This solution was so comprehensive and so robust that I was flown to the solution provider’s training headquarters in D.C. in order to receive that proper training to be able to support this program internally and externally to our customers.

“Despite my seeming inability to “get it,” I was still in charge of selling it”

There was one small problem with this gigantic online platform. It was the most cumbersome and confusing thing that I had ever worked with. Despite my seeming inability to “get it,” I was still in charge of selling it so we made appointments and I did my very best Ginger Rogers impression as I tap-danced around the glaring human-centered holes that this thing had in an effort to garner interest.

Sure, we got customers on board but at a great cost. Customers flooded my department with questions and frustrations as they tried to use this product to assess their staff. Future consulting business with these customers depended on the successful use of this platform. After 6 months of trying to make this work and to find a simple workflow process for our customers to follow, we abandoned the program. Business was lost, customer retention was damaged and credibility in the marketplace was diminished.

The provider of this solution, after millions of dollars being spent, scrapped this project all together. I shudder to think of the enormous cost that was associated with this failed project for one simple reason. No one did their research and nothing about the solution was based on any qualifiable and quantifiable data from the customer base.

Guessing loses market share and kills customer loyalty. Ask them what they want and take the further step to find out why they feel those things will alleviate their pain. You will likely be pleasantly or alarmingly surprised by what you hear and how you can really make a difference for your customers. Qualitative and quantitative research leads teams to solidify what is needed to pinpoint the issues and formulate a plan so your project team can swoop in and save the day.

Resources for Further Reading


Research Design: Qualitative, Quantitative And Mixed Methods Approaches by John W. Creswell

Qualitative Inquiry and Research Design: Choosing Among Five Approaches by John W. Creswell

Lean Analytics – Alistair Croll & Ben Yoskovitz

Search Analytics for Your Site: Conversations with Your Customers – Lou Rosenfeld

Interviewing Users: How to Uncover Compelling Insights – Steve Portigal

Measuring the User Experience – Tom Tullis & Bill Albert


Qualitative versus Quantitative Research: Key Points in a Classic Debate

3 Surprises from Combining Qualitative and Quantitative Research –

What Testing Won’t Tell You – UX Booth

Interaction Design Foundation: Ethnography – Dave Randall and Mark Rouncefield

Techniques for Data Gathering – Ann Blandford

An Ethnography Primer – AIGA

Scholarly Articles

National Science Foundation: Qualitative and Quantitative Data Collection Methods

University of Wisconsin Eau Claire: Data Collection Methods