You know that feeling when you spend a lot of money on a nice shirt, you wear it once, wash it, and then it’s never the same? I know, we’ve all been there. While we could say a lot about actual fast fashion, let us instead spend the next three minutes using fast fashion as one giant, extended metaphor to explain the importance of vetting the underlying data used for market research and analysis.

Good data creates good reports

Regardless of who is doing the analysis, we can all agree that any data visualization is better when it’s built using better data. It sounds obvious: good data generates good reports. This concept is not new (here’s looking at you, “Better ingredients, better pizza, Papa John’s”!) but it is a new one when applied to price transparency data. We’ve yammered on about how complex price transparency data can be, but what about when you just want that data to tell you a couple of straightforward things? How do you know that the data that went into your easy-to-read report was any good in the first place? Well, if data were a t-shirt, you could check the label inside the shirt to find out precisely what fabric was used to make your shirt. You probably know what fabric is generally good (100% cotton, for instance) and what is generally poor quality (like 100% polyester).

Sadly, it’s not that easy with data. Beyond egregious errors, how do you gut-check the data foundational to the reports you’re reviewing?

Work with your data partner to ensure the data is as good as it can be

The best way to know the health of the data informing your reports is to ask how your data partner works to enrich and clean their data. Take Turquoise, for instance: anyone can come to Rate Analytics, choose a few providers and a few payers, provide their codes of interest, and within an hour, have a report in hand.

But how do they know that the data informing those snazzy visualizations is worth visualizing? How do you know if you put good in and got good out? (Thanks to Minute Maid for that lil phrase!) We can’t speak for every other hotshot price transparency company, but for us, we’ve worked really hard to ensure that our data is in the best possible shape it can be.

Here are a couple of ways we’ve done this:

In addition to being transparent (ha, puns!) about how we work to improve our data, we also give users direct access to their visualizations’ source data. This is cool for many reasons but for the sake of our metaphor, think of this like being able to both read the label of the t-shirt and feel the fabric yourself.

Then, make sure it will last more than a couple of washes

So you have a better idea of the quality of the data informing your reports, great! Lastly, make sure that the capabilities of those reports are ones that will work as a lasting solution. If you’re working with a visualization tool that doesn’t allow you to upload utilization data, it likely won’t be a tool that you can use long-term—eventually, you’ll end up supplementing with something else. So, make a list of all the things you need this tool to do for you. For example, with Rate Analytics, you can quickly run high-level analysis across entire hospitals, look through inpatient/outpatient rates, or dive into code categories and individual codes for more granularity. You don’t need technical expertise to run reports and get the answers you need. This is the ideal solution for organizations who want a wide variety of visualization options without needing to consult an analytics team first. We want you to confirm for yourself that our data is enhanced with outlier flags, network mapping, Medicare reference pricing, clear network labeling, and provider demographics (aka, feel the fabric and read the label).

So, there you have it! Fast-fashioned reports be gone and long live sustainable means of getting the job done.