By Emily Anthony
Part 2 of our 3 part Data Integrity Series
ORI was recently at the Capital Data Summit in Washington DC. The summit brought over 300 attendees from the DC metro region and beyond to discuss innovations in data and analytics. We heard from true “big data” gurus, who not only shared their findings and insights on the impact of the data explosion in the last decade but also helped us look ahead to see where data and analytics is headed into the next several years.
If you missed out on part one of our three-part data integrity series, we took a look at some ways that data integrity impacts a firm, and ways to identify missing data elements that matter to your organization. Now, we want to explore how to successfully build a culture of quality data, pulling some important revelations from these keynote speakers.
Here are some things they had to say:
You Need to Have Employees that Understand Data
(even if they’re not data Scientists)
According to the Harvard Business Review, companies now more than ever are hiring CDOs – or Chief Data Officers – in order to remain competitive in the marketplace. That being said, it’s not always easy for an organization to get used to the idea of investing in large amounts of data analysis and carrying that extra overhead.
If a company believes a dedicated “data person” is too much or unnecessary in their business model, key decision makers still need to have a data strategy. Outsourcing the data analysis can help match limited resources with intermittent data requirements. Regardless of the model, when executives (and their employees) know how to leverage the data in their organization in order to make better decisions, this can propel success forward.
Besides knowing about big data, there are a few important things to consider when using data to make decisions. We’ll start with the “V”s (rather than our “ps and qs”):
When in Doubt, follow the 5 “V”’s
If you’re not really sure where to begin when it comes to working and understanding big data, don’t stress. According to Bernard Marr, a leading expert on data and analytics, big data can be best described with 5 “V”’s (the last being the most important):
- Volume – This alludes to the huge swaths of data created every day. IBM estimated that we create about 2.5 quintillion bytes of data each and every day.
- Velocity – This is the speed that data is generated, and how quickly it moves from one point to another. For example, Google now processes roughly 40,000 queries every second of every day! That’s fast.
- Variety – This has to do with the various forms of data now available to use. Examples of this are structured versus unstructured data.
- Veracity – This is an important one – This is how reliable the data is. Is your data “messy” or “clean”?
- Value – This is the most important one. Are you able to take the data you’ve gathered and turn it into useful information? (For more info on this, check out a video we’ve posted about it here.)
These five “V”s can help you and your team truly understand what data challenge you are trying to solve and in turn result in a strong data strategy. Do we truly have “big data” or is it that our data is “big enough” that we cannot analyze it easily or efficiently? Perhaps it’s not a volume issue, but veracity is a problem. ORI often hears from organizations that have implemented a new data visualization tool like Tableau, only to discover that the data they can now see with clarity is wrong in reality.
Building a culture of quality data is a key step towards a customer-centric culture where every decision incorporates a customer perspective – stay tuned for the final blog in this series on Overcoming Silos to Strengthen Customer Outcomes.