Nov 16, 2016 | Pia Opulencia
5 Reminders When Riding the Smart Data Discovery Wave
As originally seen on CIO Dive.
If you’ve spent any time in the Big Data, analytics or BI space, you’ve probably heard the analogy, “Data is the new oil.” The phrase is meant to communicate the idea that data can be incredibly valuable but only if used in the right way. Simply possessing oil, or data in this case, means little to your business - its what you do with it that counts.
Accordingly, there has been a slew of a smart data discovery tools introduced to the enterprise in an effort to monetize data. According to Gartner, smart data discovery is defined as “a next-generation data discovery capability that makes insights from advanced analytics accessible to business users or citizen data scientists.”
Essentially, these tools make it easy for the average user to find and understand the insights hidden in their data. Assorted tools offer a range of benefits including streamlining data prep and automatically highlighting important findings via visualizations or narratives. These tools can do all these things without a user ever building a model or writing an algorithm.
What else is interesting is that although new algorithms are introduced (and are replaced) frequently, the general methodology used to analyze data and drive organizational change has not changed much over the years. The biggest change has been the trend towards automation of these methodologies through the introduction of smart data discovery tools enabling organizations to scale their analytics practices.
From my perspective, informed business analysis practices are even more important with the introduction of smart data discovery tools for the enterprise. Formalizing or even revisiting best practices while adopting new tools is a must. Per our oil analogy, best practices may mean the difference between monetizing a new pipeline or an offshore disaster.
Here are 5 reminders for any organization when adopting smart data discovery tools:
1) Look for Transparency and Auditability
As the focus of many smart data discovery tools is to automate parts of the extraction, cleansing, and analysis process, customers should look out for tools that can provide governance and auditability into this process.
The ability to identify the who, what, and why data changed or explain data inconsistencies is tantamount to trust and adoption of tools within the enterprise. This is especially true of specific industries in which this information is highly regulated like Financial Services and Healthcare.
Similarly, transparency of analyses being run allows users to build trust in a system which may ease adoption of emerging tools as advanced techniques are now hidden behind a simplified experience to reach the business user. An added benefit of transparency is the ability for a tool to describe the steps it took to arrive at a specific outcome, which provides organizations with actionable insights instead of struggling to understand a “black box.”
2) Domain Expertise Continues to Add Value
As more and more business users are enabled to use smart data discovery tools in their day to day decision making process, it’s ever more important that domain expertise continues to be honed and applied. In traditional models, data teams worked with their business counterparts to define what, if anything, the data team should consider when applying various analytics to their data.
The emergence of smart data discovery tools changes the dynamic, potentially letting the tools do all the work. The opposite is actually what a business expert is freed up to do - apply his domain expertise to the results of the analysis answering questions like, “Are there active marketing campaigns impacting the analysis?” “What outliers should be removed from the data?” or “Is seasonality the reason for the outcome surfaced in the results?”
3) Document Your Assumptions (and Questions!)
Smart data discovery involves interactive exploration into different business questions. Whether the way the findings are being communicated via charts or narratives, the audience often has follow on questions that perhaps deviate from the original questions being posed. Initial questions often branch out into many questions as users explore findings and discover new questions to ask of their data.
A best practice in business reporting has been to document your assumptions to add context to results or action items. It is equally important to document what questions were being asked of the data. It’s critical to communicate your path to your outcomes so the organization can participate in analyzing those outcomes accordingly.
4) Track Actionability of the Results
As a follow on to the importance of documentation, business outcomes should be tracked so that the analysis process can be adjusted as the business changes. But what about gauging if the insights provided were actually actionable? Were the predictions the tool made accurate? Did the output result in a level of granularity that allowed for organizational change management?
Gathering feedback from the folks consuming the analytical results is critical for ongoing improvement. Tracking actionability of the output from tools in addition to the outcomes of any actions taken enable a tight feedback loop for enhancing analytic practices.
5) Involve Your Data Scientists or Seasoned Analytics Professionals in Evaluations
Before a tool is brought into the organization, it is worth involving data scientists to help evaluate them not only to build trust and ease adoption, but importantly, to ensure the promised “smarts” are sound. Their expertise in data can help an organization avoid “bad math” or techniques that are “too magical” and importantly, get a head start on any best practices or data training that the organization may need before adopting a new tool.
The data science team can also use this time to evaluate how their processes may change, looking out for integrations that output to advanced tools so analytics can be augmented or expanded (and pushed back to production) as needed.
As the market changes and brings wider adoption of tools, analytic methodologies and respective skillsets need to be augmented and re-aligned towards organizational objectives. By ensuring your organization has analytic best practices in place, regardless of the toolsets being used, teams have a better chance of deriving meaningful insights from their data.
Learn more about how Natural Language Generation helps enhance your ability to provide smart data discovery in our white paper, "The Automated Analyst: Transforming Data into Stories":