I got to hear James Richardson speak on “The end of self-serve analytics” at the Data Changemakers event recently. I’m writing this recap because I agree wholeheartedly with James’ POV and there’s no recording available of the talk. James is an Analyst at Gartner in the Business Intelligence (BI) + analytics space, and gave a great overview of his perspective on the future of BI + analytics. (Plus he got to talk a lot about Kurt Vonnegut, which is always a nice bonus!)
There were three insights that really jumped out at me, and I’ll try and summarize James’ POV more below:
- It’s not enough to make data available to your people, you must tell them data stories. And then even data stories aren’t enough; the real value of data stories is triggering conversation that leads to action.
- It’s hard for people to write good data stories, and it’s impossible for them to do it at scale. AI is necessary.
- Engagement with data is crucial, and users have high expectations for engagement and consumption experiences from their consumer lives.
What’s really exciting to me is how well these insights align with our design for our product, Lexio. Today, everyone is seeing disappointing returns on their BI initiatives because their internal users hate looking at dashboards. This is a huge problem for organizations making massive investments but seeing very little progress towards becoming a more “data-driven” organization. I’m a big believer in the size of this problem, as well as in our approach to solving it with Lexio (automated data storytelling AI + a modern UX that cribs from consumer experiences like Reddit, Twitter, and Apple News). But I’m also aware that I’m a biased observer! So it’s always nice to get such validation from an expert like James.
Most people prioritize their gut instincts over data, and sometimes that’s ok. But particularly when the environment is volatile and changing fast, our gut and instincts can lead us astray. “Volatile and changing fast” is basically the tag line for 2020, and it’s going to remain true for the business world probably forever.
So we really need to be making more of our decisions based on data, but today that’s not happening nearly as much as it should be. We can tell ourselves it’s because of low data quality, or not having a data culture, but as James’ said, “I think we’re letting ourselves off the hook a bit”.
There’s almost always a deeper issue at play: how the data is presented to a user. It’s irrelevant, it’s confusing, it looks bad, it’s hard to interpret, it’s got no insight, it’s stuck in a dashboard somewhere.
If the status quo for communicating data isn’t working, then what does work? As Daniel Kahneman says, “No one ever made a decision because of a number. They need a story.” In James’ metaphor, data stories can penetrate the “brain / data” barrier and lodge in our minds, where they can spark conversation and action.
So what are data stories? James used Brent Dykes’, author of Effective Data Storytelling, definition of data storytelling which identifies three components of a great data story:
- visualization – the kind of chart or graph you might see in a dashboard today
- narrative – a written out story that talks about what’s happening and why. James had a great push about how no visualization ever goes unexplained in a management meeting. Folks say “Tell me what I’m looking at” even though they can see the viz themselves.
- context – the information your people have that isn’t captured by your data.
But, of course, even if a data story has a great story, great viz, and the right context, it can’t actually do anything for your business. It’s just a thing. It’s your people that have conversations, that make decisions, and that take action.
What we all want is for those conversations, decisions, and actions to be informed by a deep understanding of what’s in the data. And it turns out that data stories are the best solution we have to make that happen.
Scaling with AI
If data stories are so great, then we should just do them all the time, right? They should be the main way our business understands and communicates data? The short answer is “Yes”, but James identified two big challenges that are keeping that from happening today.
The first is that it’s hard and time-consuming. James identified six(!) different steps that need to happen to create a useful data story, beginning with detecting an anomaly and ending with finally pulling the viz, story, and context together. It’s a lengthy process, and it requires analytical skills, domain knowledge, and the ability to communicate effectively. Because of this, great data storytellers are hard to find.
Even when you do have a person pull together and tell that story, they’re still affected by the cognitive biases we all have. Even when everyone is acting in good faith and doing their best to accurately tell the story, we’re all still inadvertently prone to decontextualizing information, confusing correlation with causation, and simply missing things.
James came to the same conclusion that we did at Narrative Science: turn to AI to scale data storytelling within the organization.
Delivering on engagement
Assume we now have AI generating the data stories that lead to data-driven conversations and actions across your organization. The next question is, “What’s the experience around these data stories?” Is it a dashboard, or is it something else?
James often says that the number one metric for success in analytics for your ordinary users is sustained adoption and I couldn’t agree more. So that’s what the data storytelling experience should strive for: consistent, sustained adoption by users so that the day-to-day decisions are driven by data.
So, no, the overall experience can’t be dashboards. If there’s one thing we’ve learned about dashboards in the last 20 years, it’s that they’re really poor at driving consistent, sustained adoption.
But we have learned a ton about engagement in the last 20 years from consumer apps. And as James said, applying these lessons is super important. Everyone uses incredibly engaging experiences all day long, so their bar is very high. If any data experience really wants to engage its users, then it’s going to have to do the things we know work. Recommendations. Newsfeeds. Following. Sharing. Commenting. Notifications. Beautiful. Mobile.
You need these features because you need engagement. You need engagement because you need your people on top of the data stories that impact your org. You need them on top of data stories, because data stories are the best vessel we have to actually get what’s happening in the data into someone’s head. You need what’s happening in people’s heads, because that’s how you get data-driven conversation and action. You need data-driven action so that you actually get some value out of the huge investments you’ve made in your BI and data stacks. And those data stories need to be written by AI, because there’s no way you’ll be able to hit the scale you need with hiring or training.
James’ talk was energizing for me for two reasons. The first is that Lexio stacks up really well to James’ view of the future. It’s the world’s most advanced automated data storytelling system, and today we deliver on all three components of a data story: visualization, narrative, and context. We’ve also taken the engagement side seriously and delivered a user experience that is closer to Apple News or Twitter than PowerBI or Tableau, with mobile, notifications, newsfeed, recommendations, etc.
But I was also really struck by the confidence that led him to say things like, “It is an inevitability that we move to far higher levels of automation in analytics, and we move away from the current dominant self-serve model.” With Gartner making predictions like “By 2025, data stories will be the most widespread way of consuming analytics, and 75% of stories will be automatically generated using augmented analytics techniques.”, the future of Lexio is very bright!