Blog Why Lexio Doesn’t Use Deep Learning to Write Stories

At Narrative Science, we’re super proud of the AI technology we’ve built. Lexio is the world’s only automated data storytelling system, and it works like this: based off of an initial high-level question and what it knows about you, the system begins exploring the data on your behalf, chasing down leads, looking for drivers, and understanding possible impacts for the future. Then it pulls all of that information together into a well-structured story, writes it out in language, and chooses suitable visualizations to complete the story; Lexio then makes that story available for the user to read, share, and act on. The whole process takes a few seconds.

When people hear what our tech can do, they often assume that it must be built with some of that newfangled machine learning (ML) or deep learning (DL) that everybody’s talking about.

Nope! While Lexio is definitely powered by advanced AI–we’ve got 45 patents on it with 40 more pending–it doesn’t use a bit of ML to write its stories. This is an intentional decision we made early in development and one we continue to feel great about.

In this post, I’m going to talk about why Lexio doesn’t use machine learning or deep learning to write its stories, and what we use instead. Then I’ll go into our model of how we leverage ML pervasively throughout the broader Lexio experience to deliver personalization and user value without sacrificing our commitment to accurate, reliable stories.

Our approach for story generation

Story generation falls under a branch of artificial intelligence called Natural Language Processing, or NLP. NLP is one of the hottest areas in AI right now and has seen massive progress in the last few years. In particular, OpenAI released GPT-3 in 2020 and made a huge splash. GPT-3 is the current reigning king of a class of models called “Transformers”. (Not the Optimus Prime kind, sadly!)

These models all work similarly and use deep learning. At the highest level, GPT-3 and similar models are trained on huge amounts of text pulled from the internet. Once the training is complete, you give these models a “prompt” and they fill in the “rest”, in order to create a story that statistically looks like all the other things the model has read online.

For instance, you could start with a prompt of “Once upon a time”, which GPT-3 might complete with the rest of a story: “… there was a brilliant princess who lived in a magic castle. One day, she…” You’ve experienced this kind of interaction whenever Gmail suggests the rest of the sentence you’re typing. Behind the scenes, you’re giving a prompt of the first part of your sentence, and a Transformer is predicting the rest of the sentence based on all the other text it’s seen.

Transformers are just like any other technology: they have tradeoffs that make them appropriate for some use cases and inappropriate for others.

In particular, transformer-based systems like GPT-3 work great for use cases where breadth and coverage are critical, and control and accuracy are not. These systems can produce basically any kind of text in any kind of context, so they can cover almost any use case. But there are only limited ways to control the output, and there’s absolutely no expectation that the stories are accurate or true.

You can’t expect systems like GPT-3 to be accurate because they have no access to a collection of knowledge about the world; they don’t know anything about calendars, time, mathematics, data, or anything else that isn’t text from the internet.

If you ask a person to complete the sentence “Today’s date is…”, they would look to a calendar for the answer. But GPT-3 doesn’t know about calendars. So while it’s happy to complete that sentence for you, it will do so with an essentially random date. It’s just as likely to say, “Today’s date is August 13, 2008”, “Today’s date is September 26, 1973”, or “Today’s date is July 26, 2021”.

You see the same screwiness in the BI context. GPT-3 will complete the sentence, “Last week, total sales were…”. But the number it fills in won’t come from a database or any analysis; it’ll just be made up by the system. GPT-3 might say “Last week, total sales were $42k”, “Last week, total sales were $42b”, or even “Last week, total sales were much higher than expected.”

The text output by these Transformer-based systems usually “looks right” (in the sense that it’s statistically similar to most text seen on the internet), but just looking right and actually being right are very different.

For Lexio, being right is absolutely critical, because the whole point of Lexio is giving users a better sense of what’s happening in the data so they can make better decisions day-to-day. For that to happen, our readers need to trust that Lexio’s stories always accurately reflect the data. If Transformers generated data stories that were 90% accurate, it would be really impressive; if Lexio generated stories that were 90% accurate, it would be a complete deal breaker for our customers.

So we’ve made the commitment that Lexio’s stories always 100% accurately reflect what’s in the data. This requirement has a huge impact on the approaches we can take to generate Lexio’s stories. Rather than building out a Transformer-based system, we’ve taken an algorithmic approach that relies heavily on knowledge and semantics.

In short, Lexio knows a lot about BI; it knows a lot about what’s in the customer’s data; and it knows a lot about how to tell a data story. Because of everything Lexio knows, we can guarantee the accuracy of Lexio’s stories: the numbers in stories don’t just look right, they actually are right.

Delivering customer value through ML

But Lexio is much more than a piece of technology that can generate accurate data stories: it’s an entire consumer-facing app that was built to engage users and win a spot on their home screen. To deliver on this goal, we’ve built a simple, proactive, and personalized user experience that uses ML in a variety of ways to deliver the modern experience that consumers expect.

In this way, we’re able to have our machine learning cake and eat it too. Our stories are generated with an ML-free, knowledge-heavy approach that guarantees their accuracy and reliability. But we still use ML throughout the rest of the Lexio experience to deliver big user value.

Best of all, Lexio’s knowledge-heavy approach allows us to scale our ML in a way that other BI companies can’t. That’s because Lexio’s knowledge is global. That means we can aggregate ML across Lexio, above the vagaries of any particular customer’s data. Customer’s data is never shared or exposed to other customers; but the necessary configuration and user interactions are used for training models that make Lexio better in the future for all customers and users.

Lexio learns probabilities and patterns from how customers and users interact with it, and applies what it learns to two main aspects of the user experience.

1) Focusing the user

The first aspect is using ML to ensure that our readers are aware of the data and stories that matter most to them. Lexio is not a search-based experience; users are busy and distracted, and we don’t think they should have to come up with just the right question to ask. Metaphorically speaking, we’re building Lexio to be a friendly and knowledgeable sommelier, not a huge list of expensive, intimidating wines.

So, like a good human analyst, Lexio “leans in” and proactively focuses the user’s attention on what is most important. This idea of focusing the user applies at the macro and micro level, from triggering a push alert when some story needs their attention to including a short personalized sentence in a broader story so the reader can understand how their efforts contributed to the company’s broader success.

These are all essentially optimization problems, which are often a perfect fit for machine learning. As we continue building out Lexio, we will use ML in the near-term for capabilities like:

  • Ordering stories in the Newsfeed based on what stories have been read in the past
  • Triggering mobile push alerts when Lexio has high confidence in a user’s interest in a story
  • Surfacing follow-up questions based on common exploration “paths” of the data
  • Predicting which analytics (and which results) are most likely to be valuable to a reader

By using ML to adapt to a user’s behavior, as well as the behavior of their coworkers and peers at other companies, we ensure that Lexio is viewed as both useful and respectful of a user’s time and attention.

2) Lowering time-to-value

The second big aspect of the UX where ML is invaluable is in lowering the configuration effort required. Before Lexio can write insightful data stories for a particular customer, Lexio needs to know some things about the organization and their data. This configuration work is typically done by an analyst with knowledge of the data, and includes information like:

  • What do different columns in the data represent?
    • E.g. the “sp_name” column represents Salespeople.
  • What are the different relationships between columns in the data?
    • E.g. revenue is an input to profit.
  • What are different areas of the business and how do people evaluate them?
    • E.g. An area is “West Coast sales” and people primarily evaluate it by # of deals, total bookings, and open pipeline.

It’s our goal to automate as much of this kind of configuration as possible. While the analyst will always have ultimate control, we want them to be reviewing Lexio’s best guesses and correcting where appropriate.

These kinds of challenges are a perfect fit for machine learning, and we’re already using ML to provide strong “first guesses” from Lexio that an analyst can overrule if necessary. For instance, Lexio will automatically match columns in the customer’s data to elements of our global Knowledge Base, as well as suggesting preferred metrics to new users based on the preferred metrics of existing users within the organization. We will continue to invest in using ML to drive down the amount of upfront configuration required so that our customers can begin reading insightful data stories as quickly as possible.

Because of Lexio’s global knowledge, our machine learning efforts in the UX are able to learn from one customer and apply it for the next. That means when Lexio is ordering your Newsfeed, it’s applying what it’s learned from you, from other users in your org, and from every other user in Lexio about what stories and insights matter most. When Lexio is suggesting Entity Types for the columns in your data, or the Relationships that hold between them, it’s leveraging what it’s learned from how other customers have mapped the columns in their data. When a new customer signs up with Lexio, they’re not getting a blank slate or a new system they have to build out and configure from scratch; they’re buying an intelligent system that has already learned much from all of our current customers and users and gets smarter whenever someone interacts with it.

Our vision for Lexio

At its core, Lexio uses an algorithmic, knowledge-heavy, non-ML approach to consistently generate accurate data stories. Surrounding that core is a modern UX with a ton of optimization problems that we are tackling with ML. And it’s all backed by a global knowledge base that lets Lexio think, write, and learn with commonly understood ideas (Revenue, Sales Pipelines, Products, etc.) instead of the quirks of whatever data schema a customer is using.

And that is ultimately our vision for Lexio as an intelligent system: rock-solid, reliable and trustworthy data stories with a thick layer of scalable ML optimization on top to make configuration simple and the experience super valuable and engaging to our customers. We’ve laid the foundation, we’re making rapid progress, and we’re super excited about the accurate insights and experiences Lexio will be providing our readers in the coming months and years!

Learn more about Lexio.

Our Engineering Team is Growing!

See Open Positions