Data Storytelling 101

Ready to become a better data storyteller?

This is your guide to level-up and deliver data insights in a way that everyone can understand and act on. No frills. No fluff. Just actionable advice and lessons from 12 leaders in the data & analytics space.

Download your digital copy to learn how to tell a better story, sustain a data culture, differentiate yourself as an analyst, and more. 

Request my copy

Introduction

Amid the coronavirus (COVID-19) outbreak, many tradeshows and events were canceled. We wanted to do our part and continue to cultivate education for our friends, customers, and partners.

 

Thus the Data Storytelling Virtual Summit was born.

 

Data storytelling is the fastest and easiest way to empower your team to both understand and act on data through the power of stories. Data storytelling software takes your data, analyzes it, and turns it into plain-English stories. Then, those stories are sent to you in a newsfeed-style experience that ensures you get the information you need to run your business at all times.

 

We held a virtual event for analytics leaders and Revenue Operations teams. It was two days, jam packed with 18 different speakers, all aiming to bring innovators and analytics leaders together for an event focused on all things data storytelling. The Data Storytelling Virtual Summit gave attendees the opportunity to learn about industry trends, share input, and build relationships with their peers virtually. They learned how to become effective data storytellers by taking their data and turning it into insights in a way that anyone can understand.

 

We found the sessions so inspiring we decided to turn them into an ebook. And voila, here it is. The chapters in this book cover things from how to tell a better story to what the future of work will look like to how to distinguish yourself as an analyst.

Table of Contents

  1. How to Tell A Compelling Data Story with Zack Mazzoncini, Founder of Data Story Academy and Co-founder of Decisive Data
  2. Spreading Facts, Not Fear! How Data Storytelling Helps Us Understand COVID with Dan Platt, Senior Principal of Market Innovation at Narrative Science
  3. The Future of Work: Applying Artificial Intelligence to Your Data Leads to a Culture of Openness & Empowerment with Nate Nichols, Chief Scientist at Narrative Science
  4. Telling Your Data Story with the 3Vs: Vocabulary, Voice, & Vision with Scott Taylor, The Data Whisperer
  5. The Science of Video & the Metrics That Matter with Ethan Beute, Chief Evangelist at BombBomb
  6. Sustaining a Data Culture with a Distributed Team with Donald Farmer, Principal at Treehive Strategy
  7. Promoting & Governing BI Tools in your Organization with Tommy Puglia, Senior Manager BI at CompTIA
  8. What Disney Can Teach Us About Data Storytelling with Chris Wagner, Analytics Architect at Rockwell Automation
  9. Language and The Machine: Words vs Meaning with Kris Hammond, Chief Scientist & Co-Founder of Narrative Science
  10. How To Tell A Story with Larry Birnbaum, Chief Scientific Advisor and Co-Founder of Narrative Science
  11. Differentiate Yourself as a Data Analyst: The Story-First Approach to Better Dashboards with Keelin McDonell, Principal Product Manager at Convoy Inc.
  12. Maximizing the Impact of Your Data with Charles Holive, Head Monetization Strategist at Sisense

How To Tell A Compelling Data Story

Zack Mazzoncini is the Founder of Data Story Academy and a Co-Founder of Seattlebased data and analytics firm Decisive Data. Over the years Zack has helped hundreds of organizations and individuals develop data-driven cultures centered around data storytelling. Zack graduated from the University of Washington with a degree in communications, rhetoric, and public speaking. He is considered one of the most entertaining and informative speakers in the analytics industry. His audiences are inspired to find the most important stories in their data and lean into their creativity.

Most people use data in their jobs but not everyone knows how to communicate and present their insights clearly. Knowing how to tell a data story is critical to ensure people will take action on your findings. Why a story? Our brains are wired to be drawn in and connect to stories. So people will remember and take action from them.

However, there are a couple challenges when it comes to creating these stories:

• Identifying executive priorities is difficult. How do you know what is most important?

• Clear visualizations are an art. How do you design the right stories?

• Telling an effective data story is essential. How do you deliver the insights?

There are three tools Zack recommends to transform confusing data into stories that create clarity and inspire action.

1. The Blueprint

Find the right data stories

Ask yourself the following questions before even beginning the story. What questions will be answered by my story? What actions will be taken as a result of my story? What results are expected by the reader?

2. The Canvas

Design the right data stories

First, you need a purposeful layout to your story. You should follow the intuitive path and ask yourself where do the eyes lead naturally. Then there should be intuitive visuals. Aim to create quick cognitive insights with your visuals. Utilize color to draw attention to what is important. Color choice should be purposeful and draw the eye to important areas. Allow for flexibility in your story; be prepared for additional questions based on what you present. You must be able to stick and twist until the most useful story is being presented. Simplify and eliminate distractions within your story. You should allow the audience to focus on what matters and eliminate unnecessary distractions like borders or images. Never underestimate the power of concise and clear text to propel a story, this is pivotal to getting the insights across. Optimize for the stories for sharing. You should understand how the story will be used after the presentation. Do they need it on mobile? Do they want to email it?

3. The Story

Deliver the insights

The first 30 seconds of your story matter the most. Studies show we gain or lose the audience in the first 30 seconds, you must do something to grab their attention! Once you have their attention, you have to be able to keep it. Bring emotion + passion, show the group why you are interested in this data story, your passion will spread into the room. Finally, practice! Presenting and delivering compelling data stories is hard, practice will alleviate those nerves and result in a better delivery.

 

Following this three-step process will aid anyone in becoming a better data storyteller. The instinct to be a storyteller is inside of all of us already. It is time to take those inherent skills and combine them with the data that is driving our world today

Spreading Facts Not Fear! How Data Storytelling Helps Us Understand COVID

Dan Platt is the Senior Principal of Market Innovation at Narrative Science, which creates software that writes stories from data to drive understanding and results. He is the leader of Narrative Science’s Data Storytelling for Good initiative, which aims to drive positive social outcomes with the company’s technology and products. Since joining Narrative Science in 2010, he has served in a variety of roles including product management, professional services and customer education.

 

Data storytelling is not just for understanding business data. It can affect how we understand the whole world. Our lives are full of data, every time we go to the grocery store we do data analysis to price compare. Some data is easy to understand and act on, other times it can be extremely difficult.

 

Coronavirus data is an extremely relevant example of heavy data. There is simply so much data that it is impossible for humans to understand alone. It can be hard to even know where to start. Do you want to look at the bend or flattening of the curve, or the numbers of beds available or the John Hopkins live map?

 

Data storytelling can help take this dense information and make it understandable and manageable to readers. It makes data consumption easier by providing easy to read stories with the important information up front.

 

Data storytelling can also help ensure the data you are reading is not being skewed or given an angle. Our perception of data is easily changed by the way it is presented to us. Design choices affect much of what we infer from visualizations. Colors, shapes or sizes can change the apparent meaning of the data.

 

This is an uncomfortable truth as we attempt to gain on objective understanding of data that is critical to our survival and our daily lives. It can make people even more uncomfortable with data, making them feel like they’re being duped. We believe that the data we receive should be objective so that the narrative can be determined by each individual for themselves.

 

At Narrative Science we believe in the power of storytelling and that stories allow us to connect with information on a higher level. However, most of the stories surrounding COVID-19 are filled with masses of data which is not always easy to understand. It can also be incredibly difficult to know which stories are reliable. You can always take numbers and kind of either fudge them or spin them, there’s always an excuse or a story behind something. With stories from data, that doesn’t exist. The story is kind of the neutral arbiter.

 

There is a wealth of information circulating around the internet about Coronavirus and the toll it is taking on the world. However, it is hard to always find up to date and non-editorialized data. New information about the virus and its spread is coming out frequently and it can be hard to know where to look to stay informed. We provide a data story based solely on the facts, and with no agenda. Our corporate mission is to get people to use data every day in a way that makes sense for them. That is what we are trying to do with our Coronavirus dashboards. They update every day so there is always fresh data for people to use to make informed and confident decisions.

The Future of Work: Applying AI to Your Data Leads to a Culture of Openness & Empowerment

Nathan Nichols received his Ph.D. from the Intelligent Information Laboratory at Northwestern University in 2010. His research focused on “machine-generated content”, hyper-personalized stories and content created automatically for an audience of one. Since then he has worked at Narrative Science, building early versions of our authoring engine as a Principal Engineer before moving into a Product Architect role.

 

The way we work is changing and is going to continue to see significant changes for a significant time to come. These changes come from a multitude of places, one of the largest being advancements in technology. Many people have theories, and gut reactions, when they think about how technology will change the way we work. They think of jobs being obsolete and mass unemployment. I understand these fears, but I have a very different perspective.

 

What is the workplace going to look like in the future? It is going to be an environment characterized by creativity, openness and empowerment of workers. Employees will have infinitely more time to spend on tasks and projects they are passionate about. Brainstorming, collaboration and creativity will abound in the new workplace.

 

Artificial intelligence (AI) is the way toward a workplace environment like the one I just described. AI is uniquely situated to free up our time so we can focus on big picture tasks. Many of our repetitive tasks and menial work, which we often hate doing, can be done by machines.

 

There have been intense advances in technology and AI specifically in recent years. Machines are learning how to learn, they are becoming more advanced in human communication and in empathy toward humans. Let’s back up and talk about what AI is specifically.

 

AI, machine learning and deep learning are three terms that are often used interchangeably to mean similar things. There are important distinctions between the three however. AI is the overarching category which encompasses the other two concepts. When a machine or computer does something that a human could do themselves, that is AI. This could be anything from research to the computer you play against in video games.

 

Machine learning is the next level more specific. Machine learning is when a system gets better at its job the more data it gets and the more it “practices.” Think of when you mark an email as spam, the machine learning system behind your email uses that data to predict other emails that are spam. The more you mark as spam, the more accurate the machine will get.

 

Deep learning is the most specific level of AI currently and it is also the one that sounds the most scary and frightens people the most. Deep learning is a computer system modeled like a human brain. It is good for things we do unconsciously, like recognizing faces and simple speech. Siri and Alexa use deep learning models to speak to us in a way we recognize as human. The problem with deep learning is that neither the computer nor the developers can explain *why* it made a decision. This is fine for some scenarios, but really problematic in others. We’ve already seen deep learning job application systems that can’t justify why a particular applicant was accepted or rejected.

 

Now, back to how these systems will change the way we work. The main change is that we will no longer have to go to machines and work with what they give us and what they are capable of. Machines are now becoming capable of understanding and predicting what we need, when we need it. We are not adapting to them, they are adapting to us.

 

As I said before, it is not a question of if our menial tasks will be automated, it is a question of when. Without the time we spend on these tasks, we are left with time to do exploratory and creative work. We can be collaborative and discovery. To quote Hal Varian, “Automation typically does not eliminate entire job categories, but it eliminates the dull, tedious, repetitive tasks that go into making up that job.”

 

The world of work fortified by advancements in technology is going to turn every one of us workers into something like Iron Man. An individual, using his own morals, experiences and thoughts to guide the actions of superior technology that allow him to be a superhero.

Telling Your Data Story with the 3Vs: Vocabulary, Voice & Vision

Scott Taylor is “The Data Whisperer.” He has spent over two decades solving master data challenges with large global enterprises as well as guiding Tech Brand owners to leverage their reference data and taxonomy assets. In a variety of strategic marketing, GTM, innovation and consulting roles, he has worked with some of the world’s most iconic business data brands including Dun & Bradstreet, Nielsen, Microsoft, Kantar, NPD as well as start-ups such as Qoints and Spiceworks.

 

Data is unruly. It is unruly, complex, difficult, unstructured and big. It can be a beast all of its own. We can’t live without it though. So how can we make this beast manageable for the everyday worker? How can we make it so that the data isn’t overwhelming?

 

We have to figure out a way to calm down our data. We also have to figure out how to calm down the people who need to use the data. The way to do this is through a comprehensive data management program. So what is your data management story? Let’s start by talking about how we talk about data. How do we describe data? How do we explain how we use data? How do we describe why our data is important?

 

What we put into our data is what we get out of it. Just like we have to feed our body good food to get good outputs, we have to put good data in to get good analytics. There is simply no such thing as good analytics with bad data. This is where data management comes into play. Data management ensures that you are only working with good data all the time. GI GO.

 

Data management has a golden rule just like the one we learned in kindergarten. Do unto your data as you would have it to unto you. That is exactly what will always happen. Your analytics and actions can only be as good as the data you fed it. What you put in is what you get out. An acronym to remember this is “GI GO.” This stands for “garbage in, garbage out.”

 

Two Types of Data Storytelling

The two types of data storytelling are data driven stories and analytics driven stories. Data driven stories typically fall into the category of data management while analytics based stories tend to be in the business intelligence category. You must determine the truth of the story before you can derive meaning from the story. The characters in the story come from the data management side of data storytelling.

 

Data

• Data management

• About the data

• Determine truth

• Horizontal orientation

 

Analytics

• Business intelligence

• With the data

• Derive meaning

• Vertical orientation

 

The ideal narrative in a data story has a balance of sizzle and steak in the story. This also means a balance between the why and the how within the story. You need to be telling a compelling story which is also technically accurate and true.

 

3 Vs of Data Storytelling

The three Vs of data storytelling help to create an accessible data story for any audience. The 3 Vs are vocabulary, voice and vision. Vocabulary helps to establish common definitions and terms for the story. Voice is about creating a common tone and writing style within the story. Vision illuminates the needs, problems and desired outcomes of the business which will come from the story.

The Science of Video and the Metrics That Matter Most

Chief Evangelist at BombBomb, bestselling co-author of Rehumanize Your Business, and host of The Customer Experience Podcast, Ethan has collected and told personal video success stories in a variety of formats over the past decade. He’s sent more than 10,000 videos himself. Prior to joining BombBomb, he spent a dozen years leading marketing teams inside local television stations in Chicago, Grand Rapids, and Colorado Springs. He holds undergraduate and graduate degrees from the University of Michigan and UCCS in communication, psychology, and marketing. He lives in Colorado Springs with his wife and son.

 

Do you come across better in person? I can answer that for you, yes you do. We are all better face to face. It is a more natural way of communicating, it is better for you to get your message across, and more comfortable for the person you are communicating with. The best way to leverage these facts, add videos to your marketing and sales outreach.

 

Face to face communication is increasingly rare in our remote and digital worlds. It shouldn’t be. Face to face communication leads to better communication across the board. The addition of nonverbal cues, emotion, tone and pace make for an entirely different experience for the recipient. It also allows for genuine human connection, which you can’t get from an email.

 

There is also a link between video communication and higher conversion. This does not just mean sales conversion. There are many micro yesses that are needed to lead up to higher level conversion. You need people to say “yes I will open this email, yes I will take a survey.” Basically any type of interaction is a micro yes that will lead to conversion down the road.

 

These are not just messages, there are relationships on the other side of every interaction. Humans have been speaking face to face for 150 thousand years. We have only had a phonetic written language for 5,000 years. That is 300x more experience of talking face to face than any other type of communication. There is no reason to abandon that history.

 

The type of video I am referring to is not zoom videos or highly produced marketing videos. The optimal type of video to send out is an imperfect, casual video of yourself talking directly to your audience. It can be shot on an iphone or laptop. They should be made with asynchronous communication in mind.

 

There are many ways to measure the effectiveness of including videos in your outreach. There are measurable outcomes and a new metric that is specific to video. The traditional measurable outcomes are open rate, click-through-rate, email reply rate, interaction levels and many more. Outcomes that I have seen and measured show that video has an incredible impact on outreach. Some examples are: 56% increase in cold email reply, 24% more appointments set and held, 68% higher close rates and 90% of people who tried using videos cited a greater ability to stay in touch effectively.

 

The new metric which relates specifically to video is a metric called “face to face time.” This metric is a precursor to the more measurable metrics. You cannot achieve the metrics above if people do not experience you face to face. The implication of this face to face metric is that people are engaged more with your content. You have shared emotion with the viewer and they often perceive you with some authority.

 

Adding videos to your cold outreach specifically sees huge benefits. The addition of videos makes your cold outreach much warmer. It adds a level of social reciprocity for your audience. Once they have seen your face and heard your voice they feel more obligated to respond to your emails.

 

Your people are your most valuable asset, don’t forget to leverage them, their faces and their voices. Let people get to know you, your employees and your company. Get up close and personal.

Sustaining a Data Culture with a Distributed Team

Donald Farmer is an internationally respected speaker and writer with over 30 years of experience in data management and analytics. He advises investors, software vendors, and enterprises on data and analytics strategy. In addition to working with startups, Donald led teams at Microsoft and Qlik, building some of the most important and innovative analytics platforms on the market.

 

In today’s business we see a new type of data analyst emerging, quite different from the analysts I knew when I started working with data 30 years ago. This new analyst is diverse, creative and engaged, evolving their role with the developing culture of data and data analytics.

 

Analytics today is both a creative task and a technical one. The new analyst follows suit, having high levels of technical knowledge but also creativity and innovation in the way they leverage their data and how they communicate their findings. And in some ways, the role of data analyst feels more informal now. They may not even have the job title, being just the individual who loves data and can make good use of it wherever they sit in the organization.

 

Today, four characteristics distinguish the best data analysts: they have skills, tools, access and ownership. The skills are the hard skills of working with data, but today that expertise may be learned on the job informally as much as formally. Analysts have the tools to work with data, sometimes provisioned by IT, but often tools of choice, which they personally prefer to work with. They also have access, not only to data, but to people, a network of leaders, specialists and other data folks who both provide a demand and recognition for analytics. Finally, analysts have ownership. They understand their responsibility for the data, they often initiate their projects and they own the results that come out of them.

 

This ownership becomes more complicated when you work in a distributed team, with different channels of communication. Instead of the chat around the watercooler, all communication feels a bit more formal, but still needs to happen. This is especially true when we come to conversations about data. Such exchanges are critical to our work, because more than the data itself, informed conversations will make a difference.

 

And yet … people don’t trust data. They trust other people. They trust those who provide them with information, those who audit and govern the data, those who analyze and visualize and report. Conversation matter, in part, because talking through the issues improves the trust levels of everyone in the data supply chain.

 

We need to start thinking about data on an organizational level rather than an individual level. Analytics is not just a personal skill, it’s an organizational skill: indeed how we do analytics represents a critical component of our corporate culture. Having one great analyst may prove merely incidental if the rest of the organization has no way to use their output.

 

Instead of the individual analyst, build a community of practice, where different aspects of our skills and commitment come together. For a true community, we need these four elements:

 

Craft

Craft is simply your work in hand, whatever you do: your specific area of specialization. We all have our craft, whether we are analysts, accountants or administrators. A community of practice, such as a visualization community, grows upon around a shared craft.

 

Elaboration

Elaboration is a commitment to excellence in our shared craft: to do things better beyond the bare minimum. This helps push a community forward: the urge of people always wanting to be better.

 

Belonging

Belonging is having a commitment to the organization, the craft and those around you. Belonging doesn’t come from your place in the participation and commitment. If you are active and committed, you belong to the community, regardless of organizational boundaries. For those assigned to a role, but not committed, you may be in the user group, but you’ll not find yourself at the heart of the community.

 

Sharing

Sharing is a critical element, but often overlooked. Communities share: materials, resources, learning and insights. In the community of practice, sharing helps us grow by leaning on each other.

Promoting & Governing BI Tools in your Organization

Tommy Puglia is passionate about Power BI & how Data Analysis can impact decisions. Using Power BI since it was known as “Power BI Designer”, he is a Microsoft MVP and consumes whatever he can to constantly learn and apply BI for Marketing & Sales. He is also a leader of the Chicago Power BI User Group. He is also MCSA certified in BI Reporting by Microsoft.

 

BI tool adoption is critical for any organization. That is not news at this point. Nearly every company is using some form of BI tools with a regular cadence. The new hurdles for an organization after implementation is the promotion and governing of their chosen BI tools.

 

The way I go about addressing this hurdle is by using a very simple BI framework model. This model helps companies do a status check on the use of BI in their organization currently and determine any weak spots they may have. It also provides a roadmap for making their usage and adoption of BI simple and comprehensive.

 

Before I get into detail about the framework model, I have one other tip for BI adoption. Always start with a maturity model. No matter where you are in the process, take the time to do one, even if you have been using BI for years. Using a maturity statement ensures a few key things but the key is extensive documentation. The statement will include measurable goals for the project and identify the metrics that matter. It will also lay out what the roles and responsibilities are for those interacting with the BI.

 

Now for the framework, the components are:

• Components of an adoption strategy

• Roadmaps for departments and teams

• Governance and standardization

• Rollout and support strategies

 

Let’s go in depth on each of these steps one by one.

 

Adoption Strategy

This step begins with the discovery and feedback process. This can be for a new system or for evaluating a current system. Questions are the name of the game in this phase. Asking questions and building your BI strategy around key answers is what leads to successful BI programs later on. “Where are we now with reporting?” “Where do we want to be?” “Who uses reporting today?” “Who could benefit but isn’t a current user?” These are all good fundamental questions to ask yourself and others in your organization to get a baseline feel for the culture of reporting in your organization.

 

Roadmaps

Creating roadmaps is about generating good systems and trust in your systems. One important item to note here is, again, documentation. Documentation allows you to have a well oiled BI machine. To have a complex while usable system, everything needs to have a distinct purpose. Every form and report should be unique. It should have a clear purpose and a clear audience. Documenting the purpose and uses of reports means down the line people won’t be questioning the purpose of different reports.

 

This purpose driven mindset also applies to individuals within the BI system. Every person should have a clear understanding of what is happening, what the goals are and what their roles are. This goes for anyone in the organization who may come in contact with BI systems. This allows for smooth communication, action and trust that the system works as it needs to.

 

Governance

Governance is the biggest hurdle to overcome in the process. If your organization loses trust in the data, there is no easy way to gain the trust back. It can be fatal to the BI adoption process. One way to avoid this is to build a team of data champions. This is a crossfunction team of people who like and trust data. They will instill confidence in those on their native teams. Trust comes from people and data champions earn that trust. The higher up the org chart you can find these data champions, the better.

 

Internal team governance is also a critical component to success. There needs to be clear answers and structures set up so the operation runs smoothly. There needs to be clarity around the structure of files and storage of data. Make you KPIs widely known as well as who is in charge of them. Finally, have a plan for how your BI team will adjust to change, which happens quite often in this space.

 

Rollout and Support

Rollout and support can be one of the most important steps in the framework. This step comes down to education and training of others in the organization. Create a data dictionary to go along with reports and dashboards.

 

This would be a place to document common terms, phrases and metrics that not everyone might be familiar with when using dashboards and BI visualizations.

 

Try to get people excited with you about your systems. You should be excited about the BI system you are rolling out, try to get your coworkers excited too. Show them how they can be empowered by the data they will soon have access to. Prove that the BI team will be a center of excellence within the company, then set up a center of excellence where people can ask questions and make requests.

What Disney Can Teach Us About Data Storytelling

Chris Wagner is the founder of Kratos BI. Analytics Architect at Rockwell Automation & Microsoft Power BI MVP. He has experience in a wide variety of business applications and business processes with an emphasis on insurance, banking, risk, finance, marketing and driving ROI through utilization of Agile Development best practices.

 

When The Avengers movie came out, it raised 8 billion in the box office. When The Justice League premiered, it only grossed 660 million. The Avengers had sequel after sequel and prequel after prequel made. Justice League fell off after one movie. Why do the Avengers movies outsell and outperform the Justice League series in every category?

 

The answer is the story behind the films. The MCU (Avengers) characters had character arcs. They were flawed and multidimensional characters. People could relate to them, they were well-rounded characters. The Justice League had characters who were only heroes in name. We knew who they were, but they didn’t have a backstory or personalization. People couldn’t resonate with them.

 

Believe it or not, the same concepts are valid for data stories and data analytics. Data will be called on in meetings to solve problems the same way superheroes are called on to save the city. If there is too much going on in your data or data presentations, it can cause sensory overload. You can lose your audience, just like the Justice League.

 

Instead, we want to approach our data like the creators of the Marvel movies approached their characters. We want to have clearly defined characters, character arcs, and time to develop.

 

Clear Characters with Flaws

 

We have to gradually and clearly define our data, including metrics, targets, and general terminology. Additionally, we need to make sure our audience is familiar with the most important things for them.

 

This means we cannot introduce all of our data at once. We should start with a simple overview and then build to the complexity needed to answer the questions.

 

We also need to be honest when it comes to the characters in our data stories. We need to articulate the strengths, and more importantly, the weaknesses of our data. It does no one any good to hide the faults in the data. Be upfront about the limitations of the data. The questions the data can’t answer are just as important as the questions it can answer.

 

Character Arcs

Every character in an Avengers movie has a distinct character arc. This covers where they come from, who they are, and where they will go after the movie. The same goes for our data.

 

Phase 1: Leading Indicators

In this step, we have to determine what led up to both you, the analyst, asking for this specific data, and what happened in the data leading up to the particular numbers you pulled. This requires high familiarity with your metrics and an intense understanding of how they all work together.

 

Phase 2: Transformation

Here our story becomes more complicated. The more we tell our story, the more complex and refined our data will become. We don’t focus on just one singular metric or KPI in this phase. We expand on our knowledge of the numbers and metrics and build out the story around our metrics. This is where the details of the story are discovered and pulled together.

 

Phase 3: Anticipation of trailing indicators

In phase three, we determine what our data will do after this exact moment in time. Data will always change and progress. This is where we must predict what comes next in our story. Additionally, you must determine what the result or outcome of your work will be. You found the story in the data, now what? Where does it go from here?

 

Time to Develop

Finally, we have to give our data time to create. When new initiatives or projects begin, it is essential to monitor the data over time. See how it changes, reacts, and adapts to changes in your business and in the world. No character has their full story told in one day, neither does your data. Every analyst knows it is essential to look at your data over time. Now the task is to articulate this to others and get them to have a watchful eye on the story’s progression over time. Don’t forget about the sequel of your data story.

Language and the Machine: Words vs Meaning

Kris is the Chief Scientist at Narrative Science and professor of Computer Science and Journalism at Northwestern University. His areas of research include human-machine interaction, context-driven information systems and artificial intelligence.

 

Machines are our new partners, personally and professionally. Machines control immense amounts of data and need to be able to tell us about what they find within their data. The name of the game now is bridging the gap between what machines know and what we need it to understand. But how? First we need to understand the basics of how we communicate with machines and how they communicate with us.

 

The difference between how we communicate and how machines communicate is rooted in truths and deltas. When humans communicate we, usually, are communicating the truths that we have uncovered, whether on purpose or accidentally. Machines only speak in terms of facts and patterns. If asked what the weather is like, humans can say, ‘it’s a nice day out, a good day to be outside,” because they know human truths. Machines can only present the facts, “it is 80 degrees and sunny.” Then you are left to make your own inferences. Both styles of communication have their positives and negatives, which is why combining the two is the optimal solution.

 

Let’s look closer at the concept of truths in machine versus human communication. Machines are not designed to tell the truth, even machines which are designed for language. They are designed to read well and create words. They do not necessarily say things which are true or meaningful.

 

Now let’s explore the concept of deltas. Machines operate on a basis of recognizing patterns in numbers and data sets. They can then be programmed to use common phrases to express what they have found. The problem is things that stay the same are not usually what we talk about. It is not usually interesting or consequential. Machines do not recognize that deltas or differences are the significant elements within data. We talk about changes, in weather, traffic, stock prices, etc. Machines look for patterns and breaks from pattern do not fit the mold.

 

Machines and humans solve problems in very similar ways. We follow the same process:

 

• Take data

• Analyze it and then

• Figure out the facts

• Make inferences,

• Create conceptual outlines

 

Where machines and humans differ is the creation of narratives based on the inferences and facts that have been discerned. The next step to bridge the communication gap is to work to humanize machines, so that humans can be less mechanized. If we achieve this, we can act in ways that are uniquely human and we can free the information that is currently trapped within our data.

How to Tell A Story

Larry Birnbaum is a Co-Founder of Narrative Science and serves as the company’s Chief Scientific Advisor. He focuses on next-generation architecture, advanced applications and IP. In addition, Larry is Professor of Computer Science and of Journalism at Northwestern University, where he also serves as Head of the Computer Science Division.

 

Data runs our lives. It affects nearly every choice we make, whether we are aware of it or not. The amount of data in the world is growing exponentially, all the time. The sheer amounts of data out there make analysis hard for humans to do.

 

Realistically, even a little bit if data is a lot of data for a human to manage alone. It is a lot of information to take in and can be challenging. Eventually we can get results from this data, either by hand or by a machine. However, results are not easily understood by everyone, therefore putting forward another problem.

 

The solution is data storytelling. Telling results of complex analysis through stories and the written word makes data easier for the masses to understand. This is becoming a more widely accepted concept. People like the idea of having information relayed as stories. This is not the end of the journey or the questions though.

 

How to tell a story is a big question when it comes to data storytelling. It is also a huge question with many layers and dimensions to it. One of the biggest elements of how to model a story is determining how to ask the right questions to get insight from our data, which can then be made into a story.

 

So, how do we model questions? It starts with asking other questions. You need to determine what the listener or reader needs to know. These models are based on questions that the listener/ user has asked and the questions that will be asked down the line. It is typical for more questions to arise based on answers given, so the models need to predict those next questions.

 

There needs to be a logical and structured series of questions and answers, determined by the engineer and given to the model/machined. This series of questions is what makes a data story coherent. It put the data and the sentences in an order which is understandable and useful to the reader.

 

Another thing of note when modeling questions is the difference between characteristics and metrics. Characteristics are not metrics. They can overlap, but are not always the same. “Highest” and “Best” are often the same, whether it is sales this quarter, new opportunities or deals closed. They are not always the same however, this distinction can be problematic for machine learning. Highest is an objective fact. Best, on the other hand, must be related to a performance goal and the computer must know the goal and understand the relationship.

 

Finally, a list is not a story. A list generated by a machine does not answer the question of “why.” It conveys information and we can understand it. However, it lacks the reasoning and depth of a story. It only gives us surface level information. There are a lot of questions when it comes to data storytelling. As the concept becomes more widely accepted and sought after for the workplace. The questions become more technical, how to tell the story, how to ask the right questions, etc. There is still work to be done in answering these questions but the main thing is to always be thinking about the story at the end.

Differentiate Yourself as a Data Analyst: The Story-First Approach to Better Dashboards

Keelin McDonell is the former Manager of BI and Integrations at Narrative Science and presently a Principal Product Manager at Convoy Inc. She believes a dashboard with a story is simply better than a dashboard without one. Prior to joining Narrative Science in 2011, she was on the Kindle team at Amazon. Keelin also held editorial roles at The Washington Post and The New Republic after earning her BA in English from Columbia University.

 

We are entering a new era for data analysts. So much has changed for analysts over the past few years. There has been a move toward self service analytics, people are tired of waiting on analysts to get the data they need. The whole process of analytics is becoming a more independent operation. It is a smoother and more seamless process for individuals to get data and reports for themselves.

 

Data analysts now have the time to add value to their companies in new ways. They are not tied down by mundane reporting and analysis. They can spread their data wings and discover new things. The move toward AI and machine learning also aids them in this journey. They have time to spend on pure data science, not maintenance of data. They can scale their work and their message.

 

This can also be a challenge for data analysts, they are free now from their repetitive tasks, but must find a new niche for themselves in the company and in the data world. This also means they have to prove their value to the company as they make these changes. They must make the move to educate the organization on what they do and the value they add to the organization.

 

So how do data analysts optimize their roles around these new trends?

 

Start with Decisions

A normal day for a data analyst is filled with requests for “show me this data, this number, this metric.” People are often only asking for a single number or metric. It is up to the data analyst to make people step back and look at the bigger picture. Figure out what decisions the person is going to try and make with the data. Working backwards from there you can determine the best metrics or other information to look at.

 

Give Context

For a lot of people being dropped into the middle of a bunch of quantitative data can be disorienting and confusing. It also tends to not give the full picture. Data alone cannot always paint an accurate picture. Analysts need to leave space in their dashboards to add notes or information that tells the story around the data. Data leads organizations, but there is often lots of qualitative data that also must be looked at. Don’t be afraid to add that information as well.

 

Different Formats for Different Folks

Be empathetic to your audience and how they best receive data. They likely aren’t data geniuses so what works for you may not work for them. This is not to say they aren’t passionate about data or they don’t know how important it is, they may just need something extra. Putting data into language is something that is comfortable for everyone. We already use language to communicate everything else, data should be no different. We already do so more often than you may realize. Chatbots and voice assistants are perfect examples of using language to express data and information. The world expectation is that with advancing technology, communication from machines will be easier for us than ever before. Nothing is more simple than language.

 

Order Counts

Consider the theme of your dashboard when putting it together. A haphazardly put together dashboard rarely communicates what is actually in the data. Your dashboard should tell a story with the data. It should flow in a way that makes sense Order is key to showing how data affects other data. Cost of materials affects net income but there is a lot of data in between and it should be shown in order. Data overload is also a huge risk with haphazard dashboards. Information comes at a great cost and you don’t want the insights to get lost. Paying a little bit of extra attention to the organization of your dashboard can make a world of difference.

Maximizing the Impact of Your Data

Managing Director of Data Monetization and Strategy Consulting Business at Sisense. Charles is a recognized industry thought leader in Data/Insights Monetization, Business Innovation, Analytics and AI. In the last 10 years, he specialized in creating, scaling and running new Analytics Business Units generating tens of millions of incremental yearly revenue streams.

 

In my estimation, by 2030, 20% of CEOs will have a CDO background. In the past few years the position of CDO has evolved. The whole world of data as a business department has changed. In the past, analytics and data had been seen as a cost center.

 

It was critical to business but it did not have direct value. In recent years, the world of the CDO has become a profit center. The reason is digital transformation. Every company in recent years has gone through a digital transformation.

 

CDOs have become the leaders of those movements. Being data led was the only way to have a truly successful digital transformation. Now the key to success for CDO and data analysts is telling data stories. More specifically, telling the right data story in a way that is compelling to the reader. How do we ensure we are telling not only a good data story, but the right data story.

 

There are 4 components to telling the right data story. All of these components should be able to be done in 60 seconds or less, like an elevator pitch for your story. Obviously you will want a more in depth story as well, but if you can’t answer these questions in 60 seconds, you aren’t telling the right story.

 

Identify What is Good or Bad

Look in the data and determine a baseline of what the positive and negative trends are. Present these as state of the union data points. Be objective and align to known goals, benchmarks or objectives.

 

Identify the Costs of Inaction

Your data story should have a purpose, an action plan as a result, something that needs to be done. Use this time to show organization leaders the negative results of not taking action. Doing nothing is the easiest option, show them it can’t be done.

 

Tell What the Future Holds

You should then quickly be able to do trend analysis or forecasting of what the future holds based on current metrics. What will the future hold on our current path? Is that acceptable or not? What would happen if we change course?

 

Prescribe What to Do Next

Based on everything you have discovered in your analysis you should be able to give a few sentences of recommendations of what to do next. Root it in data but make it punchy and decisive. The final and most important thing to remember is where data gets its value from. Data only has value if it is driving change and decision-making. Only when there is a result, is data valuable. We are not selling data analytics, visualizations or dashboards. We are selling results. There must be an outcome for the analytics to have been worthwhile. Always keep the results at the front of your mind.

Download your digital copy