Edward Tufte’s principles of data-Ink ratio have prevailed in data visualization since they were introduced in the 1980s. His theory has imposed a tendency towards a minimalistic style, defining excellence as clarity, precision, and efficiency and reducing the time users perceive information.
Meanwhile, academic research that has put the American pioneering statistician’s teachings to test does not show the linear relationship between data-ink ratio and visualization’s excellency. Further research shed light on other important criteria that Tufte overlooked, like the ability of a visualization to induce attention, memorability, and engagement. Overall, the academic body of literature has strongly suggested that no simple rule like data-ink ratio can suffice in data designing.
Debates among practitioners have been ongoing about the repeated notion of “less is more,” which leans back on Tufte’s teachings. Some believe that simplicity and quick perceiving should be the goals of all visualizations at all times. Others support embracing complexity and slow viewing time in some circumstances.
As a response to these debates, two interesting frameworks have emerged to suggest more criteria that should be considered. The first is “Levers of Chart-Making” by Andy Cotgreave, a senior data evangelist at Tableau, and the second is “Cognitive Load as a Guide” by Eva Sibinga and Erin Waldron, data science and visualization specialists.
Cotgreave suggested this under-formulation framework in the November 2022 edition of his newsletter The Sweet Spot. He put forward five scales of levers that “chart producers can use to enlighten, not bamboozle.” They are as follows:
Speed to primary insight – How fast or slow insight is intended to be extracted from a graph According to him, “it is ok to make charts that take time to understand”.
Granularity – How sparse or granular is the data that a chart intends to show?
Explore or explain – whether a visualization is intended to give the users the opportunity to explore the data themselves (like self-service dashboards) or to be accompanied by an explaining presentation
Dry or emotional – refers to how serious the way of presenting the data is versus how informal and relevant it is to non-data people. According to Cotgreave, an example of the serious approach is a normal column chart and for the emotional, a necklace of which the bead’s size represents the same underlying data.
Ambiguity vs. accuracy – For Cotgreave, there can be intended ambiguity in chart-making instead of clear accuracy.
Cognitive load is a more detailed and rigid framework that takes its inspiration from the psychology of instructional design. Suggested by Sibinga and Waldron, the framework was published by the Journal of the Data Visualization Society (Nightingale) in September 2021.
Cognitive load proposes 12 spectra, offering “an alternative to one-size-fits-all rules” and aiming to “encourage a more nuanced strategy” for data visualization. Divided into three categories, the spectra are supposed to “gauge the complexity of our data on one side, identify the needs of our audience on the other, and then calibrate our visualization to successfully bridge the gap between the two.”
Intrinsic load – This is the first group of spectra that is concerned with the data itself. It considers the inherent level of complexity in the data that a designer is tasked to explain with a visualization. The included spectra are:
Measurement (quantitative vs. qualitative) – According to the authors, quantitative data has less cognitive load (easier to perceive) than qualitative data. That is because the former usually has obvious measuring units, like dollars or miles, while the latter usually needs a conceptual rating scale, like satisfaction rate from 1 to 5.
Knowability (certain vs. uncertain) – Data collected from the whole population is easier to perceive than data estimated depending on a sample or predicted for the future. This is because the former usually has a high level of certainty that is easier to perceive than the uncertainty that comes with the latter, intertwined with its inevitable statistical margins of error.
Specificity (precise vs. ambiguous) – Undebated data categories, like blood type or zip codes, tend to be easier to perceive than socially determined concepts, like gender, race, and social class.
Relatability (concrete vs. abstract) – How relatable is the data to what humans see in everyday life? Concrete data would be small numbers like the cost of lunch and one’s age, while abstract data would be conceptual ones like GDP and the age of the earth.
Germane load – The second group of spectra is concerned with the audience and how ready they are to process the new information shown by a visualization. The included spectra are:
Connection (intentional vs. coincidental) – How will the audience have the first look at the visualization? Intentional viewers are likely better propped to perceive the visualization than viewers who stumble upon it by accident.
Pace (slow vs. fast) – Slow viewers are the ones that have more time in hand and therefore +have more ability to perceive a visualization (interpreting into lighter cognitive load).
Knowledge (expert vs. novice) – Expert viewers are the ones who are already familiar with the subject and therefore will have to afford lighter cognitive load when viewing a visualization.
Confidence (confident vs. anxious) – This spectrum addresses the intersection of the audience and the data reporting format. The cognitive load that is required from an audience familiar with the data reporting format, such as an interactive dashboard and a data-based report, will require lighter cognitive load than the one that is encountering such a channel for the first time.
Extraneous load – The final group addresses how new information is presented. The authors believe that these are the criteria where a designer has the most control and should therefore be considered last. The advice to determine a visualization’s place on the following spectra is by answering the question: “Given the existing intrinsic and germane loads, how much more cognitive load are we comfortable adding to the mix?”
Chart type (common vs. rare) – Chart types like bar charts need lighter cognitive load than uncommon ones, like violin charts or rose diagrams and the more innovative ones.
Interpretation (accurate vs. approximate) – Does the chart aim to deliver precise values or paint a wide picture? According to the authors, charts delivering specific values tend to take a lighter cognitive load than the ones dealing with overall objectives.
Composition (concise vs. detailed) – This spectrum assumes a high data-ink ratio and no chartjunk (from Tufte’s concepts) are already in place and then asks, how dense is the information on the page? Less dense visualizations require lighter cognitive load.
Delivery (explanatory vs. exploratory) – Does the data report explain itself, or is built to be explored? Exploration, naturally, takes more cognitive load than a self-explaining visualization.
How to make sense of all the previous discussions
Levers of chart-making and cognitive load as a guide are two of the recently suggested frameworks that offer a more complex approach to the task of data visualization. The two have similarities, like their consideration of complexity, granularity, and way of delivery. They differ from Tufte’s approach mainly through their acceptance of the need to slowly perceive designs in some circumstances. Cognitive load still deliberately pre-assumes applying data-ink ratio principles beforehand.
Therefore, no framework is likely to totally replace the others. At best, they tend to complement each other to cover the vast territory of the data visualization domain.
Data-ink ratio principles remain a good point to start as it best fits most business contexts. It can also help designers keep in mind the point of their design and avoid getting distracted amidst all the available software tools today. However, considering the emerging frameworks can make the practice more nuanced for tackling different needs, messages, and audiences.
The final determinant of how to incorporate the three frameworks -and any other emerging ones- in practice will remain to be the context of the visualization. A better understanding of the audience, the message, and the medium is key before using the different frameworks to decide on how information should be delivered.
Today’s organizations drown in information waves. When leveraging data for actionable insights needed to drive strategic decision-making and sound performance measurement, visualization makes that data comprehensible and accessible. Specifically, key performance indicator (KPI) data visualization aims to communicate key performance metrics and trends in a way that is clear, concise, and impactful.
KPI data visualization benefits for organizations
KPI data visualization offers a multitude of benefits for organizations seeking to make data-driven decisions:
Enhanced data understanding: Visualizing KPIs makes it easier and faster to grasp complex data sets, identify patterns, and uncover hidden trends that would otherwise remain obscured in numerical form. KPI visualization provides insights regarding the entity’s current situation and helps a better understanding of the market.
Improved decision-making: Providing a clear and concise overview of key performance metrics, empowers decision-makers as KPI data visualization prioritizes evidence rather than intuition.
Effective communication and collaboration: Visual representations of KPIs facilitate effective communication and collaboration across teams by enabling stakeholders to share insights, align strategies, and achieve desirable goals. Additionally, KPI data visualization fosters accountability by transparently tracking performance against established goals, motivating individuals and teams to take ownership of their results, and promoting a data-driven culture within organizations to encourage data-informed decision-making at all levels.
Popular formats for KPI data visualization
The art of data visualization lies in presenting complex information in an informative and engaging way for all stakeholders. The most popular and effective techniques are as follows:
Charts and graphs: Bar charts and line graphs are effective ways to show trends and comparisons. Bar charts are effective in category comparison within a single measure. The line graph is mostly used to visualize changes in one value relative to another.
Maps and heatmaps: These visual tools are perfect for showcasing geographical data and identifying areas of concentration or dispersion.
Dashboards: Combining multiple visualizations on a single screen provides a comprehensive overview of KPIs (see Figure 1).
Figure 1. An example of medical center management performance dashboard | Source: The KPI Institute (2023), Medical Practice Dashboard
Major principles for effective KPI data visualization
Clarity and simplicity: Prioritize clarity and simplicity in data visualizations by avoiding cluttered charts and excessive complexity that may obscure insights.
Contextualization: Provide context for visualized KPIs by including relevant information, such as benchmarks, targets, and historical trends.
Visual Hierarchy: Establish a clear visual hierarchy to guide the viewer’s attention towards the most important KPIs and trends.
Storytelling: Utilize data visualizations to tell a compelling story, highlighting key insights and communicating performance trends effectively.
KPI data visualization has emerged as a transformative tool to support organizations in extracting meaningful insights from their vast data repositories. The first move for effective KPI data visualization is to embrace data culture across all organizational levels. The second step is to determine data constraints, such as the type of data, the number of variables, and the type of pattern one is trying to show (comparison, part-to-whole, hierarchy, etc.).
If you want to achieve effective KPI visual representations to support the decision-making process,? sign up for The KPI Institute’s Certified Data Visualization Professional course.
Gone are the days when analyzing and visualizing data to get information was a job that was limited to the IT and business intelligence (BI) divisions. Gone also are the days when the sole possession of knowledge, skills, and tools for data processing was in the hands of the “data guy.”
Data is becoming more and more abundant and essential for various business operations. This makes centralizing data processing on one or two divisions an inevitable bottleneck. On the other hand, analytics and visualization tools are becoming easier to use, with more intuitive user-friendly interfaces that require less and less technical expertise.
What SSBI Is About
Self-service business intelligence (SSBI), also called self-service data exploration, has become an important approach for data-driven insights in business. It means giving the ability to the wide range of employees who are not experienced with data to drive insights from relevant datasets and create exploratory visualizations to help them better understand the data and to use it in reports. It’s also a part of what is called data democratization if you’d like another fancy term on the plate.
It should be, however, distinguished from the second approach called dashboarding. While the latter should still be the responsibility of the experienced BI team, turning amounts of data to finely curated reports on the most important KPIs within a well-developed narrative can happen. The SSBI approach aims to:
Avoid time delays in data-driven decision making among the low and mid-level teams that may happen due to the centralization of analytics responsibilities.
Minimize intuition-based decisions that can be made by low and mid-level teams on a daily basis due to lack of analytical capabilities.
Enhance internal communication within the teams by making data-driven insights and visualizations easier to generate, and therefore more frequent integration of reports.
Enhance external communication of the organization as the insights and visualizations can also be easily used in developing publications, like blog posts for example.
Google Sheets and Datawrapper
There are tons of visualization tools out there that can enable you to create an SSBI system for your organization, some of which are technologically advanced, but each has its best uses and downfalls.
Just like Google Sheets and Datawrapper. The advantages of using these tools are the following:
– Businesses with no capabilities of experienced teams or infrastructure can implement the system.
– Anyone can use it as it requires little to no technical expertise.
– Visualizations can be easily duplicated and edited, suiting fast-based work routines.
– Visualizations can be easily well-formatted and laid out, leading to efficient reporting.
– Generate both interactive and static visualizations that are suitable for embedding in various forms of reports, from web-based all the way to paper-based.
Using a self-service BI solution can help streamline operations and support critical decisions. It also encourages collaboration, simplifies daily business needs, and increases one’s competitive advantage. With the efficiency brought by SSBI, businesses can focus on what matters most to them.
Want to understand how visual representations can support the decision making process and allow quick transmission of information? Sign up for The KPI Institute’s Data Visualization Certification course.
We deal with data every day, especially at work. It can fuel our decisions and change the way we work. At the same time, if we’re surrounded by a huge amount of data, we may not find it easy to arrive at an optimal decision. This is where data visualization comes in.
Data visualization refers to the graphical representation of the data. It makes large amounts of information easier to understand and helps identify patterns and trends. People can easily comprehend information and make conclusions through data visualization.
“Graphical excellence is that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space,” wrote American statistician Edward R. Tuffe, author of the book “The Visual Display of Quantitative Information.”
Understanding how to approach data visualization allows people to equip themselves with the right tools, approach, and strategies as they gather data and present them visually. This is important to businesses who want to understand consumer behavior patterns or governments seeking data-backed insights on a crisis.
Data visualization may be considered a science because it is a process and represents data methodically and accurately. Data visualization begins with volumes of information, undergoes an intensive cleaning, classification, statistical and mathematical modeling, analysis, and design process, and ends with a visualization.
On the other hand, many argue that data visualization is a language because it uses diagrams to convey meaning. Data is encoded into symbology and semiology. The syntax and conventions of these diagrams are not inherent and must be learned.
Data visualization helps to communicate analytics results in pictures. In simple words, data visualization is the language of images. That is on par with the language of words both written and spoken and with the language of numbers and statistics.
Merging science and language
Science and language do not have to invalidate each other. Their elements can go hand in hand in data visualization.
In data visualization, the challenge is how to make more people take interest in scientifically processed data. Presenting appropriate and relevant information in an engaging format through design is what makes data visualization successful. Science processes and provides information based on certain objectives while design is a form of communication shaped by visual elements.
Combined, scientific data and design can generate meaning out of raw data. The end result of data visualization is almost always a story. In storytelling, the plot (design) won’t be able to progress without the characters (scientific data) and vice versa.
Ensuring that graphs and charts present meaningful results is important now more than ever. In MicroStrategy’s “2018 Global State of Enterprise Analytics,” 63% of data-driven organizations said that implementing analytics initiatives led to high efficiency and productivity while 57% said they became more effective in decision making.
With this, the challenge for organizations is to know how to structure, format, and present their graphical data that will allow them to make faster business decisions. Sign up for The KPI Institute’s Certified Data Visualization Professional course to learn the fundamentals of creating visual representations, the most effective layouts, channel selection, and reporting best practices.
When it comes to our bodies, there is a rich and complex amount of data available for us today, which allows us to make better decisions about our health.