Posted by Andrew Sumner
29 Nov 2017

By nature we are visual creatures, we find it much easier to perceive patterns in images than in raw data - and I’m not just talking about the Marketing team.


Most corporations have many years of detailed information stored away in fortress-like databases, often never to be looked at again. It makes sense that Data Visualisation is used to bring this data to the foreground, unlock its secrets and make informed decisions off the back of it.


At the risk of using more buzzwords, utilising "big data" to create actionable "business intelligence" is most certainly harder than it looks.


Complex Loss Prevention Solution


We all know how effective Data Visualisation should be in accomplishing this, but over many years of working with a variety of businesses and government departments, my experience is that very few come close to analysing their data as effectively as they should. Some don’t visualise their data at all, while others create charts in programs such as Excel, usually in an isolated & sporadic way which is only scratching the surface of what’s possible.


The result is that all too often, data keeps its secrets hidden away from view and key executive decisions are made in the absence of vital business intelligence.


Why is this so? Put simply, it’s because visualising data effectively is much harder than most people expect. I’ve met many managers who say the equivalent of “come on, it can’t be that hard”, believing that the data just needs to be thrown into a charting software package and out spill all manner of graphs telling you everything you need to know.


our data visualisation analyst






Unfortunately, the reality is quite different. There are several stumbling blocks that can easily derail a Data Visualisation project. Let’s imagine a project where a senior manager has appointed a staff member - I’ll call her “Alison” - to perform a deep dive into company data to identify opportunities or risks.



Firstly, she will need to decide on which charting software package to use. There are many such packages available now, and working out which provides the best functionality is not easy. Alison will also need to choose software that matches her computer skill-set, otherwise, considerable time could be wasted learning how to use a new product, but balanced against this is the danger of choosing simpler software with limited capabilities. Mistakes made at this stage could jeopardise the entire project.



Next, the data has to come from somewhere - usually, it’s one, two, three or more corporate databases with different file structures. To get access to this data, Alison will nearly always need to involve the IT department, who of course have a busy schedule of their own, have little understanding or appreciation of the type of analysis to be performed and which data would be appropriate. Usually, the best IT will offer is a simple extract of data, hopefully with nothing missing or corrupted, and in a portable format such as CSV.


At this point, Alison will need to display a significant level of technical skill. The data from IT may need to be joined with data from other sources, such as spreadsheets. It may also need to be manipulated in various ways, such as the cleaning of test values to ensure they match up properly or calculating statistics to be displayed in the charting tool. To do this, she may need to use a staging database and write SQL scripts.


Basic heat map for loss prevention








Next up, the charting software will need to be told the structure of the data in her staging database, including the type of each field and how multiple files are linked together. Alison will actually need the skill level of a Database Administrator to pull this off.


Lastly, the charting software itself will often need to perform complex calculations in order reveal answers to questions asked of the data. This will come as no surprise to anyone who’s pushed the boundaries of tools like Microsoft Excel. If the selected charting package fails to provide pivot-table, join, filtering, aggregation and summary calculations, its graphs will fall short of the mark.


Along with strong technical skills, Alison will need a thorough knowledge of her organisation’s business requirements and an imaginative analytical mind. If she doesn't know what questions to ask of the data, or how to use the charting tool to pose those questions, her charts will ultimately be irrelevant and the whole project will have accomplished nothing. On the other hand, If she has a thorough knowledge of their business but lacks the ability to work with the data and program complex queries, she’ll end up frustrated, able to only produce simplistic charts that fail to unlock anything of value.


Simple Graph - But what does it mean for Loss Prevention?



Our analyst needs a mix of technical skills, business knowledge and analytical ability. The success or failure of Data Visualisation really does come down to the quality and experience of the people working on the project. The fact is that most organisations don’t have many people with that mix of skills and experience, or when they do, those people are often already far too busy to work on such a project.


I’ve come to the conclusion that the best way to succeed at visual data analysis is to involve consultants who specialise in the field. At NetMap Analytics, we not only offer powerful charting software, we also have decades of experience in solving the technical issues and working alongside our clients to design customised chart dashboards that directly address business concerns across a wide range of applications, including insurance, marketing, loss prevention, risk management, logistics, retail sales analysis and more. If your business could be made more productive, more efficient and more competitive through better analysis of your own data, why not drop us a line?

One thought on “Data Visualisation – It’s Harder Than It Looks!”

    Michael Hammer

    Great article, totally matches what I see in the business world. However I think it should be noted that the “getting the data” term always makes people think raw files are enough. Real data integration takes a significant time, maybe 3-8 workdays of a very skilled ETL Enginner to really automate in a reliable manner which will establish trust. The other steps combined might take half this time. ETL and Data Integration are truely where the “rubber hits the road”

Leave a Reply

Your email address will not be published. Required fields are marked *

89 − = 80

Let us demonstrate the power of NetMap Analytics