5 tips how to improve data quality in aviation and why it is important
Data quality, a vague term that always throws sand into the tight and seemingly perfect schedule of your data heavy projects. Commonly it is quantified by the amount of data that is missing or where there are data consistency issues, another of those vague terms.
In plain language, the consensus is to focus on visible errors, like the previous word, issues that are easily identifiable.
However, this is only the tip of the metaphorical iceberg of many risks that can endanger your system's data quality.
In our believe data quality is not purely an exact science. There is no predefined threshold for the amount and type of issues a data set can have before data quality is a concern, nor is there a set list of the severity of items, these are only mere guidelines and fluctuate from company to company, project to project. No, it depends also very much on company processes, culture and the overall “aura/vibe” that you can feel from people using the system. More on that a little bit further on.
So why is data quality important?
Of course there can be huge safety implications as I am sure you are very much aware of. These mainly revolve around issues with rotables (missing parts, discrepancies in paperwork versus actually installed, requirements due), modifications status and any other controlled event that is overdue. As we all say in aviation, safety is paramount and this alone should already be enough to take data quality serious. However, if we take a look beyond that standard safety aspect that is always uphold as the one reason, good data quality can mean huge efficiency gains and be a money safer too. A reliable clean system prevents double work from employees, avoids costly “rescue” operations, keeps your financial books tidy and creates room for further automation and innovation.
One of the immediate signs of bad data quality is when you are talking to the people that work with the system day by day, ask them to show how they do their job. Often these kind of talks end up in alternative work methods, where you are introduced to excel sheets or references to another system to double check if the main system is correct. This is the initial sign that something is not as it should be, as apparently they do not trust the system and this suspicion must have come from somewhere.
There are always multiple reasons for bad data quality, however, these are the top 3:
During introduction of the system the data was never cleaned properly, implicating problems from the beginning as the source was never correct as well.
Employees are aware that procedures are not followed properly in the chain, many dirty work arounds have been applied in the course of history of the system making it unreliable or, at best, giving slight concerns to the user base.
Multiple systems are used in parallel which will have, for sure, synchronisation issues at some point in time and creating doubt among the company which system should be used and, more importantly, which system is correct.
Some of these problems look seemingly innocent and things are often put away as insignificant compared to the sheer size of modern datasets. But be aware, down the line, small issues can and will have exponential impact as the dataset grows, eventually leading to things such as full inventory audits to verify your stock and the value it represents, doing labor intensive dirty fingerprint checks because of high doubts of the system status. Or, in the worst case scenario, keeping an aircraft AOG as there might be discovered that critical items are overdue making the aircraft instantly non-airworthy, potentially shortening it’s economical life. We’ve seen real-life materialization of all of these examples during our work in the field.
5 tips how to ensure data quality:
Although issues and abnormal situations are inherent to human work, there are numerous safety nets that can be put into place to minimize the risks and severity of data corruption, a quick overview:
#1: Data Cleansing
During introduction of a new system do a full data cleansing and verify expected and known problematic areas. If not done, or not done properly, this will guaranteed cause regrets and costly issues in due course, as talked about just now.
#2: Do not run two systems parallel
Many companies tend to run their old system in parallel for months after introducing a new system as a back-up or actually, because they are not feeling sure enough about it. This is a risky set-up and should never been done. If you are not sure you are ready, then don’t turn the new system on and do more testing, verifications and cleansing until you reach a high confidence level. If you feel you are ready, flip the switch immediately and degrade the legacy system, block all user access and force everybody to work in your state-of-the art new system.
#3: Policies & Processes
Enforce policies and processes, proper training of the whole organization is key to thoroughly understand and make your staff aware.
#4: Tackle problems immediately
As soon as problems are spotted, do not try to work around, verify and adjust immediately where necessary. This should be encouraged throughout all layers of the company, again training will show employees how important this is. Without proper training there is no understanding.
#5: System Competence Center
Introduce a system competence center within your organization who will govern and control the all facets of it. This competence center should NOT only consists of IT people, but experienced system users from all disciplines of the organization. A good competence center will warrant the data quality and reliability, but at the same time drive continuous improvement and innovation to make full use of its capabilities.
Data and it’s quality are the backbone of a modern organization. However, remember that it is (still) people that create the data and mistakes can be made. It requires uninterrupted attention and your data will require continuous maintenance, such as an aircraft, to keep it on an high quality and safe operating level.
Where serious data corruption can severely disrupt your company performance and specialists such as EXSYN need to come by to sort it out. A sophisticated and well controlled dataset can truly push your company to the next level, setting you ahead of the competition and enable the enrolment of new technologies such as predictive capabilities and even artificial intelligence, making you the next big digital pioneer in aviation.