The Data Paradox: How traditional infrastructure is causing digital distress
Traditional models of data management have concentrated on keeping data tightly secured and contained within silos, buried under layers of complex tools and processes aimed at preventing it from reaching would-be malicious parties. Clearly, as the number of high profile data breaches rise, something isn’t working.
In the digital age, employees need to use and share data in order to achieve their business goals and keep their organisations competitive. Supply Chains, Strategic Partnerships, Logistics and any number of processes in most industry sectors rely on efficient collaboration. Having accurate up-to-date information enables these processes to be streamlined and leveraged to maximise operational effectiveness right across the ecosystem.
However, viewing any sharing of data as an inherent risk to the business, for the last few decades CIO’s have pursued a fevered agenda of preventing data from being released outside of the local confines where it resides. This has largely resulted in data silos.
The rise of shadow IT
Neither the CIO nor the employee has anything but good intention, but these polarising aims create a paradox within enterprise that can ironically make them more vulnerable to misuse or loss of data. If employees can access the data they need by getting around the restrictions put in place, they will likely do so. And in order to ‘do a good job’ they will then share it in all manner of ways.
Once that data has left the perimeter it is no longer in their control. Even using a secure method of transfer – the data itself has been shared and is now exportable and untraceable. This ‘ways-around-the-system’ method of using unapproved programs, apps and sharing platforms is referred to as Shadow IT.
Humans – forgetful, lazy, error-prone creatures that we are – cannot really be trusted.
Useless bunkered data
Conversely, if data is so bunkered under silos, it is either too time consuming to retrieve that it ceases to be useful, or it is never utilised at all. Furthermore – these traditional hub-spoke centralised methods of data security inevitably degrade over time. As they spread across large ecosystems, they create out of date systems of record and areas of the business that haven’t kept up with the latest patches or password security.
Creating trusted data
In order to succeed therefore, all areas of the business need to be thinking the same way, and to be encouraging their partners to think the same way: how can we best utilise the data we hold or generate without leaving ourselves open to damaging loss or corruption of that data, and without being handcuffed by overly complex and costly processes?
…around three quarters of projects have stalled because large organisations are simply unable to share data securely. They are ‘digitally distressed’
In a recent discussion held by Gospel with a number of data industry experts, Duncan Brown, Associate Vice President of European Infrastructure and Security at IDC said; “Even though the drive to become digitally transformed is there, around three quarters of projects have stalled because large organisations are simply unable to share data securely. They are ‘digitally distressed’. Not only is this inability to create trusted data costly for their businesses, it presents a real existential threat to their very existence.”
Security at the data layer itself
All is not lost. There are some decentralised technologies that have emerged in recent years that can overcome this problem.
Here at Gospel, the distributed ledger technology(DLT) that we have developed enforces an environment of trust where nobody is ever trusted by default. Either to view or append any records on our private, permissioned blockchain platform, any party or system must first prove itself via a vigorous consensus agreement method to the other parties in their extended networks. This maintains a single source of trusted data and an immutable record of all attempted or approved transactions on that data. In this way, the security is enforced at the data layer itself.
Expanding the perimeter of control
In effect, the silo boundaries are broken down, and the time and effort previously spent on trying to stop the data being seen by those who shouldn’t is instead all handled by the technology itself. Right down to the granular level, no-one sees what they are not authorised to see, yet what they are allowed to view is at their fingertips.
The context and granularity is important. Even using a private blockchain – in many systems the control parameters are binary – you are either allowed to see all the data or you are not. Much like using secure sharing tools – all the data is exposed instead of only that which is authorised and required for the particular request. With Gospel, everyone maintains control of their own data, whilst allowing granular access only to relevant information for other parties in the value chain.
Absolutely trusted data
The possibilities of this absolutely trusted data are enormous. One of our early customers has proved it can reduce parts recall on its supply chain from weeks to a matter of minutes, by removing the hugely disparate and often manual processes currently employed to maintain records, and instead putting all relevant data on Gospel as a single source of truth. Because of the strict and carefully controlled requirements (maintained by consensus) to access or write anything to the data, all actors or systems know any data is accurate and can be trusted, and so processes become friction-less and far more streamlined.
Supply chains, systems of record, personal data protection and regulatory compliance – all can be hugely synergized by creating an environment of trusted data.
Solving the data paradox is one of the key aims of us here at Gospel Technology – revolutionising the way companies can allow access to their sensitive data and truly creating Trust in Your Enterprise.