Gartner’s 2022 Predictions – How are they affecting cybersecurity?
By Austin Miller
Last year, technology researchers Gartner released Top Strategic Technology Trends for 2022, a list that predicted the twelve most important technological advancements that we can expect to see before the end of 2025. If you’re working in any field of IT, you’re probably already familiar with at least some of the entries contained within.
But how are these technologies being integrated into the cybersecurity industry? In an effort to show you exactly what is at the cutting edge of security technology, the SecPro team has been investigating how each of these technologies will make their mark on the lives of people defending against the adversary.
The report itself is broken up into three sections: Engineering Trust, Sculpting Change, and Accelerating Growth. Obviously, all these areas are of interest to any business. And just like the full Gartner report, we’ll be starting with the predictions for protecting trust in the IT world.
Engineering Trust – Data Fabric
The first time that I heard the phrase “data fabric”, I thought it was going to be another buzzword that disappeared from the mainstream psyche within a few weeks. But progress marches on and now more and more organizations are adopting data fabric or providing services which help others build fabric architectures.
Understanding Data Fabric
Careful data use is an essential part of all modern business, but traditional data structures can make it difficult to actually access the information. As more and more people started to work remotely, this became a growing problem for some organizations – how do we offer access to sensitive data without opening ourselves up to threat actors?
Instead of creating multiple secured access points – each of them a potential point of failure – or various data hubs, data fabric works by having a single point for users to work with information stored in a way that is easier for the decentralized and globalized world we are living in. As IBM puts it, data fabric is the “virtual connective tissue between data endpoints”.
This connective tissue is platform-agnostic and uses automation and augmentation of integration to create a system that not only allows users to access data from anywhere through a central point, but also creates an automated, intelligent system that identifies holes and weaknesses in the data set. This Finnish city of Turku was probably the first large-scale example of this in the real world.
A single access point? Is that not more dangerous?
Relying on a single point of failure should be setting off alarm bells, right? What happens when it goes down? Does this really stop the adversary from accessing our hard-earned and sensitive data?
According to industry experts, this single point of failure isn’t as much of a problem as it first seems because of the way the supporting architecture must be structured. Moving from a data mess (i.e., how most data is structured now) to a data fabric creates failsafes which protect the data in question. By integrating data masking and encryption as well as intelligent, automated data access recognition into the basic structure of the fabric, the defensive position is stronger than in the wilderness of the data mess.
Of course, that’s not to say that data fabric is perfect and unhackable. But as the world of big data marches remorselessly towards more comprehensive systems, security professionals should understand the ways in which the emerging technology should be managed.
How can cybersecurity teams use Data Fabric?
Because data fabric is very much an architectural development in the world of big data, many cybersecurity experts may not play an active role in how the fabric is created. But there are a number of ways in which a fabric architecture provides greater data security.
- As previously mentioned, data masking and encryption are central to the success of data fabric architectures. In essence, the way we approach data changes – instead of creating numerous security measures which are potentially designed and maintained separately, data fabric has a single, strengthened approach which enforces
- Data stays in one place, meaning that managing who has access to data is a much simpler task. This is the central criticism of the ‘data mess’ approach – after the data has been added to a database, there are potential security issues associated with misconfigurations. When you only have one access point, these misconfigurations are less likely to occur due to the streamlined approach to access control lists (ACL).
- Because all information is held in a central repository that can be accessed through self-service portals, governance, risk, and compliance (GRC) are all far easier to manage than in a traditional data system. Remember, data access is decentralized, but the data itself is centralized and managed through a unified policy.
When data fabric is in place, data security is managed in a unified way and this creates fewer points of weakness for the adversary to exploit. As an emerging technology, we should be on the lookout for the vulnerabilities that threat actors will find – no one is saying that adopting data fabric is an “end of history” moment!). But structurally, data fabric offers an improved approach to data management and several boons for security professionals as well. The question is how successful we can be in capitalizing on these architectural advances.