“The physicality of data and the path to intrinsically more secure authentication” was originally published through Forbes, (October 8, 2021) David kruger is co-founder and vice president of strategy for Absio Corporation and co-inventor of Absio’s Distributed Key Software Defined Cryptography (SDKC).
Two different classes of identifiers must be tested to reliably authenticate things and people: assigned identifiers, such as names, addresses and social security numbers, and a number of physical characteristics. For example, driver’s licenses list assigned identifiers (name, address and driver’s license number) and physical characteristics (photo, age, height, eye and hair color, and scanned fingerprints). Authentication requires examining both the license and the person to verify the match. Identical objects are distinguished by unique assigned identities such as a serial number. For particularly dangerous or valuable things, we supplement the authentication by verifying the provenance – proof of origin and falsification of evidence has not taken place.
In previous articles in this series, we have established that computers are miniaturized manufacturing plants that produce physical data objects. Data objects representing confidential information have been assimilated to highly hazardous chemicals (HHC). If control of HHCs or physical data objects is lost, serious damage may result. In addition, people with obscure intentions who could gain access to your shipping, processing and storage areas were considered bad computer authentication. Are these equivalences correct?
Imagine that HHC manufacturing plants were everywhere. The doors of the employees of the HHC plant are opaque boxes; people are admitted on the basis of an employee’s name and number, but guards cannot match the person to their badge. Factories have many shipping terminals. Incoming raw material shipments arrive after cursory partial authentication and no provenance control. Finished HHCs are routinely shipped to a name and address without verifying who the recipient is or their plans for the HHCs.
Given this scenario, would you be surprised if thieves and terrorists often masqueraded as employees? What if HHCs were regularly stolen or redeemed? What if sabotage via contaminated raw materials or compromised process controls were common? What if criminals used bought and stolen HHCs for bad purposes? Nope.
Imagine that such HHC plants are reality. Hordes of angry citizens, lawyers and lawmakers would rightly demand that plant operators and owners “fix the problem or bear the consequences.”
To verify equivalency, in the above scenario, replace data not controlled by HHCs, manufacturing plant computers, usernames and passwords for employee names and numbers, and logins Internet and portable media for shipping methods. I have just described the quality of authentication in most of the calculations, as evidenced by the following:
• Cyber attackers frequently use stolen credentials to gain control of software, computers or networks.
• Cyber attackers frequently exploit poor or missing authentication of incoming data such as links, patches, updates, and new applications to embed malware.
• Unchecked data stolen by malware or malicious insiders can be sent, copied and reshipped to anyone anywhere and used for any purpose.
To function, computers need three physical actors working in unison: first, an educating user, second, an instance of managing software, third, an instance of hardware. Each of these actors, like all physical things, can be authenticated by a combination of assigned values, physical characteristics, and provenance if necessary. Since co-opting a single actor can lead to loss of control over data, it makes sense to assume that authenticating all three is the norm – but incredibly, from a security perspective, it’s rare.
Intrinsically more secure authentication
Inherently safer design (ISD) prescribes build safely, not bolt it down. Why? To ensure safe and constant operation and reduce costs. By ISD, software (including applications, operating systems, and firmware) and hardware would provide authenticated identities and provenance as needed – and would require users, hardware, and software with which they interact to render. reciprocate. What if this is the norm?
• Remote attackers with stolen credentials could not log in because the software and hardware on the computer organizing the attack could not authenticate. Cyber attackers with a stolen computer and credentials could not log in because the biometrics did not match. The stolen credentials did not allow access.
• When “operational” data, data capable of functioning independently, or of modifying or instructing existing software (that is, applications, patches or malware), is received, it cannot not work unless authenticated. Malware, if present, cannot work.
• When sending confidential self-protected and self-directed data, the data contains its own authentication requirements; it is unusable unless the recipient authenticates. Stolen data objects could not provide usable information.
Is it possible? Of course it is. Biometric user authentication is widely available and calls to use it are frequent. Authenticable identities for individual instances of software and hardware are less common but do exist. There are software license management tools that authenticate instances of software and vendors that prevent the insertion of malware by authenticating and verifying the source of patches and upgrades. There are hardware manufacturers that provide authenticated identities. So the technology for inherently more secure authentication exists, but its application is patchy and inconsistent. Why?
Not my job
The first reason is that people don’t ask hardware and software manufacturers to create authenticated identities because they don’t know how to ask. The second reason is that unlike HHC factories, software and hardware manufacturers are generally not held liable for damage caused by missing or poorly designed authentication. How are they doing? First, their license agreements typically require users to waive recovery of damages as a condition of use – something no HHC factory could get away with.
Not worthy of interest
Second, uncontrolled data objects are dangerous and harmful (around $ 1 trillion in losses in 2020), but they are invisible. There are no compelling videos of toxic clouds of uncontrolled data or its victims filling hospitals, and computers don’t explode when their data is stolen, ransomed, destroyed, or corrupted. Third, violations are so common that we all suffer from them breaking fatigue. So while the world is arguably much more threatened by uncontrolled data than by uncontrolled HHCs, there are not yet hordes of people demanding solutions.
Be the Horde
If you read this series, you will have noticed that each article calls leaders to action. Why? Because the cybersecurity pandemic is curable, but it won’t be without business and government leaders gaining new knowledge (hence these articles) – and acting on it.
Follow me on LinkedIn. Discover my website.
You can also read:
The physical nature of the data and the route to ownership of personal data. When the data was first digitized in the 1950s, no controls were built in to protect it from unauthorized or misuse. Confusion between information in our minds and physical data, as stored on computers, hinders efforts to control how our data is used.
The physicality of data and the path to cybersecurity With the increasing trend of cyber attacks, remember that the potential for cyber attack is always greater than the potential for cyber defense. Data objects can defend themselves with encryption, rendering them unusable if captured by cyber attackers. Unfortunately, little data is encrypted today
The physicality of data and the path to inherently more secure computing Today’s software industry is precisely where the chemical industry was in 1978; risk control is just an afterthought. Most software leaks like a sieve and doesn’t systematically prevent strangers (cyber attackers) from entering. This is not hyperbole; read the news. -David Kruger