From airgap to zero-trust: Enhancing cybersecurity in the testing space
StoryAugust 04, 2023
The Cybersecurity Maturity Model Certification (CMMC), a unified standard for security introduced by the U.S. Department of Defense (DoD), is bringing about a cultural shift within engineering and test organizations.
Until recently, engineering and test organizations doing business with the U.S. Department of Defense (DoD) and other critical agencies attempted to bypass information technology (IT) cybersecurity requirements for their operational technology (OT) through:
- Exceptions (“You don’t need to check this; I’ve already secured this device.”)
- Skirting requirements (“You don’t need to check this; this component is not a desktop, laptop, or phone.”)
- Relying on airgaps (“You don’t need to check this; you can’t access this device remotely.”)
To be sure, significant, highly damaging cybersecurity breaches have been occurring with frequency for quite some time with systems that were already secured, were not computers, and had no physical or wireless connections to a network.
These breaches led to the DoD introducing the Cybersecurity Maturity Model Certification (CMMC) several years ago. With CMMC, IT departments are now stepping up to audit and manage all OT, transitioning the culture from “exception-based” to “zero-trust.” Defense industrial base (DIB) contractors (those companies and laboratories that enable research and development of military weapons systems, subsystems, and components or parts) must follow the new certification model to ensure that DoD contractors properly protect sensitive information. The new set of certifications must be conducted by third-party assessors.
While this change may appear daunting and potentially expensive, a holistic approach to security that involves collaboration among engineers, testing teams, IT professionals, and testing providers is essential to control costs and maintain efficiency.
A plane is not a plane
The F-35 fighter jet, at 80 million dollars a pop, is one of the top weapons in the U.S. arsenal … and a typical example of a complex system designed for military use. (Figure 1.)
[Figure 1 ǀ An F-35C Lighting II sits on the flight deck on the Nimitz-class aircraft carrier USS Carl Vinson (CVN 70). U.S. Navy photo/Mass Communications Specialist 3rd Class Erin. C. Zorich.)]
With a sizable number of “computing” elements, each system and component within the aircraft is interconnected, giving rise to complex vulnerabilities.
The plane is only as secure in the sky as in the hangar; every system that makes up the plane needs to work together, so every single one is a possible individual point of failure.
The U.S. military has deep experience ensuring that every part works, is not counterfeit, and that all the individual components work together how and when they need to. It could be better at identifying cyberwarfare threats.
Such threats tend to be a lot more serious. A broken tailhook can create a dangerous situation for a pilot, but that tailhook is just an object – it is not malicious. In contrast, a sophisticated radar system on a jet is not conscious or capable of emotion, but once hacked, it may use multiple means to kill the pilot.
From a cybersecurity perspective, the plane is not a plane – it is a stack of individual components, each at risk for a severe breach at any time. Every component is at risk; by extension, every contractor’s product and production processes are also at risk.
Put it this way: Every F-35 is only as secure as its least-secure contractor.
Much bigger than planes
Of course, as frightening as a hacked fighter jet would be, it does not com-pare in scope or destructiveness with other real-world hacking that has recently taken place.
The Stuxnet incident, which occurred from 2005 to 2010, is a notable example of a sophisticated cyberattack that targeted Iran’s nuclear program. Stuxnet, an extraordinarily complex and malicious computer worm, was designed to sabotage the uranium-enrichment process in Iranian nuclear facilities.
The attack targeted programmable logic controllers (PLCs) used in centrifuge cascades, causing them to malfunction and spin at speeds that destroyed both the centrifuges and their output. It might have been a happy ending for one particular democracy in the region. Still, Stuxnet’s successful infiltration and manipulation of the Iranian nuclear infrastructure significantly derailed Iran’s nuclear program. Just as significant, it showed the destructive potential of malware in an OT setting where nuclear materials are processed.
Here on home soil, the Colonial Pipeline hacking incident occurred in May 2021, when a cybercriminal group known as DarkSide launched a ransomware attack on the Colonial Pipeline, one of the largest U.S. fuel pipeline systems. The attack disrupted the pipeline’s operations, leading to a temporary shutdown and causing significant fuel shortages along the East Coast.
The hackers gained unauthorized access to Colonial Pipeline’s systems, encrypting critical data and demanding a ransom payment. The incident highlighted the vulnerability of critical infrastructure to cyberthreats and demonstrated the potential impact on essential services and the economy. It also underscored the urgent need for robust cybersecurity measures and proactive defense against cyberattacks on critical infrastructure systems.
It is important to note that, in both cases, the affected systems were “air gapped,” with hacks achieved using ingenious human engineering – getting operators to upload malware onto the targeted systems inadvertently. Air gapping is the practice of keeping a computing device physically disconnected and unable to connect to wireless – creating a “gap” of “air” between the device and any source of malware.
Air gapping was once considered the defense of last resort for test engineers looking to keep their computing devices free of malware (and their testing teams free of IT input). As these two incidents illustrate, this approach was simple, elegant, and wrong … and far from enough to prevent bad actors from causing catastrophic damage.
A zero-trust world
The threat posed by cyberattacks had been clear to many at the time of the Stuxnet attacks, which began more than 15 years ago. Colonial Pipeline provided the impetus for the U.S. government to finally act, and the DoD introduced a unified cybersecurity policy in the form of the CMMC in January 2020.
CMMC sets a framework and certification process to enhance cybersecurity standards across industries. The certification is based on a new culture of “zero trust” – where every component, and their relationship with other components, are validated at every step of assembly and use.
Companies must prepare for the associated costs and complexities and consider who will bear the financial burden of certification to obtain an authorization to operate (ATO) from the DoD. To be sure, whatever costs those efforts entail, an ATO is necessary for any contractor looking to sell to government agencies. Companies must assess whether it is cheaper to meet the new standards or get out of the business of selling to government agencies altogether.
Contractors that choose to rise to the new standards will need to embrace an environment where engineers and test teams no longer have any shortcuts to securing their computing devices. That means they must collaborate with new organizational stakeholders to meet CMMC standards efficiently.
Why is IT knocking on my door?
This new integration of IT into testing will entail a significant culture shift. Many customers are getting their first taste of this shift and note that IT managers are intruding on the testing process in ways they never had before.
Most testing teams at defense and aerospace firms have already adopted best practices for ensuring quality and authenticity at every step in the supply chain. Like the U.S. government, however, they could have been better at ensuring cybersecurity throughout the process.
These realities mean that teams must be involved in the testing stages to ensure that contractors build cybersecurity into chips, components, and systems right from the beginning. The emphasis on cybersecurity needs to go up the supply chain; contractors must ensure that their testing hardware and software are entirely CMMC-compliant.
To be sure, fully compliant testing solutions do not guarantee fully secure devices. There are inherent vulnerabilities built into specific devices, such as a data port that could be used for hacking by bad actors. Understanding and responding to those vulnerabilities is another reason IT needs to be more involved in testing.
Embracing collaboration for a secure supply chain
Industry experts estimate the cost of CMMC compliance at between $50,000 and $75,000 per product – with some estimates nearly $100,000 – with no economies of scale or scope. The companies that seemed to achieve compliance quickly and cost-effectively have embraced greater collaboration between their IT departments and testing.
IT’s experience in the cybersecurity space enables testing to identify multiple areas of potential insecurity early in the process. Testing teams could discover those independently, but addressing vulnerabilities piecemeal is not sustainable, given the costs.
In this new zero-trust world, engineers, testing teams, IT professionals, and testing providers must all be connected to ensure cybersecurity.
It’s about time.
Steve Summers is Director, Offering Management, at NI, focusing on mechanical systems and structural test for aerospace and defense customers. He earned a degree in physics from Brigham Young University and has worked in roles as an application engineer, sales engineer, account owner, and product manager. He has worked in the test and measurement industry for more than 25 years and is passionate about providing a path to success for engineers driving the technologies of tomorrow.
NI (formerly National Instruments) https://www.ni.com/