AI driving major changes in DoD test and measurement for radar/EW
StorySeptember 06, 2022
Radar and electronic warfare (EW) are challenging applications for designers of test systems. The only constant seems to be change in technologies, tactics, and countermeasures. Industry players believe that artificial intelligence (AI) may hold the key to driving the effectiveness of radar and EW.
Constant software evolution, open standards, and higher bandwidth are all key drivers of test and measurement designs in defense applications. Each also creates an environment where artificial intelligence (AI) techniques can enable improvements in performance that in turn enhance battlefield effectiveness.
The top trends for radar and EW test and measurement systems are more advanced modes for antijamming, more adaptability in waveforms, and leveraging of AI solutions to produce more effective systems, says Haydn Nelson, principal solutions manager for radar, EW, wireless, and EO/IR test solutions at National Instruments (NI – Austin, Texas).
“On the sensor side, they use AI as a way to pull out threats from the chaos,” he continues. “They’re building systems that may automatically do something based on a learned behavior and adapt to a threat. It will realize, ‘Hey, I’m trying to sense something and I realize I’m being jammed. I can be cognitively aware of that attempt to deny access and I can change my waveform or some parameter of the radar to avoid someone messing with me.’”
Testing the effectiveness of that AI is easier said than done, however. One of the challenges testers run into is how to train an AI model with an unknown data set, since threats exist that are both known and unknown.
“Algorithmic testing is a big challenge because of the complexity,” Nelson notes. “We’re seeing a lot more people trying to find ways to do that in the digital realm like with simulation, and then take a radar subsystem and hook it up to a digital environment emulator so you can do testing in the lab before you take it on the open-air range.”
Growing opportunities in AI
Chris Johnston, director of radar and EW solutions for Keysight Technologies (Santa Rosa, California), says he believes that the sheer volume of data collection in modern testing requires computer-assisted analysis of raw data.
“It is not feasible to have engineers and analysts scrub terabytes of data to find patterns, signals of interest, or interference,” Johnston says. “AI can play a major role in reducing big data down to something actionable. Also, when possible, having open architectures provides end users a way to include their algorithms into the test system, since much of the research is still proprietary research.” (Figure 1.)
[Figure 1 | Keysight’s Z9500A simulation view for dynamic scenarios. (Illustration courtesy Keysight Technologies.)]
Tim Fountain, radar and EW market segment manager at Rohde & Schwarz (Columbia, Maryland), notes that the testing community is “actually getting there” when it comes to AI, although challenges remain.
“Now the challenge is everything has to be cognitive,” he says. “Take spectrum sensing: The systems need to be able to look at the spectrum and make decisions in real time about where they can operate given geopolitical boundaries and where adversaries are operating. On top of that, how do you actually create realistic environments for these cognitive systems?
“And that’s been one of the challenges,” he continues. “Because in the majority of these situations, you have to create real environments that look like what the thing is going to see when it gets out there – like congestion. The challenge becomes, how do you create data sets that realistically model what systems are going to see such that we can train it?”
Fountain likens a new algorithm to a baby that doesn’t understand much at first: “You need to train the baby through real-world examples – don’t do that, don’t touch that, eat this,” he says. “It’s a similar situation with these AI algorithms.”
Digital testing
Testing radar and EW equipment at an actual open-air range – such as Naval Air Weapons Station in China Lake, California – is a critical step for new technology to be accepted, since it is the only opportunity to test it against real-world systems and weapon stations, Nelson says. These are expensive tests, though, so there are very few opportunities and the test must be run in one specific weather scenario, one threat scenario, and one environment. That reality creates a problem for testers who want to understand how the system will perform in another part of the world against a potential adversary, often in challenging conditions, he explains. “You’re going to need to do some of that testing digitally,” Nelson points out. “There are a lot of threat emulators out there, so we see a lot of people looking to do more of that in the lab.” (Figure 2.)
[Figure 2 | NI’s Radar Target Generation Driver built on the vector signal transceiver. (Illustration courtesy NI.)]
An ongoing challenge is that surrounding testing more technologies digitally and in the lab in realistic ways, such as conducting multiple layers of testing and trying out different environments.
The ideal, Nelson says, is “being able to come up with digital models and then you can have a digital model of a certain threat scenario you can test over and over.”
A key problem in those models and scenarios, however, are those unknown unknown threats, Fountain says. For example, if you want to test how a radar detects a certain type of missile, the problem is that you know the adversary will probably modify some of the operating parameters of the missile in an actual wartime situation so a radar has more trouble spotting it.
“As a result, the system doesn’t fit a static library model,” he observes. “You can’t go out in the real world and capture these new modes because adversaries are not operating them in those modes absent a real conflict. But that’s the real challenge: You’ve got to predict how something is going to work without knowing how something is really going to work.” (Figure 3.)
[Figure 3 | Rohde & Schwarz’s Integrated, Record, Analysis, Playback System (IRAPS). (Illustration courtesy Rohde & Schwarz.)]
Software driving the industry
Systems are becoming more software-driven than ever before, Nelson notes, meaning that radar systems will have the ability to change and adapt waveforms, frequencies, and modes depending on the threat. Open standards have made it easier to develop these types of software for radar systems, he says.
“If I’m an engineer and I’m designing a new mode on a radar where I’m trying to train for a new threat in the threat space, I want to do that in software and do it quicker,” he explains. “I think that’s another reason why they’ve pushed for open standards: because they can quickly upgrade hardware as software evolves. ‘Oh, we’ve got to apply a larger model and our four-year-old processor is going to run out of steam.’ The benefit of open standards is we can do tech insertion.” All of that has to be mirrored on the test side, Nelson says: “You have to have a software-defined test system.”
Keysight’s Johnston adds that software is key to the testing industry as it tries to find better ways to do modeling.
“There is a rapidly growing need for digital test beds, so the software and firmware algorithms are testing in up-to-date software development paradigms – like continuous integration,” he says. “This requires that the test and measurement testbeds run as digital twins, so they are integrated and regression-tested many times in automated software loops. Ideally these same tests are migrated to full [hardware] test beds when the software is ready to test in actual deployments, particularly for congested and contested environmental modeling in EW scenarios.”
5G’s future role
Another area of interest is the challenge of adopting 5G technology for military applications. While 5G offers tantalizing opportunities to the military to pass information on the battlefield more effectively and with less expensive hardware, it also carries security issues that testers will need to resolve before it can be widely adopted in the defense arena.
“If you build anything that’s networked, it can be hacked,” Nelson says.
He also says that he supports such initiatives as the Air Force Research Lab (AFRL) SDR competition. In last year’s challenge, a team of undergraduates from Worcester Polytechnic Institute (WPI – Worcester, Massachusetts) used a $1,200 radio to jam just one specific user, which exposed a surprising vulnerability in the 5G standard.
“5G is great for commercial, but I think DoD [the U.S. Department of Defense] will have to have their own version of the standard,” he maintains. “They may reuse some of the terminal hardware, but for tactical applications it needs to be hardened.”
Other complications arise from using 5G, he adds: “They use some bands that get really close to radar bands,” he observes. “As 5G goes global, how does that impact radar and sensor systems out there? When we fly this specific radar system off the coast and they have a 5G network, is it impacting our sensor systems?”
Darren McCarthy, aerospace and defense industry segment manager at Rohde & Schwarz (Columbia, Maryland), says that the commercial space is outspending the DoD 45-to-1 in satellite technology research, so the military has a lot of catching up to do in this area. The DoD and the defense industry will have to find some way to make use of the commercial 5G technology already available.
“There’s no longer going to be a 20-year program to put up a satellite,” McCarthy asserts. “[The military’s] going to use what’s up there and carve out a piece of the commercial network to utilize for themselves.”