Moore’s Law – or the law of more?
StoryMay 07, 2021
By Todd Prouty
The insatiable demand for instantaneous information extends to the tactical edge and into the realm of unmanned aerial vehicles. But can underpinning technologies match that reach?
The Global War on Terror – post-September 11, 2001 – highlighted battlefields riddled with unprecedented threats and blind spots, including concealed hideouts into which violent enemies could quickly disappear after attacks. The U.S. military had drone capabilities, but using them to track enemy movement was, as frequently described, like looking through a soda straw. A decade ago, the U.S. military rolled out sweeping unmanned video capture technology via Gorgon Stare, a wide-area persistent surveillance sensor system attached to an unmanned aerial vehicle (UAV) capable of capturing entire cities’ worth of motion imagery.
Ten years ago, the technology was revolutionary in augmenting surveillance and reconnaissance, but for actionable intelligence, troops still lacked the necessary high-speed connectivity to move the imagery and full-motion video (FMV) quickly enough to inform decision-making. Over time, forces eventually could leverage faster, more high-speed processing for rapidly collecting and disseminating visual data.
The continued evolution of high-speed technology in theater, mounted sensors, and data exploitation has unleashed a new era in the tactical deployment of UAVs. This transformation has given rise to UAVs equipped to track enemy forces with live FMV feeds, the emergence of open-source video streaming, and other high-fidelity tools for real-time situational awareness and reliable decision-making.
This transformation has also created an urgent demand for increasingly high-capacity and -speed data processing, intelligence mining, and information sharing, placing extraordinary pressure on deployed IT networks. The U.S. Department of Defense (DoD) and industry are partnering to expand high-speed data transmission to meet the growing demand for artificial intelligence (AI) and integration at the tactical edge. Likewise, the Army is currently two years into its integrated tactical network-modernization effort aimed at boosting resilience and advancing functionality. Are these efforts enough to keep up with near-peer adversaries?
Much of this work is coming together under the Joint All-Domain Command and Control (JADC2) approach. JADC2 – the DoD’s concept to connect sensors from all of the military services into a single network – aims to interconnect troops and technologies across platforms, and multidomain operations. By bringing the cutting edge to the tactical edge, joint forces can collaboratively and effectively confront amorphous and increasingly sophisticated adversaries. (Figure 1.)
[Figure 1 | Tech. Sgt. John Rodiguez provides security with a Ghost Robotics Vision 60 prototype at a simulated austere base during the Advanced Battle Management System (ABMS) exercise at Nellis Air Force Base, Nevada, during September 2020. The ABMS is an interconnected battle network – the digital architecture or foundation – that collects, processes, and shares data relevant to warfighters in order to make better decisions faster. The Air Force is developing the ABMS to implement the Joint All-Domain Command and Control (JADC2) approach. U.S. Air Force photo by Tech. Sgt. Cory D. Payne.]
“We are leading a continuous Army modernization effort to ensure our future warfighters possess the concepts, capabilities, and organizational structures they need to dominate a future battlefield,” says Chief Warrant Officer 5 Chris Westbrook, senior technical advisor for the Army’s Network-Cross Functional Team (N-CFT). Westbrook stresses how testing and prototyping will help enhance network capacity, resilience, and convergence to support the compute demand on interconnected systems and platforms.
“From the numerous white papers down to the few selected prototypes, we are providing a rigorous and agile method for pivotal and innovative technologies to transition into fully fielded capabilities,” he says.
Brig. Gen. Rob Collins, Army program executive officer for Command, Control, Communications-Tactical, notes his emphasis on the multidomain integration of sensor intelligence, particularly from airborne platforms – military and commercial – including UAVs. To him, this is part and parcel of the trajectory toward an effective JADC2 implementation. Integrating AI can provide algorithms that, for example, harness computer vision and machine learning to comb through UAV feeds, automatically keying in on and targeting enemy movement.
“Deep sensing needs to be a big focus for [multidomain operations], leveraging both national and commercial capabilities, specifically in space, and manned and unmanned aerial ISR [intelligence, surveillance, and reconnaissance], and collectively taking that synthesizing [of data] to inform mission command,” he says.
Is the crucial steppingstone getting overlooked?
The reality is that few of JADC2’s many ambitious and expansive goals can be achieved without the computing power that’s fundamental to delivering real-time data processing, AI algorithms, high-speed data relay, and intelligence fusion from multiple domains.
Even the promise of what’s being tested today for tomorrow’s application won’t be possible without a behind-the-scenes computing evolution – especially when it comes to compute solutions that are both available for and can reliably perform at the tactical edge. Evolving from central processing units (CPUs) to graphics processing units (GPUs) and now to data processing units (DPUs) is driving a commensurate transformation in mission capability.
“It used to be just a demand to get the information quickly; now we’re seeing a move toward taking decision-making out to the edge, in the air – instantaneous and actionable intelligence,” says Chris Cargin, technical director at Crystal Group.
That technology must undergo a transformation of its own before it can be deployed into the harsh conditions of the battlefield environment. From individual sensors to interconnect mechanisms to GPUs/DPUs, the often-fragile components must be ruggedized to perform regardless of temperatures, precipitation, shock, vibration, exposure, or other potentially damaging conditions – including altitude for UAVs.
“Today’s models are being optimized and improved so algorithmic capability can fit into a smaller footprint and mounted onto a UAV – not just to collect data like Gorgon Stare or watch video streams and make decisions after the fact,” Cargin notes. “Now, the goal is to execute next-generation, on-UAV decisions in real time using detailed information being detected in-theater.”
Next-generation abilities will harness and apply the processing capabilities of GPUs and DPUs deployed on UAVs to execute cloud-based intelligence operations, real-time decisions, and AI algorithms. They must also achieve a delicate but critical balance between compute capacity and size, weight, and power.
“Our vision is to keep putting heavy processing as high and as forward as possible without compromising compute,” Cargin says. “It’s really about giving commanders options.”
That’s the clear mandate: tactical flexibility delivered through high-speed processing and data sharing anytime, anywhere. Ongoing platform integration and proliferation of the “Internet of Battlefield Things” will require machine-speed processing and exploitation, adding to the overall complexity. Multipurposing UAVs for a range of missions and applications will further add to this massive data influx. (Figure 2.)
[Figure 2 | Members of the 6th Special Operations Squadron use a tablet running Tactical Assault Kit (TAK) to upload coordinates during an exercise showcasing the capabilities of the Advanced Battle Management System (ABMS) in late 2019. U.S. Air Force photo by Tech. Sgt. Joshua J. Garcia.]
Industry is partnering with the military to understand the requirements and help develop open-source, modular solutions that provide forward-edge interoperability, network processing, commercial cloud capabilities, and AI tools. It’s a race against an accelerating version of Moore’s Law, which states that the number of transistors on a computer chip doubles every two years. Today, according to AI research group OpenAI, AI capabilities are doubling every 3.5 months, which renders today’s software-defined systems only 10% as capable after a year and only 1% as capable after two years from the day it was designed.
This is where the evolution in DPUs, modular capabilities, and tactical-edge processing will be crucial. Integrating DPUs – along with open-source, continuously upgraded components developed through agile processes – will catalyze real-time sorting and calculating. This approach will also bring the network forward to the edge and help realize the high-powered goals of a data-driven, dominant military.
Like the rest of the battlefield technology transformation elements, though, it will take time to transition. New network switches, processors, and the emerging wave of CPUs, GPUs, smart network interface cards (SmartNICs), and DPUs will all help execute both AI deep learning and inference at the edge, while also ensuring the modules and their functions are secure, rugged, and optimized for integration and collaboration.
Last fall in a virtual briefing held by the Air Force Association’s Mitchell Institute for Aerospace Studies, Preston Dunlap, chief data architect for the Air Force, compared those modules to Lego pieces, noting everything must fit into place to operate cohesively and effectively.
Dunlap stated during the briefing: “Underpinning all this is digital engineering: Open architecture and open standards that ensure those Lego blocks – just like real Legos – actually snap together and work.”
Todd Prouty is a business-development manager at Crystal Group. He’s spent more than 15 years directing product development in avionics and communications systems, with particular focus on the military market.
Crystal Group https://www.crystalrugged.com/