The big data battlefield
StoryAugust 09, 2019
Market-intelligence firm IDC predicts that the sum of the world’s collective data, estimated at 33 zettabytes currently, will eclipse 175 zettabytes by 2025. That’s equivalent to forty billion pounds of common one-terabyte storage disks. Intelligence and military applications rely on massive data pipelines to drive intelligence gathering and mission-critical decision-making. The speed at which the warfighter is able to collect, process, analyze, and understand data directly impacts mission success.
What is big data?
“Big data” refers to large datasets so complex that transforming them into useful information cannot be achieved by traditional means. The challenges of big data break down into five fundamental areas – volume, variety, velocity, veracity and value, also known as the five Vs. (Figure 1.)
Figure 1 | The five fundamentals of big data drive analysis, results, and decision-making at the tactical edge whether on land, at sea, or in the air.
- Volume: Datasets are often massive. Storing and moving this data without inundating existing IT infrastructure becomes a challenge without the proper hardware. Volumes of data are growing exponentially, necessitating scalable solutions.
- Velocity: Analysis is most useful when it’s timely, driving real-time critical thinking and decisions. Important factors that can hamper data processing include insufficient bandwidth, improper communications infrastructure, weather, and outdated hardware.
- Variety: Data comes from a variety of sources and arrives in both structured and unstructured forms. Unstructured data – such as surveillance imagery, sensor readings, and human-generated content – is the most challenging to analyze. Without the proper software tools and analysis techniques, critical information may never surface from the chaotic mix of collected data.
- Veracity: Collected data must be clean and accurate. The hardware that safeguards data contributes to veracity by ensuring all data is reliably and securely stored. Information warfare (IW) and cyberattacks represent growing threats to veracity because they pose the risk of lost or altered mission-critical data.
- Value: The most important of the 5 Vs. Data is useless unless you can gain insight into its value. For example, users cannot deploy resources or make key operational decisions without understanding the risks, costs, and benefits to the mission.
Data management (DM) has become one of the key drivers in IW. Year after year, the U.S. military faces increased operational commitments and budget constraints, thereby forcing it to do more with fewer resources. The adaptability of the Department of Defense (DoD) has been a critical part of its ability to make rapid and intelligent strategic decisions. Data analytics has been a key enabler to this increased adaptability.
The 9/11 attacks changed our world in almost every aspect, including the future of how data would be stored and managed. Sept. 11, 2001 confirmed the need for military and intelligence communities to expand their use of data and analysis tool sets in order to protect the public from terrorist threats. Drawing timely insights from big data and understanding its challenges quickly became paramount to strengthening national security.
What is the tactical edge?
U.S. warfighters operate in a more technologically augmented arena since 2001, where sensors, wearable computers, Internet of Things (IoT)-enabled devices, and artificial intelligence (AI) systems all contribute to mission success. These devices produce enormous amounts of data (volume) stored in varying formats (variety) that must be transmitted rapidly (velocity) and reliably (veracity) into downstream systems to drive critical decisions (value).
Today’s warfighter often operates in remote, environmentally hostile, and actively contested regions. Distanced from normal IT infrastructure, their situation necessitates processing big data on-site – or at the “tactical edge.” Tactical edge devices can be found on drones, aircraft, land vehicles, and maritime vessels. On these platforms, hardware constraints such as size, weight, and power (SWaP) and extreme environmental conditions must be considered. To optimize space, edge-computing platforms strive to pack the latest processing, memory, storage, and I/O features into compact form factors without compromising reliability. To survive harsh environmental exposure, edge devices are subjected to stringent testing standards such as MIL-STD-810, MIL-STD-167, and MIL-STD-901, among others. Deploying a reliable processing solution with a failure rate closest to zero is critical to ensuring mission success.
A warfighter’s daily operations are increasingly dependent on analyzing data quickly to make critical decisions and respond to potential threats. The U.S. Department of Defense (DoD) has developed an informational architecture – known as the Department of Defense Architecture Framework (DoDAF) v2.0 – aimed at modernizing the warfighter and their infrastructure by providing guidelines on collecting, analyzing, and categorizing data.
AI applications rely on vast stores of data and computing resources. Supporting such capabilities at the tactical edge requires “built-to-purpose” hardware that fulfills SWaP requirements, can manage the five V demands of big data, and can handle device interoperability. Deployed hardware must also provide configurability and scalability for future deployments while leveraging the latest in commercial off-the-shelf (COTS) technologies. COTS technologies are central to powering such AI elements as general-purpose graphics processing units (GPGPUs), which aggregate many compute cores on a hefty PCIe adapter module. This deep-learning format enables the simultaneous processing of massive datasets, delivering the extreme parallelism and bandwidth AI applications demand.
Combining these technologies can pose their own challenges, however: Electronic components such as GPGPUs are complex and delicate, meaning that successfully integrating them into military systems requires special attention to system design, where understanding the warfighter’s operating environment, the associated thermal dynamics, and necessary system ruggedization are crucial.
In 2018, the Defense Advanced Research Projects Agency (DARPA) announced a $2 billion investment in accelerating AI integration into U.S. warfare platforms. AI-enabled military systems are smarter because they can extract insights from big data, which both enhances system autonomy and reduces reliance on error-prone human input. The ability to effectively process “big” combat data at the tactical edge can support new autonomous capabilities that identify threats, predict enemy behavior, optimize logistics, and protect military networks from cyberattacks. (Figure 2.)
Figure 2 | Mercury Systems’ EnterpriseSeries servers are currently deployed on numerous airborne, naval, and land platforms to handle computing at the tactical edge.
Moving forward with big data
Modern warfare on the big data battlefield relies on insights extracted from ever-growing volumes of unstructured, time-critical data. The processing systems that support big data and AI applications must fulfill unique computing requirements while ensuring reliability at the tactical edge and must also guarantee veracity by building security into hardware as opposed to simply bolting it on.
Richard Whaley is Director of Systems Engineering, Trusted Mission Solutions, at Mercury Systems.
Mercury Systems
www.mrcy.com