42 F
Washington D.C.
Thursday, December 12, 2024

PERSPECTIVE: Beyond the 5G Battlespace as DoD Works to Enable the Edge in a Disconnected World

5G hasn’t proved to be a comprehensive solution to connecting the tactical edge, but new data management strategies are overcoming the connectivity challenges.

It’s no surprise the Department of Defense (DoD) is investing in 5G as part of its modernization efforts. 5G, the current standard for broadband cellular networks, has transformed edge communication with higher data rates, lower latency, and greater traffic capacity than 4G. That’s essential in edge environments, which promise to decentralize data processing and data storage, and empower devices and applications to operate autonomously.

But the technology isn’t a panacea, especially in tactical-edge environments. Mission teams relying on edge data to support real-time decision-making still struggle with high data volumes, intermittent connectivity, and cybersecurity concerns.

Of course, no single technology, even one as useful as 5G, can solve all those problems. Fortunately, new data management strategies being deployed in real-world use cases can enable the DoD to analyze and exchange timely information – reliably and securely – at the tactical edge.

New Connectivity Standard, Old Connectivity Challenges

As mission teams deploy 5G in the battlespace and beyond, they face familiar data management challenges. First is the sheer volume of edge data. Data streams from edge sensors, especially high-def video cameras, quickly outstrip 5G bandwidth. Teams struggle to transmit that data to centralized datacenters for analysis and back to the edge for decision-making.

Second is intermittent connectivity. Sensors on vehicles or transported equipment can lose connectivity when geography blocks 5G transmissions. It’s a similar situation in environments like shipyards where enormous sheets of steel are moved around. Sensors can cache data and then broadcast it when connections are restored, but the result is data bursts that overwhelm bandwidth.

A third challenge is cybersecurity. 5G transmissions are encrypted, but that doesn’t solve all security issues at the edge. For one thing, encryption keys need to be shared among authorized entities in a safe way. For another, edge devices must protect against spoofing, or malicious actors falsifying data, by attesting that they’re trustworthy. Last, edge devices – and the data on them – remain vulnerable to physical theft.

But effective data management strategies can address these challenges head-on. They can ensure the resilience and reliability of edge infrastructures and enable a trustworthy technology ecosystem to support missions at the edge.

Reducing Data Transmission with Edge Analytics

The solution to high data volumes isn’t to generate less data but to transmit less data from the edge to the datacenter. Mission teams can achieve this goal by analyzing data as close to its point of production as possible. New technology makes this strategy possible.

A new generation of computer processors is faster, uses less power, and comes with built-in workload-specific accelerators for data analytics and artificial intelligence (AI). That means sophisticated data processing and transformation is now viable at the edge.

For data processing, edge devices can run AI algorithms that filter data feeds, automatically identifying the most valuable data and sending only that information to the datacenter for analysis. For example, rather than transmitting an entire 4K video feed, a device could send only a few kilobytes of object-detection data.

For data transformation, edge devices can perform AI or other analytics as close to the mission space as possible. AI models can be deployed as microservices residing in container environments that group the application with other necessary software in a single package, or container. Outputs become immediately available right where decision-makers need it, and relevant insights can easily be shared among edge devices.

Conquering Connectivity with Data Channels

Mission teams still need to transmit data, however, despite the constraints of low bandwidth and denied, degraded, intermittent, or limited (DDIL) edge environments. The solution is to create policy-defined, algorithm-controlled data channels to take optimum advantage of available bandwidth. This innovative approach is already being used by the DoD in a real-world edge environment.

Policy dictates how each data channel behaves. For example, a Real-Time channel might transmit all analyzed data as it’s generated. A Historical Channel might aggregate data and then send a statistical model of the data once an hour. A Summary Channel might send only what has changed in the past eight hours. With this approach, the amount of data transmitted at any time is comparatively small. An algorithm can prioritize channels based on available bandwidth.

Decision-makers subscribe to the channels they need. It’s straightforward to create new channels as new AI models are deployed. Channel policies are applied throughout the edge ecosystem, and security for each new channel is applied automatically.

Management of the data channel can run on cost-effective, lightweight processors with very low memory requirements. But it can also scale to take advantage of high-performance cloud environments with ample RAM. Requirements are set by the amount of data being analyzed and the complexity of the analytics being conducted.

The DoD organization currently using this approach is applying an AI inference model to data generated by thousands of 4K cameras. The organization periodically needs to update the AI model. To achieve that goal, it created a Learning Channel so that all devices running the analytics are connected to one another in the same way. It then just needs to send a delta of the model through the data channel to sync the model across devices.

Ensuring Edge Security

5G connections are encrypted by default. All other data streams also require encryption at the tactical edge. Encryption can protect against more than just data theft. With effective attestation, it can also protect against spoofing, where an adversary hijacks a device and generates bogus data. Edge devices should leverage attestation keys to prove their identity to one another and to the datacenter so that all devices – and the data they generate – can be trusted.

That level of encryption can protect data at rest and in transit. For data in use, a new generation of processors can reserve a portion of the CPU as a secure enclave. With this approach, known as “confidential computing,” the memory in the enclave is encrypted with a key that’s unique to that CPU. Even if an adversary gained root access to the device, it couldn’t read the data.

5G can help teams transcend the limitations of traditional connectivity by enabling autonomous operations, real-time analysis, and localized decision-making in mission environments. But while 5G will remain integral to communication at the tactical edge, the technology does involve inherent connectivity shortcomings. By taking advantage of innovative new data management strategies, mission teams can overcome those challenges and enable the reliable and secure analysis and sharing of information at the edge.

The views expressed here are the writer’s and are not necessarily endorsed by Homeland Security Today, which welcomes a broad range of viewpoints in support of securing our homeland. To submit a piece for consideration, email editor @ hstoday.us.

Darren Pulsipher
Darren Pulsipher
Darren Pulsipher is Chief Solutions Architect, Public Sector, for Intel. He works directly with federal, state, and local government organizations to help them modernize their IT architectures. Through executive and engineering positions, Darren has developed a unique ability to bring technology, people, and process together to enable transformational change. He focuses on data architecture, workload migration, cloud-native development, service orchestration, and multi-hybrid cloud architectures. His research has resulted in eight patents in cloud and grid computing. Darren is author of three books on technology and more than 100 articles in various industry trade publications. He shares his passion for digital transformation on his weekly podcast, “Embracing Digital Transformation.”

Related Articles

Latest Articles