A group of drones with red lights fly in formation against a twilight sky.
A group of drones with red lights fly in formation against a twilight sky.
Seamless AI Model Updates: How ADI Enables AI on a 1000-Robot Swarm Seamless AI Model Updates: How ADI Enables AI on a 1000-Robot Swarm

By Sergiu Cucuirean, Embedded Software Engineer

Sep 24, 2025

Smarter Updates for Edge AI

Embedded engineers know that updating AI models on edge devices is tough. You typically flash neural networks directly into the firmware and hope the model works in the long run. Updating models often requires physical device access and complete reprogramming.

Now, imagine the challenge of updating models that are already deployed on hundreds, if not thousands, of devices. That would be time-consuming and labor-intensive. But thanks to recent breakthroughs from Analog Devices, Inc. (ADI), model updates will no longer be a barrier you need to overcome.

Meet the Robot Swarm: OpenSwarm

Swarm-scale model updates were one of the challenges ADI confronted as part of our participation in the EU-funded OpenSwarm initiative (openswarm.eu). OpenSwarm aims to demonstrate and facilitate collaborative AI in swarms of small, autonomous robots. The requirements were challenging:

  • The robots’ small size limited the battery capacity and space. That meant model updates had to be very efficient—consuming minimal power, network bandwidth, and data size.
  • If models fail to install after an update, robots should be able to recover from these errors.
  • The robots are also highly mobile and may not be reachable by a central network.

Harnessing ADI Innovations

Leveraging 60 years of expertise in delivering ultra-efficient computing, ADI rose to the challenge with a groundbreaking solution. Together, our ADI teams based in Limerick, Ireland, and Cluj-Napoca, Romania, developed a novel model update technique for over-the-air (OTA) convolutional neural network (CNN) deployment and updates. Our solution leverages ADI’s MAX78000 processor and SmartMesh network technology. Both are components of the OpenSwarm robots.

Why MAX78000?

The MAX78000 has hardware designed for such constrained situations: a low power Cortex®-M4 processor and a dedicated CNN accelerator featuring 64 parallel processing elements. The MAX78000 SDK packs and quantizes PyTorch models into the firmware.

Block diagram showing three labeled sections: Cortex-M4, SRAM, and CNN Engine.
Figure 1. Hardware-accelerated CNN platform on MAX78000.

Models as Data: Unlocking Model Updates

Developers typically embed quantized models as static C arrays. We took a different path by creating a firmware-based architecture that treats neural networks as loadable data structures. We packaged this quantized model data into a structured firmware format. This firmware contains everything the CNN needs to run: layer configurations, quantized weights, bias values, and architectural metadata.

A block diagram showing components of a software system and parts of a Model.

At the heart of this system is a CNN engine driver that abstracts the complexity of the MAX78000’s CNN accelerator. It handles complicated tasks like memory management, operator dispatch, and hardware synchronization.

The result of our approach? One set of code runs many models. This toolset reduces development time and unlocks easy updates. Models become data that can be loaded, swapped, and updated independently of the core application logic. The same CNN engine driver can execute vastly different models without modification.

Mesh and Deliver: How OTA Updates Reach Every Bot

The team used ADI’s SmartMesh wireless networking technology to deliver the OTA model updates. SmartMesh operates independently of Wi-Fi or the 5G infrastructure. It provides reliable, low power communication for large, dynamic networks. Each node in our swarm acts as both a sensor and a router. Updates can hop across the mesh network even in complex or obstructed environments. SmartMesh ensures that every device, even those beyond the direct range of a central gateway, receives the latest AI models securely and efficiently.

How OTA Model Deployment Works

With models now treated as data, OTA updates follow simple firmware update steps:

  • Firmware is transmitted wirelessly over SmartMesh
  • Upon receipt, each robot checks compatibility, size, and supported layers
  • If valid, it installs the model in a single atomic step

If something goes wrong, the system reverts to the previous model, keeping systems safe and reducing the risk of downtime. Devices store multiple models on an SD card using a FatFS (File Allocation Table File System). That makes it easy to test, switch, or roll back versions.

Simplifying Deployment: Visual Model Management

Large swarms make model management and delivery difficult. To address that, we developed a simple, web-based management tool. Users drag and drop model firmware into a browser. The tool checks the file, loads the model for distribution, and shows update progress through the swarm in real time.

Firmware screen with update details on the left, flowcharts on the right.

The tool also lets users:

  • See the structure of the CNN
  • Compare old vs. new models side by side
  • Monitor memory usage and parameters

The management tool lowers the barrier to entry and helps engineers and data scientists collaborate on updates without requiring expertise in embedded systems.

Looking Ahead

Over-the-air model deployment changes how we think about edge AI. It separates intelligence from firmware, allowing models to evolve without requiring the reprogramming of the entire device. Companies can utilize this technology to provide ongoing upgrades to their edge devices. Although we’ve demonstrated this technique using the MAX78000 and SmartMesh, it's a broadly applicable approach that can shape the future of AI-powered systems.

Learn More

Want to modernize your edge AI workflows? OTA CNN deployment brings flexibility and control to embedded intelligence. Speak with our team or get started learning more about the ADI MAX78000 and SmartMesh networking technology.

This project has received funding from the European Union’s Horizon Europe Framework Programme under Grant Agreement No. 101093046.