Guide
TinyML is machine learning on microcontrollers — models under 1 MB running inference in milliseconds on devices costing $2-25. Start with an Arduino Nano 33 BLE Sense and Edge Impulse, or an ESP32 with TFLite Micro.
Published 2026-04-01
TinyML is machine learning that runs on microcontrollers — chips with kilobytes of RAM, megahertz-level clock speeds, and milliwatt power budgets. The “tiny” refers to the model size (typically 10 KB to 500 KB) and the hardware, not the impact.
Practical applications are already in production: keyword detection in smart speakers, gesture recognition in wearables, anomaly detection in industrial motors, and wildlife monitoring in remote sensors.
The key constraint is memory. Where a cloud ML model might use gigabytes of RAM, a TinyML model must fit its weights, activations, and runtime into 64-512 KB of SRAM. This requires aggressive model optimization — quantization, pruning, and architecture choices designed for small footprints.
For your first TinyML project, pick one of these proven starting platforms:
Why: Built-in 9-axis IMU, microphone, temperature, humidity, pressure, light, and proximity sensors. No external wiring needed. The Arduino IDE is familiar to most hobbyists, and Edge Impulse has direct Arduino integration.
Best first project: Gesture recognition. Wave the board in a pattern, classify the motion. The IMU provides the input data, no additional hardware required.
Specs: ARM Cortex-M4F at 64 MHz, 256 KB SRAM, 1 MB flash. Enough for keyword spotting, gesture recognition, and simple anomaly detection. Not enough for vision.
Why: More RAM (512 KB), Wi-Fi for sending results to a dashboard, and a large community. The ESP32-S3 adds SIMD instructions and a camera interface for vision tasks.
Best first project: Anomaly detection on ESP32. Read an analog sensor (vibration, current, temperature), detect when readings deviate from normal patterns.
Specs: Dual-core Xtensa at 240 MHz, 512-520 KB SRAM, up to 8 MB PSRAM (S3). Handles all TinyML use cases including basic image classification.
Why: ARM Cortex-M ecosystem with professional tooling (STM32CubeIDE, STM32Cube.AI). Better for teams already in the ST ecosystem or building towards production hardware.
Best first project: Predictive maintenance on STM32F4. Connect a vibration sensor, detect bearing degradation patterns.
Specs: Ranges from Cortex-M4F at 80 MHz (STM32L4) to Cortex-M7 at 480 MHz (STM32H7). The F4 at 168 MHz with 192 KB RAM is a solid mid-range choice.
Edge Impulse is a cloud platform that handles the entire TinyML pipeline:
The free tier covers individual developers with projects up to 4 hours of training data. No Python or ML knowledge required to start.
TFLite Micro is Google’s open-source runtime for ML on microcontrollers:
.tflite format with quantizationYou manage the entire pipeline yourself. This gives full control over model architecture, quantization parameters, and memory layout. Requires Python for training and C++ for the firmware.
This walkthrough uses the Arduino Nano 33 BLE Sense with Edge Impulse for gesture recognition. The same concepts apply to any MCU and framework combination.
Install the Arduino IDE and add the Arduino Mbed OS Nano Boards package. Connect the Nano 33 BLE Sense via USB. Verify the connection:
arduino-cli board list
Install the Edge Impulse CLI:
npm install -g edge-impulse-cli
Run the daemon to connect your board:
edge-impulse-daemon
This opens a serial connection and registers your board with your Edge Impulse project.
In the Edge Impulse web UI, go to Data Acquisition. Record samples:
Aim for at least 3 minutes of data per class. More data improves accuracy.
In the Edge Impulse UI:
Check the confusion matrix. If accuracy is below 85%, collect more data for the confused classes.
Go to Deployment, select Arduino library, and download the .zip. In the Arduino IDE:
The serial monitor shows real-time predictions: gesture class and confidence score.
Test with gestures that were not in the training set. Common issues:
After your first project, try scaling complexity:
| Next Step | Hardware | Guide |
|---|---|---|
| Audio keyword detection | Arduino Nano 33 BLE Sense | Voice recognition |
| Vibration anomaly detection | ESP32 + accelerometer | Anomaly detection |
| Visual inspection | ESP32-S3 + camera | Object detection |
| Predictive maintenance | STM32F4 + vibration sensor | Predictive maintenance |
Each step increases model complexity and hardware requirements. The workflow stays the same: collect data, train, quantize, deploy, evaluate.
Run gesture recognition on Arduino Nano 33 BLE with TFLite Micro. Built-in IMU, Arduino IDE, and the official TFLite gesture tutorial.
Run anomaly detection on ESP32 with TFLite Micro. Autoencoder setup, sensor integration, and real-time monitoring for industrial applications.
Build and deploy object detection on ESP32-S3 using Edge Impulse. End-to-end pipeline from data collection to on-device inference.
Classify gestures on STM32F4 with Edge Impulse. IMU data collection, model training, and real-time classification on the Cortex-M4.
Deploy anomaly detection on Arduino Nano 33 BLE with Edge Impulse. Built-in sensors, beginner-friendly pipeline, and BLE connectivity.
ForestHub turns your TinyML prototype into a deployable workflow. Visual builder, automatic code generation, multi-MCU support.
Get Started Free