LiteRT for Microcontrollers is designed to run machine learning models
on microcontrollers and other devices with only a few kilobytes of memory. The
core runtime just fits in 16 KB on an Arm Cortex M3 and can run many basic
models. It doesn't require operating system support, any standard C or C++
libraries, or dynamic memory allocation.
Why microcontrollers are important
Microcontrollers are typically small, low-powered computing devices that are
embedded within hardware that requires basic computation. By bringing machine
learning to tiny microcontrollers, we can boost the intelligence of billions of
devices that we use in our lives, including household appliances and Internet of
Things devices, without relying on expensive hardware or reliable internet
connections, which is often subject to bandwidth and power constraints and
results in high latency. This can also help preserve privacy, since no data
leaves the device. Imagine smart appliances that can adapt to your daily
routine, intelligent industrial sensors that understand the difference between
problems and normal operation, and magical toys that can help kids learn in fun
and delightful ways.
Supported platforms
LiteRT for Microcontrollers is written in C++ 17 and requires a 32-bit
platform. It has been tested extensively with many processors based on the
Arm Cortex-M Series
architecture, and has been ported to other architectures including
ESP32. The
framework is available as an Arduino library. It can also generate projects for
development environments such as Mbed. It is open source and can be included in
any C++ 17 project.
Each example application is on
GitHub
and has a README.md file that explains how it can be deployed to its supported
platforms. Some examples also have end-to-end tutorials using a specific
platform, as given below:
Hello World -
Demonstrates the absolute basics of using LiteRT for
Microcontrollers
Convert to a C byte array using
standard tools to store it in a
read-only program memory on device.
Run inference on device using the C++ library and process
the results.
Limitations
LiteRT for Microcontrollers is designed for the specific constraints of
microcontroller development. If you are working on more powerful devices (for
example, an embedded Linux device like the Raspberry Pi), the standard
LiteRT framework might be easier to integrate.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2024-08-30 UTC."],[],[],null,["LiteRT for Microcontrollers is designed to run machine learning models\non microcontrollers and other devices with only a few kilobytes of memory. The\ncore runtime just fits in 16 KB on an Arm Cortex M3 and can run many basic\nmodels. It doesn't require operating system support, any standard C or C++\nlibraries, or dynamic memory allocation.\n| **Note:** The [LiteRT for Microcontrollers Experiments](https://experiments.withgoogle.com/collection/tfliteformicrocontrollers) features work by developers combining Arduino and TensorFlow to create awesome experiences and tools. Check out the site for inspiration to create your own TinyML projects.\n\nWhy microcontrollers are important\n\nMicrocontrollers are typically small, low-powered computing devices that are\nembedded within hardware that requires basic computation. By bringing machine\nlearning to tiny microcontrollers, we can boost the intelligence of billions of\ndevices that we use in our lives, including household appliances and Internet of\nThings devices, without relying on expensive hardware or reliable internet\nconnections, which is often subject to bandwidth and power constraints and\nresults in high latency. This can also help preserve privacy, since no data\nleaves the device. Imagine smart appliances that can adapt to your daily\nroutine, intelligent industrial sensors that understand the difference between\nproblems and normal operation, and magical toys that can help kids learn in fun\nand delightful ways.\n\nSupported platforms\n\nLiteRT for Microcontrollers is written in C++ 17 and requires a 32-bit\nplatform. It has been tested extensively with many processors based on the\n[Arm Cortex-M Series](https://developer.arm.com/ip-products/processors/cortex-m)\narchitecture, and has been ported to other architectures including\n[ESP32](https://www.espressif.com/en/products/hardware/esp32/overview). The\nframework is available as an Arduino library. It can also generate projects for\ndevelopment environments such as Mbed. It is open source and can be included in\nany C++ 17 project.\n\nThe following development boards are supported:\n\n- [Arduino Nano 33 BLE Sense](https://store-usa.arduino.cc/products/arduino-nano-33-ble-sense-with-headers)\n- [SparkFun Edge](https://www.sparkfun.com/products/15170)\n- [STM32F746 Discovery kit](https://www.st.com/en/evaluation-tools/32f746gdiscovery.html)\n- [Adafruit EdgeBadge](https://www.adafruit.com/product/4400)\n- [Adafruit LiteRT for Microcontrollers Kit](https://www.adafruit.com/product/4317)\n- [Adafruit Circuit Playground Bluefruit](https://learn.adafruit.com/tensorflow-lite-for-circuit-playground-bluefruit-quickstart?view=all)\n- [Espressif ESP32-DevKitC](https://www.espressif.com/en/products/hardware/esp32-devkitc/overview)\n- [Espressif ESP-EYE](https://www.espressif.com/en/products/hardware/esp-eye/overview)\n- [Wio Terminal: ATSAMD51](https://www.seeedstudio.com/Wio-Terminal-p-4509.html)\n- [Himax WE-I Plus EVB Endpoint AI Development Board](https://www.sparkfun.com/products/17256)\n- [Synopsys DesignWare ARC EM Software Development Platform](https://www.synopsys.com/dw/ipdir.php?ds=arc-em-software-development-platform)\n- [Sony Spresense](https://developer.sony.com/develop/spresense/)\n\nExplore the examples\n\nEach example application is on\n[GitHub](https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples)\nand has a `README.md` file that explains how it can be deployed to its supported\nplatforms. Some examples also have end-to-end tutorials using a specific\nplatform, as given below:\n\n- [Hello World](https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples/hello_world) - Demonstrates the absolute basics of using LiteRT for Microcontrollers\n - [Tutorial using any supported device](./get_started)\n- [Micro speech](https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples/micro_speech) - Captures audio with a microphone to detect the words \"yes\" and \"no\"\n - [Tutorial using SparkFun Edge](https://codelabs.developers.google.com/codelabs/sparkfun-tensorflow/#0)\n- [Person detection](https://github.com/tensorflow/tflite-micro/blob/main/tensorflow/lite/micro/examples/person_detection) - Captures camera data with an image sensor to detect the presence or absence of a person\n\nWorkflow\n\nThe following steps are required to deploy and run a TensorFlow model on a\nmicrocontroller:\n\n1. **Train a model** :\n - *Generate a small TensorFlow model* that can fit your target device and contains [supported operations](./build_convert#operation_support).\n - *Convert to a LiteRT model* using the [LiteRT converter](./build_convert#model_conversion).\n - *Convert to a C byte array* using [standard tools](./build_convert#convert_to_a_c_array) to store it in a read-only program memory on device.\n2. **Run inference** on device using the [C++ library](./library) and process the results.\n\nLimitations\n\nLiteRT for Microcontrollers is designed for the specific constraints of\nmicrocontroller development. If you are working on more powerful devices (for\nexample, an embedded Linux device like the Raspberry Pi), the standard\nLiteRT framework might be easier to integrate.\n\nThe following limitations should be considered:\n\n- Support for a [limited subset](./build_convert#operation_support) of TensorFlow operations\n- Support for a limited set of devices\n- Low-level C++ API requiring manual memory management\n- On device training is not supported\n\nNext steps\n\n- [Get started with microcontrollers](./get_started) to try the example application and learn how to use the API.\n- [Understand the C++ library](./library) to learn how to use the library in your own project.\n- [Build and convert models](./build_convert) to learn more about training and converting models for deployment on microcontrollers."]]