Skip to main content

At CES 2020, Acconeer showed a speaker using radar-based gesture control to play and pause music, change song and control volume. On this page you can find everything you need to make your own implementation, in order to evaluate a similar use case, to learn more about how to combine radar with AI, or simply because it is fun.



For this implementation, two radar sensors are mounted on a normal consumer speaker with a Raspberry Pi fitted inside. Find a list of everything needed at GitHub.



All the source code used for the project is provided as open source on Acconeer’s GitHub page. Clone it to adjust to your implementation, and learn all about how to train your own AI model based on the deep learning support integrated into Exploration Tool.



Tell us about your project! We’d love to hear from you, get in touch on