If you are interested in artificial intelligence and gesture recognition you may be interested in a new article published to the official Arduino blog this week by the Arduino Team. Using a Arduino Nano 33 BLE Sense and untethered gesture recognition controller has been created using the boards 9-axis motion sensor. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm. Check out the video below to learn more about gesture recognition and artificial intelligence.
One AI will do any task we ask of it. But in reality, even when AI reaches the advanced levels we envision, it won’t automatically be able to do everything. The Fraunhofer Institute for Microelectronic Circuits and Systems has been giving this a lot of thought. As a test case, an Arduino Nano 33 BLE Sense was employed to build a demonstration device. Using only the onboard 9-axis motion sensor, the team built an untethered gesture recognition controller. When a button is pressed, the user draws a number in the air, and corresponding commands are wirelessly sent to peripherals. In this case, a robotic arm.”
“Obviously this is just an example use case. It’s easy to see the massive potential that this kind of compact, learning AI could have. Whether it’s in edge control, industrial applications, wearables or maker projects. If you can train a device to do the job you want, it can offer amazing embedded intelligence with very few resources.”
For more information on the Arduino customizable artificial intelligence and gesture recognition testing and prognosis jump over to the official Arduino blog by following the link below.