This demo showcases Motion Gestures' embedded machine learning solution for touch gesture recognition. The demo runs on the SAMC21 Xplained Pro and uses the QT8 water tolerant touch surface for gesture input.
How it Works
When a finger touches down on the touch surface, all of the touch coordinates are collected until the finger is lifted. Then the collected coordinates are passed to the Motion Gestures Library for recognition. If the library recognizes the gesture, it will return the label associated with that gesture. If it does not recognize the gesture, it will return the “rejected” label.
Recognition is path-dependent. Refer to the gesture definitions to see how each gesture should be drawn. Keep in mind that the finger should not be lifted until the gesture is complete.
Gesture Definitions
The nine distinct gestures that the machine learning model is trained to recognize are shown below. The start of each gesture is marked by a dot and the end of the gesture is marked with an arrow. These nine distinct gestures map to six labels: M, Check Mark, S, 2, Alpha, Star.
There are two variations each of the M, Check Mark, and 2 gestures. These variations will map to the same predicted label, so that a capital M and a lowercase M will both be recognized as an M gesture.
The star must begin at the top point, trace down to the left first, and then trace around until reaching the top again. This is the only variation that is recognized by this demo.
Hardware Setup
The QT8 Xplained Pro Touch Surface should be connected to EXT1 on the SAMC21 Xplained Pro as seen in the image below.
Downloads
Using the Motion Gestures Touch Demo
First, download the demo materials, which include the Motion_Gestures_Touch_Demo.hex file and the demo GUI installer. Next, install the demo GUI, which is used to display the recognition results returned from the embedded Motion Gestures Library.
Programming the Demo Board
Use MPLAB® X IDE to program the SAMC21 Xplained Pro with the demo HEX file.
If you are not familiar with MPLAB X IDE, please visit the "MPLAB X IDE" Developer Help page.
Using the Motion Gestures Demo GUI
To use the GUI, first connect the SAMC21 to your PC with a USB cable. Next, use the drop-down box at the top to select the correct device (EDBG) and then press the connect button at the top left.
In the Motion Gestures Demo GUI, the most recent gesture input from the QT8 Touch Surface is displayed on the right side, and the predicted label returned from the embedded Motion Gestures Library is displayed on the left side with the most recent prediction at the top.
Motion Gestures Demo Example Project
The example project shows how the Motion Gestures Library can be integrated with the Microchip Touch Gesture Library within a single application. The Motion Gestures Library can be used to detect complex gestures while the Microchip Touch Gesture Library handles the detection of fundamental gestures such as taps, swipes, and wheels. The integrated demo uses the Microchip 2D Touch Surface Utility to display the recognition results returned from each library.
If you are not familiar with Microchip's 2D Touch Surface Utility, please visit the "Guide to Connect to Touch Surface Utility" Developer Help page.
The 2D Touch Surface Utility can be used to simultaneously display gesture recognition results from the Motion Gestures library and the Microchip 2D Touch Library. The current gesture window of the 2D Surface Utility will display the recognized gesture. Results from the Microchip Touch Library will be displayed in blue and results from the Motion Gestures library will be displayed in green.
Summary
With the Motion Gestures solution, complex touch gesture recognition can be implemented in a matter of minutes with the power of machine learning. With just one example of each desired gesture, the model can be trained to recognize them with very high accuracy. To learn more, visit Motion Gestures' website.