This guide covers usage of the ML Plugin with our design partner, Motion Gestures. From the ML Plugin, it is possible to upload new gestures to the Motion Gestures SDK, where they can be used to train and test new gesture recognition models that can be deployed back to the embedded application.
Touch gesture data can be gathered from any platform that supports 2D touch sensing. One example is the Integrated Graphics and Touch (IGAT) Curiosity Development Kit, which also demos the Motion Gestures solution in the Legato Showcase Demo Firmware
The ML Plugin and MPLAB® Data Visualizer can both be installed as plugins to MPLAB® X via the plugins manager or the ML Plugin can be installed as a plugin to MPLAB® Data Visualizer Standalone.
- SAMC21 QT8 Data Logger Firmware - 2D touch position data for touch and gesture-based solutions
This section contains instructions for using the ML plugin with Motion Gestures. The first section of this guide is about capturing 2D touch position data streamed over a serial connection from the SAMC21 Xplained Pro and viewing it within the XY Plot of Data Visualizer. Then the guide covers using the ML Plugin to upload captured gestures to the Motion Gestures SDK for training new models and testing their performance. To learn more about Motion Gestures' touch gesture recognition solution, check out the "Motion Gestures Touch Demo".
Capturing sensor data with MPLAB® Data Visualizer
Program the Kit with Data Logger Firmware
Use MPLAB® X IDE to program the SAMC21 Xplained Pro with the provided example project. This firmware utilizes the on-chip Peripheral Touch Controller (PTC) to detect and track touch contacts on the QT8 Xplained Pro Touch Surface. The resultant touch coordinates, calculated by the QTouch Modular Library (QTML), are streamed out over serial connection to MPLAB® Data Visualizer.
The touch configuration parameters, such as the sensor sampling rate, can be found in touch.h. These settings can also be viewed or reconfigured within MPLAB® Harmony v3. The sampling rate should be noted for later use in the project configuration menu of the Motion Gestures SDK. For the example firmware, a sampling rate of 200 Hz is used, and the library is configured for self-capacitance sensing to use the QT8 Xplained Pro as the touch sensor.
Once the kit is programmed with the desired configuration, you are ready to move on to collecting the serial data stream with MPLAB® Data Visualizer.
Configure MPLAB® Data Visualizer
Leave the board connected to the computer and open MPLAB® Data Visualizer. Load the Data Visualizer workspace file 2d-touch-position.dvws found in the example firmware repository. This workspace already contains the variable streamer required to parse the X/Y position data, and it will plot each variable once the serial port is configured.
After loading the Data Visualizer workspace file, select the Serial/CDC Connection that corresponds to the SAMC21 Kit. Adjust the baud rate to 115200 and click Apply. The DGI connection can also be disabled since we will not use any debug data.
Use the Play button on the Serial/CDC Connection to start data collection from the kit. Once data is streaming, it is available for use with the variable streamer.
Now select the same Serial/CDC Connection as the input data source for the Touch Position variable streamer, so that the data axes can be parsed from the stream.
The X/Y position data is now available in the time plot. Two-dimensional touch data is also visualized in the XY Plot, which plots the data visible in the Time Plot in two dimensions. Double click anywhere within the time plot to start/stop scrolling of the time axis, and zoom in or out to decrease or increase the length of the plotting window.
Select Data Region and Mark Time Window
Once the touch position data is visualized in the XY Plot, gesture data can be marked for use in the ML Plugin. To capture a new gesture, simply draw it on the QT8 Touch Surface and then focus the time window on the specific time region where the gesture was performed. Confirm that the gesture is properly segmented by checking the XY Plot. When ready, click the Mark button in the Time Axis menu.
Pressing Mark will place the cursors at the bounds of the visible window. To select a new region of data, first, reposition the desired data within the Time Plot, and then press Mark again.
After marking the time window, the gesture data displayed in the XY Plot is ready to be used within the ML Plugin. Repeat this process as needed to capture new gestures for use with Motion Gestures.
Uploading data to the Motion Gestures SDK
Log in With Your Motion Gestures Credentials
After selecting Motion Gestures within the ML plugin you will be prompted to log in. This will enable the plugin to connect with your account on the Motion Gestures SDK for uploading the captured gesture data. If you still need to create an account on the Motion Gestures SDK, you can do so for free on the registration page.
Upload to Gesture Library
To upload the new gesture to your Gesture Library, enter a name for the gesture and click upload. Once the gesture is available in the Motion Gestures SDK library, you can add it to one of your projects and then train a model that can recognize it. To learn more about using the Motion Gestures SDK, see the "Motion Gestures User's Guide".
Upload to Project for Model Testing
Once you have trained a model in the Motion Gestures SDK you can test the model performance by uploading new gestures from the ML Plugin. The recognition results are displayed in the ML Plugin. This allows for validating model performance, and if tuning is needed, the test results can be used to iterate on and improve the project configuration.
To find your project's API key within the Motion Gestures SDK, open the project settings menu, and select Details. Then copy and paste the API key into the ML Plugin to upload new gestures for classification.
After uploading a gesture for testing, the response will be displayed in the ML Plugin along with the confidence rating associated with the recognition. This way the solution performance can be assessed with data directly from the target hardware.
You should now understand how to collect live touch gesture data from your target device and send it to the Motion Gestures SDK for developing gesture recognition solutions. To deploy your own model as a static C library on a Microchip Arm® Cortex®-based 32-bit device, please contact Motion Gestures. If you would like to try out a demo library running in an embedded application then check out the "Motion Gestures Touch Demo".