ML Plugin User's Guide

 Objective

This guide will teach you how to use the ML plugin to capture and transmit your data to our partner platforms where you can develop machine learning solutions that can be deployed in your embedded application. The ML plugin works within MPLAB Data Visualizer so that you can capture a live data stream from your target device, select the desired data region, and then upload it to the platform of your choice.

MLPlugin_0_LandingPage.png

Our Machine Learning Partners


cartesiam-logo.png

Cartesiam

With the ML plugin, you can log your data to a file that can be imported into the Nano Edge AI Studio where you can develop anomaly detection solutions that learn at the edge. To learn more visit the Cartesiam website.


ei-logo-rgb.png

Edge Impulse

With the ML plugin, you can upload your edge data to the Edge Impulse Studio where you can create the next generation of intelligent devices with embedded machine learning. To learn more visit the Edge Impulse website.


Motion-Gestures-logo.png

Motion Gestures

With the ML plugin, you can define new touch gestures on your target hardware and then upload them to the MG SDK where you can train a machine learning model to recognize your custom gestures. To learn more visit the Motion Gestures website.


Partner solutions are suitable for deployment on Microchip Arm® Cortex®-based 32-bit microcontrollers and microprocessors.

 Materials

Hardware Tools

Cartesiam & Edge Impulse

Motion Gestures

Software Tools

The ML Plugin and Data Visualizer are both plugins to MPLAB X and are available to install in the plugins manager.

Exercise Files

Cartesiam & Edge Impulse

Motion Gestures

 Connection Diagram

SAMD21 ML Evaluation Kit with Bosch IMU

SAMD21-ML-Eval-Kit-Bosch

SAMC21 Xplained Pro & QT8 Xplained Pro

SAMC21+QT8

 Procedure

This section contains instructions for using the ML plugin with all three partners. The first section is about capturing live sensor data streamed from the target hardware. This can be done with the SAMD21 ML Evaluation Kit for six-axis IMU data or with the SAMC21 Xplained Pro and QT8 Xplained Pro for two-axis touch position data. After the data has been captured in MPLAB Data Visualizer, it can be used imported or uploaded to our partner platforms. This guide will cover using the six-axis IMU data with Cartesiam and Edge Impulse, and the two-axis touch position data with Motion Gestures.


Capturing sensor data with MPLAB Data Visualizer

1

Program the Kit with Data Streamer Firmware

Use MPLAB X IDE to program the desired kit with the corresponding example project.

If you are not familiar with MPLAB X IDE, please visit the "MPLAB X IDE" Developer Help page.

2

Connect Target Device to MPLAB Data Visualizer

Leave the board connected to your computer and open the MPLAB Data Visualizer plugin. On the left, select the Serial/CDC Connection that corresponds to the kit you are using. Then adjust the baud rate to 115200 and click Apply.

MLPlugin_port_baud.png

If you are not familiar with MPLAB Data Visualizer, please see the "MPLAB Data Visualizer User's Guide".

3

Configure the Variable Streamer

Click the new variable streamer button on the Serial/CDC Connection that you are using to configure the parsing of variables from the data stream.

MLPlugin_port_var_streamer.png

SAMD21 ML Evaluation Kit (6-Axis IMU Data)

The IMU data types are Int16 and they are output from the kit in the following order: aX, aY, aZ, gX, gY, gZ.

MLPlugin_imu_streamer.png

Next, plot the variables for viewing in the Time Plot.

ML_Plugin_time_plot.png

SAMC21 + QT8 (2-Axis Touch Position Data)

The touch position data types are UInt16 and they are output from the kit in the following order: x, y.

MLPlugin_touch_streamer.png

Next, plot the variables for viewing in the Time Plot.

MLPlugin_time_plot_touch.png

To use the touch position for capturing 2D gestures open the XY Plot window and select sources for the x and y axes.

MLPlugin_xyplot.png

4

Select Data Region and Mark Time Window

Use the play/stop button on the Serial/CDC Connection to control the collection of data from the kit.

MLPlugin_port_start_stream.png

Then select a region of interest in the data by focusing the Time Plot on that region. You can drag the plots in the time window to the desired region of data while scrolling to zoom in or out as needed. Once you are satisfied with the data viewable in the Time Plot, hit the Mark button to mark this region for use in the ML plugin.

MLPlugin_mark_window.png

After marking the time window you can move on to the ML plugin where you can send the selected data to one of our ML partner platforms.

Jump to a section: Cartesiam, Edge Impulse, Motion Gestures

You will need to have the ML plugin installed to have access to the ML plugin window within MPLAB Data Visualizer.


Saving data for import to Cartesiam's NEAI Studio

1

Save to CSV File

After selecting Cartesiam within the ML plugin, you will have two file format options to choose from. Mono-Sensor is used for any data source that has three axes or less and Multi-Sensor is used for any data source that has more than three axes. To learn more about formatting signal files for import into NanoEdge AI Studio check out Cartesiam's documentation.

MLPlugin_Cartesiam.png

For this example which has six-axis IMU data, we will use the Multi-Sensor format. After selecting the proper format, click Save Data. This will open the MPLAB Data Visualizer dialog for saving to a CSV file with the required formatting for Cartesiam pre-selected. Check that the parameters are set as in the image below and then click Save.

Cartesiam-multi.png

2

Import CSV to the NEAI Studio

For a detailed guide on importing CSV files into your NEAI Studio project see Cartesiam's documentation.


Uploading data to the Edge Impulse Studio

1

Log in With Your Edge Impulse Credentials

After selecting Edge Impulse within the ML plugin you will be prompted to log in. This will enable the plugin to retrieve your projects from the Edge Impulse Studio and allow you to upload data to any of your projects.

2

Upload Your Data

Once you have logged in, configure the upload by selecting the data sources that you would like to upload and the project that you would like to upload them to. You can also specify the endpoint within the selected project (i.e. Training, Testing, Anomaly). The device name will tag the uploaded data to show which device it came from. The data label will be used to generate the file name and to mark the class of the data sample.

MLPlugin_EI_upload.png

Sensor names should remain consistent within each project in the Edge Impulse Studio. If you are adding more data to a project that already contains data, then be sure to use the same sensor names.

3

Creating an Impulse

Once you have uploaded various classes of data to the Edge Impulse Studio you can train an Impulse to recognize these classes and to detect anomalous data. To learn more about using the Edge Impulse Studio see the Edge Impulse Docs.

4

Live Classification

To test a trained Impulse with new data from the target device simply navigate to the Live Classification tab within the Edge Impulse Studio and then upload new data from the ML Plugin to the Testing endpoint within the desired project. After uploading, the Edge Impulse Studio will refresh to show you the classification results for the new data.

EI_testing.png

Uploading data to the Motion Gestures SDK

1

Log in With Your Motion Gestures Credentials

After selecting Motion Gestures within the ML plugin you will be prompted to log in. This will enable the plugin to connect with your account on the Motion Gestures SDK for uploading the captured gesture data.

2

Upload Your Gesture Data

New gestures can be uploaded for training new models and for testing model performance.

a

Upload to Gesture Library

When uploading to your library you must enter a name for your new gesture. Once the gesture is available in the Motion Gestures SDK library, you can add it to one of your projects to train a model that can recognize it. To learn more about using the Motion Gestures SDK see the "Motion Gestures User's Guide".

MLPlugin_MG_library.png

b

Upload to Project for Model Testing

Once you have trained a model in the Motion Gestures SDK you can test the model performance by uploading new gestures from the ML plugin. The recognition result will be displayed in the ML plugin. This allows for model verification and tuning.

MLPlugin_MG_testing.png

To find your project's API key within the Motion Gestures SDK, open the project settings menu, and select Details.

MG_project_details.png

Learn more about Motion Gestures touch gesture recognition solution by checking out the guide for the Motion Gestures Touch Demo.

 Results

You should now understand how to collect live sensor data from your target device and send it to one of our ML partner platforms for developing ML applications that can be used in your embedded application.


© 2020 Microchip Technology, Inc.
Notice: ARM and Cortex are the registered trademarks of ARM Limited in the EU and other countries.
Information contained on this site regarding device applications and the like is provided only for your convenience and may be superseded by updates. It is your responsibility to ensure that your application meets with your specifications. MICROCHIP MAKES NO REPRESENTATIONS OR WARRANTIES OF ANY KIND WHETHER EXPRESS OR IMPLIED, WRITTEN OR ORAL, STATUTORY OR OTHERWISE, RELATED TO THE INFORMATION, INCLUDING BUT NOT LIMITED TO ITS CONDITION, QUALITY, PERFORMANCE, MERCHANTABILITY OR FITNESS FOR PURPOSE. Microchip disclaims all liability arising from this information and its use. Use of Microchip devices in life support and/or safety applications is entirely at the buyer's risk, and the buyer agrees to defend, indemnify and hold harmless Microchip from any and all damages, claims, suits, or expenses resulting from such use. No licenses are conveyed, implicitly or otherwise, under any Microchip intellectual property rights.