mmwave-gesture-recognition

mmwave-gesture-recognition

Basic Gesture Recognition Using mmWave Sensor - TI AWR1642

Stars: 103

Visit
 screenshot

This repository provides a setup for basic gesture recognition using the TI AWR1642 mmWave sensor. Users can collect data from the sensor and choose from various neural network architectures for gesture recognition. The supported gestures include Swipe Up, Swipe Down, Swipe Right, Swipe Left, Spin Clockwise, Spin Counterclockwise, Letter Z, Letter S, and Letter X. The repository includes data and models for training and inference, along with instructions for installation, serial permissions setup, flashing firmware, running the system, collecting data, training models, selecting different models, and accessing help documentation. The project is developed using Python and TensorFlow 2.15.

README:

Basic Gesture Recognition Using mmWave Sensor - TI AWR1642

Collecting data from the TI AWR1642 via its serial port, this setup allows the user to choose one of several neural network architectures - convolutional, ResNet, LSTM, or Transformer. The selected network is then used for the recognition and classification of specific gestures:

  • None (random non-gestures)
  • Swipe Up
  • Swipe Down
  • Swipe Right
  • Swipe Left
  • Spin Clockwise
  • Spin Counterclockwise
  • Letter Z
  • Letter S
  • Letter X

Demo

Getting Started

Deps:

  • python 3.8+
  • unzip (optional)
  • curl (optional)

unzip and curl are used by the fetch script.

Installation

Install mmwave_gesture package locally:

git clone https://github.com/vilari-mickopf/mmwave-gesture-recognition.git
cd mmwave-gesture-recognition
pip install -e .

Data and models

You can run ./fetch script to download and extract:

  • data (20k samples - 2k per class) ~120Mb

  • models (Conv1D, Conv2D, ResNet1D, ResNet2D, LSTM and Transformer models) ~320Mb

Note: Models are generated using TensorFlow 2.15, which is why the tf version is fixed in the setup.py. All models, except for ResNets, can be loaded with tf versions from 2.12 to 2.15. However, the training process should work fine with all tf versions up to 2.15, as tf introduced some changes in version 2.16 that break nested models.

To access the required data manually, follow the provided links to download the files. Once downloaded, manually extract the contents to the directories mmwave_gesture/data/ and mmwave_gesture/models/ as appropriate.

End result should look like this:

mmwave_gesture/
│ communication/
│ data/
│ │ ccw/
│ │ cw/
│ │ down/
│ │ │ sample_1.npz
│ │ │ sample_2.npz
│ │ │ ...
│ │ └ sample_2000.npz
│ │ left/
│ │ none/
│ │ right/
│ │ s/
│ │ up/
│ │ x/
│ │ z/
│ │ __init__.py
│ │ formats.py
│ │ generator.py
│ │ loader.py
│ │ logger.py
│ └ preprocessor.py
│ models/
│ │ Conv1DModel/
│ │ │ confusion_matrix.png
│ │ │ history
│ │ │ model.h5
│ │ │ model.png
│ │ └ preprocessor
│ │ Conv2DModel/
│ │ LstmModel/
│ │ ResNet1DModel/
│ │ ResNet2DModel/
│ └ TransModel/
│ utils/
│ __init__.py
│ model.py
...

Serial permissions

The group name can differ from distribution to distribution.

Arch

gpasswd -a <username> uucp

Ubuntu:

gpasswd -a <username> dialout

The change will take effect on the next login.

The group name can be obtained by running:

stat /dev/ttyACM* | grep Gid

One time only (permissions will be reseted after unplugging):

chmod 666 /dev/ttyACM*

Flashing

The firmware code used for AWR1642 is just a mmWaveSDK demo provided with the version 02.00.00.04. Bin file is located in firmware directory.

  1. Close SOP0 and SOP2, and reset the power.
  2. Start the console and run flash command:
python mmwave-console.py
>> flash xwr16xx_mmw_demo.bin
  1. Remove SOP0 and reset the power again.

Running

If the board was connected before starting the console, the script should automatically find the ports and connect to them. This is only applicable for boards with XDS. If the board is connected after starting the console, autoconnect command should be run. If for some reason this is not working, manual connection is available via connect command. Manual connection can also be used for boards without XDS. Type help connect or help autoconnect for more info.

If the board is connected, the prompt will be green, otherwise, it will be red.

After connecting run plotter and prediction with following commands:

python mmwave-console.py
>> plot
>> predict

Use Ctrl-C to stop this command.

Collecting data

The console can be used for easy data collection. Use log command to save gesture samples in .npz format in mmwave/data/ directory (or custom directory specified by set_data_dir command). If nothing is captured for more than a half a second, the command will automatically be stopped. redraw/remove commands will redraw/remove the last captured sample.

python mmwave-console.py
>> listen
>> plot
>> set_data_dir /path/to/custom/data/dir
>> log up
>> log up
>> redraw up
>> remove up
>> log down
>> log ccw

Training

python mmwave-console.py
>> set_data_dir /path/to/custom/data/dir
>> train

or

python mmwave_gesture/model.py

Note: Default data dir is mmwave_gesture/data.

Selecting model

By default, conv2d model is used. Other models can be selected using set_model option.

python mmwave-console.py
>> set_model conv1d
>> set_model lstm
>> set_model trans

Help

Use help command to list all available commands and get documentation on them.

python mmwave-console.py
>> help
>> help flash
>> help listen

Acknowledgments

  • Thanks to NOVELIC for providing me with sensors

Authors

  • Filip Markovic

For Tasks:

Click tags to check more tools for each tasks

For Jobs:

Alternative AI tools for mmwave-gesture-recognition

Similar Open Source Tools

For similar tasks

For similar jobs