As demo in the class, you can train your own objects detector on your own dataset. This part mainly use MobileNet and Yolo2. You can also design the network or formulate the task by yourself. Here, I follow this tutorial to train a raccon detector.
Anaconda
Python3.6
Tensorflow 1.14.0
Keras 2.3.0
Opencv 3.3.0
Git
Ubuntu
GPU
For quicker training.
conda create -n yolo python=3.6
conda activate yolo
conda install -c conda-forge tensorflow
conda install -c conda-forge keras
conda install -c anaconda opencv
xxxxxxxxxx
git clone https://github.com/AIWintermuteAI/Yolo-digit-detector.git
git clone https://github.com/datitran/raccoon_dataset.git
mv raccoon_dataset Yolo-digit-detector/
cd Yolo-digit-detector/
xvim configs/raccoon.json
//inside the configs/raccoon.json
...
...
"train" : {
"actual_epoch": 20,
"train_image_folder": "raccoon_dataset/images"
"train_annot_folder": "raccoon_dataset/annotations",
...
"valid_image_folder": "raccoon_dataset/images_valid",
"valid_annot_folder": "raccoon_dataset/annotations_valid",
...
...
//end of file
xxxxxxxxxx
python train.py -c configs/raccoon.json
You can play with some hyperparameters in the config file such as the actual_epoch
, batch_size
, learning_rate
, or anchors
. They influence the final performance of your model. For example, if you use a larger dataset containing more training images and more detection classes, you should train your model by more epochs for better convergency. Or, if the targeting objects inside your image is typically small, you can use smaller anchors' parameters (it defines the prior anchors for detector. you can find more detail in FIg.2 in Yolo2 paper).
In this part, we use the Maix toolbox provided by Sipeed to convert the .tflite
model into .kmodel
which can be run on the board.
Maix toolbox
Imagemagick
First of all, let's get the Maix toolbox from their GitHub repo:
xxxxxxxxxx
git clone https://github.com/sipeed/Maix_Toolbox.git
Then, fix a bug:
xxxxxxxxxx
cd Maix_Toolbox
vim get_nncase.sh
//inside the get_nncase.sh
...
...
tar -Jxf ncc-linux-x86_64.tar.xz
rm ncc-linux-x86_64.tar.xz
...
//end of file
Download the library:
xxxxxxxxxx
bash get_nncase.sh
In order to convert the model, we need some example images for inference. These example images have to be the same resolution as the input image's resolution to your model. In this case, we define the input size in the configuration file raccoon.json
:
xxxxxxxxxx
{
"model" : {
"architecture": "MobileNet",
"input_size": 224,
...
...
}
As a result, we need some images with 224x224
size for inference. I directly use the training images we used to train our racoon detector by Keras
. To make sure those images are 224x224
, I use Imagemagick
package to directly resize them:
xxxxxxxxxx
cd ..
convert -resize 224x224! raccoon_dataset/images/raccoon-1.jpg Maix_Toolbox/images/raccoon-1.jpg
convert -resize 224x224! raccoon_dataset/images/raccoon-2.jpg Maix_Toolbox/images/raccoon-2.jpg
convert -resize 224x224! raccoon_dataset/images/raccoon-3.jpg Maix_Toolbox/images/raccoon-3.jpg
//3 images are enough
You can use your prefered way to do resizing.
Then, do convert:
xxxxxxxxxx
cp model.tflite Maix_Toolbox/
cd Maix_Toolbox
./tflite2kmodel.sh model.tflite
If you successfully convert the model, you will see:
It shows you the detail of your model such as the I/O dimension of InputLayer
, OutputLayer
, and each Conv2d
layer, as well as the memory usage of this converted model.
In this part, we burn Maixpy Firmware onto the board to run the micropython script. Here, I use the GUI package provide by Sipeed to burn the firmware. In order to use my laptop to demo this board at anywhere, I switch back to my MacBook. Because OSX is pretty similar to Ubuntu, it should be OK to keep using Ubuntu to complete this part.
Maixpy Firmware v.0.4.0_52
Kflash-GUI
First, get the firmware:
xxxxxxxxxx
curl -L -O http://dl.sipeed.com/MAIX/MaixPy/release/master/maixpy_v0.4.0_52_g3b8c18b/maixpy_v0.4.0_52_g3b8c18b.bin
Or download from here.
Then, clone the Kflash
GUI and install the requirements and packages (.bin
file):
xxxxxxxxxx
git clone https://github.com/sipeed/kflash_gui.git
cd kflash_gui
curl -L -O https://github.com/sipeed/kflash_gui/releases/download/v1.5.3/kflash_gui_v1.5.3_linux.tar.xz
tar xf kflash_gui_v1.5.3_linux.tar.xz
rm -rf kflash_py
cp -r kflash_gui/kflash_py .
Run the Kflash
GUI:
xxxxxxxxxx
python3 kflash_gui.py
Let's burn the firmware:
If you successfully burn the firmware, you will see:
Instead of using this GUI, you can also use the kflash.py
to burn the firmware. It is also inside the kflash_gui
folder. However, you have to make sure you use the correct arguments for the process
function.
In some case, you want to use light weight Maixpy Firmware, Sipeed also provide a minimum version:
xxxxxxxxxx
curl -L -O http://dl.sipeed.com/MAIX/MaixPy/release/master/maixpy_v0.4.0_52_g3b8c18b/maixpy_v0.4.0_52_g3b8c18b_minimum.bin
For example, if you want to burn your .kmodel
into the Flash memory simultaneously with the firmware, you have to burn a light weight firmware. By this light weight firmware, you cannot use the provided MaixPy IDE which I will introduce later to run a micropython script. Instead, you have to use Serial I/O
to execute it (I will also intoduce it in the next part). Otherwise, either the some parts of firmware or your model would be earsed. Both case will cause you to fail to run the model. And there is no error message or warning message. Thus, I suggest to use a SD card to store the model and plug it into the board when you need it.
You can find different versions' firmware here. But in order to use the Maixpy IDE later, I suggest to use v0.4.0_52
.
In this part, I use the provided MaixPy IDE and Serial I/O
to execute the micropython script to run the .kmodel
on the board. Instead of run the .kmodel
by micropython, you can also run it by C
. In this case, you have to use Arduino IDE or Platform I/O to burn your code onto the board.
MaixPy IDE v0.2.4
Get the IDE:
xxxxxxxxxx
//Dowload one of them based on your OS
For Windows:
http://dl.sipeed.com/MAIX/MaixPy/ide/_/v0.2.4/maixpy-ide-windows-0.2.4-installer-archive.7z
or
http://dl.sipeed.com/MAIX/MaixPy/ide/_/v0.2.4/maixpy-ide-windows-0.2.4.exe
For OSX:
http://dl.sipeed.com/MAIX/MaixPy/ide/_/v0.2.4/maixpy-ide-mac-0.2.4.dmg
For Linux:
http://dl.sipeed.com/MAIX/MaixPy/ide/_/v0.2.4/maixpy-ide-linux-x86_64-0.2.4-installer-archive.7z
or
http://dl.sipeed.com/MAIX/MaixPy/ide/_/v0.2.4/maixpy-ide-linux-x86_64-0.2.4.run
Install it after downloading. Then, run it. First thing we need to check is the Board version:
Then, check if it can connect to the board:
If you cannot connect to your board, you may (1) select a wrong board version or (2) burn a wrong Maixpy Firmware. For example, you cannot use this IDE to connect the board when you burn the minimum version firmware. Double check you burn a correct version of Maixpy Firmware.
Open the Yolo-digit-detector/micropython_code/racoon_detector.py
on this IDE and run it:
If you successfully run it, you will see:
The right part of this IDE shows the captured image and its RGB
value in real-time.
Finally, you can test your racoon detector on some random racoon image from Google Image:
Let's see how to execute the micropython script by Serial I/O
when you want to use a minimum version Maixpy firmware. In this example, I run another .kmodel
which is a tiny-yolo2 trained on PASCAL dataset. Bsiaclly, use the screen
to connect the board:
xxxxxxxxxx
screen -L /dev/cu.usbserial-xel_sipeed0 115200
And, you will see:
Then do Crtl+E
and copy-paste all the micropython script into it, and press Crtl+D
to execute it. It basically put all the code in the script into one line then send them to the board:
From the script, you can find there are 20 classes (you can download from here): aeroplane
, bicycle
, bird
, boat
, bottle
, bus
, car
, cat
, chair
, cow
, diningtable
, god
, house
, motobike
, person
, pottedplant
, sheep
, sofa
, train
, tvmonitor
which are defined in the PASCAL dataset. And I put the .kmodel
in the Flash memory starting from 0x500000
.
Finally, let's do some test:
MaixPy IDE: http://dl.sipeed.com/MAIX/MaixPy/ide
MaixPy Firmware: http://dl.sipeed.com/MAIX/MaixPy/release/master/
MaixPy DOC: https://maixpy.sipeed.com/en/(https:/github.com/sipeed/MaixPy
MaixPy micropython example scripts: https://github.com/sipeed/MaixPy_scripts
Kflash tollbox: https://github.com/sipeed/kflash_gui
Sipeed BBS: https://bbs.sipeed.com
Train a flower classifier: https://iotdiary.blogspot.com/2019/07/maixpy-go-mobilenet-transfer-learning.html
Train a racoon detector: https://www.instructables.com/id/Object-Detection-With-Sipeed-MaiX-BoardsKendryte-K/
Train a MobileNet classifier: https://www.instructables.com/id/Transfer-Learning-With-Sipeed-MaiX-and-Arduino-IDE/
C code for running MobileNet classifier: https://github.com/AIWintermuteAI/transfer_learning_sipeed/tree/master/mobilenet_v1_transfer_learning
Run 20 classes TinyYolo2 on board: https://bbs.sipeed.com/t/topic/683
Yolo: https://pjreddie.com/yolo/
Yolo2: https://arxiv.org/pdf/1612.08242.pdf
MobileNet: https://arxiv.org/pdf/1704.04861.pdf
CSE599 G1 18au: https://courses.cs.washington.edu/courses/cse599g1/18au/
CS231n: http://cs231n.github.io