adds further instructions

This commit is contained in:
Raphael Maenle 2022-03-21 09:56:09 +01:00
parent 0dbabe328d
commit baefc3c761

View File

@ -33,51 +33,63 @@ you would like to deploy.
You'll need a way to train your network (yielding a .pb file from it) and the hailo software suite to generate the Hailo File from that. You'll need a way to train your network (yielding a .pb file from it) and the hailo software suite to generate the Hailo File from that.
- always download the newest version of the [hailo software suite](https://hailo.ai/developer-zone/sw-downloads) from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/) - always download the newest version of the [hailo software suite](https://hailo.ai/developer-zone/sw-downloads) from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/)
which are getting monthly updates at the time of writing. This of course means that these notes are rapidly aging into inaccuracy. which are getting monthly updates at the time of writing. This of course means that these notes are rapidly aging into inaccuracy.
- clone the [hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo), which has some convenient yolov5 - clone the [hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo), which has some convenient yolov5
Docker containers. Docker containers.
- unzip the `hailo software suite` and run inside `./hailo_sw_suite_docker_run.sh`. If this is your fist time running, the container will set up.
### Setup
- unzip the `hailo software suite` and run inside `./hailo_sw_suite_docker_run.sh`. If this is your fist time running, the container will set up.
Otherwise, you'll have to either `--resume` or `--override` Otherwise, you'll have to either `--resume` or `--override`
## For your Device with a Hailo Chip attached ## For your Device with a Hailo Chip attached
- downloads the newest hailo rt from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/) - downloads the newest hailo rt from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/)
- extract the file and run the installer. Once you're done, reboot. - extract the file and run the installer. Once you're done, reboot.
- Source the virtual environment under `/path/to/hailo_rt/Installer/hailo_platform_venv/bin/activate` - Source the virtual environment under `/path/to/hailo_rt/Installer/hailo_platform_venv/bin/activate`
- test virtual environment by running `hailo` - test virtual environment by running `hailo`
# Setup Example for a custom Yolov5m # Setup Example for a custom Yolov5m
Here you'll train, quantize, compile (on a gpu if possible) and infer (on the hailo chip)
## Train your own Yolov5m ## Train your own Yolov5m
- For yolov5 we'll be using hailo Docker containers, which are based on the [ultralytics yolov5 containers](https://github.com/ultralytics/yolov5) - For yolov5 we'll be using hailo Docker containers, which are based on the [ultralytics yolov5 containers](https://github.com/ultralytics/yolov5)
- hailo model zoo now has a [guide how to train yolov5 for hailo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) follow that one. Some notes: - hailo model zoo now has a [guide how to train yolov5 for hailo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) follow that one. Some notes:
- You'll need to create your own dataset structure for the training process. [This guide](https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data) - You'll need to create your own dataset structure for the training process. [This guide](https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data)
explains well on how to create that. explains well on how to create that.
- There's a minimal example dataset in this repository under `/dataset` - There's a minimal example dataset in this repository under `/dataset`
- To mount this, use eg.: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0` - To mount this, use eg.: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0`
- For training, make sure you target the correct `--model` and use the correct `--weights` (which are now conveniently already in the hailo docker) - For training, make sure you target the correct `--model` and use the correct `--weights` (which are now conveniently already in the hailo docker)
- once you've saved the best.pb onnx file, you can exit this docker container - once you've saved the best.pb onnx file, you can exit this docker container
- once you are done with the steps 'training and exporting to ONNX', move on to the next step. - once you are done with the steps 'training and exporting to ONNX', move on to the next step.
## Create Hailo representation (hef) ## Create Hailo representation (hef)
- in `hailo_sw_suite_docker_run.sh` add a volume so you can access your best.pb onnx file, for example - in `hailo_sw_suite_docker_run.sh` add a volume so you can access your best.pb onnx file, for example
in the `DOCKER_ARGS` you could add -v /home/user/files:/files in the `DOCKER_ARGS` you could add -v /home/user/files:/files
- run `./hailo_sw_suite_docker_run.sh --resume` to get into the docker container - run `./hailo_sw_suite_docker_run.sh --resume` to get into the docker container
- follow the rest of the tutorial from [the hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) - `cd hailo_model_zoo` and continue the tutorial.
- Note, that for yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network
you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that. Further Notes and Examples:
- Another example to compile the model is to run: `python hailo_model_zoo/main.py compile yolov5m --ckpt /files/my_dataset/best.onnx --calib-path /files/watt_dataset/dataset/dataset/images/ --yaml ./hailo_model_zoo/cfg/networks/yolov5m.yaml`
- note that here you're quantizing and compiling to HEF file. If you just want to quantize into a _har_ file, run `python hailo_model_zoo/main.py quantize yolov5m --ckpt /files/my_dataset/best.onnx --calib-path /files/my_dataset/dataset/dataset/images/ --yaml ./hailo_model_zoo/cfg/networks/yolov5m.yaml`, which yields the _.har_ representation of the file
- You have to provide a list of images for quantization (--calib-path)
- also provide a quantization scheme (--yaml), where hailo\_model\_zoo provides a variety
- **There's no real reason to not use _compile_ directly from what I know.**
- Note that the file locations in these examples are different to Hailos. These worked, while Hailo's didn't.
- follow the getting started guide from [hailo model zoo / GETTING STARTED](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/GETTING_STARTED.md), specifically the
_optimizing_ and _compile_ steps if you want more information on quantization parameters and how to go on from there.
## run inference ## run inference
- this git repository includes a _inference.py_ file, which which loads a specified _.hef_ file. Change that to your _.hef_ file location.
- If you have anything other than yolov5m, you will have to change the pre/post-processing steps. Otherwise it won't work. Check out the Hailo Handler Class and go from there.
# Other Networks # Other Networks
- what other networks can I deploy on Hailo? - For yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that.
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_odel_zoo/cfg/networks/`) can be easily trained and deployed using the process
described above.
- Also check out the [Hailo Tappas](https://hailo.ai/developer-zone/tappas-apps-toolkit/) which supply a variety of pre-trained _.hef_ files, which you can run
without having to compile anything.