Merge branch 'main' of ssh://192.168.0.63:2290/raphael.maenle/hailo_inference into main

This commit is contained in:
Thomas Hamböck 2022-03-21 16:54:09 +01:00
commit dd99e3570f

View File

@ -18,7 +18,7 @@ For deployment
- Ubuntu 18.04 or 20.4
For taining
For training
- GPU enabled device (recommended)
- [hailo.ai](www.hailo.ai) developer account
- [docker](https://docs.docker.com/engine/install/ubuntu/)
@ -58,6 +58,7 @@ Here you'll train, quantize, compile (on a gpu if possible) and infer (on the ha
explains well on how to create that.
- There's a minimal example dataset in this repository under `/dataset`
- To mount this, use eg.: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0`
- `python train.py --img 640 --batch 16 --epochs 3 --data /dataset/dataset/dataset.yaml --weights yolov5m.pt --cfg models/yolov5m.yaml`
- For training, make sure you target the correct `--model` and use the correct `--weights` (which are now conveniently already in the hailo docker)
- once you've saved the best.pb onnx file, you can exit this docker container
- once you are done with the steps 'training and exporting to ONNX', move on to the next step.
@ -89,7 +90,7 @@ Further Notes and Examples:
# Other Networks
- For yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that.
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_odel_zoo/cfg/networks/`) can be easily trained and deployed using the process
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_model_zoo/cfg/networks/`) can be easily trained and deployed using the process
described above.
- Also check out the [Hailo Tappas](https://hailo.ai/developer-zone/tappas-apps-toolkit/) which supply a variety of pre-trained _.hef_ files, which you can run
without having to compile anything.