Compare commits
5 Commits
baefc3c761
...
dd99e3570f
Author | SHA1 | Date | |
---|---|---|---|
|
dd99e3570f | ||
|
e652578793 | ||
|
cce1dd1c84 | ||
|
d8809ba7e7 | ||
|
fbdd0e87cd |
@ -18,7 +18,7 @@ For deployment
|
||||
- Ubuntu 18.04 or 20.4
|
||||
|
||||
|
||||
For taining
|
||||
For training
|
||||
- GPU enabled device (recommended)
|
||||
- [hailo.ai](www.hailo.ai) developer account
|
||||
- [docker](https://docs.docker.com/engine/install/ubuntu/)
|
||||
@ -58,6 +58,7 @@ Here you'll train, quantize, compile (on a gpu if possible) and infer (on the ha
|
||||
explains well on how to create that.
|
||||
- There's a minimal example dataset in this repository under `/dataset`
|
||||
- To mount this, use eg.: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0`
|
||||
- `python train.py --img 640 --batch 16 --epochs 3 --data /dataset/dataset/dataset.yaml --weights yolov5m.pt --cfg models/yolov5m.yaml`
|
||||
- For training, make sure you target the correct `--model` and use the correct `--weights` (which are now conveniently already in the hailo docker)
|
||||
- once you've saved the best.pb onnx file, you can exit this docker container
|
||||
- once you are done with the steps 'training and exporting to ONNX', move on to the next step.
|
||||
@ -89,7 +90,7 @@ Further Notes and Examples:
|
||||
# Other Networks
|
||||
|
||||
- For yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that.
|
||||
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_odel_zoo/cfg/networks/`) can be easily trained and deployed using the process
|
||||
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_model_zoo/cfg/networks/`) can be easily trained and deployed using the process
|
||||
described above.
|
||||
- Also check out the [Hailo Tappas](https://hailo.ai/developer-zone/tappas-apps-toolkit/) which supply a variety of pre-trained _.hef_ files, which you can run
|
||||
without having to compile anything.
|
||||
|
41
Uebergabe.md
Normal file
41
Uebergabe.md
Normal file
@ -0,0 +1,41 @@
|
||||
# Übergabe Hailo
|
||||
|
||||
* Raphael Maehnle
|
||||
* Thomas Hamböck
|
||||
|
||||
## Agenda
|
||||
|
||||
1. CVAT: Login, Data Import, Labeling, Data export
|
||||
1. Create a new Project; Important: Labels eintragen, möglichst in konsistenter Reihenfolge (zu anderen Projekten)
|
||||
1. Früher gab es nur Tasks => da war Labels konsistent halten aufwändiger; Anm.: Projects funktionieren zZ noch nicht stabil
|
||||
2. Anm.: Früher gab es ein Speicherproblem; Bilder blieben im RAM => häufigeres Seite neu laden sinnvoll
|
||||
2. Jobs: Aufteilung von Bildern auf annotierende Leute (User Assignment)
|
||||
3. Bilder annotieren: Tastenkürzel ("n" > neue Bounding Box, ...): https://openvinotoolkit.github.io/cvat/docs/manual/advanced/shortcuts/
|
||||
4. "..." > export as dataset > Yolo1.1
|
||||
2. hailo setup for development (training)
|
||||
1. prerequisite: nvidia-docker
|
||||
2. clone hailo model zoo: https://github.com/hailo-ai/hailo_model_zoo
|
||||
3. download & run docker container "Hailo Software Suite (for Ubuntu 20.04)"
|
||||
1. note: volume mount is missing in the setup guide => will be required for importing data
|
||||
3. Retraining Yolov5m
|
||||
1. Tutorial for yolov5 training
|
||||
https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md
|
||||
2. dataset meta setup:
|
||||
http://192.168.0.63:8930/raphael.maenle/hailo_inference/-/blob/main/dataset/dataset/dataset.yaml
|
||||
labels aus CVAT nochmal in die liste übertragen, nc == len(names)
|
||||
3. from readme: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0`
|
||||
based on https://github.com/hailo-ai/hailo_model_zoo/blob/master/training/yolov5/Dockerfile
|
||||
4. https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md?plain=1#L183
|
||||
epochs 3 ist etwas wenig; ~100 wäre besser
|
||||
`--data` => auf dataset
|
||||
5. Export to ONNX: https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md?plain=1#L192
|
||||
For visualisation: ONNX viewer, e.g.: https://github.com/lutzroeder/Netron
|
||||
4. Yolov5 hailo quantisation + compile
|
||||
1. `hailo_sw_suite_docker_run.sh` => add `-v` as `DOCKER_ARGS` for volume mount in script
|
||||
2. network config: `yolov5m.yaml`
|
||||
3. do `compile` => includes `quantize`
|
||||
`.hef` => network bitstream
|
||||
5. hailo setup for deployment (inference)
|
||||
1. `HailoRT` => setup für runtime
|
||||
6. Inference
|
||||
|
Loading…
Reference in New Issue
Block a user