adds further instructions
This commit is contained in:
parent
0dbabe328d
commit
baefc3c761
26
README.md
26
README.md
@ -37,9 +37,6 @@ You'll need a way to train your network (yielding a .pb file from it) and the ha
|
||||
which are getting monthly updates at the time of writing. This of course means that these notes are rapidly aging into inaccuracy.
|
||||
- clone the [hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo), which has some convenient yolov5
|
||||
Docker containers.
|
||||
|
||||
### Setup
|
||||
|
||||
- unzip the `hailo software suite` and run inside `./hailo_sw_suite_docker_run.sh`. If this is your fist time running, the container will set up.
|
||||
Otherwise, you'll have to either `--resume` or `--override`
|
||||
|
||||
@ -51,6 +48,7 @@ You'll need a way to train your network (yielding a .pb file from it) and the ha
|
||||
|
||||
|
||||
# Setup Example for a custom Yolov5m
|
||||
Here you'll train, quantize, compile (on a gpu if possible) and infer (on the hailo chip)
|
||||
|
||||
|
||||
## Train your own Yolov5m
|
||||
@ -70,14 +68,28 @@ You'll need a way to train your network (yielding a .pb file from it) and the ha
|
||||
- in `hailo_sw_suite_docker_run.sh` add a volume so you can access your best.pb onnx file, for example
|
||||
in the `DOCKER_ARGS` you could add -v /home/user/files:/files
|
||||
- run `./hailo_sw_suite_docker_run.sh --resume` to get into the docker container
|
||||
- follow the rest of the tutorial from [the hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md)
|
||||
- Note, that for yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network
|
||||
you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that.
|
||||
- `cd hailo_model_zoo` and continue the tutorial.
|
||||
|
||||
Further Notes and Examples:
|
||||
- Another example to compile the model is to run: `python hailo_model_zoo/main.py compile yolov5m --ckpt /files/my_dataset/best.onnx --calib-path /files/watt_dataset/dataset/dataset/images/ --yaml ./hailo_model_zoo/cfg/networks/yolov5m.yaml`
|
||||
- note that here you're quantizing and compiling to HEF file. If you just want to quantize into a _har_ file, run `python hailo_model_zoo/main.py quantize yolov5m --ckpt /files/my_dataset/best.onnx --calib-path /files/my_dataset/dataset/dataset/images/ --yaml ./hailo_model_zoo/cfg/networks/yolov5m.yaml`, which yields the _.har_ representation of the file
|
||||
- You have to provide a list of images for quantization (--calib-path)
|
||||
- also provide a quantization scheme (--yaml), where hailo\_model\_zoo provides a variety
|
||||
- **There's no real reason to not use _compile_ directly from what I know.**
|
||||
- Note that the file locations in these examples are different to Hailos. These worked, while Hailo's didn't.
|
||||
- follow the getting started guide from [hailo model zoo / GETTING STARTED](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/GETTING_STARTED.md), specifically the
|
||||
_optimizing_ and _compile_ steps if you want more information on quantization parameters and how to go on from there.
|
||||
|
||||
## run inference
|
||||
|
||||
- this git repository includes a _inference.py_ file, which which loads a specified _.hef_ file. Change that to your _.hef_ file location.
|
||||
- If you have anything other than yolov5m, you will have to change the pre/post-processing steps. Otherwise it won't work. Check out the Hailo Handler Class and go from there.
|
||||
|
||||
|
||||
# Other Networks
|
||||
|
||||
- what other networks can I deploy on Hailo?
|
||||
- For yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that.
|
||||
- Anything the hailo\_model\_zoo provides a configuration file for (under `hailo_odel_zoo/cfg/networks/`) can be easily trained and deployed using the process
|
||||
described above.
|
||||
- Also check out the [Hailo Tappas](https://hailo.ai/developer-zone/tappas-apps-toolkit/) which supply a variety of pre-trained _.hef_ files, which you can run
|
||||
without having to compile anything.
|
||||
|
Loading…
Reference in New Issue
Block a user