# Introduction These installation and setup instructions are aimed for yolov5 deployment on a PCIe connected Hailo device. If you want to deploy a different network, check the [other networks](Other Networks) section. # Preemptive Note The Hailo pipeline is constantly updating and changing, so it might not be ideal to follow these instructions as they are not updated regularly. I would recommend following the hailo tutorials from the [hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) and go through the docs on hailo.ai. guide # Requirements For deployment - access to [hailo.ai developer zone](https://hailo.ai/developer-zone) - Hailo PCIe Device - Ubuntu 18.04 or 20.4 For taining - GPU enabled device (recommended) - [hailo.ai](www.hailo.ai) developer account - [docker](https://docs.docker.com/engine/install/ubuntu/) - [nvidia docker](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html) # Installation Tutorial This is split into the 'deploy device' (which has your Hailo PCIe Device connected) and the 'training device' which you'll use for network training and network to hailo quantization. Skip this if you already have a `.har` file you would like to deploy. ## For your (GPU enabled) training Device You'll need a way to train your network (yielding a .pb file from it) and the hailo software suite to generate the Hailo File from that. - always download the newest version of the [hailo software suite](https://hailo.ai/developer-zone/sw-downloads) from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/) which are getting monthly updates at the time of writing. This of course means that these notes are rapidly aging into inaccuracy. - clone the [hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo), which has some convenient yolov5 Docker containers. ### Setup - unzip the `hailo software suite` and run inside `./hailo_sw_suite_docker_run.sh`. If this is your fist time running, the container will set up. Otherwise, you'll have to either `--resume` or `--override` ## For your Device with a Hailo Chip attached - downloads the newest hailo rt from the [hailo.ai software download zone](https://hailo.ai/developer-zone/sw-downloads/) - extract the file and run the installer. Once you're done, reboot. - Source the virtual environment under `/path/to/hailo_rt/Installer/hailo_platform_venv/bin/activate` - test virtual environment by running `hailo` # Setup Example for a custom Yolov5m ## Train your own Yolov5m - For yolov5 we'll be using hailo Docker containers, which are based on the [ultralytics yolov5 containers](https://github.com/ultralytics/yolov5) - hailo model zoo now has a [guide how to train yolov5 for hailo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) follow that one. Some notes: - You'll need to create your own dataset structure for the training process. [This guide](https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data) explains well on how to create that. - There's a minimal example dataset in this repository under `/dataset` - To mount this, use eg.: `docker run -it --gpus all -ipc=host -v /path/to/dataset/:/dataset yolov5:v0` - For training, make sure you target the correct `--model` and use the correct `--weights` (which are now conveniently already in the hailo docker) - once you've saved the best.pb onnx file, you can exit this docker container - once you are done with the steps 'training and exporting to ONNX', move on to the next step. ## Create Hailo representation (hef) - in `hailo_sw_suite_docker_run.sh` add a volume so you can access your best.pb onnx file, for example in the `DOCKER_ARGS` you could add -v /home/user/files:/files - run `./hailo_sw_suite_docker_run.sh --resume` to get into the docker container - follow the rest of the tutorial from [the hailo model zoo](https://github.com/hailo-ai/hailo_model_zoo/blob/master/docs/RETRAIN_ON_CUSTOM_DATASET.md) - Note, that for yolov5m, Hailo provides a configuration yaml which defines the quantization levels for the various networks. If you have a custom network you will have to define your custom yaml file. I liked using [netron](https://github.com/lutzroeder/Netron) to visualize the network for that. ## run inference # Other Networks - what other networks can I deploy on Hailo?