Replies: 46 comments 241 replies
-
I am currently using a N4100 soft router to run Frigate. When there is no one around, the page shows an inference speed of 20-30 milliseconds, but when there are people, it becomes 70-80 milliseconds. The actual feeling is that it takes two to three seconds for Home Assistant to get the information, which is too slow. I plan to purchase an Orange Pi 5, so I am particularly interested in the following two points: 1.Is an Orange Pi 5 with 4GB of memory enough to run Frigate? |
Beta Was this translation helpful? Give feedback.
-
Are these models really 320x320? I thought the YOLO NAS models were 640x640. |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.14.0-b97e274 |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.14.0-b97e274 |
Beta Was this translation helpful? Give feedback.
-
Frigate version: frigate:dev-d646338-rk |
Beta Was this translation helpful? Give feedback.
-
I have tried all deci-fp16-yolonas_* models. Trying to move from coral-based system. All of these models seem to weigh heavily on "cat". Same config (except for model and detectors). In 0.13.2 coral system, dogs were dogs, and people were people. Now dogs are always cats and people are sometimes cats. Rarely are there any actual cats. Are there any other models to try with the rockchip decoder? Should I just remove cat from object detection and see if that improves classification? |
Beta Was this translation helpful? Give feedback.
-
With the known issue of 8GB and 16GB rockchip boards having crashes, isn't this really a non-starter for frigate until this is fixed? |
Beta Was this translation helpful? Give feedback.
-
With the beta v3 image, when restreaming 1024x768 mjpg stream from an esp32 camera module to m264 I'm getting about I'm 20% cpu usage on my rk3588. Feels like go2rtc ffmpeg is not using hardware acceleration yet. |
Beta Was this translation helpful? Give feedback.
-
I'm not sure if anyone knows but should the rknn detector just keep incrementally increase memory usage? It seems to go from 1% to about 8% usage after a day and then will flush back to 4% usage. Is this set somewhere in frigate? If I use a coral usb it stays around 2% usage so I wasn't sure if there was a reason it spikes so much higher vs the coral usb. |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.13.2-69d9a261 |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.13.2-69d9a261 |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.14.0-da913d8 |
Beta Was this translation helpful? Give feedback.
-
It seems that there is an incompatibility between the yolov8 models from Frigate v13 and newer kernels. This results in this error:
I recommend upgrading to Frigate v14 since yolov8 is not supported anymore. If you encounter this problem in Frigate v14, please reply to this comment. This issue was initially reported here: nyanmisaka/ffmpeg-rockchip#95 |
Beta Was this translation helpful? Give feedback.
-
Hi @MarcA711 , @ALL, FYI: I tried the 14.0 release on the 6.1 image from here. Linux ubuntu 6.1.0-1021-rockchip #21-Ubuntu SMP Mon Jul 29 03:52:32 UTC 2024 aarch64 aarch64 aarch64 GNU/Linux Good news - inference is working on one camera with good speed. I did not try with more cameras yet, this is planned. ![]() Frigate version: 0.14.0-da913d8 Still I was expecting better detector performance. My other setup is running on the pretty old Intel Core i7-6700K and inference there on ov detector is 10-20 ms. Given that the Rk3588 has the hardware acceelerator with 3 NPU cores and also hardware decoding of h.265 stream, I was expecting it to be at least on par with the old Core i7. Many thanks for your help and advice! |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.14.0-da913d8
|
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Is anyone else having problems with 0.15 release and all earlier betas? Any recent version of 0.15 prerelease and the release itself will constantly build up zombie processes over the course of a day and various cameras streams will drop off, until the container is restarted. I have found one image that doesn't give me this trouble and its a month or two old: hcr.io/blakeblackshear/frigate:7b65bcf-rk |
Beta Was this translation helpful? Give feedback.
-
After Getting really annoyed at nanopi's overlay fs, I decided to switch back to mainline with armbian.. Am I correct in seeing that V4l2 levarages the rk hardware decoders now? Should I be using a different image base than the stable-rk one now?.. will try. /expand
|
Beta Was this translation helpful? Give feedback.
-
So Ive been using the bleeding edge v0.16 and lately is been working great. Face detection is nice to have but I think even the large model(which docs will use a GPU, but doesnt specify which ones are supported) uses only CPU on rk3588, not GPU or NPU. The following model files exist under config/model_cache:
Would it be possibly to just use one of the onnx2rknn converter scripts on these to make the use the NPU? Im still really interested in using some different models or even just yolo_nas trained on a more appropriate image set for my purposes but I cant seem to get even yolo_nas models that I convert to rknn to function :( Would love to hear about any success anyone has with such things |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.15.0-cea210d I found a little issue while configuring my PTZ camera with ONVIF: "No appropriate Onvif profiles found for camera" while the same camera is working on another instance of frigate running stable main branch (same version but not rk). Maybe ONVIF is not supported in this branch? Thanks for the excellent work! |
Beta Was this translation helpful? Give feedback.
-
So with this version I can use frigate+ to Train my own model? With rknn chip? |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.15.0-cea210d |
Beta Was this translation helpful? Give feedback.
-
Anyone else using Face Detection on 0.16 for RKNN and find that the 'training' page has gone from reasonably snappy, to decreasingly responsive to now no longer usable/loadable at all? Any fixes? |
Beta Was this translation helpful? Give feedback.
-
@MarcA711 I have a yolov9 onnx model per https://deploy-preview-16390--frigate-docs.netlify.app/configuration/object_detectors#downloading-yolo-models, and I have attempted to convert it to rknn format. Everything seems to be working with the existing post processing yolo function in 0.16, except the bounding boxes. I'd be happy to share these with you, perhaps we can get it working. Inference time (without quantization) is 25ms on 3588 |
Beta Was this translation helpful? Give feedback.
-
I noticed rknn toolkit 2.3.2 dropped a few weeks ago. Not sure if anything here improves the performance of Frigate? The lite toolkit includes support for The non-lite toolkit includes more features by the looks of it.
|
Beta Was this translation helpful? Give feedback.
-
I have rk3588 chipset, config : |
Beta Was this translation helpful? Give feedback.
-
Frigate version: 0.16.0-5c3ac75 Frigate version: 0.16.0-5c3ac75 Frigate version: 0.16.0-5c3ac75 |
Beta Was this translation helpful? Give feedback.
-
RK3566 Single Core RKNN device. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to run LPR on the RKNN? If yes what model to use? |
Beta Was this translation helpful? Give feedback.
-
Am I reading the 0.16-b4 release notes correctly and there is a patched FFMPEG trying to fix the 4GB-limit crashing? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
This is intended as an exchange for Rockchip users. We can talk about problems setting up hardware acceleration, optimizing performance, which distribution works best etc. Moreover, I want to collect some information like which OS + board works or not as well as inference times on different SoCs. I hope that as many users as possible help and submit data. Finally, I would also like to compile a list of FAQ, common pitfalls and known issues.
Latest news
RKNN now supports yolov9 and yolox thanks to the contributions of @NickM-27. They are available in the latest dev builds. If you test them, it would be awesome if you report bugs you might encounter and share the inference time on your platform. For more details see #17788 and #17791.
System compatibility
You can help complete this table by commenting these information:
Inference times
All times are in ms.
You can help complete this table by commenting these information:
FAQ
Is there a Frigate add-on for Home Assistant with support for Rockchip hardware?
No. There are (at least) 3 ways to install Home Assistant: their OS (HAOS), their supervised script (HA Supervised) and their docker image (HA Container). Only HAOS and HA Supervised support add-ons (see this comparison for more details).
HAOS uses the mainline Linux kernel that currently lacks the necessary drivers for Rockchip hardware acceleration (see this table to see the progress of mainlining the rk3588). HA Supervised currently lacks the option to unmask paths (
--security-opt systempaths=unconfined
, see home-assistant/supervisor#4863). So there will be no add-on for Rockchip hardware until either HAOS or HA Supervised works.The only way to install Home Assistant and Frigate with Rockchip support is to use HA Container. For details see the next FAQ "How do I install Frigate and Home Assistant?". Also note that you don't need Home Assistant to use Frigate. Frigate already has a UI that received a major overhaul in v0.14. Using Home Assistant with Frigate makes only sense if you already use HA for other services and prefer an all-in-one solution or if you want to trigger other actions for certain events.
How do I install Frigate and Home Assistant?
This is currently only possible using HA Container (see previous FAQ "Is there a Frigate add-on for Home Assistant with support for Rockchip hardware?" for details). Note that you can't install add-ons in HA Container (see this comparison for more details). However, most add-ons are also available as docker images, you just need to configure them manually.
Please note:
The instructions below assume, that you are in an empty directory with read and write permission.
Now, create the files
docker-compose.yml
andmosquitto-data/config/mosquitto.conf
and paste the contents below. Thedocker-compose.yml
creates a docker network and adds all containers to it. This way the containers can communicate with each other. Instead of an IP address you can use thecontainer_name
of each container.e.g.
nano docker-compose.yml
e.g.
nano mosquitto-data/config/mosquitto.conf
Now start just the mosquitto container and add two users for Frigate and Home Assistant. Remember the password that you choose. Afterwards, create the
frigate-data/config/config.yml
. I highlighted some lines that you might need to change.e.g.
nano frigate-data/config/config.yml
Now start Home Assistant and install HACS, afterwards start all containers:
Finally, you can open Home Assistant in your browser using
http://IP-of-your-server:8123
. Go toSettings --> Integrations --> Add integration
. There you can setup HACS, afterwards MQTT, afterwards Frigate (download it from HACS first). After installing HACS you should restart Home Assistant usingdocker compose restart
in your terminal. During the setup of MQTT you are asked to enter the broker addressmosquitto
as well as the usernamehomeassistant
and password that you chose earlier. During the Frigate setup you need to enter the addresshttp://frigate:5000
.You can stop your setup using
docker compose down
, start usingdocker compose up -d
and restart usingdocker compose restart
if you are in the directory where you set everything up. Now, you should study the docs of each project and adopt everything to your needs.Known issues
Streams crash or recordings are missing
This is due to a hardware limitation that prevents the video processing hardware to access memory outside the first 4GB. This is potentially fixable, see rockchip-linux/mpp#560.
Beta Was this translation helpful? Give feedback.
All reactions