Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Basic PTZ object autotracking functionality #6913

Merged
merged 45 commits into from
Jul 8, 2023

Conversation

hawkeye217
Copy link
Collaborator

@hawkeye217 hawkeye217 commented Jun 24, 2023

This draft PR is a basic implementation of using a pan/tilt/zoom capable IP camera to automatically track an object and keep it in the center of the frame as it moves. The idea came from playing around with an expensive Dahua PTZ camera that has autotracking built into its firmware (see example videos here). I thought “well if Frigate can track objects quickly already, why can’t I make a poor man’s open source autotracker with Frigate?!”

This PR assumes the user has:

  • A PTZ camera capable of relative movement within the field of view (as specified in the ONVIF spec as RelativePanTiltTranslationSpace having a TranslationSpaceFov entry). The camera I’m using is an EmpireTech/Dahua SD1A404XB-GNR. I also have a cheap, older Amcrest IP2M-841 indoor PTZ that does NOT support RelativeMove within the fov. I would assume better quality PTZ cameras (> 100 USD) would support ONVIF field of view translation, but I could be wrong. I created this quick python script to see if a camera supports the correct ONVIF space.
  • Non-CPU detectors with Frigate. A better detector is a necessity for this to function even close to acceptably.

As this draft stands right now, panning and tilting is working for me. Slower objects (like people walking up and down a street) are tracked and followed by the camera (but not without issues, as mentioned below).

A basic config entry would be:

cameras:
  ptzcamera:
    ….
    onvif:
      host: 192.168.1.100
      port: 80
      user: onvifuser
      password: password
      autotracking:
        enabled: True
        motion_estimator: True <—- use Norfair motion estimator (slow - see below)
        required_zones:
          - street <—- the zone the object must enter into before autotracking begins
        track:
          - person
        return_preset: street <--- return to this preset in the camera when autotracking ends
        timeout: 15 <— return to the above preset 15 seconds after no more objects are detected to track

I tried to draft this PR with as little intrusion to the existing codebase as possible, but changes will need to be made. Below are some of the things that need community contribution from before it should even be considered for inclusion into Frigate. I didn’t want to make huge changes to the existing code without @blakeblackshear and @NickM-27 (and besides, I feel like I’m already in over my head anyways!) I’m sure there will be plenty more we can add to this list, too:

  • Motion - the new lightning detection code causes motion to be recalibrated every time the camera moves. This then causes the tracker to make assumptions and lose the tracked object(s).
  • The Norfair motion estimation code is very slow when using a homography transformation (it almost pins my older i5 Intel CPU at 5fps). A translation transformation is available and is much faster, but won’t work with zooming. There may be underlying issues that I’m not taking into account as well with how Frigate’s tracking works, so the motion issue may not be the only problem with tracking.
  • Cameras may be oriented differently than mine, and so the pan and tilt values may move the camera in the wrong direction.
  • Onvif calls may need to be executed from a queue.

Features to discuss and/or implement:

  • Zooming in/out on the object
  • Behavior when multiple objects in the camera’s fov that are available to track
  • Using onvif absolute positioning (most PTZ cameras support this). I messed around a little with the Amcrest I mentioned earlier, and I was able to specify some absolute measurements that made it track me pretty well. But this would need lots of thinking if Frigate is going to support it.

Thanks again @blakeblackshear for this great project. I'm not the most skilled developer so please comment, write code, show me how to do things better, etc. I hope this little contribution can be something the community can help with and make Frigate even better.

@netlify
Copy link

netlify bot commented Jun 24, 2023

Deploy Preview for frigate-docs ready!

Name Link
🔨 Latest commit 51357cc
🔍 Latest deploy log https://app.netlify.com/sites/frigate-docs/deploys/64a6d2a52031d60007b72cf1
😎 Deploy Preview https://deploy-preview-6913--frigate-docs.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@hawkeye217 hawkeye217 marked this pull request as draft June 24, 2023 23:41
@NickM-27
Copy link
Sponsor Collaborator

I'll take a closer look some other time but wanted to say the lightning tracker is specifically designed to ignore large motion like IR changes and ptz movement so that would need to be adjusted of autotrack is enabled

@hawkeye217
Copy link
Collaborator Author

I'll take a closer look some other time but wanted to say the lightning tracker is specifically designed to ignore large motion like IR changes and ptz movement so that would need to be adjusted of autotrack is enabled

Right. I didn't want to make so many sweeping changes without first asking for ideas from Blake, you, and any other contributors first. I know just enough to be dangerous 😂

@spacebares
Copy link
Contributor

spacebares commented Jun 25, 2023

This looks great!

I think each object needs to have a minimum tracking time, so when multiple objects appear it can switch over to it for as long as each object has been tracked for a certain amount of seconds. Maybe even different minimum tracking times for different types of objects. I suppose you would have to remember where each object was at before the camera's FOV changed, this might be more complicated than i thought.

@hawkeye217
Copy link
Collaborator Author

This looks great!

I think each object needs to have a minimum tracking time, so when multiple objects appear it can switch over to it for as long as each object has been tracked for a certain amount of seconds. Maybe even different minimum tracking times for different types of objects. I suppose you would have to remember where each object was at before the camera's FOV changed, this might be more complicated than i thought.

Take a look at the 2nd video example here. The nice thing about the Norfair tracker (with homography transformation in the motion estimator) is that it doesn't so quickly lose track of objects that leave the camera FOV.

@hawkeye217
Copy link
Collaborator Author

hawkeye217 commented Jun 29, 2023

I can't quite figure out why the Norfair tracker loses detections sometimes when the ptz moves. The estimator function seems to be dishing out transformation vectors that are being passed to the tracker, but looking at the debug frames (change "False" to "True" on line 949 of video.py) and matching up the frame time from the debug logging, it seems like it takes a handful of frames for the vector to show that there was movement. By that time, the tracker has assumed the object is gone. It might have something to do with the distance function, or something else I'm missing with the way Frigate is dealing with the Norfair detections. I'm a bit out of my league on this one, so if anyone has any ideas, chime in!

EDIT: after playing with Kalman filter settings and finding a few minor issues with the motion estimator, it seems like the best solution is just to increase the camera FPS to 10. The motion estimator needs a few frames to stabilize, and this seems to mitigate the issue of the Norfair tracker losing the object.

@kirsch33
Copy link
Contributor

kirsch33 commented Jul 6, 2023

I created this quick python script to see if a camera supports the correct ONVIF space.

@hawkeye217 FYI i had to modify lines 15 and 27 from .token to ._token for my camera

@hawkeye217
Copy link
Collaborator Author

I created this quick python script to see if a camera supports the correct ONVIF space.

@hawkeye217 FYI i had to modify lines 15 and 27 from .token to ._token for my camera

Interesting. Let's bring the conversation over to the gist.

@kirsch33
Copy link
Contributor

kirsch33 commented Jul 6, 2023

@hawkeye217

ignore me, i was using python-onvif package with python 2.x.

onvif-zeep and python 3.x works

false alarm

@hawkeye217
Copy link
Collaborator Author

@hawkeye217

ignore me, i was using python-onvif package with python 2.x.

onvif-zeep and python 3.x works

false alarm

Great! I updated the gist for anyone else with the issue in the future.

@blakeblackshear blakeblackshear merged commit 88fc0fa into blakeblackshear:dev Jul 8, 2023
11 checks passed
@kirsch33
Copy link
Contributor

kirsch33 commented Jul 8, 2023

are PTZ presets meant to be defined within the camera firmware or Frigate? having trouble understanding how to set those up.

@hawkeye217
Copy link
Collaborator Author

hawkeye217 commented Jul 8, 2023

are PTZ presets meant to be defined within the camera firmware or Frigate? having trouble understanding how to set those up.

In the camera firmware. I'll update the docs for clarity.

EDIT: Here are the current docs if you haven't seen them.

@hawkeye217 hawkeye217 deleted the autotracking branch July 11, 2023 12:31
@kennethgomez01
Copy link

how does this works, and what version of frigate?

@hawkeye217
Copy link
Collaborator Author

This PR was committed to the dev branch. It should be in the official 0.13 release at some point in the future.

@kennethgomez01
Copy link

kennethgomez01 commented Aug 8, 2023

Great. I did similar project before using ONVIF and PTZ camera with auto zoom when person is detected in the frame. Would this be feature in some point in this development, or just pan/tilt?

@hawkeye217
Copy link
Collaborator Author

Zoom is on the list of things I'd like to get working but I'm busy with other projects at the moment. But PRs are welcomed, especially if you've already done a similar project.

@kennethgomez01
Copy link

kennethgomez01 commented Aug 10, 2023

Is it possible to allow some device that does not support RelativeMove within the fov? Because I have expensive Speed dome PTZ Camera that has auto tracking and zoom to human, and does it great job doing, but does not feature RelativeMove within the fov, according the python script.

The main reason I am excited for this ptz tracking for frigate as I want to be able to track cars and zoom to it, to capture license plates.

@hawkeye217
Copy link
Collaborator Author

No, it's not possible to use the code I wrote without a camera that supports ONVIF RelativeMove.

In security applications, LPR is usually done with a dedicated camera so that shutter speed can be set high enough to capture license plates. This is often difficult to achieve with a single camera that is used for other applications at the same time.

What camera do you have? If it already does autotracking, I'm surprised it doesn't have robust ONVIF support.

@kennethgomez01
Copy link

I got Vikylin Hikvision 4K 8MP Outdoor Auto Tracking PTZ PoE IP Camera, 20x Optical Zoom 30x.

@hawkeye217
Copy link
Collaborator Author

It looks like your camera is a Hikvision clone. If the camera's firmware doesn't support the ONVIF standard, you should contact the manufacturer and ask them for ONVIF support. Frigate can't support any kind of autotracking for cameras without it.

If you have any other questions, feel free to open a support issue rather than commenting on this closed PR.

@Someguitarist
Copy link

Hey Hawkeye,

Quick question for you! How do you define the return_preset? Is it just 'home' by default, and home is automatically the coordinates where it started? My bigger question, even unrelated to tracking, is does the return_preset work with autotracking off as well? I'd like to be able to use the PTZ controls to manually pan around, but have the camera return to the correct coordinates with or without autotracking so that my motion masks remain intact.

Thoughts?

@NickM-27
Copy link
Sponsor Collaborator

NickM-27 commented Aug 22, 2023

Hey Hawkeye,

Quick question for you! How do you define the return_preset? Is it just 'home' by default, and home is automatically the coordinates where it started?

No, that is the name of the preset that you have created in the cameras settings

My bigger question, even unrelated to tracking, is does the return_preset work with autotracking off as well? I'd like to be able to use the PTZ controls to manually pan around, but have the camera return to the correct coordinates with or without autotracking so that my motion masks remain intact.

It does not automatically return to home, you can do this via the webUI easily though

@Someguitarist
Copy link

I'm sorry, I think I added this in two places, this and a QA since that seemed more appropriate. To follow up here, by the preset created in the camera settings, do you mean in the software for that specific camera, or somewhere in the Frigate camera settings in the config file? How do you return to home via webUI? I'm on the latest beta and only see pan/tilt/zoom controls.

I must totally be missing something. So sorry to bug you guys! Amazing project though, and great response time, dang!

@NickM-27
Copy link
Sponsor Collaborator

To follow up here, by the preset created in the camera settings, do you mean in the software for that specific camera, or somewhere in the Frigate camera settings in the config file?

You create the preset on the camera itself

I'm on the latest beta and only see pan/tilt/zoom controls.

Because you haven't created any presets on the camera

@Someguitarist
Copy link

I marked the other one as an answer just to close it, but thanks, this looks like exactly what I'm looking for, but how do I create the home preset on the camera?

@NickM-27
Copy link
Sponsor Collaborator

but how do I create the home preset on the camera?

depends on the type of camera, you should search that for yourself

@Someguitarist
Copy link

Oh, gotcha! Okay, so just to be super clear because I think I confused the two; you don't mean creating a 'home preset' inside of Frigate->config.yml->cameras. You mean each camera has it's own way it defines the preset, like use the Amcrest app or whatever to set it, and then Frigate will pick it up on it's webUI?

@NickM-27
Copy link
Sponsor Collaborator

yes, you simply create the presets on the camera settings and frigate will pull those presets automatically. There is no way to create a preset in the frigate config

@Someguitarist
Copy link

Interesting, thanks! Looks like neither of my camera types support having a home preset, unfortunately. I'm using a few Amcrest ASH41-W's inside and LaView bulb cameras outside. Interestingly enough, both support PTZ, but only Laview supports Onvif.

The LaView onvif controls work within Frigate really well, and I'm able to name and save coordinate locations as well as return to them within the LaView app, but the locations themselves never show in Frigate. This is likely due to both of these types of cameras being some cheap bottom-dollar stuff, but they work really well for everything else so far.

This may be more of a feature request than anything at this point, but would it at all be possible in the future to backtrack the movements done by autotracking? Essentially if it knows it moved ~6 degrees to the right, know to walk it back 6 degrees to the left after 30 seconds?

Either way, looks like it just doesn't work on my cameras or populate any of the presets saved on the LaView app. Totally fine, this is just a minor nitpick I wanted to check into! Thanks for making Frigate and working on all this stuff, as all the rest of it works mostly flawlessly!

@NickM-27
Copy link
Sponsor Collaborator

but the locations themselves never show in Frigate. This is likely due to both of these types of cameras being some cheap bottom-dollar stuff, but they work really well for everything else so far.

did you restart frigate after adding them? Frigate does not check for new onvif info after starting.

This may be more of a feature request than anything at this point, but would it at all be possible in the future to backtrack the movements done by autotracking? Essentially if it knows it moved ~6 degrees to the right, know to walk it back 6 degrees to the left after 30 seconds?

auto tracking requires specific onvif features, so if a camera supports those then it will also support presets due to onvif standard requirements

@Someguitarist
Copy link

I did restart it, but no luck. It's okay, these bulb cameras required some beta firmware and their onvif support was listed as 'experimental'. Honestly, I don't know if I really need autotracking that much, as they're currently pointed where I'd need them ~95% of the time. I was just hoping that the autotracking would zoom and get me better facial recognition for doubletake, which is honestly a pretty minor concern.

Thanks for your help though!!

@NickM-27
Copy link
Sponsor Collaborator

Going to lock this PR as closed PRs are not the right place to have general discussions

Repository owner locked and limited conversation to collaborators Aug 22, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

PTZ support
7 participants