diff --git a/.github/workflows/publish-dockerhub.yaml b/.github/workflows/publish-dockerhub.yaml index a9c82d5..48933a8 100644 --- a/.github/workflows/publish-dockerhub.yaml +++ b/.github/workflows/publish-dockerhub.yaml @@ -57,3 +57,12 @@ jobs: tags: ${{ steps.meta.outputs.tags }} labels: ${{ steps.meta.outputs.labels }} + - name: Update Docker Hub description + if: github.event_name != 'pull_request' + uses: peter-evans/dockerhub-description@v2 + with: + username: ${{ secrets.DOCKER_HUB_USERNAME }} + password: ${{ secrets.DOCKER_HUB_PASSWORD }} + repository: groundlight/stream + readme-file: README.md + diff --git a/CAMERAS.md b/CAMERAS.md new file mode 100644 index 0000000..77aefcc --- /dev/null +++ b/CAMERAS.md @@ -0,0 +1,83 @@ +# RTSP Camera Stream Setup Guide + +Real-Time Streaming Protocol (RTSP) is a network control protocol used for streaming media across IP networks. Many WiFi and Ethernet cameras support RTSP streams for remote viewing and integration with third-party applications. This guide provides step-by-step instructions for setting up RTSP streams on various popular camera brands. + +## Hikvision Cameras + +To get an RTSP stream from a Hikvision camera, follow these steps: + +1. Open the camera's web interface by entering its IP address in your web browser. +2. Log in using your camera's username and password. +3. Go to Configuration > Network > Advanced Settings > Integration Protocol. +4. Enable the RTSP protocol and set the Authentication method to "digest/basic". +5. Construct the RTSP URL using the following format: + +``` +rtsp://:@:554/Streaming/Channels/ +``` + + Replace ``, ``, ``, and `` with the appropriate values. The channel is typically 101 for the main stream and 102 for the substream. + +## Axis Cameras + +To get an RTSP stream from an Axis camera, follow these steps: + +1. Open the camera's web interface by entering its IP address in your web browser. +2. Log in using your camera's username and password. +3. Go to Setup > Video > Stream Profiles. +4. Click "Add" to create a new profile, or edit an existing profile. +5. Configure the desired video and audio settings for the stream, and enable the RTSP protocol. +6. Click "Save" to save the settings. +7. Construct the RTSP URL using the following format: + +``` +rtsp://:@/axis-media/media.amp?videocodec=h264&streamprofile= +``` + + Replace ``, ``, ``, and `` with the appropriate values. + +## Foscam Cameras + +To get an RTSP stream from a Foscam camera, follow these steps: + +1. Open the camera's web interface by entering its IP address in your web browser. +2. Log in using your camera's username and password. +3. Go to Settings > Network > IP Configuration. +4. Note the camera's IP address, HTTP port, and RTSP port. +5. Construct the RTSP URL using the following format: + +``` +rtsp://:@:/videoMain +``` + + Replace ``, ``, ``, and `` with the appropriate values. + +## Amcrest Cameras + +To get an RTSP stream from an Amcrest camera, follow these steps: + +1. Open the camera's web interface by entering its IP address in your web browser. +2. Log in using your camera's username and password. +3. Go to Setup > Network > Connection. +4. Note the camera's RTSP port. +5. Construct the RTSP URL using the following format: + +``` +rtsp://:@:/cam/realmonitor?channel=1&subtype= +``` + + Replace ``, ``, ``, ``, and `` with the appropriate values. The stream type is typically 0 for the main stream and 1 for the substream. + +## Unifi Protect Cameras + +To obtain an RTSP stream from a Unifi Protect camera, follow these steps: + +1. Open the Unifi Protect web application by entering its IP address in your web browser. +2. Log in using your Unifi Protect account credentials. +3. Click on "Unifi Devices" and locate the device you want to connect to. +4. In the right-panel, select "Settings" to access the camera's settings. +5. Expand the "Advanced" section, where you will find the RTSP settings. +6. Choose a resolution for the stream and enable it. An RTSP URL will be displayed below your selection. + +Use the generated RTSP URL to connect to the camera stream in third-party applications or services. + diff --git a/DEVELOPING.md b/DEVELOPING.md new file mode 100644 index 0000000..078d4dd --- /dev/null +++ b/DEVELOPING.md @@ -0,0 +1,24 @@ +# Groundlight Stream Processor + +A containerized python application that uses the Groundlight SDK to +process frames from a video stream. This can connect to a web cam over RTSP, a local video file, or a youtube URL. + +## Releases + +Releases created in Github will automatically get built and pushed to [dockerhub](https://hub.docker.com/r/groundlight/stream/tags). These are multi-architecture builds including x86 and ARM. + +## Test Builds + +To build and test locally: + +``` shell +docker build -t stream:local . +``` + +## Run +Now you can run it + +``` shell +docker run stream:local -h +``` + diff --git a/README.md b/README.md index adf1148..aec76ea 100644 --- a/README.md +++ b/README.md @@ -1,32 +1,104 @@ # Groundlight Stream Processor -A containerized python application that uses the groundlight sdk to -process frames from a video stream +A containerized python application that uses the [Groundlight](https://www.groundlight.ai/) [Python SDK](https://github.com/groundlight/python-sdk) to +process frames from a video file, device, or stream. -## Releases +## Useage -Releases created in Github will automatically get built and pushed to [dockerhub](https://hub.docker.com/r/groundlight/stream/tags). These are multi-architecture builds including x86 and ARM. +The system is easy to use on any system with Docker installed. Command line options are displayed like: -## Test Builds +``` shell +$ docker run groundlight/stream --help + +Captures frames from a video device, file or stream and sends frames as +image queries to a configured detector using the Groundlight API + +usage: stream [options] -t TOKEN -d DETECTOR + +options: + -d, --detector=ID detector id to which the image queries are sent + -e, --endpoint=URL api endpoint [default: https://api.groundlight.ai/device-api] + -f, --fps=FPS number of frames to capture per second. 0 to use maximum rate possible. [default: 5] + -h, --help show this message. + -s, --stream=STREAM id, filename or URL of a video stream (e.g. rtsp://host:port/script?params OR video.mp4 OR *.jpg) [default: 0] + -t, --token=TOKEN API token to authenticate with the Groundlight API + -v, --verbose enable debug logs + -w, --width=WIDTH resize images to w pixels wide (and scale height proportionately if not set explicitly) + -y, --height=HEIGHT resize images to y pixels high (and scale width proportionately if not set explicitly) + -m, --motion enable motion detection with pixel change threshold percentage (disabled by default) + -r, --threshold=THRESHOLD set detection threshold for motion detection [default: 1] + -p, --postmotion=POSTMOTION minimum number of seconds to capture for every motion detection [default: 1] + -i, --maxinterval=MAXINT maximum number of seconds before sending frames even without motion [default: 1000] +``` + +Start sending frames and getting predictions and labels using your own API token and detector ID: + +``` shell +docker run groundlight/stream \ + -t api_29imEXAMPLE \ + -d det_2MiD5Elu8bza7sil9l7KPpr694a \ + -s https://www.youtube.com/watch?v=210EXAMPLE \ + -f 1 +``` + +## Running with a Local MP4 File + +To process frames from a local MP4 file, you need to mount the file from your host machine into the Docker container. Here's how to do it: + +1. Place your MP4 file (e.g., `video.mp4`) in a directory on your host machine, such as `/path/to/video`. +2. Run the Docker container, mounting the directory containing the video file: + +``` shell +docker run -v /path/to/video:/videos groundlight/stream \ + -t api_29imEXAMPLE \ + -d det_2MiD5Elu8bza7sil9l7KPpr694a \ + -s /videos/video.mp4 \ + -f 1 +``` + +This command mounts the `/path/to/video` directory on your host machine to the `/videos` directory inside the Docker container. The `-s` parameter is then set to the path of the MP4 file inside the container (`/videos/video.mp4`). + +## Using a YouTube URL -To build and test locally: +To use a YouTube video as a source, you first need to obtain a direct URL to the video stream. YouTube does not provide RTSP URLs, so you'll need a tool like `youtube-dl` to extract the direct video URL. Install `youtube-dl` following the instructions on its [GitHub page](https://github.com/ytdl-org/youtube-dl#installation). + +Once you have `youtube-dl` installed, extract the direct video URL: ``` shell -docker build -t stream:local . +youtube-dl -g "https://www.youtube.com/watch?v=210EXAMPLE" ``` -## run -Now you can run it +Replace the YouTube URL with the video you want to use. The output will be a direct video URL that you can pass to the `-s` parameter: ``` shell -docker run stream:local -h +docker run groundlight/stream \ + -t api_29imEXAMPLE \ + -d det_2MiD5Elu8bza7sil9l7KPpr694a \ + -s "" \ + -f 1 ``` -# Video Stream types +Replace `` with the URL obtained from `youtube-dl`. + +## Connecting an RTSP Stream + +To connect an RTSP stream from a camera or other source, you'll need the RTSP URL specific to your device. Check the instructions provided earlier in this document for obtaining the RTSP URL for your camera. + +Once you have the RTSP URL, pass it to the `-s` parameter: + +``` shell +docker run groundlight/stream \ + -t api_29imEXAMPLE \ + -d det_2MiD5Elu8bza7sil9l7KPpr694a \ + -s "rtsp://username:password@camera_ip_address:554/path/to/stream" \ + -f 1 +``` + +Replace the RTSP URL with the one specific to your camera or streaming device. + -## Unifi Protect Cameras +## Further Reading -To get an RTSP stream from a Unifi Protect camera, first open the Unifi Protect web application. -Then select "Unifi Devices", and find the device you want to connect to. In the right-panel, select "Settings" -Open the Advanced section, and you will find an RTSP section. Pick a resolution to stream at, and enable the stream. Then an RTSP url will appear below your selection. +* [Camera types](https://github.com/groundlight/stream/blob/main/CAMERAS.md) shows how to get RTSP stream URLs for many popular camera brands. +* [Developing](https://github.com/groundlight/stream/blob/main/DEVELOPING.md) discusses how this code is built and maintained. diff --git a/USERGUIDE.md b/USERGUIDE.md deleted file mode 100644 index 11a6787..0000000 --- a/USERGUIDE.md +++ /dev/null @@ -1,42 +0,0 @@ -# Groundlight Stream Processor - -A containerized python application that uses the [groundlight](https://www.groundlight.ai/) SDK to -process frames from a video file, device, or stream. - -## Useage - -The system is easy to use on any system with Docker installed. Command line options are displayed like: - -``` shell -$ docker run groundlight/stream --help - -Captures frames from a video device, file or stream and sends frames as -image queries to a configured detector using the Groundlight API - -usage: stream [options] -t TOKEN -d DETECTOR - -options: - -d, --detector=ID detector id to which the image queries are sent - -e, --endpoint=URL api endpoint [default: https://api.groundlight.ai/device-api] - -f, --fps=FPS number of frames to capture per second. 0 to use maximum rate possible. [default: 5] - -h, --help show this message. - -s, --stream=STREAM id, filename or URL of a video stream (e.g. rtsp://host:port/script?params OR movie.mp4 OR *.jpg) [default: 0] - -t, --token=TOKEN api token to authenticate with the groundlight api - -v, --verbose enable debug logs - -w, --width=WIDTH resize images to w pixels wide (and scale height proportionately if not set explicitly) - -y, --height=HEIGHT resize images to y pixels high (and scale width proportionately if not set explicitly) - -m, --motion enable motion detection with pixel change threshold percentage (disabled by default) - -r, --threshold=THRESHOLD set detection threshold for motion detection [default: 1] - -p, --postmotion=POSTMOTION minimum number of seconds to capture for every motion detection [default: 1] - -i, --maxinterval=MAXINT maximum number of seconds before sending frames even without motion [default: 1000] -``` - -Start sending frames and getting predictions and labels using your own API token and detector ID - -``` shell -docker run groundlight/stream \ - -t api_29imEXAMPLE \ - -d det_2MiD5Elu8bza7sil9l7KPpr694a \ - -s https://www.youtube.com/watch?v=210EXAMPLE \ - -f 1 -``` diff --git a/requirements.txt b/requirements.txt index 6578474..915efce 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,4 +1,4 @@ -groundlight +groundlight==0.7.7 docopt==0.6.2 jsonformatter==0.3.1 numpy==1.21.2