Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: ipcamera support #93

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

kstasik
Copy link

@kstasik kstasik commented Aug 5, 2024

I needed this crate to setup some home automation and got tired of homebridge. Few things changed:

  • ifaddr had to be updated
  • basic tlv support had to be added
  • i made pointer aliases public because they are used by public traits anyway - i didn't want to copy them
  • basic camera support added

In case of ipcamera I have created trait which can be implemented for ffmpeg if needed.

I decided to use gstreamer bindings. It worked well with my hikvision ip camera. IMO the code is not perfect but better than nothing.

gstreamer18 support is needed to build it for raspberrypi (glib dependency).

Copy link
Collaborator

@simlay simlay left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm impressed! This is some great work! I've tried to get started with this feature a few different times and got discouraged because it felt like a large mountain to climb.

Just a heads up, I think ewilken might not have a lot of extra time to maintain features.

I added some comments in places I think are useful. I wrote a simple examples/ip_camera.rs to help me review this. What do you think about adding one?

I tried to test this against an rtsp stream (I think this should be supported?) but was unsuccessful. It's unclear how to craft the video/audio options, pipeline ro to specify rtsp/port for the ip camera.

Comment on lines +24 to +25
camera-gstreamer18 = ["dep:gstreamer18"]
camera-gstreamer = ["dep:gstreamer"]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interesting. I don't know much about gstreamer. Any particular reason why gstreamer 18 support specifically?

gstreamer18 support is needed to build it for raspberrypi (glib dependency).

Gotcha.

Comment on lines +39 to +45
// todo: consider types to be more restricted
#[derive(Clone, Default, Debug)]
pub struct StreamOptions {
pub video: Value,
pub audio: Value,
pub srtp: bool,
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was about to suggest changing this from a serde_json::Value to more struct/enumerated types because I wasn't sure what to use for the video/audio fields without looking at the tests.

/// setup_endpoints to create proper handshake with iOS device
/// Current accessory contract doesnt allow to self modify Characteristic and locks
/// are created when adding accessory to the server
pub async fn build<S>(self, server: S) -> Result<StreamManager<MP>, StreamManagerError>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm. How does one start do server.run_handle() after passing the server into the builder? I did a bit of debugging and was able to start it by using a reference to the server (server: &S) and stream_builder.build(&server) but it's unclear what the intended use is.

@ewilken
Copy link
Owner

ewilken commented Aug 6, 2024

Wow, thank you so much for this huge contribution and @simlay thank you so much for helping with the review! 🙏

Super happy to merge and publish this once you guys are on the same page about it.

@kstasik do you think it makes sense to check in a version of your code actually using this as an example, or would that be too complicated? Having a basic example of using it with GStreamer right in here might be pretty valuable to users.

@kstasik
Copy link
Author

kstasik commented Aug 6, 2024

Any help is very appreciated - the project for one evening turned out to be project for a week of evening.

I didn't find any good documentation of TLV contract for ipcamera accessory. I used https://github.com/ikalchev/HAP-python/blob/dev/pyhap/camera.py as a reference (i think we should add it to a block comment) + i did run some ipcamera nodejs homebridge integration to get some data for unit tests.

I have a little bit complicated setup, i have a garden plot and ipsec tunnel on a microtik routers between garden (LTE) and home - i wanted to have a remote camera and sprinklers accessible using homekit. In home i have raspberrypi with hap-rs running on it.

use std::net::{IpAddr, Ipv4Addr};

use hap::accessory::{
    camera::{
        manager::{StreamManager, StreamManagerBuilder, StreamOptions},
        media::gstreamer::{Gstreamer, StreamConfig},
        protocol::{
            VIDEO_CODEC_PARAM_LEVEL_TYPES_TYPE3_1, VIDEO_CODEC_PARAM_PROFILE_ID_TYPES_BASELINE,
        },
        CameraAccessory,
    },
    AccessoryInformation,
};
use serde_json::{json, Value};


let camera = CameraAccessory::new(
	1,
	1,
	AccessoryInformation {
		name: "Garden Front".into(),
		manufacturer: "test".into(),
		model: "1".into(),
		serial_number: "asd".into(),
		..Default::default()
	},
)
.unwrap();

let manager = StreamManagerBuilder::new(
	camera,
	Gstreamer::new(vec![StreamConfig {
		address,
		pipeline: "rtspsrc onvif-mode=true location=rtsp://user:pass@ip/Streaming/Channels/111/ ! rtpjitterbuffer ! decodebin".into(),
		options: StreamOptions {
			video: json!({
				"codec": {
					"profiles": [
						VIDEO_CODEC_PARAM_PROFILE_ID_TYPES_BASELINE,
					],
					"levels": [
						VIDEO_CODEC_PARAM_LEVEL_TYPES_TYPE3_1
					],
				},
				"resolutions": [
					[2560, 1440, 20],
					[1024, 768, 20],
					[640, 480, 20],
				]
			}),
			audio: json!({
				"codecs": [
					/* {
						"type": "OPUS",
						"samplerate": 24,
					},
					{
						"type": "AAC-eld",
						"samplerate": 16
					} */
				]
			}),
			srtp: true,
		},
	}])
	.unwrap(),
)
.build(server)
.await
.unwrap();

// run_handle

Cross.toml (another thing i wasted few hours on, raspberrypi, maybe someone will save some time thanks to this information):

[target.armv7-unknown-linux-gnueabihf]
pre-build = [
    "dpkg --add-architecture armhf",
    "apt-get update && apt-get upgrade",
    "apt-get install --assume-yes curl zip unzip",
    "apt-get install -y libglib2.0-dev:armhf",
    "apt-get install -y libgstreamer1.0-dev:armhf libgstreamer-plugins-base1.0-dev:armhf gstreamer1.0-plugins-base:armhf gstreamer1.0-plugins-good:armhf gstreamer1.0-plugins-bad:armhf gstreamer1.0-plugins-ugly:armhf gstreamer1.0-libav:armhf libgstrtspserver-1.0-dev:armhf libges-1.0-dev:armhf",
    "curl -OL https://github.com/google/protobuf/releases/download/v3.2.0/protoc-3.2.0-linux-x86_64.zip",
    "unzip protoc-3.2.0-linux-x86_64.zip -d protoc3",
    "mv protoc3/bin/* /usr/local/bin/",
    "mv protoc3/include/* /usr/local/include/",
    "chmod 777 /usr/local/bin/protoc",
    "chmod -R 777 /usr/local/include/google"
]%

Current solution is sufficient for my use case but i think it may have some bugs - i have low quality connection to the camera - it may be caused by tunnel over LTE with cheap internet - IOS sometimes sends premature signal to close the connection - or i did something wrong with audio stream (my camera doesn't have a microphone - and i wasn't able to extend gstreamer pipeline with audio support). Im also complaining about quality of video but it may be caused by my lack of knowledge about gstreamer/rtspsrc.

The reason why i had to introduce some builder pattern and accept Server trait instead normal constructor is lack of possibility to introduce inner mutability pattern on characteristics. Current contract of an accessory (trait) has to return a mutable references. So this is why i decided that .add_accessory should be called by the builder itself. We need to have a Arc<Mutex<_>> of an entire accessory. I had to use channel to omit deadlocks on callbacks (setValue callback has to change different characteristic state).

One more thing about the library is that it has some pairing issues. I have multiple apple devices in home and very often it's very difficult to pair a hap-rs server. I'm not sure if it's a race condition but i know it has some troubles with ids, File not found errors - https://github.com/ewilken/hap-rs/blob/main/src/transport/http/handler/pair_verify.rs#L226 . Uuid mismatch. If i have time i want to dig into it.

@kstasik
Copy link
Author

kstasik commented Aug 6, 2024

Oneliner to test gstreamer + rtspc:

gst-launch-1.0 -e rtspsrc onvif-mode=true location=rtsp://user:pass@host/Streaming/Channels/xxx/ ! decodebin ! videorate ! video/x-raw,framerate=30/1 ! videoconvert ! videoscale ! video/x-raw,width=640,height=360 ! x264enc tune=zerolatency bitrate=132 ! video/x-h264,profile=baseline ! mp4mux ! filesink location=test.mp4

The pipeline is built this way (it's generated/regenerated on a request from ios device):

Ok(format!(
            "{pipeline} \
            ! videorate \
            ! video/x-raw,framerate={fps}/1 \
            ! videoconvert \
            ! videoscale ! video/x-raw,width={width},height={height} \
            ! x264enc tune=zerolatency bitrate={max_bitrate} \
            ! video/x-h264,profile=baseline \
            ! rtph264pay config-interval=1 pt=99 ssrc={ssrc} \
            ! srtpenc key={key} \
            ! udpsink host={host} port={port} async=false"

RUST_LOG="info,hap=debug" should display generated gstreamer pipeline as well.

As you may notice it doesnt have udpsink + srtpenc for audio - i wanted to add it but my hikvision camera doesn't have a mic.

@kstasik
Copy link
Author

kstasik commented Aug 10, 2024

It looks like audio configuration is required and device capabilities characteristics are only fetched during pairing. So,

audio: json!({
				"codecs": [
					{
						"type": "OPUS",
						"samplerate": 24,
					},
					{
						"type": "AAC-eld",
						"samplerate": 16
					} 
				]
			}),

has to be enabled. Without it a video stream won't be requested by ios device.

@simlay
Copy link
Collaborator

simlay commented Aug 10, 2024

@kstasik here is my example/ip_camera.rs. I changed the ip address in the pipeline but otherwise it's the same. I'm not sure why I'm not seeing any logs from the StreamManagerBuilder. I'm confused how you have your server starting and passing to the StreamManagerBuilder without passing the server by reference.

@teovoinea
Copy link

I'd love to see a solution for the UUID mismatch. I tried the temporary solution outlined in another issue but it didn't work.

Are you able to run the lightbulb example using your branch?

@kstasik
Copy link
Author

kstasik commented Aug 18, 2024

simlay#1

IMO during pairing and from time to time Home fetches characteristics responsible for device capabilities. If it decides that current device can't provide a video with required resolution it will just silently fail. I think in your case it was either frame rate or available resolutions - i fixed it on MR.

Remember that device will negotiate the resolution of the streaming needed with accessory. Gstreamer creates dynamic video/audio conversion pipeline which will convert your video camera stream to proper format/framerate/resolution. So please list there all resolutions which fit your device probably caring only about aspect ratio starting from the best one.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants