Skip to content
This repository has been archived by the owner on Jul 10, 2023. It is now read-only.

pipeline shows error with splitmuxsink #119

Open
dhaval-zala-aivid opened this issue Nov 10, 2022 · 4 comments
Open

pipeline shows error with splitmuxsink #119

dhaval-zala-aivid opened this issue Nov 10, 2022 · 4 comments

Comments

@dhaval-zala-aivid
Copy link

dhaval-zala-aivid commented Nov 10, 2022

@nnshah1
I'm running yolov5 on pipeline-server without splitmuxsink. Its shows the proper result.
But if i add splitmuxsink in pipeline. the pipeline shows error like this...


{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:05,093", "message": "Setting model-proc to /home/pipeline-server/models/object_detection/coco_yolov5_tiny_608to416_FP32/head_yolov5_tiny_608to416_FP32.json for element detection", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:05,095", "message": "Gst launch string is only for debugging purposes, may not be accurate", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:05,095", "message": "gst-launch-1.0 urisourcebin  uri=rtsp://aivid:[email protected]:554/cam/realmonitor?channel=6&subtype=0 ! tee name=t ! decodebin ! video/x-raw ! videoconvert  qos=True ! gvadetect  model=/home/pipeline-server/models/object_detection/coco_yolov5_tiny_608to416_FP32/FP32/yolov5s.xml model-instance-id=detection_e4e789cc60b511edad0a0242ac110002 model-proc=/home/pipeline-server/models/object_detection/coco_yolov5_tiny_608to416_FP32/head_yolov5_tiny_608to416_FP32.json threshold=0.10000000149011612 ! gvametaconvert  tags={\"sensor\": \"sensor\", \"location\": \"location\", \"algorithm\": \"algorithmName\"} ! gvametapublish  file-path=/tmp/results_objects.txt file-format=<enum each line is valid JSON with inference results per frame of type gstreamer_pipeline.GvaMetaPublishFileFormat> ! gvapython  module=/home/pipeline-server/server/pplcount_copy.py class=PeopleCount ! appsink  sync=False emit-signals=True", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:05,185", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:08,185", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:11,187", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "ERROR", "asctime": "2022-11-10 07:10:11,753", "message": "**Error on Pipeline** b392bf4860c611ed97990242ac110002: **gst-stream-error-quark**: Internal data stream error. (1): ../gst/rtsp/gstrtspsrc.c(6252): gst_rtspsrc_loop (): /GstPipeline:pipeline1/GstURISourceBin:source/GstRTSPSrc:rtspsrc0:\nstreaming stopped, reason not-linked (-1)", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:11,753", "message": "Setting Pipeline b392bf4860c611ed97990242ac110002 State to ERROR", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:14,187", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:17,188", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:20,188", "message": "Called Status", "module": "gstreamer_pipeline"}
{"levelname": "DEBUG", "asctime": "2022-11-10 07:10:23,189", "message": "Called Status", "module": "gstreamer_pipeline"}

here is the pipeline

{
	"type": "GStreamer",
	"template": ["{auto_source} ! tee name=t ! decodebin ! video/x-raw ! videoconvert name=videoconvert",
				" ! gvadetect model={models[object_detection][coco_yolov5_tiny_608to416_FP32][network]} name=detection",
				" ! gvametaconvert name=metaconvert ! gvametapublish name=destination",
				" ! gvapython class=PeopleCount module=/home/pipeline-server/server/pplcount.py name=object-counter",
				" ! appsink name=appsink",
				" t. ! queue ! qtdemux ! splitmuxsink name=splitmuxsink"
			],
	"description": "Person Vehicle Bike Detection based on person-vehicle-bike-detection-crossroad-0078",
	"parameters": {
		"type": "object",
		"properties": {
			"detection-properties": {
				"element": {
					"name": "detection",
					"format": "element-properties"
				}
			},
			"detection-device": {
				"element": {
					"name": "detection",
					"property": "device"
				},
				"type": "string",
				"default": "{env[DETECTION_DEVICE]}"
			},
			"detection-model-instance-id": {
				"element": {
					"name": "detection",
					"property": "model-instance-id"
				},
				"type": "string"
			},
			"inference-interval": {
				"element": "detection",
				"type": "integer",
				"minimum": 1,
				"maximum": 4294967295,
				"default": 1
			},
			"max-size-time": {
				"element": "splitmuxsink",
				"type": "integer",
				"minimum": 1,
				"maximum": 200000000000,
				"default": 2000000000
			},
			"recording_prefix": {
				"type": "string",
				"element": {
					"name": "splitmuxsink",
					"property": "location"
				},
				"default": "/home/pipeline-server"
			},
			"threshold": {
				"element": "detection",
				"type": "number"
			}
		}
	}
}

@nnshah1
Copy link

nnshah1 commented Nov 10, 2022

Can you confirm the rtsp stream is compatible with mp4 (qtdemux / qtmux)? splitmuxsink uses qtmux as default - if the incoming stream is not compatible with qtmux / qtdemux - additional modifications would be needed

@whbruce
Copy link

whbruce commented Nov 10, 2022

Another experiment you could do to narrow down the problem is check that the record part of the record_playback sample works. Then change to using your media and let us know what happens.

@dhaval-zala-aivid
Copy link
Author

@nnshah1 @whbruce

With the given pipeline I can run with splitmuxsink and its also generating the videos. But how can I get the timestep in generated video name?
working pipeline:

{
	"type": "GStreamer",
	"template": ["urisourcebin name=source ! decodebin ",
		" ! gvadetect model={models[object_detection][coco_yolov5_tiny_608to416_FP32][network]} name=detection",
		" ! gvametaconvert name=metaconvert ! queue ! gvametapublish name=destination",
		" ! gvapython class=PeopleCount module=/home/pipeline-server/server/pplcount.py name=object-counter",
		" ! gvawatermark ! videoconvert ! x264enc ! splitmuxsink muxer=avimux location=\"/tmp/temp-%d.mp4\" max-size-time=30000000000"
		],

	"description": "Person Vehicle Bike Detection based on person-vehicle-bike-detection-crossroad-0078",
	"parameters": {
		"type": "object",
		"properties": {
			"detection-properties": {
				"element": {
					"name": "detection",
					"format": "element-properties"
				}
			},
			"detection-device": {
				"element": {
					"name": "detection",
					"property": "device"
				},
				"type": "string",
				"default": "{env[DETECTION_DEVICE]}"
			},
			"detection-model-instance-id": {
				"element": {
					"name": "detection",
					"property": "model-instance-id"
				},
				"type": "string"
			},
			"inference-interval": {
				"element": "detection",
				"type": "integer",
				"minimum": 1,
				"maximum": 4294967295,
				"default": 1
			},
			
			"recording_prefix": {
				"type": "string",
				"element": {
					"name": "splitmuxsink",
					"property": "location"
				},
				"default": "/home/pipeline-server"
			},
			"threshold": {
				"element": "detection",
				"type": "number"
			}
		}
	}
}

@nnshah1
Copy link

nnshah1 commented Dec 1, 2022

to get the timestamp in the video segments do

  1. remove the element block in the recording prefix parameter
  2. add a name=splitmuxsink to the splitmuxsink element
{
	"type": "GStreamer",
	"template": ["urisourcebin name=source ! decodebin ",
		" ! gvadetect model={models[object_detection][coco_yolov5_tiny_608to416_FP32][network]} name=detection",
		" ! gvametaconvert name=metaconvert ! queue ! gvametapublish name=destination",
		" ! gvapython class=PeopleCount module=/home/pipeline-server/server/pplcount.py name=object-counter",
		" ! gvawatermark ! videoconvert ! x264enc ! splitmuxsink muxer=avimux name=splitmuxsink max-size-time=30000000000"
		],

	"description": "Person Vehicle Bike Detection based on person-vehicle-bike-detection-crossroad-0078",
	"parameters": {
		"type": "object",
		"properties": {
			"detection-properties": {
				"element": {
					"name": "detection",
					"format": "element-properties"
				}
			},
			"detection-device": {
				"element": {
					"name": "detection",
					"property": "device"
				},
				"type": "string",
				"default": "{env[DETECTION_DEVICE]}"
			},
			"detection-model-instance-id": {
				"element": {
					"name": "detection",
					"property": "model-instance-id"
				},
				"type": "string"
			},
			"inference-interval": {
				"element": "detection",
				"type": "integer",
				"minimum": 1,
				"maximum": 4294967295,
				"default": 1
			},
			
			"recording_prefix": {
				"type": "string",
				"default": "/home/pipeline-server"
			},
			"threshold": {
				"element": "detection",
				"type": "number"
			}
		}
	}
}

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants