Registered User (Ethernet connector ID fixed) |
Registered User mNo edit summary Tag: 2017 source edit |
||
(6 intermediate revisions by 4 users not shown) | |||
Line 1: | Line 1: | ||
{{ApplicableFor | |||
|MPUs list=STM32MP25x | |MPUs list=STM32MP25x | ||
|MPUs checklist=STM32MP13x, STM32MP15x, STM32MP25x | |MPUs checklist=STM32MP13x, STM32MP15x, STM32MP25x | ||
}} | }} | ||
== Overview == | == Overview == | ||
This article | This article explains how to stream camera content over network thanks to [[GStreamer overview|GStreamer]] application on top of [[V4L2_camera_overview|V4L2 Linux<sup>®</sup> kernel framework]]. | ||
This article | This article focuses on camera sensors that does not output compressed image or video content but raw content such as YUV, RGB, or Raw-bayer images. | ||
Remote sending of images with a decent framerate requires compression | Remote sending of images with a decent framerate requires compression to respect a reasonable bandwidth on the network. | ||
This could be achieved compressing to JPEG image format or video formats like VP8 or H264. | This could be achieved by compressing to JPEG image format or video formats like VP8 or H264. | ||
Find below some examples of command lines allowing to capture a continuous stream of raw images then compress to JPEG, VP8 or H264 stream while playing it using various multimedia players, either local or remote. | Find below some examples of command lines allowing to capture a continuous stream of raw images then compress to JPEG, VP8 or H264 stream while playing it using various multimedia players, either local or remote. | ||
Line 17: | Line 16: | ||
Here is an example of a local preview loopback using [https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html gst-launch] to capture raw pictures then compress to {{STPurple|JPEG}} then decode and display. | Here is an example of a local preview loopback using [https://gstreamer.freedesktop.org/documentation/tools/gst-launch.html gst-launch] to capture raw pictures then compress to {{STPurple|JPEG}} then decode and display. | ||
{{ | {{WestonLaunch}} | ||
}} | |||
<br> | <br> | ||
{| class="st-table mw-collapsible mw-collapsed" | {| class="st-table mw-collapsible mw-collapsed" | ||
| For {{MicroprocessorDevice | device=255}}, | | For {{MicroprocessorDevice | device=255}}, set up the camera subsystem first: | ||
|- | |- | ||
| | | | ||
Line 36: | Line 32: | ||
== UDP streaming == | == UDP streaming == | ||
An internet connection is required, for example by plugging an | An internet connection is required, for example, by plugging an Ethernet cable on the: | ||
{| class="st-table" | {| class="st-table" | ||
| {{Board | type=257x-EV1}} | | {{Board | type=257x-EV1}} | ||
Line 44: | Line 40: | ||
<br> | <br> | ||
{| class="st-table mw-collapsible mw-collapsed" | {| class="st-table mw-collapsible mw-collapsed" | ||
| For {{MicroprocessorDevice | device=255}}, | | For {{MicroprocessorDevice | device=255}}, set up the camera subsystem first: | ||
|- | |- | ||
| | | | ||
Line 54: | Line 50: | ||
<br> | <br> | ||
Get the IP address {{highlight|'''aa.bb.cc.dd'''}} of the host PC using [[Ifconfig|ifconfig]] command: | Get the IP address {{highlight|'''aa.bb.cc.dd'''}} of the host PC using the [[Ifconfig|ifconfig]] command: | ||
{{PC$}} ifconfig | grep "inet" | {{PC$}} ifconfig | grep "inet" | ||
inet addr:{{highlight|'''aa.bb.cc.dd'''}} Bcast:10.201.23.255 Mask:255.255.252.0 | inet addr:{{highlight|'''aa.bb.cc.dd'''}} Bcast:10.201.23.255 Mask:255.255.252.0 | ||
inet addr:127.0.0.1 Mask:255.0.0.0 | inet addr:127.0.0.1 Mask:255.0.0.0 | ||
Then fill the {{HighlightParam|'''host{{=}}'''}} udpsink property with this IP address on the remote side | Then fill the {{HighlightParam|'''host{{=}}'''}} udpsink property with this IP address on the remote side to send UDP {{STPurple|'''JPEG'''}} stream: | ||
{{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="{{STPurple|'''image/jpeg'''}}" ! {{STPurple|'''rtpjpegpay'''}} ! udpsink {{HighlightParam|'''host{{=}}'''}}{{highlight|'''aa.bb.cc.dd'''}} port=5000 | {{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="{{STPurple|'''image/jpeg'''}}" ! {{STPurple|'''rtpjpegpay'''}} ! udpsink {{HighlightParam|'''host{{=}}'''}}{{highlight|'''aa.bb.cc.dd'''}} port=5000 | ||
Then play the UDP {{STPurple|'''JPEG'''}} stream on host PC: | Then play the UDP {{STPurple|'''JPEG'''}} stream on host PC: | ||
{{PC$}} gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name={{STPurple|'''JPEG'''}} ! {{STPurple|'''rtpjpegdepay'''}} ! {{STPurple|'''jpegparse'''}} ! decodebin ! videoconvert ! autovideosink | {{PC$}} gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name={{STPurple|'''JPEG'''}} ! {{STPurple|'''rtpjpegdepay'''}} ! {{STPurple|'''jpegparse'''}} ! decodebin ! videoconvert ! autovideosink sync=false | ||
A new window | A new window appears on the host PC displaying the camera content. | ||
The same can be done with a lower bandwidth thanks to video compression. Here is an example with {{STDarkGreen|'''VP8'''}}: | |||
{{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="{{STDarkGreen|'''video/x-vp8'''}}" ! {{STDarkGreen|'''rtpvp8pay'''}} ! udpsink host{{=}}aa.bb.cc.dd port=5000 | {{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="{{STDarkGreen|'''video/x-vp8'''}}" ! {{STDarkGreen|'''rtpvp8pay'''}} ! udpsink host{{=}}aa.bb.cc.dd port=5000 | ||
Then play the UDP {{STDarkGreen|'''VP8'''}} stream on host PC: | Then play the UDP {{STDarkGreen|'''VP8'''}} stream on host PC: | ||
{{PC$}} gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name={{STDarkGreen|'''VP8'''}} ! {{STDarkGreen|'''rtpvp8depay'''}} ! decodebin ! videoconvert ! autovideosink | {{PC$}} gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name={{STDarkGreen|'''VP8'''}} ! {{STDarkGreen|'''rtpvp8depay'''}} ! decodebin ! videoconvert ! autovideosink sync=false | ||
Another example with {{STDarkBlue|'''H264'''}}: | |||
{{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="{{STDarkBlue|'''video/x-h264'''}}" ! {{STDarkBlue|'''h264parse config-interval{{=}}1'''}} ! {{STDarkBlue|'''rtph264pay'''}} ! udpsink host{{=}}aa.bb.cc.dd port=5000 | |||
Then play the UDP {{STDarkBlue|'''H264'''}} stream on host PC: | |||
{{PC$}} gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name={{STDarkBlue|'''H264'''}} ! {{STDarkBlue|'''rtph264depay'''}} ! decodebin ! videoconvert ! autovideosink sync=false | |||
Another H264 streaming example giving a {{Highlight|200 Kb/s bitrate constraint}} to limit network bandwidth: | |||
{{Board$}} gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="video/x-h264|element-properties,{{Highlight|'''rate-control<nowiki>=</nowiki>1,bitrate<nowiki>=</nowiki>200000'''}}" ! h264parse config-interval=1 ! rtph264pay ! udpsink host{{=}}aa.bb.cc.dd port=5000 | |||
{{Info|Refer to the [[V4L2_video_codec_overview#Controlling_encoder]] section for more details about video encoder available controls}} | |||
<noinclude> | <noinclude> | ||
{{ArticleBasedOnModel | How to article model}} | {{ArticleBasedOnModel | How to article model}} | ||
{{PublicationRequestId | | {{PublicationRequestId |32068 | 2024-08-22 | }} | ||
[[Category:How to run use cases]] | [[Category:How to run use cases]] | ||
[[Category:V4L2]] | [[Category:V4L2]] | ||
[[Category:GStreamer]] | [[Category:GStreamer]] | ||
</noinclude> | </noinclude> |
Latest revision as of 11:25, 12 September 2024
1. Overview[edit | edit source]
This article explains how to stream camera content over network thanks to GStreamer application on top of V4L2 Linux® kernel framework.
This article focuses on camera sensors that does not output compressed image or video content but raw content such as YUV, RGB, or Raw-bayer images.
Remote sending of images with a decent framerate requires compression to respect a reasonable bandwidth on the network. This could be achieved by compressing to JPEG image format or video formats like VP8 or H264.
Find below some examples of command lines allowing to capture a continuous stream of raw images then compress to JPEG, VP8 or H264 stream while playing it using various multimedia players, either local or remote.
2. Local streaming[edit | edit source]
Here is an example of a local preview loopback using gst-launch to capture raw pictures then compress to JPEG then decode and display.
ExpandFor STM32MP255 line ![]() |
image/jpeg" ! decodebin ! autovideosinkgst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="
3. UDP streaming[edit | edit source]
An internet connection is required, for example, by plugging an Ethernet cable on the:
STM32MP257x-EV1 Evaluation board ![]() |
CN17 Ethernet connector 1 |
ExpandFor STM32MP255 line ![]() |
Get the IP address aa.bb.cc.dd of the host PC using the ifconfig command:
aa.bb.cc.dd Bcast:10.201.23.255 Mask:255.255.252.0 inet addr:127.0.0.1 Mask:255.0.0.0ifconfig | grep "inet" inet addr:
Then fill the host= udpsink property with this IP address on the remote side to send UDP JPEG stream:
image/jpeg" ! rtpjpegpay ! udpsink host=aa.bb.cc.dd port=5000gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="
Then play the UDP JPEG stream on host PC:
JPEG ! rtpjpegdepay ! jpegparse ! decodebin ! videoconvert ! autovideosink sync=falsegst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name=
A new window appears on the host PC displaying the camera content.
The same can be done with a lower bandwidth thanks to video compression. Here is an example with VP8:
video/x-vp8" ! rtpvp8pay ! udpsink host=aa.bb.cc.dd port=5000gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="
Then play the UDP VP8 stream on host PC:
VP8 ! rtpvp8depay ! decodebin ! videoconvert ! autovideosink sync=falsegst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name=
Another example with H264:
video/x-h264" ! h264parse config-interval=1 ! rtph264pay ! udpsink host=aa.bb.cc.dd port=5000gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="
Then play the UDP H264 stream on host PC:
H264 ! rtph264depay ! decodebin ! videoconvert ! autovideosink sync=falsegst-launch-1.0 udpsrc port=5000 ! application/x-rtp, encoding-name=
Another H264 streaming example giving a 200 Kb/s bitrate constraint to limit network bandwidth:
rate-control=1,bitrate=200000" ! h264parse config-interval=1 ! rtph264pay ! udpsink host=aa.bb.cc.dd port=5000gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw, format=YUY2, width=640, height=480, framerate=30/1 ! encodebin profile="video/x-h264|element-properties,
![]() |
Refer to the V4L2_video_codec_overview#Controlling_encoder section for more details about video encoder available controls |