Jetson Nano + CSIカメラ + momo で WebRTC
jetson nano 用の momo を落としてくる
code:bash
./momo --version
WebRTC Native Client Momo 2021.3 (1334a266)
WebRTC: Shiguredo-Build M90.4430@{#3} (90.4430.3.1 dee77cf2)
Environment: aarch64 Ubuntu 18.04.5 LTS (nvidia-l4t-core 32.5.1-20210519110732) USE_MMAL_ENCODER=0
USE_JETSON_ENCODER=1
USE_NVCODEC_ENCODER=0
USE_SDL2=1
普通に実行してみるとエラー。
code:bash
$ ./momo --log-level 1 --video-device /dev/video0 test
000:00010712 (v4l2_video_capturer.cpp:76): GetDeviceName(0): device_name=vi-output, imx219 6-0010, unique_name=platform:54080000.vi:0 000:00110712 (v4l2_video_capturer.cpp:76): GetDeviceName(1): device_name=Dummy video device (0x0000), unique_name=platform:v4l2loopback-000 000:00210712 (v4l2_video_capturer.cpp:230): Video Capture enumerats supported image formats: 000:00210712 (v4l2_video_capturer.cpp:232): { pixelformat = RG10, description = 'f9c24bbc' } 000:00210712 (v4l2_video_capturer.cpp:245): no supporting video formats found 000:00210712 (jetson_v4l2_capturer.cpp:65): Failed to start JetsonV4L2Capturer(w = 640, h = 480, fps = 30) 000:00310712 (v4l2_video_capturer.cpp:170): no matching device found 000:00310712 (jetson_v4l2_capturer.cpp:59): Failed to create JetsonV4L2Capturer(platform:v4l2loopback-000) 000:00410712 (jetson_v4l2_capturer.cpp:33): Failed to create JetsonV4L2Capturer failed to create capturer
momo のソースコード見ると、適切に処理できる pixelformat が無くてエラーになっているようだ
code:bash
$ v4l2-ctl -d /dev/video0 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'RG10'
Name : 10-bit Bayer RGRG/GBGB
Size: Discrete 3264x2464
Interval: Discrete 0.048s (21.000 fps)
Size: Discrete 3264x1848
Interval: Discrete 0.036s (28.000 fps)
Size: Discrete 1920x1080
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1640x1232
Interval: Discrete 0.033s (30.000 fps)
Size: Discrete 1280x720
Interval: Discrete 0.017s (60.000 fps)
RG10じゃ駄目なのね。ググると良い tweet してる人が
l4v2 の仮想デバイスを作ってそこに変換したものを流すことで扱えると
l4v2 の仮想デバイスを作る
sudo apt install v4l2loopback-dkms
sudo modprobe v4l2loopback
/dev/video1 が出来上がる
gstreamer 経由で変換して流す
code:bash
gst-launch-1.0 -v nvarguscamerasrc ! 'video/x-raw(memory:NVMM), format=NV12, width=3264, height=2464, framerate=21/1' ! nvvidconv ! 'video/x-raw, width=3264, height=2464, format=I420, framerate=21/1' ! videoconvert ! identity drop-allocation=1 ! 'video/x-raw, width=3264, height=2464, format=I420, framerate=21/1' ! v4l2sink device=/dev/video1`
code:bash
$ v4l2-ctl -d /dev/video1 --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
Index : 0
Type : Video Capture
Pixel Format: 'YU12'
Name : Planar YUV 4:2:0
Size: Discrete 3264x2464
Interval: Discrete 0.033s (30.000 fps)
うまく変換できた
momo で配信する。なお gstreamer を通しているためか、ハードウェアエンコードは使えないよう?
./momo --log-level 1 --hw-mjpeg-decoder=false --no-audio-device --video-device /dev/video1 test
test 用のローカルサーバで見るとヤッター動いた!