Giter Club home page Giter Club logo

ros_rtsp's People

Contributors

circusmonkey avatar lucasw avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

ros_rtsp's Issues

nvidia codec

Hi,
I am working to use GPU-accelerated encoder instead as xh264enc causes considerably high CPU usage. What I am currently doing is simply replacing xh264enc with nvh264enc in the pipeline string and removing those properties not supported by nvh264enc like tune=zerolatency etc. I can see a substantial drop of CPU usage on the sender side, but the receiver side gets higher latency. How could I reconstruct the gstreamer pipeline properly to achieve the same low latency as the original xh264enc? There seems much fewer options that can be used to tune nvh264enc.

Synchronization and timestamping problem

Hi,
Thanks for the great work.

I am currently using the node to stream ros images which i later receive as rtsp stream in gstreamer. for my use case the timestamps from the ros images are very important, after a bit of digging i found that neither the server pipeline clock is synced with images timestamps, nor the timestamps are attached to the buffer as metadata.

thanks in advance

Lag/Latency

When I stream rtsp to my VLC player there's almost 20-40s of lag in the stream.
I've tried almost everything.
VLC doesn't play the rtsp if I change the config file (eh: bitrate/ width/height etc).

What's the best solution to this?

How to support CompressedImage ?

I made the following changes to the code:

  1. in image2rtsp.h:
// GstCaps* gst_caps_new_from_image(const sensor_msgs::Image::ConstPtr &msg);
GstCaps* gst_caps_new_from_image(const sensor_msgs::CompressedImageConstPtr &msg); //my_test
//void imageCallback(const sensor_msgs::Image::ConstPtr& msg, const std::string& topic);
void CompressedImageCallback(const sensor_msgs::CompressedImageConstPtr& msg, const std::string& topic); //my_test
  1. in image2rtsp.cpp
    2.1 Use fixed parameters
GstCaps* Image2RTSPNodelet::gst_caps_new_from_image(const sensor_msgs::CompressedImageConstPtr &msg)
{
    // http://gstreamer.freedesktop.org/data/doc/gstreamer/head/pwg/html/section-types-definitions.html

    // static const ros::M_string known_formats = {{
    //     {sensor_msgs::image_encodings::RGB8, "RGB"},
    //     {sensor_msgs::image_encodings::RGB16, "RGB16"},
    //     {sensor_msgs::image_encodings::RGBA8, "RGBA"},
    //     {sensor_msgs::image_encodings::RGBA16, "RGBA16"},
    //     {sensor_msgs::image_encodings::BGR8, "BGR"},
    //     {sensor_msgs::image_encodings::BGR16, "BGR16"},
    //     {sensor_msgs::image_encodings::BGRA8, "BGRA"},
    //     {sensor_msgs::image_encodings::BGRA16, "BGRA16"},
    //     {sensor_msgs::image_encodings::MONO8, "GRAY8"},
    //     {sensor_msgs::image_encodings::MONO16, "GRAY16_LE"},
    // }};

    // if (msg->is_bigendian) {
    //     ROS_ERROR("GST: big endian image format is not supported");
    //     return nullptr;
    // }

    // auto format = known_formats.find(msg->encoding);
    // if (format == known_formats.end()) {
    //     ROS_ERROR("GST: image format '%s' unknown", msg->encoding.c_str());
    //     return nullptr;
    // }


    // return gst_caps_new_simple("video/x-raw",
    //         "format", G_TYPE_STRING, format->second.c_str(),
    //         "width", G_TYPE_INT, msg->width,
    //         "height", G_TYPE_INT, msg->height,
    //         "framerate", GST_TYPE_FRACTION, 10, 1,
    //         nullptr);
    return gst_caps_new_simple("video/x-raw",
            "format", G_TYPE_STRING, "RGB",
            "width", G_TYPE_INT, 1280,
            "height", G_TYPE_INT, 720,
            "framerate", GST_TYPE_FRACTION, 10, 1,
            nullptr);
}

2.2 change bufsize=720*1280

// my_test satrt
void Image2RTSPNodelet::CompressedImageCallback(const sensor_msgs::CompressedImageConstPtr& msg, const std::string& topic) {
    GstBuffer *buf;

    GstCaps *caps;
    char *gst_type, *gst_format=(char *)"";
    // g_print("Image encoding: %s\n", msg->encoding.c_str());
    if (appsrc[topic] != NULL) {
        // Set caps from message
        caps = gst_caps_new_from_image(msg);
        gst_app_src_set_caps(appsrc[topic], caps);

        // buf = gst_buffer_new_allocate(nullptr, msg->data.size(), nullptr);
        //gst_buffer_fill(buf, 0, msg->data.data(), msg->data.size());
        //std::cout<< msg->data.size()<<std::endl;
        gsize data_size = 2764800;
        buf = gst_buffer_new_allocate(nullptr, data_size, nullptr);
        gst_buffer_fill(buf, 0, msg->data.data(), 2764800);
        GST_BUFFER_FLAG_SET(buf, GST_BUFFER_FLAG_LIVE);

        gst_app_src_push_buffer(appsrc[topic], buf);
    }
}
// my_test end

if no changes as 2.2, will get Error : 80000 < 2764800. i guess 80000 means the size of the CompressedImage.
Apply all the changes, the process will be killed.

Licensing

Hi,

I noticed there wasn't a license attached to this project. We would like to use this package, what license do you intend for it to be under?

GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "x264enc"

I install this node on ubuntu 18.04 with melodic. When I use VLC to connect to the rtsp server, this node prints the following error:

0:07:41.178902242 11218 0x5581d8cc4f70 ERROR GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "x264enc"
0:07:41.178911425 11218 0x5581d8cc4f70 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x7f94cc006d80]
0:07:41.178933565 11218 0x5581d8cc4f70 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x7f94cc00a270]
0:07:41.270150981 11218 0x5581d8cc4f70 ERROR rtspclient rtsp-client.c:1054:find_media: client 0x7f94f000a5a0: can't prepare media
0:07:41.270270203 11218 0x5581d8cc4f70 ERROR rtspclient rtsp-client.c:2910:handle_describe_request: client 0x7f94f000a5a0: no media
[ INFO] [1629166812.940669430]: New RTSP client
0:07:46.000311784 11218 0x7f94dc054c50 ERROR GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "x264enc"
0:07:46.000328290 11218 0x7f94dc054c50 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x7f94e0009580]
0:07:46.000354271 11218 0x7f94dc054c50 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x7f94e003c070]
0:07:46.061032460 11218 0x7f94dc054c50 ERROR rtspclient rtsp-client.c:1054:find_media: client 0x7f94f000a490: can't prepare media
0:07:46.061180635 11218 0x7f94dc054c50 ERROR rtspclient rtsp-client.c:2639:handle_setup_request: client 0x7f94f000a490: media '/back' not found

The issue can be solved by
sudo apt install gstreamer1.0-libav gstreamer1.0-plugins-ugly

Also, When I check the stream like this:

gst-launch-1.0 -v rtspsrc location=rtsp://127.0.0.1:8554/back drop-on-latency=true use-pipeline-clock=true do-retransmission=false latency=0 protocols=GST_RTSP_LOWER_TRANS_UDP ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink sync=true

I get:

WARNING: erroneous pipeline: no element "h264parse"

Solve it by:
sudo apt install gstreamer1.0-plugins-bad

node doesn't subscribe camera image

Hello,
I guess I'm to stupid to use your node properly.

I'm running some ROS cameras from stemmer-imaging.
The output is a ROS-Topic which is called "/MonoLeft/image_raw"

My config file looks like this:
`
port: "8554"
streams:

stream2:
type: topic
source: /MonoLeft/image_raw
mountpoint: /back
caps: video/x-raw,framerate=10/1,width=640,height=480
bitrate: 800
`

When I check who subscribes this node I can't see the rtsp node
'rostopic info MonoLeft/image_raw'
`
Type: sensor_msgs/Image

Publishers:

Subscribers:

  • /rqt_gui_cpp_node_10987 (http://nano:45731/)
    `
    Like you see, there is an Video Output on the screen in a gui window which works.

'rostopic info /standalone_nodelet/bond' says:
`
Type: bond/Status

Publishers:

Subscribers:

ROS is running on a jetson nano (arm64 / aarch64)

Do you have any idea, what I'm doing wrong? Seems like it's working for some others. The image format is RAW.
May you can help me out?
Thanks a lot
Greetings
Markus

Roslaunch shuts down after a while

Here's the error I getting-

[ INFO] [1642740727.979080745]: Client connected: /back
[ERROR] [1642740930.640270278]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.705612519]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.738311305]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.805840695]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.878347705]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.909162431]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc
[ERROR] [1642740930.938718362]: Exception thrown when deserializing message of length [921652] from [/usb_cam]: std::bad_alloc

(nodelet:5055): GLib-ERROR : /build/glib2.0-4CLeJI/glib2.0-2.48.2/./glib/gmem.c:100: failed to allocate 921683 bytes
[standalone_nodelet-1] process has died [pid 5055, exit code -5, cmd /opt/ros/kinetic/lib/nodelet/nodelet manager __name:=standalone_nodelet __log:=/home/ubuntu/.ros/log/67abd8b2-935e-11eb-b968-438b55e11b08/standalone_nodelet-1.log].
log file: /home/ubuntu/.ros/log/67abd8b2-935e-11eb-b968-438b55e11b08/standalone_nodelet-1
.log
[ INFO] [1642740934.303974307]: Bond broken, exiting
[Image2RTSPNodelet-2] process has finished cleanly
log file: /home/ubuntu/.ros/log/67abd8b2-935e-11eb-b968-438b55e11b08/Image2RTSPNodelet-2
.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done

Setting up the stream_setup.yaml file for GMSL camera with Nvidia Xavier NX based device

Hi,

I am currently able to successfully launch the camera streaming display with the below command line:
gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM),width=1900,height=1080,format=NV12' ! nvvidconv ! xvimagesink
I now want to use this package in order to launch the same with ROS. Below is my stream_setup.yaml file. With this yaml file I'm able to launch the node although do not see the image topic. I'm guessing there seems to be an issue with the way I'm setting certain parameters.

port: "8554"
streams: # Cannot rename - must leave this as is.

stream-x:
  type: cam
  source: "nvarguscamerasrc sensor-id=0 ! nvvidconv ! xvimagesink ! video/x-raw(memory:NVMM),width=1900,height=1080,format=NV12"
  mountpoint: /front
  bitrate: 500

stream-yay:
  type: topic
  source: /usb_cam1/image_raw
  mountpoint: /back
  caps: video/x-raw,framerate=10/1,width=640,height=480
  bitrate: 500

Hope to resolve this issue.
Thanks!

VLC error

  1. I run "roslaunch ros_rtsp rtsp_streams.lauch"successfully according to the readme method
    image

  2. I use the local vlc player to connect
    image

  3. VLC error and con not play
    image

Same source to different mountpoints

Hello, first of all, I'm very thankful with you, this has helped me a lot!!!.

I wanted to ask you if it is possible to have different mountpoints that use the same webcam as video source.
Thanks again Sam, you rock.

taking source as RTSP client

Hey, I am trying to take source from another RTSP camera connected over ethernet having rtsp-server and streaming over wlan interface. but facing problem as it is unable to prepare media to stream.

Config file:-
stream-x: type: cam source: "rtspsrc rtsp://192.168.144.25:8554/main.264 ! decodebin videoconvert ! videoscale ! video/x-raw,width=1280,height=780 " mountpoint: /front bitrate: 5000

tough
gst_launch-1.0 rtspsrc rtsp://192.168.144.25:8554/main.264 ! decodebin videoconvert ! autovideosink sync=false works to see the stream.

Public IP

How to make this rtsp link available on public IP?

memory leak problem with gst_caps_new_simple and gst_buffer_new_allocate call.

@CircusMonkey hi,i am trying use this project with 4 video stream, each stream is 1280 * 720. when this project running,it is fill memory very fast. i have 16G memory, about 10 minutes,it will use all memory.
i am checked the code,found 2 function call : gst_caps_new_simple and gst_buffer_new_allocate. as document say,these two function are allocate some memory then code can use it . but i can not find any code to release that memory.
so,is that the problem here?can you fix it?

Inter-client Connection Dependency

I am seeing a strange and repeatable behavior, where if I install this ros node in a docker container running melodic, a totally separate machine, also running melodic, is only able to consume the stream after I have first consumed it locally from within the stream using a gst-launch-1.0 client. If I do not first run the gst-launch-1.0 client, the other client can never successfully connect.

Steps:

  1. Install this ros node exactly following your readme
  2. run with the following config:
port: "8554"
streams:
  mock-stream:
    type: topic
    source: /zed/zed_nodelet/left/image_rect_color
    mountpoint: /mock
    caps: video/x-raw,framerate=10/1,width=1280,height=720
    bitrate: 500
  1. Try to consume from another machine using python (or even VLC) but get error:
capture = cv2.VideoCapture(rtsp_url, cv2.CAP_FFMPEG)

503 Service Not Available

  1. On host, this ros node prints error:

0:00:21.117989831 1454 0x5623304bba80 ERROR GST_PIPELINE grammar.y:816:priv_gst_parse_yyparse: no element "x264enc"
0:00:21.118087772 1454 0x5623304bba80 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no sink [source=@0x7f0e3801dcd0]
0:00:21.124276372 1454 0x5623304bba80 ERROR GST_PIPELINE grammar.y:901:priv_gst_parse_yyparse: link has no source [sink=@0x7f0e38030240]
0:00:21.314472980 1454 0x5623304bba80 ERROR rtspclient rtsp-client.c:1054:find_media: client 0x7f0e4c00b0f0: can't prepare media
0:00:21.314845959 1454 0x5623304bba80 ERROR rtspclient rtsp-client.c:2910:handle_describe_request: client 0x7f0e4c00b0f0: no media

  1. Restart this ROS node
  2. From within the same container (local), run: gst-launch-1.0 -v rtspsrc location=rtsp://0.0.0.0:8554/mock drop-on-latency=true use-pipeline-clock=true do-retransmission=false latency=0 protocols=GST_RTSP_LOWER_TRANS_UDP ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink sync=true
  3. Stop running gst-launch-1.0
  4. Again run my python code and it works

So how is the local gst-launch-1.0 client enabling the python (opencv) client to connect correctly? Shouldn't clients not depend on each other?

Codec configuration?

Hi! first of all, thanks for the great work!

I'm taking the stream form a ROS topic and trying to stream to an android device.
When using VLC, I am able to see the video, but Android is unable to display it. After some research, I think it is because Android does not support the used codec ( Android suported formats )

Is there any way to chenge the one used to one of the ones supported?
The android app is able to play other streams (such as rtsp://wowzaec2demo.streamlock.net/vod/mp4:BigBuckBunny_115k.mov) so that could not be the problem.

Thank you in advance!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.