DEV Community

Cover image for ROS with Raspberry Pi: Improving Image Streaming Performance
Sebastian
Sebastian

Posted on

ROS with Raspberry Pi: Improving Image Streaming Performance

When using ROS on a small-scale computer like the Raspberry Pi, performance optimizations are very important. In the last article, I concluded an interesting experiment about network connectivity that resulted in a clear strategy: Use a dedicated 5Ghz Wi-Fi access point to connect your Raspberry Pi to your Workstation, and start the roscore node on the workstation. In this way, data streaming throughput, measures with rostopic, is best. This follow-up article continues the optimization for one area in which special constraints apply: The transportation of image data from camera and point cloud sensors. This article will teach you 4 optimization aspects: USB connection, ROS node parametrization, traffic shaping and using compressed data.

This article originally appeared at my blog admantium.com.

Step 1: Camera Connection

To connect a camera to the Raspberry Pi, you can use two different physical interfaces. The CSI, an acronym for Camera Serial Interface, is a special port on the Raspberry Pi. According to the technical specification, it can transfer between 5 - 5.8 Gbit/s. Your camera model must support this interface, which is for example the case with the Raspberry Pi Camera.

Source: https://picamera.readthedocs.io

The other option is USB. USB has different standards, and on a Raspberry Pi 4, you can use either USB2.0 (480 Mbit/s) or USB3.0 (5 Gbit/s). Please note that also USB3.1 and USB3.2 exists, but this version is not yet supported on the Raspberry Pi. USB cameras are quite common but check that your particular model is supported by ROS.

Note: For my robot project, I'm using the Intel RealSense D435 camera. It has a USB3.0 connection and is actively supported by Intel to run on the Raspberry PI and supports ROS.

A particular aspect I encountered is USB2.0 via USB3.0. When starting the relevant ROS node, you might see a message as shown:

Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.905999083]: Device with port number 2-1 was found.
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.906064100]: Device USB type: 2.0
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.906121599]: Warning, expect lower performance
Enter fullscreen mode Exit fullscreen mode

It seems that the camera was only connected with USB2.0 speed. Checking the Linux Kernel messages does not yield anything suspicious...

[  382.569703] usb 1-1.1: new high-speed USB device number 3 using xhci_hcd
[  382.671057] usb 1-1.1: New USB device found, idVendor=8086, idProduct=0b07, bcdDevice=50.ce
[  382.671077] usb 1-1.1: New USB device strings: Mfr=1, Product=2, SerialNumber=0
[  382.671096] usb 1-1.1: Product: Intel(R) RealSense(TM) Depth Camera 435 
[  382.671112] usb 1-1.1: Manufacturer: Intel(R) RealSense(TM) Depth Camera 435 
[  382.742153] uvcvideo: Unknown video format 00000050-0000-0010-8000-00aa00389b71
[  382.742349] uvcvideo: Found UVC 1.50 device Intel(R) RealSense(TM) Depth Camera 435  (8086:0b07)
[  382.744851] input: Intel(R) RealSense(TM) Depth Ca as /devices/platform/scb/fd500000.pcie/pci0000:00/0000:00:00.0/0000:01:00.0/usb1/1-1/1-1.1/1-1.1:1.0/input/input0
[  382.745253] uvcvideo: Found UVC 1.50 device Intel(R) RealSense(TM) Depth Camera 435  (8086:0b07)
[  382.747894] usbcore: registered new interface driver uvcvideo
[  382.747901] USB Video Class driver (1.1.1)
Enter fullscreen mode Exit fullscreen mode

... and also checking that a USB3.0 hub is correctly connected did not reveal any error.

Bus 002 Device 001: ID 1d6b:0003 Linux Foundation 3.0 root hub
Bus 001 Device 003: ID 8086:0b07 Intel Corp. 
Bus 001 Device 002: ID 2109:3431 VIA Labs, Inc. Hub
Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub
Enter fullscreen mode Exit fullscreen mode

After some research, I found the mistake. For some reason, I thought that all USB-C cables are USB3.0. But no! With the correct cable connected, it looks very different.:

Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.325630073]: Initializing nodelet with 4 worker threads.
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.644655742]: RealSense ROS v2.2.21
...
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.904735999]: Device with physical ID 2-1-4 was found.
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.904770313]: Device with name Intel RealSense D435 was found.
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.905999083]: Device with port number 2-1 was found.
Oct 12 18:21:00 ubuntu ROS[3633]: [ INFO] [1634062852.906064100]: Device USB type: 3.2
Enter fullscreen mode Exit fullscreen mode

Step 2: Node Parametrization

The next step is to consider the particular ROS nodes that publish the data. Get to know the libraries that you are working with, check their configuration options, and systematically try different settings to improve the performance step by step.

As a blueprint, you can follow the approach outlined in my previous article. In this experiment, I systematically tested the topic frequency for images when connecting a Linux workstation and the Raspberry Pi wirelessly. See the following excerpt to recognize how much influence the startup parameters have on the overall performance.

# Hardware Setup
Sender: Raspberry Pi 4 
Receiver: Linux Workstation 

# Software & Frameworks
OS: Ubuntu Server/Focal 20.04
Ros Noetic 1.5.0-1focal.20210922.213755
Realsense  SDK v2.41
Realsense Ros 2.2.21
Enter fullscreen mode Exit fullscreen mode

By trying different parameters, I could increase the amount of data received drastically.

Parameters
depth/color width 640 640 640
depth/color height 480 480 480
depth_fps 5 30 30
color_fps 5 30 30
initial_reset FALSE TRUE TRUE
enable_sync TRUE TRUE TRUE TRUE
align_depth FALSE TRUE TRUE
filters point
Topic HZ Sender
color/image_raw 15 15 15 25
depth/color/points no data no data no data 15
depth/image-rect 15 30 30 30
Topic HZ Receiver
color/image_raw 8 10 11 10
depth/color/points no data no data no data 15
depth/image-rect 8 15 15 14

Step 3: Traffic Shaping with Topic Tools

When using ROS over multiple computers in the same network, an additional challenge is managing the bandwidth intelligently. When using Wi-Fi, the bandwidth is drastically reduced. In my previous article measuring Wi-Fi upload speed I could merely transport 9MB/s with the onboard Wi-Fi, and by using an external USB Wi-Fi adapter with an antenna in improving Raspberry PI4 wireless performance, this improved to 20MB/s.

Still, this bandwidth might not be enough for your particular use case. And for these situations, the ros package topic tools provides several tools to shape traffic so that it fits your network characteristics.

To be honest, I find it hard to dig up a single, complete documentation of the essential network characteristics for ROS 1, but by combining several sources I figured this out:

  • Nodes publish topics width the full specified frequency and as much bandwidth as possible even if no subscriber exists
  • Each node has a maximum bandwidth available, and if too many topics are published, the bandwidth drops similarly

With these characteristics, it becomes apparent why the topic tools are handy. For example, if you want to maximize the bandwidth of one node, start a relay node that listens exclusively to one topic and publishes it as another topic. Or if you have a node that publishes several topics and you need to subscribe to all of them, but this would exceed the available bandwidth, you can use a throttle node.

Let’s see both types in action.

Relay Nodes

These nodes are started with two arguments: The name of the topic to subscribe to, and the name of the topic to which it should publish, like this:

rosrun topic_tools relay /camera/depth/color/points /camera/depth/color/points_rey
Enter fullscreen mode Exit fullscreen mode

Relay nodes themself have minimal overhead, they are as capable as the original nodes.

Topic Bandwidth Sender Receiver
camera/depth/color/points 25.47MB/s 10.15MB/s
camera/depth/color/points_relay 25.46MB/s 10.16MB/s

Throttle Nodes

Throttling can limit either just the frequency or the frequency and bandwidth of the topic. Its syntax is:

rosrun topic_tools throttle messsages $TOPIC 3.0
rosrun topic_tools throttle bytes $TOPIC 2048
Enter fullscreen mode Exit fullscreen mode

The throttled topics are published exactly as specified.

Step 4: Use Compressed Data

The final option is to use compressed instead of raw data.

This option is limited in applicability. First, both the producer and the consumer need to be able to work with compressed data. Second, producer and consumer need to provide the data in exactly the same message format and version.

Here are the topic bandwidths of the compressed image data:

Topic Bandwidth Sender
camera/color/image_raw 15.33MB/s
camera/color/image_raw/compressed 465,93KB/s
camera/depth/image_rect_raw 18.37MB/s
camera/depth/image_rect_raw/compressed 742.83B/s

Conclusion

Using ROS on single-board computers introduces performance obstacles when transporting data over Wi-Fi. This article showed four different aspects how to optimize streaming image data: 1) Favor USB3.0 and CSI native protocol for the camera connection, 2) fine-tune ROS node parameters, 3) use ROS topic tools to shape the network traffic and finally 4) use compressed data. These hints should help you to utilize the limited bandwidth of Wi-Fi connections as best as possible.

Discussion (0)