How to Visualize Live-Streaming Frames from Intel RealSense Depth Sensor

Source Code Explore Other Examples

This example requires a connected RealSense depth sensor.

The Intel RealSense depth sensor can stream live depth and color data. To visualize this data output, we utilized Rerun.

Logging and visualizing with Rerun

The RealSense sensor captures data in both RGB and depth formats, which are logged using the Image and DepthImage archetypes, respectively. Additionally, to provide a 3D view, the visualization includes a pinhole camera using the Pinhole and Transform3D archetypes.

The visualization in this example were created with the following Rerun code.


RDF

For visualize the data as RDF:

<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>ViewCoordinates</span><span>.</span><span>RDF</span><span>,</span> <span>timeless</span><span>=</span><span>True</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>ViewCoordinates</span><span>.</span><span>RDF</span><span>,</span> <span>timeless</span><span>=</span><span>True</span><span>)</span> 
rr.log("realsense", rr.ViewCoordinates.RDF, timeless=True)

Enter fullscreen mode Exit fullscreen mode


Image

First, the pinhole camera is set using the Pinhole and Transform3D archetypes. Then, the images captured by the RealSense sensor are logged as an Image object, and they’re associated with the time they were taken.

<span>rgb_from_depth</span> <span>=</span> <span>depth_profile</span><span>.</span><span>get_extrinsics_to</span><span>(</span><span>rgb_profile</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span>
<span>"</span><span>realsense/rgb</span><span>"</span><span>,</span>
<span>rr</span><span>.</span><span>Transform3D</span><span>(</span>
<span>translation</span><span>=</span><span>rgb_from_depth</span><span>.</span><span>translation</span><span>,</span>
<span>mat3x3</span><span>=</span><span>np</span><span>.</span><span>reshape</span><span>(</span><span>rgb_from_depth</span><span>.</span><span>rotation</span><span>,</span> <span>(</span><span>3</span><span>,</span> <span>3</span><span>)),</span>
<span>from_parent</span><span>=</span><span>True</span><span>,</span>
<span>),</span>
<span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span>
<span>"</span><span>realsense/rgb/image</span><span>"</span><span>,</span>
<span>rr</span><span>.</span><span>Pinhole</span><span>(</span>
<span>resolution</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>width</span><span>,</span> <span>rgb_intr</span><span>.</span><span>height</span><span>],</span>
<span>focal_length</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>fx</span><span>,</span> <span>rgb_intr</span><span>.</span><span>fy</span><span>],</span>
<span>principal_point</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>ppx</span><span>,</span> <span>rgb_intr</span><span>.</span><span>ppy</span><span>],</span>
<span>),</span>
<span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>
<span>rr</span><span>.</span><span>set_time_sequence</span><span>(</span><span>"</span><span>frame_nr</span><span>"</span><span>,</span> <span>frame_nr</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense/rgb/image</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>Image</span><span>(</span><span>color_image</span><span>))</span>
<span>rgb_from_depth</span> <span>=</span> <span>depth_profile</span><span>.</span><span>get_extrinsics_to</span><span>(</span><span>rgb_profile</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span>
    <span>"</span><span>realsense/rgb</span><span>"</span><span>,</span>
    <span>rr</span><span>.</span><span>Transform3D</span><span>(</span>
        <span>translation</span><span>=</span><span>rgb_from_depth</span><span>.</span><span>translation</span><span>,</span>
        <span>mat3x3</span><span>=</span><span>np</span><span>.</span><span>reshape</span><span>(</span><span>rgb_from_depth</span><span>.</span><span>rotation</span><span>,</span> <span>(</span><span>3</span><span>,</span> <span>3</span><span>)),</span>
        <span>from_parent</span><span>=</span><span>True</span><span>,</span>
    <span>),</span>
    <span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>

<span>rr</span><span>.</span><span>log</span><span>(</span>
    <span>"</span><span>realsense/rgb/image</span><span>"</span><span>,</span>
    <span>rr</span><span>.</span><span>Pinhole</span><span>(</span>
        <span>resolution</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>width</span><span>,</span> <span>rgb_intr</span><span>.</span><span>height</span><span>],</span>
        <span>focal_length</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>fx</span><span>,</span> <span>rgb_intr</span><span>.</span><span>fy</span><span>],</span>
        <span>principal_point</span><span>=</span><span>[</span><span>rgb_intr</span><span>.</span><span>ppx</span><span>,</span> <span>rgb_intr</span><span>.</span><span>ppy</span><span>],</span>
    <span>),</span>
    <span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>

<span>rr</span><span>.</span><span>set_time_sequence</span><span>(</span><span>"</span><span>frame_nr</span><span>"</span><span>,</span> <span>frame_nr</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense/rgb/image</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>Image</span><span>(</span><span>color_image</span><span>))</span>
rgb_from_depth = depth_profile.get_extrinsics_to(rgb_profile) rr.log( "realsense/rgb", rr.Transform3D( translation=rgb_from_depth.translation, mat3x3=np.reshape(rgb_from_depth.rotation, (3, 3)), from_parent=True, ), timeless=True, ) rr.log( "realsense/rgb/image", rr.Pinhole( resolution=[rgb_intr.width, rgb_intr.height], focal_length=[rgb_intr.fx, rgb_intr.fy], principal_point=[rgb_intr.ppx, rgb_intr.ppy], ), timeless=True, ) rr.set_time_sequence("frame_nr", frame_nr) rr.log("realsense/rgb/image", rr.Image(color_image))

Enter fullscreen mode Exit fullscreen mode


Depth image

Just like the RGB images, the RealSense sensor also captures depth data. The depth images are logged as DepthImage objects and are linked with the time they were captured.

<span>rr</span><span>.</span><span>log</span><span>(</span>
<span>"</span><span>realsense/depth/image</span><span>"</span><span>,</span>
<span>rr</span><span>.</span><span>Pinhole</span><span>(</span>
<span>resolution</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>width</span><span>,</span> <span>depth_intr</span><span>.</span><span>height</span><span>],</span>
<span>focal_length</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>fx</span><span>,</span> <span>depth_intr</span><span>.</span><span>fy</span><span>],</span>
<span>principal_point</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>ppx</span><span>,</span> <span>depth_intr</span><span>.</span><span>ppy</span><span>],</span>
<span>),</span>
<span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>
<span>rr</span><span>.</span><span>set_time_sequence</span><span>(</span><span>"</span><span>frame_nr</span><span>"</span><span>,</span> <span>frame_nr</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense/depth/image</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>DepthImage</span><span>(</span><span>depth_image</span><span>,</span> <span>meter</span><span>=</span><span>1.0</span> <span>/</span> <span>depth_units</span><span>))</span>
<span>rr</span><span>.</span><span>log</span><span>(</span>
    <span>"</span><span>realsense/depth/image</span><span>"</span><span>,</span>
    <span>rr</span><span>.</span><span>Pinhole</span><span>(</span>
        <span>resolution</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>width</span><span>,</span> <span>depth_intr</span><span>.</span><span>height</span><span>],</span>
        <span>focal_length</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>fx</span><span>,</span> <span>depth_intr</span><span>.</span><span>fy</span><span>],</span>
        <span>principal_point</span><span>=</span><span>[</span><span>depth_intr</span><span>.</span><span>ppx</span><span>,</span> <span>depth_intr</span><span>.</span><span>ppy</span><span>],</span>
    <span>),</span>
    <span>timeless</span><span>=</span><span>True</span><span>,</span>
<span>)</span>

<span>rr</span><span>.</span><span>set_time_sequence</span><span>(</span><span>"</span><span>frame_nr</span><span>"</span><span>,</span> <span>frame_nr</span><span>)</span>
<span>rr</span><span>.</span><span>log</span><span>(</span><span>"</span><span>realsense/depth/image</span><span>"</span><span>,</span> <span>rr</span><span>.</span><span>DepthImage</span><span>(</span><span>depth_image</span><span>,</span> <span>meter</span><span>=</span><span>1.0</span> <span>/</span> <span>depth_units</span><span>))</span>
rr.log( "realsense/depth/image", rr.Pinhole( resolution=[depth_intr.width, depth_intr.height], focal_length=[depth_intr.fx, depth_intr.fy], principal_point=[depth_intr.ppx, depth_intr.ppy], ), timeless=True, ) rr.set_time_sequence("frame_nr", frame_nr) rr.log("realsense/depth/image", rr.DepthImage(depth_image, meter=1.0 / depth_units))

Enter fullscreen mode Exit fullscreen mode


Join us on Github

rerun-io / rerun

Visualize streams of multimodal data. Fast, easy to use, and simple to integrate. Built in Rust using egui.

Build time aware visualizations of multimodal data

Use the Rerun SDK (available for C++, Python and Rust) to log data like images, tensors, point clouds, and text. Logs are streamed to the Rerun Viewer for live visualization or to file for later use.

A short taste

import rerun as rr  # pip install rerun-sdk
rr.init("rerun_example_app")

rr.connect()  # Connect to a remote viewer
# rr.spawn()  # Spawn a child process with a viewer and connect
# rr.save("recording.rrd")  # Stream all logs to disk

# Associate subsequent data with 42 on the “frame” timeline
rr.set_time_sequence("frame", 42)

# Log colored 3D points to the entity at `path/to/points`
rr.log("path/to/points", rr.Points3D(positions, colors=colors

… Enter fullscreen mode Exit fullscreen mode
View on GitHub

原文链接:How to Visualize Live-Streaming Frames from Intel RealSense Depth Sensor

© 版权声明
THE END
喜欢就支持一下吧
点赞8 分享
It is during our darkest moments that we must focus to see the light.
越是在艰难困苦的时候,我们越是要看到希望
评论 抢沙发

请登录后发表评论

    暂无评论内容