Go to the documentation of this file.
8 This example shows how to fuse wheel odometry measurements (in the form of 3D translational velocity measurements) on the T265 tracking camera to use them together with the (internal) visual and intertial measurements. 9 This functionality makes use of two API calls: 10 1. Configuring the wheel odometry by providing a json calibration file (in the format of the accompanying calibration file) 11 Please refer to the description of the calibration file format here: https://github.com/IntelRealSense/librealsense/blob/master/doc/t265.md#wheel-odometry-calibration-file-format. 12 2. Sending wheel odometry measurements (for every measurement) to the camera 15 For a static camera, the pose output is expected to move in the direction of the (artificial) wheel odometry measurements (taking into account the extrinsics in the calibration file). 16 The measurements are given a high weight/confidence, i.e. low measurement noise covariance, in the calibration file to make the effect visible. 17 If the camera is partially occluded the effect will be even more visible (also for a smaller wheel odometry confidence / higher measurement noise covariance) because of the lack of visual feedback. Please note that if the camera is *fully* occluded the pose estimation will switch to 3DOF, estimate only orientation, and prevent any changes in the position. 20 import
profile = cfg.resolve(pipe)
dev = profile.get_device()
pose_sensor = tm2.first_pose_sensor()
wheel_odometer = pose_sensor.as_wheel_odometer()
f = open("calibration_odometry.json"
frames = pipe.wait_for_frames()
pose = frames.get_pose_frame()
data = pose.get_pose_data()
wheel_odometer.send_wheel_odometry(wo_sensor_id, frame_num, v)
static std::string print(const transformation &tf)
Author(s): Sergey Dorodnicov , Doron Hirshberg , Mark Horn , Reagan Lopez , Itay Carpis
autogenerated on Mon May 3 2021 02:50:11