08:06, available SLAM solutions 25:33, global shutter for rapid moving platforms
@ChicagoBob123
5 жыл бұрын
The issue is the extremity limited range. If it worked well.at 30 meters then it would be a lot more useful. Also has issues in sunlight reflections
@gustavovelascoh
3 жыл бұрын
Nice video. At first I thought it was "Realsense with Jake Gyllenhaal"
@Hasan...
5 жыл бұрын
I don't get it. The tracker was used separately (additionally) to provide IMU data to the visual odometry. But this presentation is of 28th May 2019. Why didn't he use D435i that has a built-in IMU ? Shouldn't that provide both IMU and Visual Odometry together as a single unit?
@patrickpoirier1877
5 жыл бұрын
Adding the T265 provides VIO based odometry topic to the system making tracking much more accurate than just IMU that is prone to drift as shown on the first video
@mattanimation
5 жыл бұрын
the tracker outputs the visual odometry using it's onboard asic to compute whereas the d435i outputs the imu data that would then need to but fed into an algorithm running on the host device CPU or GPU
@jamesdavidson11
3 жыл бұрын
SLAM = Simultaneous Localization And Mapping
@natethegreatest1000
2 жыл бұрын
How does this software account for moving objects, such as other people walking around in that office?
@TheRockeyAllen
4 жыл бұрын
Can I use these products for commercial applications? Can I make a robot with an Intel T265 and Jetson Nano and just sell it? Or will that require licenses and stuff?
@robnierreyes7197
3 жыл бұрын
depends on who you sell it to :)
@TheRockeyAllen
3 жыл бұрын
@@robnierreyes7197 oh... where can I find more information on this?
@robnierreyes7197
3 жыл бұрын
@@TheRockeyAllen you need to know the regulations for the field where you are deploying your robotics solution. You need to look for ISO standards and such regulatory documents. If you are selling a surgical robot to a hospital then you need to follow medical standards. If you are selling a drone to consumers then you need to follow aviation regulations, etc.
@kappapride6226
11 ай бұрын
Does it work on D435 and L515 cameras?
@KK-fh1ds
3 жыл бұрын
Good presentation
@smtabatabaie
5 жыл бұрын
That's great, Where can I find the source code or sample?
@froimv
5 жыл бұрын
I second the motion.
@yankoaleksandrov
4 жыл бұрын
Can I run a drone with jetson nano and the stack of t264 and d400 cameras and use them for indoor navigation ?
@ChicagoBob123
4 жыл бұрын
Where is a link to the next speaker
@tienhoangngoc7867
4 жыл бұрын
Can I ask what's kind of SLAM this project used?
@darthmop2293
4 жыл бұрын
Hi I have some questens. How did you make the Sensorsystem? Which hardware did you use, can I bild it with a Jetson Nano and a Akkupack? Is there a German Person to talk?
@TheBigCrag123
3 жыл бұрын
Hey, ich habe ein ähnliches Vorhaben wie du. Hast du dein Projekt inzwischen beenden können?
@laisan86
4 жыл бұрын
I would like to have complete documents and source code files to read. Where I can get them ?
@olawlor
4 жыл бұрын
A T265 overview is here, basically a paper version of this talk, with links to the T265 ROS node source code: dev.intelrealsense.com/docs/intel-realsensetm-visual-slam-and-the-t265-tracking-camera SLAM with D435i setup guide for ROS: github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i RTAB-Map (underlying SLAM library) source code: github.com/introlab/rtabmap This also needs the Point Cloud Library: github.com/PointCloudLibrary/pcl
@laisan86
4 жыл бұрын
@@olawlor Thanks a lot!
@omnigeddon
2 жыл бұрын
Basically buy a used Xbox kinect and gain color and depth.. instead of just depth.. you can buy a kinect for 3 dollars
@OEFarredondo
5 жыл бұрын
Shukran habibi
@_aawawaa_
4 жыл бұрын
I tried this with my Realsense d435i on a jetson nano. It lost track easily, i even move it slowly with my hand. Anyone have suggestion to fix the problem?
@Frankx520
4 жыл бұрын
I just bought a D415 for my Jetson nano robot, and I saw your comment... So, is your tracking get any better now?
@_aawawaa_
4 жыл бұрын
@@Frankx520it got better after i set the parameter recommended here wiki.ros.org/rtabmap_ros/Tutorials/HandHeldMapping Dont use the launch file included on realsense sample folder, read RTabMap wiki and github so you can tune it better. Goodluck~
@Frankx520
4 жыл бұрын
@@_aawawaa_ COOL! I am also trying to add two encoders, and IMU for the robot car. I hope I can get a good result : ) Tx for the link!!!
@ChicagoBob123
5 жыл бұрын
Also the current cameras has no build for pi4
@somatosthatine
3 жыл бұрын
Steve Carell explaining SLAM...
@RowdyCoders
5 жыл бұрын
👏👏👍
@goofybits8248
4 жыл бұрын
👌👍❤
@EliSpizzichino
Жыл бұрын
3 years later... are there any better solution on the market??? I can't believe Intel or someone else didn't come with a single camera doing everthing
@TULEYDIGITAL
5 ай бұрын
Closest would be the HTC Vive Ultimate -> Calibrate -> Run it in STEAM VR in Headless Mode -> Use OpenXR package in Python, installed via pip, to read the pose
@ericpham8205
4 жыл бұрын
The sensor was not correctly design. Use space sense without camera
Пікірлер: 42