I love everything Kinemetrix is doing! Automation that makes sense, measured at the bottom line in dollars and cents. Thank you for keeping American manufacturing competitive in a global economy.
@bobvoigt27146 жыл бұрын
@Jeremy This system is using a single "sheet of light" camera to determine the 3D location and orientation of the workpiece. As stated above this was built in 2013. Kinemetrix has developed even more impressive solutions since then.
@JeremyCook6 жыл бұрын
That's incredible. Are you able to get that kind of 3D orientation using only a single camera? I'm seeing a single Sick unit, but perhaps there's more going on here?
@aleksandersuur94757 жыл бұрын
That's pretty neat
@ZeroSins6 жыл бұрын
Can this application able to be adapted to a welding robot?
@xiaoxingdong71588 жыл бұрын
really nice job!
@chiragpatel59406 жыл бұрын
which type of 3d vision sensor is used in this ??????
@moto53156 жыл бұрын
SICK Ranger
@williamhuang53293 жыл бұрын
Hanzhen harmonic drive gear , robot gear , over 30 years experience
@jasoncreech44865 жыл бұрын
Is the red line scanner generating the robot offsets.
@Kinemetrix5 жыл бұрын
Yes. The laser line generator, imaging hardware and a bunch of software.
@jasoncreech44865 жыл бұрын
Kinemetrix does the software come with the sensor? I have a application I would like to try to use this system how can I reach a representative.
@Kinemetrix5 жыл бұрын
@@jasoncreech4486 The part-finding hardware and software is not packaged as a standalone product that can be integrated by others. Our solution was a comprehensively engineered system that included the 3D vision robot guidance tightly integrated into the system controls. Please check our website www.kinemetrix.com for contact information if you would like to discuss your needs.
@Bart1DotNet7 жыл бұрын
I cant believe the 3D system has so much recycle.
@jimlpeyton17 жыл бұрын
Great observation. There are a few non-obvious constraints that determine whether a part is "pickable". For example, the parts shown at 0:42 can be picked by vacuum cup only when their large flat surface is unobstructed. That means: facing up, EOAT pick point not obstructed by neighboring part, not out of the robot's reach, and robot must not collide with conveyor guides or sensor frame. This particular system also images in discrete frames. Parts on the conveyor that span multiple frames are not fully imaged or matched. Since then, we've removed that limitation. During the development of this process, we were surprised to learn that solving the real-time 3D imaging and surface matching was only about 30% of the problem... As a result, all of our 3D guidance applications are optimized as a system: part presentation, imaging, gripping, and path planning. Thanks for your comment.
@hakimka4 жыл бұрын
The variety of parts introduces an additional level of complexity. If the cell were to run "one" part, the solution could have been a more rigid and pickability % would be higher. The trick is to balance the pick rate with a dozen of combinations of parts. At point, the solutions becomes somewhat universal and one needs to establish probabilities on pick rate. Another thing to observe, the robot pick parts that can be oriented 360 degrees. That limits the reach of the tool point. It is a not a lights out, but it is close. Dump the sub assembly parts and pick up the assembled ones later.