November 4, 2021
GelSight, which develops tactile imaging and sensing technologies, has announced a partnership with Meta AI to commercialize and manufacture the DIGIT tactile sensor and expand the field of tactile sensing research.
The DIGIT is a reliable, low-cost, compact, high-resolution tactile sensor designed for robotic in-hand manipulation, designed and open-sourced in 2020 by Meta (then Facebook) AI researchers to enable AI and robotics researchers to work with touch. GelSight will manufacture and sell DIGIT as part of the company’s offering of digital tactile sensors, making this sensor even more accessible to the global research community.
The company said that touch is the next sense that will be digitized to enable robots to perform tasks they were previously unable to do. GelSight develops and sells technology for digital tactile sensing with the sensitivity and resolution of human touch. The company offers unique elastomeric and imaging-based tactile sensing technology that also enables robotic engineers to develop solutions for complex object manipulation and other dexterous tasks.
In addition to enabling new applications in robotics, the company said digitizing touch can help industrial applications that require surface and texture understanding. This can include robotic inspection for quality assurance, material properties identification, manufacturing, and assembly process efficiency.
“GelSight and the Meta AI research team share the same vision around making tactile sensing technology more accessible,” said Youssef Benmokhtar, CEO of GelSight. “Our technology and this partnership will help generate new applications for robotics as engineers, hobbyists and researchers will have access to a small form factor, imaging-based technology at a more accessible price point than ever before.”
“Since we first open-sourced DIGIT last year, it’s been exciting to see researchers use the sensor in their work,” said Roberto Calandra, a research scientist at Meta AI. “GelSight making DIGIT commercially available will help make touch readily accessible to many more researchers and practitioners, and we hope that it will accelerate further progress advancing AI and robotics.”