Show HN: Put this touch sensor on a robot and learn super precise tasks

We just released a very excited touch sensor that finally simplifies touch sensing for robotics.

Our most exciting result: Learned visuotactile policies for precise tasks like inserting USBs and credit card swiping, that work out-of-the-box when you replace skins! To the best of our knowledge, this has never been shown before with any existing tactile sensor.

Why is this important? For the first time, you could now collect data and train models on one sensor and expect them to generalize to new copies of the sensor -- opening the door to the kind of large foundation models that have revolutionized vision and language reasoning.

Would love to hear the community's questions, thoughts and comments!


Comments URL: https://news.ycombinator.com/item?id=41603865

Points: 290

# Comments: 54

https://any-skin.github.io

Created 6mo | Sep 21, 2024, 10:30:23 AM


Login to add comment