Show HN: Put this touch sensor on a robot and learn super precise tasks

We just released a very excited touch sensor that finally simplifies touch sensing for robotics.

Our most exciting result: Learned visuotactile policies for precise tasks like inserting USBs and credit card swiping, that work out-of-the-box when you replace skins! To the best of our knowledge, this has never been shown before with any existing tactile sensor.

Why is this important? For the first time, you could now collect data and train models on one sensor and expect them to generalize to new copies of the sensor -- opening the door to the kind of large foundation models that have revolutionized vision and language reasoning.

Would love to hear the community's questions, thoughts and comments!


Comments URL: https://news.ycombinator.com/item?id=41603865

Points: 290

# Comments: 54

https://any-skin.github.io

Établi 6mo | 21 sept. 2024, 10:30:23


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

Show HN Pianoboi – displays sheet music as you play your piano

I made a software library for displaying piano music 7 years back, and recently ported it to the web (which is now easier than even ever).

It displays sheet music as you play, and let's you take

29 mars 2025, 05:40:05 | Hacker news