Facebook CEO Mark Zuckerberg is fencing at the Meta Universe with an Olympic gold medalist during a live broadcast of the virtual and augmented reality conference to announce the rebranding of Facebook as Meta, in this screenshot taken from a video released on October 28, 2021. .
Facebook | via Reuters
Facebook co-founder Mark Zuckerberg, now Meta’s CEO, said Monday that the new sensor and plastic could work together to potentially support the development of the so-called meta universe.
Together with scientists from Carnegie Mellon University, researchers of artificial intelligence from Meta have created a deformed plastic “skin” less than 3 millimeters thick.
A relatively cheap material known as ReSkin has magnetic particles inside that create a magnetic field.
When the skin comes in contact with another surface, the magnetic field from the embedded particles changes. The sensor detects a change in magnetic flux before transmitting data to any AI software that is trying to understand the force or touch that has been applied.
“We developed a high-resolution sensor and worked with Carnegie Mellon to create a fine-skinned robot,” Zuckerberg wrote on Facebook on Monday. “It brings us one step closer to realistic virtual objects and physical interactions in the metaverse.”
The skin was tested on works that processed soft fruits, including grapes and blueberries. He was also placed in a rubber glove, and a human hand formed a bao bun.
The artificial intelligence system had to be trained on 100 human touches to have enough data to understand how changes in the magnetic field are related to touch.
The paper is scheduled to be published in an academic journal later this month, but has not yet been reviewed.
Researchers of artificial intelligence have largely neglected touch because the sensors were too expensive or too flimsy to obtain reliable data, said Abhinav Gupta, a Meta scientist, during a call to the media on Friday.
“If you think about how people or babies learn, rich multimodal data is very important for developing an understanding of the world,” Gupta said. “We learn from pixels, sound, touch, taste, smell, etc.”
“But if you look at how artificial intelligence has progressed over the last decade, we’ve made huge strides in pixels (computer vision)… and we’ve made strides in sound: audio, speech, and so on. But there was no touch in this progress. although it is very critical. “
Helping machines and helper robots feel that will allow them to understand what people are doing, Gupta said, adding that Meta’s ReSkin can detect forces up to 0.1 Newtons from objects less than 1 mm wide.
“For the first time, we can try to better understand the physics of objects,” Gupta said, adding that it would help Meta create a metacosm.
The metaverse is either the next evolution of the Internet, or the latest corporate fashion phrase that has interested investors in some nebulous innovations that may not even happen over the next decade.
In any case, technology companies – especially Meta – are increasingly developing the concept of a metacosm, a term for a virtual world within which one can live, work and play. If you’ve seen the movie “Prepared for the first player”, you have a good idea of what a metaverse is: put on computerized glasses, and you will be transported to the digital universe, where anything is possible.
If the meta-global ambitions of the Goal come true, it may be possible to interact with virtual objects and get some physical response from some of the equipment.
“When you wear a Meta headset, you also want users to have an even richer experience,” Gupta said.
“How can you provide tactile feedback if you don’t know how people feel, what the properties of the material are, and so on?”
– Additional report by Steve Kovacs from CNBC.