Menu

Blog

Archive for the ‘virtual reality’ category: Page 14

May 23, 2023

Meta’s new AI models can recognize and produce speech for more than 1,000 languages

Posted by in categories: robotics/AI, virtual reality

They could help lead to speech apps for many more languages than exist now.

Meta has built AI models that can recognize and produce speech for more than 1,000 languages—a tenfold increase on what’s currently available. It’s a significant step toward preserving languages that are at risk of disappearing, the company says.

Meta is releasing its models to the public via the code hosting service GitHub. It claims that making them open source will help developers working in different languages to build new speech applications—like messaging services that understand everyone, or virtual-reality systems that can be used in any language.

May 23, 2023

Apple readies launch of $3,000 headset: Will it succeed where others have failed?

Posted by in categories: electronics, virtual reality

Apple Inc.’s annual WWDC developers’ conference is fast approaching — and this one promises to be a bit more eventful than most, as the consumer-electronics giant is expected to debut its long-awaited mixed-reality headset.

The company hasn’t made a major product introduction since it rolled out the Apple Watch in 2015, but with an expensive headset supporting augmented and virtual reality, Apple AAPL,-0.88% will be entering into a market that has so far failed to catch on in a mainstream way. Meta Platforms Inc. META, +0.85% has bet big on virtual reality with its Oculus headset, with only limited traction.

May 19, 2023

Exploring the Relationship Between Artificial Intelligence (AI) and the Metaverse

Posted by in categories: augmented reality, internet, robotics/AI, virtual reality

Artificial intelligence (AI) and the metaverse are some of the most captivating technologies of the 21st century so far. Both are believed to have the potential to change many aspects of our lives, disrupt different industries, and enhance the efficiency of traditional workflows. While these two technologies are often looked at separately, they’re more connected than we may think. Before we explore the relationship between AI and the metaverse, let’s start by defining both terms.

The metaverse is a concept describing a hypothetical future design of the internet. It features an immersive, 3D online world where users are represented by custom avatars and access information with the help of virtual reality (VR), augmented reality (AR), and similar technologies. Instead of accessing the internet via their screens, users access the metaverse via a combination of the physical and digital. The metaverse will enable people to socialize, play, and work alongside others in different 3D virtual spaces.

A similar arrangement was described in Neal Stephenson’s 1992 science-fiction novel Snow Crash. While it was perceived as a fantasy mere three decades ago, it seems like it could become a reality sooner rather than later. Although the metaverse isn’t fully in existence yet, some online platforms incorporate elements of it. For example, video games like Fortnite and Horizon World port multiple elements of our day-to-day lives into the online world.

May 19, 2023

Is buzzy startup Humane’s big idea a wearable camera?

Posted by in categories: augmented reality, food, health, mobile phones, robotics/AI, virtual reality, wearables

The demo is clever, questionably real, and prompts a lot of questions about how this device will actually work.

Buzz has been building around the secretive tech startup Humane for over a year, and now the company is finally offering a look at what it’s been building. At TED last month, Humane co-founder Imran Chaudhri gave a demonstration of the AI-powered wearable the company is building as a replacement for smartphones. Bits of the video leaked online after the event, but the full video is now available to watch.

The device appears to be a small black puck that slips into your breast pocket, with a camera, projector, and speaker sticking out the top. Throughout the 13-minute presentation, Chaudhri walks through a handful of use cases for Humane’s gadget: * The device rings when Chaudhri receives a phone call. He holds his hand up, and the device projects the caller’s name along with icons to answer or ignore the call. He then has a brief conversation. (Around 1:48 in the video) * He presses and holds one finger on the device, then asks a question about where he can buy a gift. The device responds with the name of a shopping district. (Around 6:20) * He taps two fingers on the device, says a sentence, and the device translates the sentence into another language, stating it back using an AI-generated clone of his voice. (Around 6:55) * He presses and holds one finger on the device, says, “Catch me up,” and it reads out a summary of recent emails, calendar events, and messages. (At 9:45) * He holds a chocolate bar in front of the device, then presses and holds one finger on the device while asking, “Can I eat this?” The device recommends he does not because of a food allergy he has. He presses down one finger again and tells the device he’s ignoring its advice. (Around 10:55)

Continue reading “Is buzzy startup Humane’s big idea a wearable camera?” »

May 19, 2023

Tensor Holography MIT Student creates AI learning advancing Holograms

Posted by in categories: 3D printing, biotech/medical, holograms, media & arts, mobile phones, robotics/AI, virtual reality

From 2021

A new method called tensor holography could enable the creation of holograms for virtual reality, 3D printing, medical imaging, and more — and it can run on a smartphone.

Continue reading “Tensor Holography MIT Student creates AI learning advancing Holograms” »

May 15, 2023

Robotic proxy brings remote users to life in real time

Posted by in categories: robotics/AI, virtual reality

Cornell University researchers have developed a robot, called ReMotion, that occupies physical space on a remote user’s behalf, automatically mirroring the user’s movements in real time and conveying key body language that is lost in standard virtual environments.

“Pointing gestures, the perception of another’s gaze, intuitively knowing where someone’s attention is—in remote settings, we lose these nonverbal, implicit cues that are very important for carrying out design activities,” said Mose Sakashita, a doctoral student of information science.

Sakashita is the lead author of “ReMotion: Supporting Remote Collaboration in Open Space with Automatic Robotic Embodiment,” which he presented at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems in Hamburg, Germany. “With ReMotion, we show that we can enable rapid, dynamic interactions through the help of a mobile, automated robot.”

May 13, 2023

A moment’s silence, please, for the death of the Metaverse

Posted by in categories: robotics/AI, virtual reality

And, imagine if every penny sank into this was available for AI research right now.


Meta sank tens of billions into its CEO’s virtual reality dream, but what will he do next?

Continue reading “A moment’s silence, please, for the death of the Metaverse” »

May 13, 2023

Meta says new study shows the metaverse could boost the global economy

Posted by in categories: augmented reality, computing, economics, mobile phones, virtual reality

Yeah, feels a bit harder to take it seriously when the company paying for the study has so much skin in the game.

May 10, 2023

How to build cheap VR Haptic Gloves to FEEL VR

Posted by in categories: 3D printing, virtual reality

How to build VR Haptic gloves to feel in VR, for really cheap.


Here’s a step-by-step guide on how to build your own budget VR Haptic Gloves! (Prototype 4)

Continue reading “How to build cheap VR Haptic Gloves to FEEL VR” »

May 8, 2023

Study presents large brain-like neural networks for AI

Posted by in categories: augmented reality, biological, mobile phones, robotics/AI, virtual reality, wearables

In a new study in Nature Machine Intelligence, researchers Bojian Yin and Sander Bohté from the HBP partner Dutch National Research Institute for Mathematics and Computer Science (CWI) demonstrate a significant step towards artificial intelligence that can be used in local devices like smartphones and in VR-like applications, while protecting privacy.

They show how brain-like neurons combined with novel learning methods enable training fast and energy-efficient spiking on a large scale. Potential applications range from wearable AI to and Augmented Reality.

While modern artificial neural networks are the backbone of the current AI revolution, they are only loosely inspired by networks of real, biological neurons such as our brain. The brain however is a much larger network, much more energy-efficient, and can respond ultra-fast when triggered by external events. Spiking neural networks are special types of neural networks that more closely mimic the working of biological neurons: the neurons of our nervous system communicate by exchanging electrical pulses, and they do so only sparingly.

Page 14 of 103First1112131415161718Last