On November 21, 2022
How Meta is capturing human inputs to control the virtual world
Despite the growth of the augmented reality (AR) marketplace, the smart glasses segment is slower to take off. Smart glasses remain a luxury product, like virtual reality headsets (VR), but a lack of system-selling software slightly stuns the growth of AR hardware. Although, demand for AR hardware and software is making immersive firms scramble to create a relevant service.
There are many fantastic smart glasses products on the marketplace. A range of global firms and distributors are releasing hardware for frontline workers.
With the debut of Nreal and Ray-Ban devices, consumers are encountering smart glasses that contain increasingly sophisticated components and software. Estimations predict that consumer-grade AR smart glasses vendors will ship roughly 14.19 million units in 2022.
With the growth of AR solutions, Meta is exploring ways to evolve user input to match the emergent hardware. Meta is investing in neural biofeedback technology to transform the AR glasses industry.
Defining Electromyography or EMG in the Context of AR
Electromyography (EMG) is a hardware technology that uses sensors to detect and record electrical activity from the muscles and convert it into input information for AR wearables.
EMG has been a medical technique for several years to detect anomalies in human or animal movements. Sciences professionals like Neurologists, physiotherapists, and biomedical engineers typically employ EMG technology to diagnose and treat medical conditions.
AR and EMG-ready immersive devices enable a form of human-computer interaction (HCI) that negates traditional input methods like a mouse or keyboard. The EMG device user doesn’t need to move their wrists, hands, or other limbs.
Instead, users can use their thoughts to send an electrical signal to the limb. EMG-ready devices detect this and transforms it into a computer signal covering navigation and input.
There are many applications of EMG in AR and VR. For example, ongoing research proposes that people with cerebral palsy may be able to use an EMG-enabled interface to recreate neuromuscular control and coordinate movements in the virtual world.
Meta’s research also attempts to leverage EMG to make user movements in the virtual world more seamless. The idea is that you’ll wear a pair of AR glasses and an EMG wristband.
What Is Meta Doing with EMG?
Meta’s EMG journey started in 2019 when it acquired CTRL Labs for an undisclosed amount. Estimations predicted that the purchase cost the Menlo Park-based firm somewhere between $500 million and $1 billion.
At that time, it was among the few companies doing EMG from a consumer perspective, the other being Thalmic Labs. Thalmic Labs has rebranded as North since then, and Google acquired the firm in June 2020.
When Meta acquired CTRL Labs, the firm took over the company’s critical IP in well-trained EMG models. CTRL Labs had already created custom virtual keyboards that would adapt to a user’s unique typing patterns, quirks, and speed based on electrical activity from the muscles.
Moreover, Meta acquired Haptic startup Lofelt in September 2022. The purchase, which came amidst an FTC ingestion into Meta’s practices, enables the Menlo Park-based firm to design advanced touch and feel feedback systems to coincide with its HCI and EMG roadmaps.
Meta R&D Round-Up
This work formed the foundation of what Meta is doing with EMG under its Reality Labs division for immersive hardware and software research and development (R&D). Here is a quick roundup:
Facebook started work on an input device for AR glasses six years ago, in 2015. It decided that a wrist-based wearable would be the most ergonomic and efficient solution.
Before acquiring EMG capabilities through CTRL Labs, it had explored other techniques like contextualised AI. But these required several more years of effort.
Meta initial versions of EMG will include a “click” gesture. The Click is a pinch-and-release action, which will be Meta’s AR equivalent of clicking the mouse button.
Future versions of Meta’s EMG will have more advanced controls. For instance, you’ll be able to touch and move virtual objects, like dragging and dropping on a mouse.
EMG will work together with Meta’s underlying AI. Let’s say that you want to exercise in a virtual world. The AI will surface personalised playlists, and you can choose a playlist through intent.
When you provide input via EMG, you should be able to get haptic feedback. Meta is working on several prototypes to study wristband haptics, including Bellowband and Tasbi.
Reality Labs Reshuffle
Although, despite Reality Labs’ consistent innovation, the subdivision is costing Meta a lot of money. The Menlo Park-based firm significantly reduced support for its immersive R&D lab in July 2022 following a significant internal reshuffle.
Meta’s reshuffle led to the firm cancelling and delaying several internal extended reality (XR) projects, including its XR smartwatch, Orion smart glasses, and Project Nazare.
Although, Meta did not mention EMG R&D during its July reshuffle – putting the project’s future into question.
Key Features of EMG for Meta’s AR Vision
Meta has selected EMG, a relatively new technology for the company, for several reasons, compared to contextualised AI.
Most neurotech is extraordinarily complex and challenging to prepare for consumer-grade commercialisation. In that respect, EMG is more viable. Unlike direct brain-computer interfaces, EMG does not require the insertion of a chip or the need to break the skin barrier.
Users can take them on or off, and the device can gradually “learn” from the user’s habits through prolonged use.
Also, EMG and wrist-based wearables are a perfect match. You can easily fit compute resources, antennas, batteries, and multiple sensors into an EMG-enabled smart-watch-like device.
It helps that CTRL Lab’s exceptional work in this space has placed Meta head and shoulders above the competition. As the Menlo Park-based firm works on projects like the Meta Quest Pro, which is changing immersive input with hand and eye-tracking, technology like EMG may appear as another alternative for user experience functionalities.
Challenges and Opportunities on the Road Ahead
A few concerns remain on the road towards mainstream EMG development and deployment for AR. Meta and other companies must adapt these devices to suit various accessibility considerations.
Privacy and security are other challenges, as EMG-enabled devices can read your most personal and private electrical impulses. To that end, FRL Research runs a neuro-ethics program to identify and address these issues early on.
Thanks to its recent EMG advancements, Meta has positioned itself to gain from this opportunity as it innovates and explores new input methodologies.