A generic noninvasive neuromotor interfacefor human-computer interaction

In February 2024 The CTRL org operating inside Meta Reality Labs posted a paper providing a broad overview of much of the research the org’s done for the past years.

My role on this org was broadly to develop Software and Infrastructure for real-time data collection, prompted EMG studies, model training, and analysis.


EMG Hands in VR (Facebook Connect 7)

Facebook Connect was today, and offered a glimpse at one of the projects I’ve contributed to this year.

Most of this work remains confidential, but you can see one shot of EMG neural interface devices embedded in VR alongside CV tracked hands (staring my hand 😛, at 19:56) highlighted during Mark’s intro.

This segments showed EMG data from our teams wristbands visualized as lines “flowing” through the wrist in VR.

The project was highlighted again in Michael Abrash’s segment, alongside some of my colleagues work on interface design, and typing.

This segment includes a demonstration of “one-bit” pinch-and-release models built on EMG data, integrated with virtual reality object interactions; in this case, a chess board.

Additionally it showed pose-based control of virtual objects at a distance

Software Engineering was always intended to be a stepping stone for me to get into hand-modeling, so I’m excited to have made the leap into the next phase of my career 😉


Observability At Scale

Observability at Scale: Building Uber’s Alerting Ecosystem

November 20, 2018

Uber’s Observability team built a robust, scalable metrics and alerting pipeline to detect, mitigate, and notify engineers of issues as they occur.

Checkout the link above for a great Engineering Blog post written by my former colleague, Shreyas Srivatsan, covering a lot of the systems and tools I worked on well on the Uber Observability team in New York.

I was one of the lead engineers working on uMonitor, and contributed to integrations with related services like M3, Blackbox, and Pagerduty.

Exit mobile version