When we interact with computers today we move the mouse, we scroll the trackpad, we tap the screen, but there is so much that the machines don’t pick up on, what about where we’re looking, the subtle gestures we make and what we’re thinking?
Asteroid is looking to get developers comfortable with the idea that future interfaces are going to take in much more biosensory data. The team has built a node-based human-machine interface engine for macOS and iOS that allows developers to build interactions that can be imported into Swift applications.
“What’s interesting about emerging human-machine interface tech is the hope that the user may be able to “upload” as much as they can “download” today,”Asteroid founder Saku Panditharatne wrote in a Medium post.
To bring attention to their development environment, they’ve launched a crowdfunding campaign that gives a decent snapshot of the depth of experiences that can be enabled by today’s commercially available biosensors. Asteroid definitely doesn’t want to be a hardware startup, but their campaign is largely serving as a way to just expose developers to what tools could be in their interaction design arsenal.
There are dev kits and then there are dev kits, and this is a dev kit. Developers jumping on board for the total package get a bunch of open hardware, i.e. a bunch of gear and cases to build out hacked together interface solutions. The $450 kit brings capabilities like eye-tracking, brain-computer interface electrodes, and some gear to piece together a motion controller. Backers can also just buy the $200 eye-tracking kit alone. It’s all very utility-minded and clearly not designed to make Asteroid those big hardware bucks.
“The long-term goal is to support as much AR hardware as we can, we just made our own kit because I don’t think there is that much good stuff out there outside of labs,” Panditharatne told TechCrunch.
The crazy hardware seems to be a bit of a labor of love for the time being, while a couple AR/VR devices have eye-tracking baked-in, it’s still a generation away from most consumer VR devices and you’re certainly not going to find too much hardware with brain-computer interface systems built-in. The startup says their engine will do plenty with just a smartphone camera and a microphone, but the broader sell with the dev kit is that you’re not building for a specific piece of hardware, you’re experimenting on the bet that interfaces are going to grow more closely intertwined with how we process the world as humans.
Panditharatne founded the company after stints at Oculus and Andreessen Horowitz where she spent a lot of time focusing on the future of AR and VR. Panditharatne tells us that Asteroid has raised over $2 million in funding already but that they’re not detailing the sources of that cash quite yet.
The company is looking to raise $20k from their Indiegogo but the platform is the clear sell here, exposing people to their human-machine interaction engine. Asteroid is taking sign-ups to join the waiting list for the product on their site.