I really wish it were easier for artists and small operations to get a hold of these. Ultraleap acquired LeapMotion a while back[1] and made it against their development TOS[2] to give public demonstrations which killed most of the energy I saw in the interactive arts space.
A year ago, I looked into using Ultrahaptics for interactive exhibits and the yearly licensing they quoted was really high for making a single exhibit so my company ended up dropping that pitch.
I understand they want to keep their tech close to their chest, prevent competitors, and focus on large distribution deals, but it feels like they're holding it back from the people who might make the most interesting examples.
I got burned by Leap Mothion I won’t touch a thing they create.
I think my most fundamental complaint was when Leap Motion launched instead of being prescriptive about how it should work they deferred all drivers to app specific functionality when they could have at least had a reference generic driver that had consistent behavior across apps that could then be modified.
The device was a hog on system resources and worked only in super niche areas and I think I am mostly disappointed they didn’t go another route with a more generic input device. I’ve got no indication this device takes a different approach but tbh I don’t have the energy to do the work to check.
I actually reverse engineered most of the code for the drivers for LeapMotion for my undergraduate project. Turns out it was illegal as per their TOS so my professor had me quit that project and take it offline.
The biggest problem I had encountered back then is that there was a packet transmitted with a certain signature in order to turn on the cameras that I could not reverse for the life of me (had something to do with the app version).
Other than that: basically two IR cameras that stream in an interleaved pixel format (2 x 640) x 480 iirc. It was a fun device to use and hack, but the first time I had ran into the brick-wall that is an enclosed format.
> LeapUVC gives you access to the Leap Motion Controller image data through the industry standard UVC (Universal Video Class) interface. This gives you low level controls such as LED brightness, gamma, exposure, gain, resolution, and more.
Interesting enough there is a device from Intel called RealSense that looks very much like a Leap Motion, the R435, sold for 275$ which purports to add a finger tracking interface to any current UI with a simple driver install and no changes to the UI code. It would appear that this is a step in the right diection.
Furthermore, the RealSense SDK is apparently open source, as opposed to what Ultraleap has, which seems rather closed and/or expensive.
I had to scroll down the page to see if Ultraleap is a product, a company, or some kind of open source project, and whether it has anything to do with the Leap Motion Controller. For those who were as confused as me, Ultrahaptics bought Leap Motion, renaming the resulting company to Ultraleap.
Man I've been harping on haptic holograms being the next UI mode since 2014, and this takes us half-way there. Congrats for getting some products to market in this space! I feel like there are a couple more years of VR stagnation, but by 2023-5 when 3mn SOCs are pretty common and affordable I think it will all hit at once and haptics will be a huge mode of interaction.
Incredibly low power draw in the low end, which allows for more mobility and diverse sources of energy. Like if you could wear meaningful compute that is powered with body kinetic energy or some flexible solar thing, I think that unlocks computing to happen in many places with many physical and virtual things working in concert. ESP-32 on BLE sleep can sip tiny amounts of energy, but cannot do any meaningful compute, and a low top end speed. I want a chip that can burst to 5GHZ and rest at 0.05mha. That can make generating or streaming a massive bitrate from a wearable. Then you can do sick-ass offline-first edge compute and be able to drive one of these haptics devices, projectors, VR, whatever with a tiny machine or local cluster. Wearable k3s if you will. If you could drive any display from your person, with local and remote indexes and objects of everything you want or need, interfaces to people and things, replay of everything, totally offline as you want or need.
If it was fast enough, and didn’t require a whole backpack to drive and power it, then I think a computer like that would feel like a superpower.
Hang on, is that actually true though? I was under the impression that shrinking process nodes represented a trade-off in terms of power consumption on the low end. There must be reasons why nobody makes a 14nm microcontroller, right?
Shrinking the process node means that you get less power consumption per transistor flip, but it can also increase the amount of static leakage current, which hurts designs that aim for energy budgets under say, 100uA.
I could be wrong, and I think that leakage can be mitigated by the lower operating voltages on smaller process nodes, but I don't believe it is as simple as "smaller process = more power efficient". If you're talking about GHz-scale application processors it holds true, but getting that sort of chip to idle at 0.05mA might be hard.
RAM can also consume a lot of power, if you have gigabytes of it. So until we have cheap high-density NVRAM, you might need a sort of 'hibernate' mode to get really low power consumption. And if you did that, you'd need to burn a bunch of energy to wake up and go back to sleep...busy, busy, busy.
I really am out of my depth to speak to specifics of 3nm or how the hell physics even works at that scale. I can just assert that mass adoption of ubiquitous and tactile computing has dependency on speed, locality, and energy consumption.
The chief reason no one makes a 14nm MCU is because the cost of designing on leading edge nodes is staggering and MCUs are a small market that doesn’t make that much money. Just as importantly, the MCU often tends to have a lot of analog and RF content on it, which actually gets easier worse as the nodes shrink
I'm curious if the people excited about this have tried it. The sensations are IMO not compelling. They definitely don't transform Minority Report-style air gestures into a viable mode of interaction. I think the cooler trick that ultrasonic phased arrays can perform is levitating 3D patterns of tiny styrofoam spheres.
For those wanting to experiment with hand controlled UI, an Oculus Quest is probably the best entry point. Development is very accessible; you can access the hand tracking in the browser from JavaScript and go to town. And if you need more than hand tracking can provide, you get controllers too.
I used to work there. You're right the sensations are too weak and indistinct to be useful. People who haven't tried it probably think it's like touching an object. It isn't. It makes a patch of your skin lightly vibrate, which is a very different sensation, and it only really works on your palm.
It's a really really cool tech demo but it kind of blows my mind that went further than that. The device is always going to be super expensive, and it just isn't anywhere near as good as actual buttons. Buttons work really well!
There are some situations where you can't use buttons, like VR. But they're reaaaally niche. I can't see Ultraleap surviving.
I got the chance to try something like this a few years ago at GDC. It was barely even detectable.
But I still hope that the technology gets researched and built out as far as it can go, because _something_ is necessary for virtual interfaces to work.
I have similar feelings about the technology. As much as I'd love to see someone apply ultrasonic haptics to an interesting use case, I don't know if useful haptic feedback can occur within the limits of safe ultrasonic SPLs.
Even with relatively large arrays, the prototypes I've seen are so subtle that you almost have to be expecting the sensation to get anything out of it. It's not on the same level as the haptic feedback of an Apple trackpad.
I've been super excited for this space for a long time. Novel interaction ideas in the digital age seems to have a hard time breaking through (mouse and keyboard has was really a killer application).
I'm wondering if a matrix of wire which could just run some very small currents through gloves onto specific sections and points on the hands could work ? (please be gentle, I haven't really researched this just a fun thought :) )
Mouse and keyboard killed it because they are physical objects. A touch screen or even something like this will always be a poor interface because they aren’t physically manipulable. Honestly I think we’re more likely to see a major advance from the Teledildonics people since they are actually working with real hardware and incorporating feedback mechanisms.
Yes, I have owned multiple Leap Motion devices before their acquisition and used multiple demonstrations of ultrasonic phased arrays at SIGGRAPH and GDC, including Ultraleap specifically. It's true that my experience is somewhat limited as I haven't owned an ultrasonic phased array to play around with it for longer periods of time.
I'm currently toying with making an "air" MIDI controller that uses ultrasonic distance sensors (HC SR04).
Those sensors work around 40kHz, well beyond normal human hearing (human range is 20Hz to 20kHz and narrows with age) and I certainly don't hear anything. Yet my children, who are between 10-15, swear they can hear them (and I have at least verified they can tell by ear when they are on or off).
I wonder how an array of 256 of these would sound to them? (From the product image: 16x16=256).
Yes it's impossible they are hearing anything at 40kHz. Either the generated sounds are not really 40kHz, or there is something vibrating or resonating that produces something they can hear.
But the question about running 256 of those simultaneously remains.
I have one of these sitting in my closet, the pre-acquisition development module I got off ebay for a few students to work with. The sensation is very strong, but the novelty wore off quickly.
I tried putting my finger into a humidifier's ultrasound port once, exposing it to high levels of vibration. After a second or so, it caused a sharp pain in my finger. I wonder about the safety of this technology at various power levels, to skin as well as ears.
I'm still disappointed in the Leap Motion, although mostly because of the drivers. They are so locked down that it's basically impossible to play with the hardware without going through their SDK, which, last time I checked, did not work with ARM devices and I never had any luck getting it to work on Ubuntu.
I always thought the thing we’d need to simulate in a simulation is physical touch. I don’t know of any chemicals that can set off a pressure sensation. So if we are in a simulation, there’s got to be some kind of tech that creates physical touch sensation as we interact with the virtual world.
I am looking forward to the day when car manufacturers shove this stuff into cars to introduce tactile feedback for touch screens to solve the urging problem of getting rid of physical controls.
The only mainstream UX for haptics I see making sense is direct neural input of some kind. Ultrasonic based approaches are cool, but probably limited to art installations, museums and the like
I feel like this is a dead end. Every year, haptic force-feedback gloves are getting better and better. I don't think tactile feedback without force feedback is the future.
I guess this could be enhanced to also serve as an input device. Combined with either microphones reacting to the reflections of the ultrasound or a high res theremin perhaps?
A year ago, I looked into using Ultrahaptics for interactive exhibits and the yearly licensing they quoted was really high for making a single exhibit so my company ended up dropping that pitch.
I understand they want to keep their tech close to their chest, prevent competitors, and focus on large distribution deals, but it feels like they're holding it back from the people who might make the most interesting examples.
[1]: https://www.theverge.com/2019/5/30/18645604/leap-motion-vr-h...
[2]: https://www.ultraleap.com/licensing/#licensingforhaptics