Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I get frustrated seeing this go into the iPad and knowing that we can't get a shell, and run our own binaries there. Not even as a VM like [UserLAnd](https://userland.tech). I could effectively travel with one device less in my backpack but instead I have to carry two M chips, two displays, batteries, and so on...

It's great to see this tech moving forward but it's frustrating to not see it translate into a more significant impact in the ways we work, travel and develop software.



> instead I have to carry two M chips

What's the incentive for Apple to unify them, since you've already given them the money twice?


> Not even as a VM

WWDC is next month. There's still a chance of iPadOS 18 including a Hypervisor API for macOS/Linux VMs on M4 iPads.


I hope for this every single year. I just don’t see it happening. But I hope I am wrong.


2022, https://appleinsider.com/articles/22/10/20/apple-rumored-to-...

> A leaker has claimed that Apple is working on a version of macOS exclusive for the M2 iPad Pro ... the exclusivity to M2 iPad Pro could be a marketing push. If the feature is only available on that iPad, more people would buy it.

Based on the M4 announcement, vMacOS could be exclusive to the 1TB/2TB iPad Pro with 16GB RAM that would be helpful for VMs.


At this point, you would have a better chance of running your own apps by relocating to the EU ;)


yup - im honestly tired of the Apple ~~~jail~~~ ecosystem.

I love the lower power usage/high efficiency of ARM chips but the locked down ecosystem is a drag.

Just the other day, I was trying to get gpu acceleration to work within a vm on my m1 mac. I think it’s working? But compared to native it’s slow.

I think it’s just a misconfig, somewhere (ie, hypervisor or qemu or UTM or maybe the emulation service in vm).

On other systems (intel/amd + nvidia/radeon) this is more or less a “pass through” but on mac it’s a different beast.


gpu passthrough for VMs is not supported on apple silicon period afaik. there may be some "native" renderer built on top of metal but apple doesn't support SR-IOV or "headless passthrough".

https://chariotsolutions.com/blog/post/apple-silicon-gpus-do...

otoh no, it is not "more or less [automatic]" in other hardware either, SR-IOV has been on the enthusiast wishlist for a ridiculously long time now because basically nobody implements it (or, they restrict it to the most datacenter-y of products).

intel iGPUs from the HD/UHD Intel Graphics Technology era have a concept called GVT-g which isn't quite SR-IOV but generally does the thing. Newer Xe-based iGPUs do not support this, nor do the discrete graphics cards.

AMD's iGPUs do not have anything at all afaik. Their dGPUs don't even implement reset properly, which is becoming a big problem with people trying to set up GPU clouds for AI stuff - a lot of times the AMD machines will need a hard power reset to come back.

NVIDIA GPUs do work properly, and do implement SR-IOV properly... but they only started letting you do passthrough recently, and only 1 VM instance per card (so, 1 real + 1 virtual).

Curious what you're using (I'm guessing intel iGPU or nvidia dGPU) but generally this is still something that gets Wendell Level1techs hot and bothered about the mere possibility of this feature being in something without a five-figure subscription attached.

https://www.youtube.com/watch?v=tLK_i-TQ3kQ

It does suck that Apple refuses to implement vulkan support (or sign graphics drivers), I think that's de-facto how people interact with most "hardware accelerated graphics" solutions in vmware or virtualbox, but SR-IOV is actually quite a rare feature, and "passthrough" is not sufficient here since the outer machine still needs to use the GPU as well. The feature point is SR-IOV not just passthrough.


UTM can be built for iOS.


Hypervisor.framework is not exposed without a jailbreak which makes this quite limited in terms of usability and functionality.


best you can hope for is cpu pass through. Gl with using the rest of the chip


Think the play is "consumer AI". Would you really write code on an iPad? And if you do, do you use an external keyboard?


Tablets are the perfect form factor for coding because you can easily mount them in an ergonomic position like this: https://mgsloan.com/posts/comfortable-airplane-computing/

Most laptops have terrible keyboards so I'd be using an external one either way.


Those keyboards are absolutely ridiculous, sorry.


Yes. If I’m plugging it to a thunderbolt dock I’d expect it to work like a MacBook Air




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: