Since iPhone 4 I have been saying Apple should take over the office by replacing all but power users' desktops with an iPhone that when at the desk, runs a Microsoft Office/Exchange compatible environment (web, email, word processing, presentation, spreadsheet, etc.) connected to a bluetooth kb and wifi display which all autosyncs along with documents and state whenever within proximity as well as seamlessly integrating with the present office telephony.
Anyway, this is the best thing I have seen in a while, but the demo video would have been more impressive with display sharing (with the vm fullscreen) and bluetooth kb&m, because it visually proves the viable pocket light workstation.
part of my phd dissertation was process migration for the desktop. I argued that MSFT or apple should make low powered portable devices (say phones/tablets) that were your computer, but could also be docked to more powerful computers. When docked, processes migrate off and run on the powerful hardware, when you want to undock, they migrate back.
when docked, the running gui apps switch to a keyboard/mouse optimized mode, and when undocked, they switched to a touch optimized mode.
I thought microsoft might have been heading in this direction with Windows 8 and the RT style apps, but instead of seeing it through they got gun shy and went back to more traditional style.
I did this back in 2013 I think. I had a desktop and a laptop that both supported IOMMU device passthrough. I used the Xen hypervisor at the time, and had a VM that I could migrate and then hotplug all the PCI devices to between the systems.
I never got it past the "playing around with it stage," but I was just _so_ sure that the market was going to move that way... Imagine your phone and laptop just migrates out to the cloud every time you lock it, then live-migrates back in when you wake it up... Or at least between your tiny laptop and your beefy desktop. Because nobody would have just one computer!
It sure sounds cool to me, but the reality is that it's too clunky. Virtualization and live migration are absurd solutions to consumer-level problems.
Surface Book was a step in this direction. The base of the computer had a discrete MVidia GPU, so you could run heavy GPU loads from the tablet section of the computer while connected to the base. If the base was removed, only the onboard Intel GPU was available.
Samsung DeX feels like a POC, sure it works and I've used it for about a week while my laptop was being repaired but it had lots of paper cuts, most applications were out of place, even some of their own.
I've used 3 different wireless VR implementations before, all of them require absurd amounts of bandwidth (50-130mbps), and send a highly-compressed image. It looks worse depending on the display you're using; on the OLED display of the Quest, the image ended up looking particularly terrible.
Even with proper display stream compression, I doubt you could build a network robust enough to host more than 2 or 3 wireless monitors on one network.
VR required like 90fps at something >1080p, right?
Office work is fine with 30fps (can realistically drop lower). Also I suspect office workloads should be compressed more easily (think of a next editor -- lots of white-space, only a couple characters changing from frame to frame).
And peer-to-peer rather than through the router would probably be the way to go. Since 5GHz drops off pretty quickly, it might be possible to exploit this if people aren't packed in too tightly. Which gives a nice, hard technological backing to requests not to be crammed in too tightly.
Even on 5GHz, your office router would quickly run out of bandwidth. A compressed 1080p stream is like 40mbps, so even if nobody in the office is using a HiDPI display, you'd connect maybe 5 or 6 displays before hitting some serious local congestion. You might be able to resolve it over ethernet, but if 'wireless' is the name of the game, then I'd suggest looking elsewhere.
But it is a fundamentally different type of problem. Going through a router imposes hard bandwidth limitations at a central point.
With direct wifi, each monitor only needs to serve a single user, so the devices can be lower powered -- the broadcast only needs to serve ~a single desk. And they are centrally controlled by the IT department, so unlike a wifi router the in RF mad-max anarchy zone that is a modern apartment building, there shouldn't be concerns about noisy neighbors.
I'm not sure there's a point to all this -- the phone being used as a computer is probably going to be plugged in, so why not push everything through USB-C/thunderbolt nowadays? -- but it is almost certainly possible.
The difference between sharing limited spectrum and AP resources in a room and only sharing limited spectrum resources in a room are not a fundamentally different problems. In addition you'd have loads of noisy neighbors, at least everyone close by in the room.
I agree the phone ought to be plugged in anyway. Even if it could survive a full day of being your workstation, you wouldn't want a half dead or worse battery at the end of your work day, and regularly draining a significant amount while stationary unnecessarily shortens the total battery lifespan. In addition, if docked into something with a bit of active cooling, the phone could perform better and once again preserve the battery.
I think they are fundamentally different. For the first case, everyone must broadcast with enough power to reach the AP. It introduces a bottleneck.
For the second case, everyone only needs to broadcast with enough power to reach their device. Overlapping coverage areas are essentially a side effect.
I was thinking more along the lines of 1024x768. Non-power users. And if it was an Apple enviro, the display would be a computer, and the connection would be vnc or nx.
My laptop now connects to my monitor with a single USB-C cable, providing power, driving the 4K monitor and connecting to its built-in USB hub where the other peripherals are connected. It doesn't even have to be wireless because you'd want the phone connected to power anyway.
I feel like any recent iPhone would be brilliant for this, though the RAM is on the short side, though with slightly more in recent Pros it could be a Pro only feature.
Shame it seems like such an un-Apple move to not provide this value, that it will likely stay that way.
I’ve just given up on my M1 Pro MBP in favour of my M1 iPad Pro with detachable keyboard case.
I’m very excited to see if the rumours about a proper DE for iPadOS when attached to a monitor comes true.
I don’t want to carry my laptop around. It’s too big and heavy. I want to carry one device that I can use as a portable device, but also use as a full fat device when I have appropriate peripherals, without having to worry about some kind of Cloud sync instead.
I don’t want to have to wait for state to sync between multiple devices. I want a single device that can do literally everything. The M1 iPad Pro with detachable keyboard (Logitech Folio) and Pencil is dangerously close.
Now if it could run a Linux environment that wasn’t horrifically slow due to Apple’s obscene limitations (see iSH, which is incredible, but unfortunately, slow), that would solve everything.
For now, I depend on Cloud VMs instead, which falls over once my connectivity becomes poor.
You can’t run a computer on a iPhone. If you could, it wouldn’t be iPhone shaped. The fact that you think you can comes from the incredible amount of work put into making them fast when being used as iPhones.
How is a current iPhone significantly different from an iMac architecturally? Sure they are curreny running different variants of Apple’s operating system, but they share similar processors, similar wireless connectivity, etc.
Edit: a slightly different way to look at it: the iPhone has always been a computer. Since iPhone 8 or so, it’s also had desktop class performance.
current iphones actually use the same exact chips as the macbook air series - the base-tier M1 (although some laptops have the more powerful pro/ultra).
the crossover between "high end phone" and "ultra-portable laptop" has been observed for a number of years now, that's really the space where chromebooks thrived even with inferior ARM reference cores, if you throw an actually-good M1 into it then you get an extremely solid ultra-portable. And the bigger laptops are just more of everything - more cores, bigger GPU, more memory.
What? I run desktop Linux on a Pinephone which is (unfortunately) iPhone shaped. The shape of the actual computer inside these devices is often really just a small board ~2/3 the size/shape of your credit card anyway.
The above post mentions web browsing, and I can easily find you common websites that’ll take 1GB per Chrome tab on a desktop sized screen. That doesn’t leave much of a 4GB device for computing.
There’s other issues but that’s the main one. The workload you’ll enjoy it with is a lot less than you’d expect unless you’re a mainly terminal user.
Yeah you can't open many tabs for large pages at once. I'll happily trade doing that for the computing freedom and git, xmpp etc actually working. iOS automatically kills your tabs anyway so it's not like you're actually using it.
Which xmpp client do you use on the pinephone? How is the support for OMEMO and A/V calls? If there is a Linux phone with great XMPP support there wouldn't be a lot left that keeps me on Android.
Why not? An iPhone 13 has a similar single core score to all of the M1 processors and a similar multi core score to a MacBook Pro (13-inch Mid 2020) with an Intel Core i7-1068NG7 @ 2.3 GHz (4 cores). However, iPhones only have 4GB or 6GB of RAM which may be limiting.
Because the SoC is not the only component of a computer and neither is one benchmark score. Other components are the power system, (lack of) cooling, flash storage…
iPhones have had external display support for a _long_ time. Does it require an adapter? Sure. The USB-C only MacBooks and iPads do too, unless you have a USB-C display. It’s entirely possible to display different content on the internal display and external display. That’s an app specific feature in iOS, but not a hardware limitation.
Have you seen a current iMac motherboard? Nearly all of the I/O is on a breakout board with a thunderbolt controller. Does an iPhone expose PCI-E lanes for interconnect? No, of course not, but there were desktops long before PCI-E and as others have mentioned, the first USB-C MacBook had exactly the same number of external ports as the iPhone.
> iPhones have had external display support for a _long_ time. Does it require an adapter? Sure.
And it’s actually a compressed video output reusing the screen recording hardware. The adapter runs a miniature iOS that decompresses the video again. It’s not raw pixels.
The most interesting bit for me is that iPads currently support virtualization, both in the chipset and the kernel.
All that wasted compute power could be used to run a developer sandbox if Apple wanted to (iSH is great, but it's sandboxed x86 emulation and not quite all there yet for back-end development).
"All that wasted compute power" has been the story of the iPad for the last 5 years. Very powerful hardware that simply isnt used by the software on it.
> If the hardware of the iPad Air 2 demonstrates the overwhelming power of small iterative improvements, then the software represents the failings of that approach. The overall experience of using the iPad Air 2 in 2014 is a case study in missed opportunities and untapped potential. Apple has all but stopped adding tablet-specific features to iOS — the minor two-paned mode for landscape apps on the iPhone 6 Plus is a more significant rethinking of how to manage a larger screen size than anything added to the iPad Air 2 this year.
> Just consider something as simple as browsing the web. On raw benchmarks the iPad Air 2 is comparable to a 2011 MacBook Air — which, again, is crazy — but the MacBook’s version of Safari is vastly more feature-rich and flexible. That MacBook will also allow me to run multiple apps alongside Safari and be far more productive than the iPad; we’re well past the point when Apple needs to figure out proper multitasking on its tablet.
Both of these pretty much make my point. To the first complaint, no iPad specific improvements: here we are 8 years later, 3 years into enough iPad specific improvements to rename the OS, and with enough computational resources to make use of them without buying a new device.
To the second, poor multitasking - this has been constantly improved since the days of iOS 8 and again, due to the spec of the device, those software improvements are available with no waste or additional purchase needed.
I’m not sure what alternative you would want here - a device half the initial cost that you have to replace 2-3 times between 2014 and now to use the same software updates? Or is the argument just that 2022 software should’ve already existed in 2014?
I'm not sure if that other side is positive if it means that we need more power to give user the same functionality as before.
I imagine that the basic uses haven't really changed since 2014: write to others, video calls, read stuff on the web, watch videos, use web apps where expensive logic is outsourced to a remote server.
I think it might be coming, but of course on Apple's terms. Swift Playgrounds has been a thing for a while, now you can make native iPad apps on an iPad. It's not going to be the hacker's wet dream, but I wouldn't be surprised if a future update makes use of the hardware.
This is exactly why I'm hesitant about going back to Apple ecosystem. I like the M1 stuff and the recent hardware changes - but the walled garden and the Appstore lockdown to protect their cash cow are coming at the expense of functionality and I just can't get behind that.
Hoping the next gen of x86 chips finally hit that sweet spot of performance/low power/heat.
> Hoping the next gen of x86 chips finally hit that sweet spot of performance/low power/heat.
Well I'm hoping we're finally getting the shove to move off x86, there's nothing particularly exciting about the architecture itself, and we're finally getting a tangible proof that the compatibility layer has a huge price tag. I'm excited because displacing x86 is also a stepping stone for RISC-V, and might push various vendors to consider making their software more portable - which would be a win for literally everyone.
> [...] the walled garden and the Appstore lockdown to protect their cash cow are coming at the expense of functionality [...]
Emphasis mine - what is the functionality trade-off that you're referring to? The App Store has never stopped me from running any app from outside the App Store.
I would agree that macOS is continuously getting a bit less hackable over time (how do you authorise cron to access the downloads folder...), but this has very clearly everything to do with the half-assed attempts at making the system more secure (a very noble and respectable goal), and nothing to do with restricting user freedom (after you factor in incompetence, carelessness, and ignorance).
OpenBSD has always been making broadly similar moves: securelevel, signing packages, pledge/unveil, removing support for loadable kernel modules, removing unmaintained/insecure subsystems (Linux emulation, Bluetooth). Some of these have been a bit annoying, but reading this as removing user freedom seems like mis-interpreting the intent.
> Well I'm hoping we're finally getting the shove to move off x86, there's nothing particularly exciting about the architecture itself...
I actually largely agree with you here. We need to be transitioning to open ISAs, so getting off x86 isn't something I disagree with. However, jumping from one proprietary ISA to another, slower one doesn't exactly make sense to me. No matter how you frame it, x86 is still the performance and compatibility king. I'd love a RISC-V laptop as much as the next guy, I think this is more about Apple increasing their profit margins by re-using their ISA licensing from their phones/tablets.
> Emphasis mine - what is the functionality trade-off that you're referring to? The App Store has never stopped me from running any app from outside the App Store.
Well, the discussion is the Apple ecosystem, not just Mac, and in that sense the App Store does prevent you from ingesting third-party software or running arbitrary code on your own device.
I also don't think this is about security as much as it's about control. Apple now directly competes with companies like Spotify, Netflix, Hulu, Microsoft, Meta, the list goes on for days. If they control their hardware and software experience end-to-end, then they control the users too.
> ...and nothing to do with restricting user freedom
But how exactly can we prove that? Apple says one thing and does another. They say they're protecting user privacy by hashing your cloud photos with a neural network. They say they've built the fastest GPU and then backpedal when it gets benchmarked. If they give you less capabilities with each update and say it's making you safer, I don't think I'd take their word entirely at face value; maybe security is part of the motivation, but almost certainly not the entire story.
As for OpenBSD? I actually like their approach for the most part, but there's a reason it's still a fairly niche OS. People like having options, and taking away your Linuxulator and Bluetooth modules doesn't make for a super enticing desktop experience. MacOS suffers the same issue, but more for the development crowd. There are so many footguns in MacOS for developing standard *NIX software that it makes my head spin. Want the standard GNU coreutils? Of course they're not built in, go grab a package manager and download them posthaste! Want to use Git? Here, download 750mb of random utilities onto your system so the 20mb program can execute. Oh, you wanted Bash to be up-to-date? We sorta forgot about that one...
...but their MacOS, iPad and iPhone development tools get fixed just fine. Gotta tend to the breadwinners before you take care of the intrinsic issues in your OS, I guess.
In closing, I don't actually disagree with a lot of your rhetoric, but I don't think you're addressing the parent's statement. Apple is flighty, they do crazy things in the blink of an eye and everyone else is forced to just go along. What happens if I go back to the Apple ecosystem and they remove all the iPhone and Macbook ports? There could be so many "gotchas" that many people simply don't feel safe going back to that lifestyle anymore. There is no roadmap, and that scares people who rely on stable software for a living. It's why I end up developing more on Linux than I do on Mac.
> No matter how you frame it, x86 is still the performance and compatibility king.
I'm not up to date with the latest benchmarks, but last time I checked, the M1 family is near the top in raw performance, and completely destroying everything else in perf per watt (CPU and GPU) - at competitive prices.
The compatibility angle is pretty interesting. I've been using an M1 Mac for about a year, and the only time I notice that I'm not on an x86 system, is when I run Electron apps compiled for x86 only. (I think it shows just how bad of an idea it is to ship JIT kitchen sinks as "native" apps, but it's an entirely different topic.) Surprisingly enough, the ARM build of Windows 11 in Parallels also does a pretty good job of running x86 software, including even some pretty dated games (like Soldat). GPU works really well too - Edge in Parallels beats native Safari on shadertoy.com.
Of course I'd rather see software that is actually portable and built natively, but honestly, I'm just impressed with what these companies have pulled.
> I think this is more about Apple increasing their profit margins by re-using their ISA licensing from their phones/tablets.
I don't mind Apple increasing their profit margins, if other parties benefit as well. Consider the impact on the wider software ecosystem. Software is usually difficult to port to a second platform, because the developer was hardcoding some assumptions. Often porting to a secondary architecture, OS, or platform will lay the groundwork to remove many of these assumptions, and enable porting to other systems. Portability tends to contribute to software quality, and having more software available benefits the users of the secondary platforms. As I said: everyone wins.
> [...] in that sense the App Store does prevent you from ingesting third-party software or running arbitrary code on your own device.
This is technically incorrect. You don't need the App Store to run arbitrary code on your iPad/iPhone - you only need a Mac and XCode. Which is either better or worse, depending on your POV.
And again - you can now write iPad apps on an iPad. You still need to jump all the hoops to distribute it on the App Store, but if you consider the work it takes to e.g. have it included in Debian repositories, or OpenBSD ports, it's roughly equivalent. "You can bring your toys to my playground, but we still play by my rules."
> They say they're protecting user privacy by hashing your cloud photos with a neural network.
I 100% agree with you on this one. The premise looks innocent: iCloud Photos is opt-in, and Apple is doing Apple by moving their CPU-bound workloads away from clouds and onto user devices. Technically this is fine: nothing is being scanned that wasn't already being scanned server-side. But just having the mechanism at all is scary, it has a lot of potential to bring harm, and it's a very dangerous precedent.
> If they give you less capabilities with each update and say it's making you safer, I don't think I'd take their word entirely at face value [...]
I'm not taking that at face value, but I will blame their "honest" incompetence, carelessness, and ignorance before I will accuse them of malice. They often rewrite stuff for the sake of rewriting it, and break "edge" cases that thousands of people rely on. The cries the the thousands drown in a billion sales though.
> Want the standard GNU coreutils? Of course they're not built in, go grab a package manager and download them posthaste!
Same on every BSD. macOS is just staying true to its roots ;)
> Oh, you wanted Bash to be up-to-date?
Notice that macOS' Bash is the final GPL-2 release. This is entirely on FSF working overtime to make redistributing their software more annoying. One of the reasons why I'm a big fan of the BSDs, is that they take this no-bullshit approach to redistribution: the license has to answer one simple question, "can I use this?", preferably in a few very short sentences.
Personally, I'd stay away from Bash. If an otherwise POSIXly kosher shell script becomes hairy enough to call for Bash's extensions, I will label it a monster and rewrite it in Python. ZSH and (OpenBSD) KSH both make much nicer interactive shells too.
> Apple is flighty, they do crazy things in the blink of an eye and everyone else is forced to just go along.
Agree on that. "Our playground, our rules", often taken to an extreme. Which is also why I like voting with my wallet, and keeping an alternative around (OpenBSD on my laptop). If I don't like an Apple toy, I simply don't buy it.
> It's why I end up developing more on Linux than I do on Mac.
Which explains why you're having such a hard time with systems that don't include the non-standard GNU or Linux extensions. I develop and/or test across five different OS+arch combinations, which often catches bugs and hardcoded assumptions. I often wonder if it's worth it, but then remember how monocultures tend to just rot.
>This is technically incorrect. You don't need the App Store to run arbitrary code on your iPad/iPhone - you only need a Mac and XCode. Which is either better or worse, depending on your POV.
This doesn't work practically since you need to resign the app every week. Also you do not get push notifications since those must go through app specific infrastructure that talks to Apple's service and is cryptographically verified.
> the only time I notice that I'm not on an x86 system, is when I run Electron apps compiled for x86 only.
Maybe it's just from working in devops at a Mac shop, but getting dev environments working on M1 with deployment parity is nearly impossible. It requires vast redesigns of preexisting architecture and forces a lot of software off of the table. It's wholly a YMMV situation, but I've seen things on both side of the fence; I still reach for x86 when I'm getting work done.
> Portability tends to contribute to software quality [...] everyone wins.
I don't really mind that aspect. What I don't appreciate is Apple raising the walls around their garden; Sure, you can claim it's being done to make people safer, but as someone without much skin in the game, this just makes Apple look worse to me.
> You don't need the App Store to run arbitrary code on your iPad/iPhone - you only need a Mac and XCode.
I mean, at that point it's no better than a microcontroller.
"Want to run your code? No problem, all you have to do is plug it into the ROM programmer!"
My gripe is that Apple goes out of their way to prevent people from distributing software anywhere that they might not profit from/control. If I buy a computer, I ought to be able to run the software I want. That logic extends to phones and every other device in their lineup, for that matter.
> But just having the mechanism at all is scary, it has a lot of potential to bring harm, and it's a very dangerous precedent.
Sure, I don't think many people will disagree here. The posturing here is what scares me, and it's the sort of sociopathic "we know what's right for you" behavior that turns me right off.
> I will blame their "honest" incompetence, carelessness, and ignorance before I will accuse them of malice.
I would too. One problem though: they seem to have a hard time admitting when they're wrong, and insist that everything they do is deliberate. Remember "you're holding it wrong"?
> Same on every BSD. macOS is just staying true to its roots ;)
I would have preferred an "it just works" solution, but I'm sure that's the answer most Mac users would issue me so I'll take it.
> Personally, I'd stay away from Bash. If an otherwise POSIXly kosher shell script becomes hairy enough to call for Bash's extensions
I do, I use Fish as an interactive shell. But running recent bash is a requirement for maintaining servers, cross-platform *NIX development and a huge number of toolchains. I don't like it, but the ease-of-installation partially predicates how difficult it is to set up my dev environment.
> If I don't like an Apple toy, I simply don't buy it.
I mean, I just take this ideology to it's logical conclusion; if I presently think Apple is being scummy, I don't even give them my money. Explains why I haven't paid for an app since 2013...
> I develop and/or test across five different OS+arch combinations, which often catches bugs and hardcoded assumptions. I often wonder if it's worth it, but then remember how monocultures tend to just rot.
I write/test across multiple arches as well, I just end up writing the code on Linux. It's really not worth it to use Mac when you're only actually deploying to x86_64-Linux.
> I mean, at that point it's no better than a microcontroller.
Agreed, but what's the state of the art for the alternatives? Android has Termux, which is basically a Debian chroot - can you get it to talk to the rest of the OS sensibly? Can you write a shell script to act on an incoming SMS[1]? Can you run Android Studio on Android? What about the base system - can it be made self-hosting? SailfishOS is dead - I take my Jolla out of the drawer once a year to check how many kernel or Firefox releases they're lagging behind. Does PostmarketOS have at least one device where all hardware is supported?
Meanwhile yes, all options for the iPad are either Apple-sanctioned or - as TFA shows - pure homebrew. This is not a PC, but neither is any of the alternatives. However the sanctioned options are relatively decent from the "getting stuff done" PoV: Shortcuts is basically shell scripts with a GUI; Playgrounds[2] can build native iPad apps. It's not freedom by any definition, but you can't call it a microcontroller. You could argue a shell would be more practical than Shortcuts... But iSH[3] exists.
> But running recent bash is a requirement for maintaining servers, cross-platform *NIX development and a huge number of toolchains.
Interesting, I guess YMMV again. I have the stock, ancient Bash on my Mac, and no Bash at all on my OpenBSD box. But I can easily imagine some random toolchain having a 10000-line Bash monster as a launcher/updater, I was probably lucky so far.
> I would have preferred an "it just works" solution, but I'm sure that's the answer most Mac users would issue me so I'll take it.
Not dissing the technical achievement, I just think the form factor itself doesn't lend itself too well to self-hosted hacking. The quality of the tooling (how it's adapted to the tiny screen) defines the experience = aka actual usability.
It's great for my side projects. I take the train a lot so that's where I use it the most. I use a normal X11 DE and all my normal apps just work, no futzing with "mobile" crap.
> I use a normal X11 DE and all my normal apps just work, no futzing with "mobile" crap.
Honestly that's exactly my problem with the "normal" X11 stuff. I have a 7" WaveShare touchscreen, I hook it up to a RasPi, load up this or that DE, and the experience is... subpar
No. The user experience on iOS when you want to use any kind of FOSS (xmpp and irc clients, compilers, git without some closed wrapper etc) is shit. You're often stuck without notifications and you might even have to use iSH which is ~1/10th-1/100th speed of the original Pinephone.
It also doesn't stop malware since the contractors doing app reviews don't instrument the apps, they just play with them for a few minutes to see if something looks wrong. The App Store is full of malware but it makes you feel safe.
FOSS on Apple products that only have an app store is doomed because it costs money to have a developer account.
FOSS activists always claim that it's free as in free speech and not free beer, but somehow these FOSS apps everyone claims to need can't even reach the $100/y necessary to keep them on the app store.
> FOSS on Apple products that only have an app store is doomed because it costs money to have a developer account.
It also costs money to buy a computer and pay the electricity bills. It also costs money (indirectly, through man-hours and infrastructure costs) to maintain and distribute FOSS apps through any other channels.
The idea behind the fee is basically as a CAPTCHA. Whether it's achieving that goal is very arguable though.
Apple definitely could do a better job accommodating FOSS apps. The developer account fee could be waived for verified FOSS developers. They've recently launched a CI service - it should have a free tier for FOSS. Verifying that an app is FOSS should be very simple - the App Store should link to the source code and a public CI dashboard. A part of Apple's 15/30% cut from FOSS app sales should directly go to fund FOSS development.
Not just the developer account, the developers are required to maintain app specific infrastructure to do things like push notifications. That's why FOSS apps rarely have them.
All of this on top of the the hardware cost and having someone run the iOS API treadmill and deal with app store reviews makes FOSS on iOS prohibitively expensive for everyone but the most committed. It pretty much filters out everyone that isn't a corporation or irrationally passionate.
I mean, I would suppose people who want IRC on their phone must be pretty senior. They can't spare 2 dollars per year to pay for the developers to keep the app afloat?
Exactly. It's all "kind of okay" at best. OSX used to have the best XMPP and IRC clients and some were ported to iOS; it's not a lack of effort on the part of the developers it's a critical problem with the platform.
Not my experience. Maybe for specific stuff for HN type users, but in general the functionality (aka options) in recent years have gone up, not down. Most apps are very high quality compared to Android especially (and yes I use both ecosystems)
There is a reason it is a cash cow. People like buying it.
Sorry if this sound dismissal, but Ferraris are a cash cow too.
People like to buy them.
Research shows that there are 80+ apps installed on the average smartphone. But with that said, people aren't using all of those apps. The average person uses 9 mobile apps per day and 30 apps per month.
Most of them are games, the apps people can't live without are ~15 and are made by largest brands of social media platforms and e-commerce
They have the resources to make the same high quality apps on every platform.
In my view Apple users will keep using Apple by the force of habit, Apple users will perceive lower quality in Android ecosystem, because there's more choice.
Also, on Android there are a lot of open source apps that are uglier but work exactly as intended, no ads, no phone home etc
If you go to a Bang & Olufsen shop and then to RadioShack the perception is of a lower quality, which is probably true, but who needs bang & olufsen headphones to listen to a 64kbs podcast while driving to work?
My "user experience" includes using hardware / software to it's potential extent. If I am limited to a kindergarten state for marketing reasons I simply do not join the ecosystem.
It tells me that Apple is big enough to have a team of people who developed a feature which seems to work fine (hardware virtualization) but was never launched.
The post does state that it's explicitly disabled for iPhone builds of iOS, while it's basically enabled for iPad builds - you just need the entitlements.
Maybe some Apple engineers are running around with an internal VM app running Fedora because they were tired of only using their powerhouse iPads for Netflix and scribbles?
I'd guess that it made sense to use iPads as a cheap testing ground for stuff before they went all-in on M1. Maybe they found bugs in hardware that they could fix in time for the M1 release but made virtualization impossible or a security hole on iPads.
This could have been Plan B for Apple if governments forced their hand in side-loading apps.
This makes me think of the way Intel shoehorned the x84-64 into Prescott Pentium 4 chips, enabled with a BIOS update when Athlon 64 released back in 2005.
It could presumably also be used to run a completely sandboxed copy of macOS on the phone while simultaneously running iOS (for use when docked with a mouse/keyboard/screen).
I think it’s more likely that they want products to create touch native versions of apps while allowing running macOS apps directly would result in less iPad apps being made since telling the user to connect a mouse and keyboard is workable.
Think of the power that is thrown away yearly by Apple. All these locked down computers capable of so much being used to run what, a camera app? Lifetimes of wasted potential. It's a crime against the future, if you ask me.
The real crime is the damage to the next generation's minds. Their expectation that all meaningful software must have an organization and online infrastructure behind it and the general infantilization of computer users is honestly kind of terrifying.
But I wonder if you could use the modified version the author posted with AltStore (or plain old Xcode signing) to get virtualization support on the M1 iPads which don't require kernel modifications. I'd pay $100 a year for a developer account if I knew I could run macOS and Linux at near native speed on my iPad.
Does anyone know if you can install apps that use private entitlements without a jailbreak if you compile/sign them yourself?
I assume the performance improvements come from having JIT memory enabled? If so this needs to be done by starting the app in 'Debug' mode from XCode which loads a specific entitlement.
Can someone please explain to me why a jailbreak is needed for running a VM?
I’m just why it wouldn’t be possible to run a virtualized OS as a native app, and translate all OS calls into calls that the native app could make to the underlying iOS.
It's not needed, but as you can read in the article, Fedora takes half an hour to boot without virtualization or JIT. You cannot mark memory executable in iOS apps, preventing you from actually letting the host CPU execute any guest code.
Not related to the video, and I know this may read as a not so polite thing to write. However, the person in the video should probably clip their nails. And it is not just about the appearance, but also about their health risks [1].
Anyway, this is the best thing I have seen in a while, but the demo video would have been more impressive with display sharing (with the vm fullscreen) and bluetooth kb&m, because it visually proves the viable pocket light workstation.