I wish Apple would make a sort of Raspberry Pi like board with an A series chip, ethernet, HDMI, GPIO, SD Card, etc... for $50-$100 that runs Linux.
I would also like a flying unicorn.
Seriously, a high power SBC at a reasonable price point would be insanely useful for projects like portable MAME cabinets, video transcoders, scene displays, etc... I love the Pi but it's so old tech, especially the video cores, that it's annoyingly limited for some applications. Worse, the competitors with faster cpus often have worse video acceleration support, making them worse for this than the underpowered Pi.
Apple has been targeting the general public filling their needs with easy to use device/software stack and while part of it is starting to fail (dropping AirMac series), I would consider them losing focus if they start to build tiny machines only targeted at geeks.
But I do feel that without extending their business what they're going to do with their massive cash just sitting around doing nothing.
Doing nothing is far better than making a dumb acquisition, and most acquisitions are dumb. Just buy back stock, they have plenitude of cash flow to fund reasonable R&D.
A USB-C Thunderbolt Display that delivers power over the cable and has a USB hub built into it. These exist, though they're expensive.
Or, an eGPU enclosure, which also delivers power over the cable and has a USB hub built into it. (You just need to plug a monitor into it.) These exist, and are surprisingly cheap if you buy them bundled with a graphics card.
For most of the day my Macbook Pro is basically a computer with one port plugged into a dock with its case shut since I like using my LG 21x9 screen more than the built in screen.
1. Make sure an external keyboard and mouse are available, wired or wireless is fine
2. Make sure your MacBook is plugged into power and an external display
3. Close lid
4. Your main desktop should show up on the external display. If your computer goes to sleep you can wake it up by clicking the mouse or typing on the keyboard.
If the external display wakes up as your main display with the same desktop picture and icons as your laptop it means the internal display is asleep.
If your mouse is laggy and display looks fuzzy it means your external is mirroring your internal display and that your internal display is on. Turn mirroring off if that happens.
If you’re worried about heat (lid shut mode blocks the vent that runs the width of the keyboard) you can leave your lid open and use your external display normally.
I have not noticed a heat problem using my 2013 MBP Retina in lid shut mode after my two year old threw a phone at the screen (kids! Lol)
Remove the battery and screen, you get the Mac equivalent of the Intel Stick. Depending on price, I could very much see buying a lot of them for the students (with a USB-C Digital AV Multiport Adapter or equivalent for their home use).
[Edit]I would then stock the labs with docking ports like something from CalDigit and LG displays.
The odds of Apple releasing a development ARM board is probably roughly equal to that of them releasing complete documentation and linux kernel driver support to go along with it.
The vendor kernels are worthless, tainted, piles of crap. Vendors rarely, if ever, maintain them once they are posted publicly. The GPU still requires proprietary blobs to be even remotely useful.
Of course. But they contain non-zero amounts of information on the chip so you can do a proper implementation or, much less ambitious, update it for newer kernel versions.
All the reverse-engineered GPU drivers we have now (freedreno, etnaviv, nouveau, lima) started out with intercepting kernel ioctls, being able to observe the blobs at work. Many of them at one point ran with the vendor kernel interface but open-source userspace.
In many ways the kernel part of a GPU is arguably the least important part and it's the userland where the shader compiler sits and produces the magic, but if you start from nothing it's very useful to have a working kernel interface that initializes the hardware and accepts command streams to run.
Apple’s commits to the LLVM (where the shader gets compiled into GPU executable code) states the A11 GPU target is “PowerVR Rogue” ISA. Which is identical to how it compiles for the A10.
So it’s still using foreign IP for the instruction set. Maybe Apple moved things around/optimized internals? Get better path optimization, or improve IPC?
But it’s weird they didn’t add optimizations specifically for that A11. So an A10/A11 GPU just run the same opcodes.
Maybe I misunderstood "portable MAME cabinets, video transcoders, scene displays" and the comparison to the RPi but surely the intention is to run your own code on it from the bootloader up?
Worse is the array of smart devices that are forced to use Android type hardware / Android OS. Looking at the Peloton cycle I have specifically which clearly runs Android OS. Could be so much better if it was on faster hardware / could link up to Apple Watch for stats.
Make a version of iOS that companies can build off of; maintain the OS patch cycles but let them put their own app / front end on it.
Tons of smart devices exist now a days; TVs, refrigerators, bikes, etc... There's a market for an Apple based system for sure.
But yes - I'll taking a flying unicorn also in the interim.
That's an interestingly spun perspective. So you're asking for Apple to produce an open source ecosystem around their hardware and OS, while simultaneously refusing to use the products of the company that did?
No devices are "forced" to use "Android type hardware / Android OS". They use it because it's free. That's... important. Right?
Android isn’t “free” in the FOSS or Beer sense of the word.
The important bits (hardware drivers and core applications) aren’t open source (AOSP doesn’t include the new default apps or standard libraries.
Also, the important stuff is tied to google services that spy on you.
I think it’s completely reasonable to reject that stack, and lament that no one offers a high quality, supported alternative software stack at a reasonable price.
I hope and I wish the market and Android will force Apple to make this decision within the next 3 to 4 years.
They are still not making Apple TV set.
With WatchOS and the Apple S3 chip already includes 90%+ of the features needed. I bet the S3 cost less then $20, and Apple could have sold it at $50 along with WatchOS to specfic market that Apple do not intend to compete. Fridge, Fan, Lamp, bikes, TV set, radio, kitchen appliance or what so ever.
Apple currently charges $1 for the controller of Lightning Cables regardless of volume. They could have just offer similar program.
The CPU is an Intel Atom which turns out to perform very well. The internal storage is blazingly fast eMMC. It has an Intel GPU which means complete open source driver support. (I can't comment on the GPU performance as I use the serial console.)
They also put a 40 pin connector on it that is hardware compatible with the raspberry pi - ie the same pins, voltages, and functions. You do need kernel patches for full support of the pins because the pins are specced at 3.3V while the Atom is 1.8V. There is level shifter hardware that fixes that, which the patches control. The patches are public from Emutex (in github) and there is also a PPA if you just want to play with Ubuntu and not worry about it.
It uses an AMI UEFI BIOS. You can compile the kernel with efi stub support and the upboard can boot straight into your code.
Your best bet would be a SBC with i.MX8M - the same SoC that will be in Librem 5. That's because it's one of the best supported SoCs by mainline Linux kernel.
With Etnaviv drivers for GPU you can be pretty much covered. And I think it will get better.
I'm hoping that someone will make a netbook with i.MX8M. Or maybe even Chromebook - it would be then easily repurposeable.
Releasing a chip like that would be close to useless without also developing a user-friendly hobbyist API like the Raspberry PI and Arduino's already have.
Releasing the chip plus a nice users manual of all of its registers and assembly instructions might pose a security threat for device jail-breaks that they probably aren't interested in either.
Seems to take about 4-5 years from release for the depreciation&upgrade curves to bring an iPhone under $100. At that point, the main thing keeping them from being what you're asking for is:
1) No one has been particularly dedicated to developing an open-source OS for these things, despite the fact that used phones are steadily becoming some of the most common devices in the world, to the point where we have to treat them like waste.
2) Apple's treatment of ports of any kind as a blemish.
Solve these problems, and you have the board in a slick package. :)
I'm less than half kidding. #2 is obviously solvable, as there are lightning dongles/docks for just about everything on your list. #1 gets harder over time with security measures but it has to be possible, it's a question of resources.
There is video story at all (except a serial console), but the apu2 amd-based boards are nice, stable targets, and much, much faster than a rpi for networking applications:
Still, only 1.5Tflops (roughly 750Ti level), which makes it insufficient for bleeding-edge robotics, but 8GB is nice for inference. NVidia has some high-performing board for automotive, but that one is 500W for a change...
I'm right there with you. There are some pilikes like Espresso Bin that are much more powerful, but they just don't have the community support that Pi does.
http://espressobin.net/
Still at $49 it's cheap enough I'm going to see if it'll work as a wall-of-ceph appliance. Would love to have an SBC designed to be an OSD node but Western Digital never put theirs in the channel and there's not much else comparable AFAIK.
Apple is so in love with walled gardens. I can't imagine them ever making truly open hardware. Perhaps they are uniquely qualified to make something open because of the amount of effort they have put into making things that are closed.
It's not that they love walled gardens, it's that they have different priorities than other companies. Users and usability are their primary concerns. Everything else is done to support that.
They have open-source initiatives, they like to be open like that when it doesn't interfere with those mandates. It's just that usability is such a huge thing that it impacts every step of their process.
> Seriously, a high power SBC at a reasonable price point would be insanely useful for projects like portable MAME cabinets
Right now one can buy an Exynos5422, which CPU wise (I've never tested the GPU) is 4/5+ times as fast as an RPi 3b, for the price point mentioned (<100$, board only).
For me, this is effectively a "high power SBC at a reasonable price", however, at the same time, users interested in SBCs (or micro-PCs) seem to have a wildly different array of requirements (I've read virtually everything: (very) low price, GPU, kernel support, open source drivers, arch, ports, form factor, etc.etc.), so a desirable SBC seems like a pipe dream due to the fuzziness of the requirements.
(For reference, the CPU performance ratio is calculated on single-core performance of a generic RPi 3b core vs. a fast Exynos5422 core, on OpenVPN maximal bandwidth).
It's hard to asses a generic ratio, since the performance increase is wildly variable. I think it can be reasonably assessed a "base" 3x performance ratio against the RPi 3B (with an outlier of 8x).
Someone recommended in an article an ODRIOD-C2 over a Raspberry Pi. They said you get quite a bit of performance spending just a little more ($40). Here's a comparison:
Surely there's got to be something like that available already. I'm mostly dealing with low powered M0/M4 boards, but the selection in this range is massive.
There's an absolute ocean of old obsolete ARM cores clocked slowly. The difference in performance is so vast it would be hard to compare with an A11 chip, especially once you include graphics acceleration in the mix.
Apple has been doubling down every year on the "we can make it go faster", and the rest of the ARM chip makers have responded with a "meh". It's so bad that people get really excited for a Samsung chip that is no slower than a two generation old A series chip.
Performance is highly dependent on task and graphics is extra tricky to compare, but looking at geekbench, it seems the Samsung S9 is about halfway between iPhone 7 and 8/X. So half a generation behind is more accurate.
Jetson CPU performance is on par with chips like the RK3399 or the Exynos5422, which are 4/5+ times as fast as the RPi 3b (I owned RPi 2/3b series, and the Exynos).
Note that the first benchmark in the mentioned article is an outlier.
No, it's not. Having worked with the Jetson platform extensively, the TK1 was equivalent to a 5422 (because they're both built on the vanilla A15 architecture). However, the TX1 and TX2 ("Denver" cores) are closer to (though, still slighty behind) the current SD 835:
I'll correct to "comparable", as it's more appropriate.
However, it's not a "clearcut no" or "very powerful in comparison", due to the number of cores (no doubt about the GPU, I had made this explicit in the parent post).
I've run the Phoronix test suite (which uses a set of real-world applications)[⁰], and, excluding the outliars for both parts (Redis/OpenSSL), we're talking about 10 to 40% advantage for the TX1, and little more for the TX2; XU4 has even the edge in one case.
They're definitely faster for the majority of use cases, however, I wouldn't classify them as "very powerful in comparison", at least, when considering the performance of an RPi 3B.
The TX2 is a dual-core chip (ignoring the 4 "littles" since they're unused in benchmarks). The 5422 is a quad core. And even then, it's outperforming the 5422 by a decent margin in most tests. Even c-ray, an intrinsically multithreaded application is close.
So you're correct, they're "comparable" in that if you take twice as many 40%-as-powerful cores, you might shrink the gap. But for any fundamentally single threaded tasks, a massive gap exists. And if/when you leverage the GPU (which can't be ignored, despite your handwaving), it becomes a canyon.
I didn't even know these existed! From a googling I can only find hats/daughter boards for existing raspberry pis. Is that what you mean or are there ones that are all in one?
I'm not really impressed by this article. I don't see much of a performance improvements in desktop CPUs, so I don't expect them in mobile CPUs as well. They were catching desktop CPUs until recently, but now they are playing on even field, so I doubt that it would be 25% in single-thread. Sure, they could drop more cores, but that's it and only for specific workloads. Also I'm not even sure that I need that performance in my phone, I don't play games or run Idea there. Battery life speculations are very unconvincing as well. CPU is only part of battery drain, another parts are display and, most importantly, radio chips. So if my CPU eats 10% of my battery, those 25% improvements will be 2.5% which is not that impressive. They have terrible batter anyway, I have no idea how they managed to make a phone which turns off at -30C. How am I supposed to use it at winter?
I'm really impressed by how long apps remain in RAM on my 8 Plus. I'll play a game before bed and find it still where I left off during my commute the next morning, and that's with some browsing/podcasts in between.
It feels like CPU advancements in recent memory have brought nothing at all. It used to be, in the early days of smartphones, every generation brought something very new, something that can truly be considered an improvement. I remember the first dual core phone, the first quad core, etc. Nowadays, there is no noticeable difference at all.
Even now, where phones have indeed caught up to computer CPUs in speed, we have yet to see a phone really do anything more with that speed. The one thing I want to see, but nobody seems to be doing it, is a phone that converts into a desktop. Your laptop doesn't have to be separate from your phone. Your laptop could be a dock for your super-fast phone that converts to a desktop operating system. Of course, there have been a few products on the market that have attempted this, but none that lasted.
Too true. Selecting text is still as worse as I remember it always being. But the battery life could be into the days. The processing capabilities may be similarly improved too. It’s not a nothing upgrade.
Ubuntu was working on this, and to be honest, I would very much prefer Ubuntu over Android in order to work or do development.
But it never was profitable enough and it died. Nowadays there are some Android and Windows phones with that ability, and I think they are not very successful as well.
In fact, you claim you want this, but if the technology were successful, you would already have one of these phones and one or more docking stations.
Re:cannibalization, Apple is more than willing to cannibalize their products - just see iPod Nano v Mini, iPhone v iPod, iPad v Mac, iPad 2018 v iPad Pro. It's part of their playbook to cannibalize themselves, so that competitors don't wind up sneaking in market segments. This is enabled by organizing the company functionally rather than by product line, which lets them avoid the 'strategy tax' of pre-existing divisions that own products wanting to keep those products alive (bureaucracies self-perpetuate and all that).
That said, a phone that converts to a desktop (at least in how I think OP and others who bring this up think of it) is not part of the playbook, because its making the device do double-duty in UI/UX. See iPad not having a mouse.
I think Apple's perspective is that the glue that ties mobile UX to seated/desktop UX together is the cloud, and to your point, that involves multiple devices. The exception is non-interactive content (AirPlay), which third parties can license.
I guess I'm not really talking about it from a business-centric perspective, but rather one of progress. This is where I feel the next step in mobile computing lies, given these advancements in mobile CPU. But you're right, perhaps it does not make business sense, and that's why we're not seeing it.
7nm is not going to be unique to Apple. TSMC has said that they have 50 customers using their new 7nm tech this year. That most likely includes Qualcomm whose Snapdragon chips power most Android phones.
The dearth of WWDC rumors this year is really amazing to me. Apple really is taking leaks very seriously. We're 10 days away from the keynote, and absolutely nothing is getting out.
This speculation is the best that Macworld can do? No offense to them, but I mean, the rumors are slim pickings.
I really hope that Apple fixes its laptop lineup at WWDC. It's a complete and total shitshow right now. And that's coming from a huge Apple fan.
Even before you factor in the keyboard issues, it's a mess.
The Air line maxes at 8gigs of ram, but you can configure for a better core i7 proc than the MacBooks. Of course the screen is awful.
The actual MacBook line has an anemic processor, but you can get 16gigs of ram and a decent screen. But by that time, you're paying MacBook pro money. Which at 13-inches, you can't get the top specs without the touchbar, and the midrange specs on the non-touchbar are stupidly overpriced even for someone like me who's willing to pay a premium for Apple kit. And still capped at 16gigs ram.
15-inch model suffers from the same problems as the 13-inch: You can only get the maxed specs with the touch bar and the mid-range specs are a joke for the price.
The only pro laptop worth buying right now is a 3-year-old mid-2015 MacBook pro. I was in the market for a new Mac a couple of months ago and literally could not find anything worth spending money to upgrade on from Apple. I picked up two 11-inch macbook airs for cheap on eBay because I adore that tiny form factor and use them as mostly dumb terminals for remote stuff. They are surprisingly useful and have great battery life.
Apple really needs to fix the laptops and give up the goods on the Mac Pro machine. I don't care if it's throwing a bevy of these chips in an ultra thin MacBook Pro-Air X Plus or what. But this situation is embarrassing. Reminds me of the mid-90s, when there were a ton of options if you wanted a mac, but nothing really worth buying.
My work ThinkPad is an ugly brick that gets the job fucking done. I'm about to buy one for my home if I don't just stockpile all the 11" macbook airs I can find and build a home lab cluster to remote into for doing real work instead.
You're right -- this has been a very rumorless year. Interesting given how many leaks we've seen over the last couple of years; I guess someone got serious about tightening up. Or, of course, there's nothing much coming.
You're dead right on the laptop lineup. Touchbar, whatever, I don't think that's an enormous issue one way or the other -- but the incoherency of the tech specs matters a bunch.
They did tighten up leaks. There was a big internal meeting about how Apple is actually not only firing people but also prosecuting them--which was promptly leaked.
And yeah, it's also possible that there's nothing really exciting coming that's worth leaking.
It could be a big software year. Most leaks we get tend to be supply chain, where as software may only be known about a handful of people until the day it’s unveiled.
>We're 10 days away from the keynote, and absolutely nothing is getting out.
I'm OK with this. I'm not so locked into a feedback loop that I need to know every little thing happening at Apple before it actually happens. I think the death of Think Secret did that for me.
The downside is that all of the Apple fansites (9to5Mac, MacRumors, etc...) are all full of crap "wish list", "deals" and contests disguised as articles because they have to fill their quotas.
Cannot agree more. You can get 6-core i9 Dell Precision with 4k screen, 32GB RAM, and 5x better GPU that weights 4lb for 2/3 of maxed out MacBook price.
Sure the build won't be that good but with this new keyboard I'm not so sure anymore. And I get fucking ports.
The only reason I'm with MacBook is vendor lock-in with my work environment but that isn't going to last forever.
I paid for the storage upgrade, not the RAM one and I regret it weekly :( I saved 475€ on the whole machine thanks to exchange rate, but still the bill was salty.
I emphatically agree with every point. I am in the exact same position and feel exactly the same. I bought a 2016 13” MacBook Pro and got rid of it because it just wasn’t... better. Still using a laptop and a desktop from 2013. And waiting...
> Apple really needs to fix the laptops and give up the goods on the Mac Pro machine.
Apple just needs to give up on Macs. They obviously don't care about the product much anymore. License the OS and let OEMs and regular people build machines that can deliver the value that Apple can't or won't.
It's still a blip in their bottom line compared to their mobile devices. It's been pretty clear for years where their priorities are, and I say this as someone who's been an Apple fan since grade school. They've been riding their laurels hard for the last five years when it comes to PCs.
I disagree, I think they are doing a full about-face with these latest generation of garbage MBPs and I am hopeful for the refreshes. A lesson many will learn the hard way, eventually, is you don't scorn your power users because they are an endless source of evangelism and free marketing for your product.
iOS is not particularly different from OSX. That's why Apple developers get to use a super fast simulator instead of the dog slow emulator we Android developers are stuck with.
Similarities between iOS and macOS don't have much to do with it.
The reason the Android emulator is slow is that it is actually emulating an ARM CPU.
The iOS simulator, on the other hand, is running an iOS natively built for x86, and Xcode targets the x86 architecture when you build your app for the simulator.
> It certainly doesn’t hurt that the simulator only needs to run the iOS front end on top of macOS Darwin instead of a full copy of iOS, though.
Uh, no. The simulator literally boots an entire copy of iOS, with its own libraries, frameworks, and utilities, that's entirely separate from the host OS.
That is incorrect. The iOS simulator is not a virtualized environment. It's not running it's own kernel etc, rather it's just a set of iOS libraries and frameworks built for x86 and a thin compatibility layer so that they run on macOS.
You can actually see your running iOS applications, and all of the iOS daemons etc show up as first class processes on you Mac if you run "ps".
iOS apps running in the simulator are not separate from the host OS at all - they're running on the host OS, just like any other process. Try killing "Springboard" from your Mac command line and watch what happens inside the simulator...
Does anyone really uses ARM emulation? I think, everyone just uses x86 Android image which should be the same as Apple's iOS Simulator. And if you need to test ARM, you should just use real phone.
I've used both. ARM emulation is slow, as you'd expect from a CPU emulation. The default, x86, is blazingly fast. Of course, this is assuming you remember to turn on virtualization features on your CPU.
The default x86 is a fair bit faster than my phone....
Except in multitasking UX. Yes, technically there isn't anything stopping you from running a compiler on iOS, but there's so much more to having a relevant development platform than "does llvm technically run".
> Yes, technically there isn't anything stopping you from running a compiler on iOS, but there's so much more to having a relevant development platform than "does llvm technically run"
You can't run unsigned code on iOS (with a few exceptions), and I don't believe that anyone has come up with a viable solution to codesign code on the fly on-device.
Sure, but my point is that's easy. The hard part is converting iOS to a point where you can get real multitasking work done. Even just switching back and forth between an IDE and a web browser is super painful compared to a desktop/laptop.
iOS won’t ever be a development platform. That would necessitate bringing down the AppStore walled garden, which would be a very stupid move from Apple.
Not sure why you say this. There are already various kinds of developer tools on iOS (see Codea, for example). They don't bring down the App Store walls.
An Xcode for iOS would be designed to make it straightforward to develop iOS apps and then sign them and upload them to the App Store when you're ready.
Sure, you can have toys like Swift Playgrounds in the AppStore, but real application development takes more than just Xcode. E.g. you need a shell and various command line tools, like otool, lipo, nm.
No one should pay an "apple premium" anymore for their computer- Their recent MBPs and their lack of response to the many complaints shows they don't care about reliability and quality in 2018- I am so happy now that I'm using a PixelBook as my dev machine (with the new official linux support)
Apple forces developers to use MacOS to develop for iOS on. Compiling code on iOS is banned (except WebKit). You'd ban all developers from having a portable development platform.
I don’t see how that could be a problem. If Apple the hardware manufacturer asked Apple the App Store owner to change that rule, I don’t see why they would not do so.
I think it was, but if most people who use them for work are sticking keyboards back on them then it might make sense to offer a clamshell or hybrid design.
How about a DualPad - it's a clamshell with an upper and a lower iPad. When you separate them, they're just iPads, but when combined they become something more powerful with different features.
Perhaps Apple could introduce a Mac Mini with an iPad-class version of the A12 for all people that need to develop for iOS but don't want to spend a ton on hardware. The basic tools like Xcode and some Apple productivity software could already be ported, most developer toolchains would follow. The Simulator would have realistic CPU characteristics. And Apple would ease in on the transition to macOS-ARM.
If they did that it would need to have the profit margin of an iPhone or iPad, at least. But frankly, I don’t know why they seem to hate the traditional form factors so much. The last time they upgraded the Mac Pro they said “we’ll totally never make you wait this long for a Mac Pro upgrade again.” Yet here we are.
In previous discussions about sub 10nm chips I kept reading that the only company which is close to achieving this feature size is Intel - and for all other companies it's just a marketing term. Does this still hold?
What are the chances the next gen iPhone gets a L1/L5-capable GNSS chip? That'd arguably be one of the better features too. There's some speculation that Pixel 3 will get it (BCM47755).
OT, but with the whole GDPR drama, this website has made a better effort than most to comply. However, I doubt that all the default opt-ins would be ruled GDPR-compliant.
I got pretty confused by their popup, because there are three options:
- "Update Privacy Settings"
- "Sounds Good, Thanks"
- "Not Now"
Apparently "Sounds Good, Thanks" is their wording for expressing consent, which is already odd, but what does "Not Now" mean? Does it mean no consent, so no tracking for now? In any case, pretty poor wording as well.
Chips are the enabler to all sorts of critical features and user experiences. Its not just the chip - hardware architecture and software are there too, but three examples:
- maintaining device responsiveness for more and more complex software and higher resolution displays
- 120fps UI (iPad Pro)
- 4k60 video
Re:4k60, try to find another device besides the iPhone that can shoot 4k60 that isn't professional equipment. Best you have is the new Samsung S9 which can do so for a whopping 5 minutes. That's primarily the chip.
They used to be critical features, but I just don't think the incremental improvements are as noticeable anymore. My iPhone 6s feels quite snappy yet.
The iPhone 3G to the 3Gs was an amazing leap in usability though. Everything just worked so much better, mostly just due to faster chips as far as I'm aware.
Yes. Apple’s chips have enabled amazing performance (far above Android on a single core) and very low power draw, letting them get buy with less battery/heat output for equivalent performance.
The 508 in my Nexus 5X is not quite quick enough to be smooth all the time so I can definitely see an improvement jumping to a more modern SOC. But the improvements have slowed down as mobile hit the same sort of limits that desktop had already exposed and now it is on for the same very slow progression of improvement.
2. Real news from Mark Gurman out since a few days now.
3. Extensions of previous years changes ("the camera improved with the iPhone X which had a new processor, the camera might thus improve this year for the same reason!").
Also... bad article if it’s meant to be hyping iPhone.
The worst scenario for Apple is if the CPU of the new phone is the “best feature”.
Phones are crazy fast right now. You know what they aren’t? Crazy though, crazy efficient, crazy helpful if you don’t want to be distracted, etc... phones are really stale.
Typed on an iPhone8 that I just can’t not tell the different between my older7 and was almost identical to my older iphone6.
I love my AirPods too but I'm still waiting for the wireless charging mat that works with the iPhone 8 and Apple Watch. Remember that? They announced it a year ago...
I would also like a flying unicorn.
Seriously, a high power SBC at a reasonable price point would be insanely useful for projects like portable MAME cabinets, video transcoders, scene displays, etc... I love the Pi but it's so old tech, especially the video cores, that it's annoyingly limited for some applications. Worse, the competitors with faster cpus often have worse video acceleration support, making them worse for this than the underpowered Pi.