Hacker News new | past | comments | ask | show | jobs | submit login
Apple's 7nm A12 chip could be the best feature in the 2018 iPhones (macworld.com)
163 points by rbanffy on May 25, 2018 | hide | past | favorite | 183 comments



I wish Apple would make a sort of Raspberry Pi like board with an A series chip, ethernet, HDMI, GPIO, SD Card, etc... for $50-$100 that runs Linux.

I would also like a flying unicorn.

Seriously, a high power SBC at a reasonable price point would be insanely useful for projects like portable MAME cabinets, video transcoders, scene displays, etc... I love the Pi but it's so old tech, especially the video cores, that it's annoyingly limited for some applications. Worse, the competitors with faster cpus often have worse video acceleration support, making them worse for this than the underpowered Pi.


Not a flying unicorn, but still a winged equine: the Mac Nano.

A tiny white puck with a single usb-c plug for power and data. Inside, an A12 running OS X.

Education and “enthusiast” market. GPIO as a dongle, of course.

They can release the A13 version in colors the next year :)


Apple has been targeting the general public filling their needs with easy to use device/software stack and while part of it is starting to fail (dropping AirMac series), I would consider them losing focus if they start to build tiny machines only targeted at geeks.

But I do feel that without extending their business what they're going to do with their massive cash just sitting around doing nothing.


Doing nothing is far better than making a dumb acquisition, and most acquisitions are dumb. Just buy back stock, they have plenitude of cash flow to fund reasonable R&D.


I think you mean Airport


It’s called AirMac in Japan


Take my money


Have you tried developing apps for the Apple TV? It's basically like that; just too low-powered to actually run macOS.


It most definitely is not. Not anymore. It's just as capable as recent Mac minis. A10X Fusion is a very decent chip.


I'm confused as to how you have a computer with only a single port... what do you imagine plugging the one port into?


A USB-C Thunderbolt Display that delivers power over the cable and has a USB hub built into it. These exist, though they're expensive.

Or, an eGPU enclosure, which also delivers power over the cable and has a USB hub built into it. (You just need to plug a monitor into it.) These exist, and are surprisingly cheap if you buy them bundled with a graphics card.


Not all Mac USB-C ports are Thunderbolt, though...for instance, in the MacBooks.


Previous MacBook 12” user here. Either power, or peripherals. Just not at the same time.


Macbooks have large batteries. Wouldn't make sense to have one in this if you're going for as small as possible form factor.


For most of the day my Macbook Pro is basically a computer with one port plugged into a dock with its case shut since I like using my LG 21x9 screen more than the built in screen.


Do you have a link to a good guide on how you can use a MacBook Pro with the lid closed all the time? (Obviously with external display[s].)

I tried 2 times -- granted they weren't very honest attempts, I hoped to have it done in 5 minutes, but failed.

I am looking for a no-BS quick guide if you have one lying around.


1. Make sure an external keyboard and mouse are available, wired or wireless is fine

2. Make sure your MacBook is plugged into power and an external display

3. Close lid

4. Your main desktop should show up on the external display. If your computer goes to sleep you can wake it up by clicking the mouse or typing on the keyboard.


Thanks.

Will that wake the internal display though? That would warm it up and that can be pretty bad when the lid is closed.


If the external display wakes up as your main display with the same desktop picture and icons as your laptop it means the internal display is asleep.

If your mouse is laggy and display looks fuzzy it means your external is mirroring your internal display and that your internal display is on. Turn mirroring off if that happens.

If you’re worried about heat (lid shut mode blocks the vent that runs the width of the keyboard) you can leave your lid open and use your external display normally.

I have not noticed a heat problem using my 2013 MBP Retina in lid shut mode after my two year old threw a phone at the screen (kids! Lol)


Right but your Macbook has a large battery. Something it wouldn't really make sense to have if you were going for a very small form factor.


Remove the battery and screen, you get the Mac equivalent of the Intel Stick. Depending on price, I could very much see buying a lot of them for the students (with a USB-C Digital AV Multiport Adapter or equivalent for their home use).

[Edit]I would then stock the labs with docking ports like something from CalDigit and LG displays.


Power I'd guess, since he didn't specify a battery. The display would be wirelessly beamed directly into your brain.


Apple chips famously have Imagination GPUs that have zero open-source support, a far ways off even from Nvidia. They are the absolute worst.

That's before you consider that all the stuff Apple did to that poor ARM probably means you can't even boot Linux into a terminal on it today.


>Apple chips famously have Imagination GPUs

Not anymore, Apple switched to in-house GPUs recently.


That makes it worse? The Imagination crap is shipped on Android phones, too, so there are at least vendor kernel drops to work from.


The odds of Apple releasing a development ARM board is probably roughly equal to that of them releasing complete documentation and linux kernel driver support to go along with it.


The vendor kernels are worthless, tainted, piles of crap. Vendors rarely, if ever, maintain them once they are posted publicly. The GPU still requires proprietary blobs to be even remotely useful.


Of course. But they contain non-zero amounts of information on the chip so you can do a proper implementation or, much less ambitious, update it for newer kernel versions.


Not in my experience. The code in the kernel is usually just a shim for loading the blobs that do the real work.


All the reverse-engineered GPU drivers we have now (freedreno, etnaviv, nouveau, lima) started out with intercepting kernel ioctls, being able to observe the blobs at work. Many of them at one point ran with the vendor kernel interface but open-source userspace.

In many ways the kernel part of a GPU is arguably the least important part and it's the userland where the shader compiler sits and produces the magic, but if you start from nothing it's very useful to have a working kernel interface that initializes the hardware and accepts command streams to run.


This is marketing hype.

Apple’s commits to the LLVM (where the shader gets compiled into GPU executable code) states the A11 GPU target is “PowerVR Rogue” ISA. Which is identical to how it compiles for the A10.

So it’s still using foreign IP for the instruction set. Maybe Apple moved things around/optimized internals? Get better path optimization, or improve IPC?

But it’s weird they didn’t add optimizations specifically for that A11. So an A10/A11 GPU just run the same opcodes.


Citation for those commits?


Apple's GPU is designed just down the road from Imaginations HQ. Guess where the designers previously worked...


It's a completely locked down system - does it matter if the GPU isn't open source?


Maybe I misunderstood "portable MAME cabinets, video transcoders, scene displays" and the comparison to the RPi but surely the intention is to run your own code on it from the bootloader up?


Ok, but Apple devotes considerable resources to make all of that impossible. Why pick on the GPU specifically?


Worse is the array of smart devices that are forced to use Android type hardware / Android OS. Looking at the Peloton cycle I have specifically which clearly runs Android OS. Could be so much better if it was on faster hardware / could link up to Apple Watch for stats.

Make a version of iOS that companies can build off of; maintain the OS patch cycles but let them put their own app / front end on it.

Tons of smart devices exist now a days; TVs, refrigerators, bikes, etc... There's a market for an Apple based system for sure.

But yes - I'll taking a flying unicorn also in the interim.


That's an interestingly spun perspective. So you're asking for Apple to produce an open source ecosystem around their hardware and OS, while simultaneously refusing to use the products of the company that did?

No devices are "forced" to use "Android type hardware / Android OS". They use it because it's free. That's... important. Right?


Android isn’t “free” in the FOSS or Beer sense of the word.

The important bits (hardware drivers and core applications) aren’t open source (AOSP doesn’t include the new default apps or standard libraries.

Also, the important stuff is tied to google services that spy on you.

I think it’s completely reasonable to reject that stack, and lament that no one offers a high quality, supported alternative software stack at a reasonable price.


I hope and I wish the market and Android will force Apple to make this decision within the next 3 to 4 years.

They are still not making Apple TV set.

With WatchOS and the Apple S3 chip already includes 90%+ of the features needed. I bet the S3 cost less then $20, and Apple could have sold it at $50 along with WatchOS to specfic market that Apple do not intend to compete. Fridge, Fan, Lamp, bikes, TV set, radio, kitchen appliance or what so ever.

Apple currently charges $1 for the controller of Lightning Cables regardless of volume. They could have just offer similar program.


I would love this. They could also push swift with it, so I'm surprised they haven't done this already.


Why would that be good for Apple?


It would be good (and bad) for the same reasons it was for microsoft + intel back in the ‘80s.


> Raspberry Pi like board with an A series chip, ethernet, HDMI, GPIO, SD Card, etc... for $50-$100 that runs Linux.

The upboard fits that description perfectly - http://www.up-board.org/up/

The CPU is an Intel Atom which turns out to perform very well. The internal storage is blazingly fast eMMC. It has an Intel GPU which means complete open source driver support. (I can't comment on the GPU performance as I use the serial console.)

They also put a 40 pin connector on it that is hardware compatible with the raspberry pi - ie the same pins, voltages, and functions. You do need kernel patches for full support of the pins because the pins are specced at 3.3V while the Atom is 1.8V. There is level shifter hardware that fixes that, which the patches control. The patches are public from Emutex (in github) and there is also a PPA if you just want to play with Ubuntu and not worry about it.

It uses an AMI UEFI BIOS. You can compile the kernel with efi stub support and the upboard can boot straight into your code.


Your best bet would be a SBC with i.MX8M - the same SoC that will be in Librem 5. That's because it's one of the best supported SoCs by mainline Linux kernel.

With Etnaviv drivers for GPU you can be pretty much covered. And I think it will get better.

I'm hoping that someone will make a netbook with i.MX8M. Or maybe even Chromebook - it would be then easily repurposeable.

I'm eyeing WandPi 8M [0], armstone mx8m [1] and Nitrogen8M [2].

[0] https://www.wandboard.org/products/

[1] https://www.fs-net.de/en/products/armstone/armstonemx8m-with...

[2] https://boundarydevices.com/product/nitrogen8m-imx8/


Releasing a chip like that would be close to useless without also developing a user-friendly hobbyist API like the Raspberry PI and Arduino's already have.

Releasing the chip plus a nice users manual of all of its registers and assembly instructions might pose a security threat for device jail-breaks that they probably aren't interested in either.


Seems to take about 4-5 years from release for the depreciation&upgrade curves to bring an iPhone under $100. At that point, the main thing keeping them from being what you're asking for is:

1) No one has been particularly dedicated to developing an open-source OS for these things, despite the fact that used phones are steadily becoming some of the most common devices in the world, to the point where we have to treat them like waste. 2) Apple's treatment of ports of any kind as a blemish.

Solve these problems, and you have the board in a slick package. :)

I'm less than half kidding. #2 is obviously solvable, as there are lightning dongles/docks for just about everything on your list. #1 gets harder over time with security measures but it has to be possible, it's a question of resources.


So basically a PC with GPIO?

NVIDIA Jetson is the closest you can get in terms of performance, but the price is closer to $500.

DX2 specs:

GPU: NVIDIA Pascal™, 256 CUDA cores (1302MHz)

CPU: HMP Dual Denver 2/2 MB L2 + Quad ARM® A57/2 MB L2 (1.4Ghz/2Ghz)

Video: 4K x 2K 60 Hz Encode (HEVC) and 4K x 2K 60 Hz Decode (12-Bit Support)

Memory: 8 GB 128 bit LPDDR4 (59.7 GB/s)

Display: 2x DSI, 2x DP 1.2 / HDMI 2.0 / eDP 1.4

CSI: Up to 6 Cameras (2 Lane) CSI2 D-PHY 1.2 (2.5 Gbps/Lane)

PCIE: Gen 2 | 1x4 + 1x1 OR 2x1 + 1x2

Data Storage: 32 GB eMMC, SDIO, SATA

Other: CAN, UART, SPI, I2C, I2S, GPIOs


There is video story at all (except a serial console), but the apu2 amd-based boards are nice, stable targets, and much, much faster than a rpi for networking applications:

https://pcengines.ch/about.htm


*TX2

Also it has a student discount to only $300 which imo makes it much more compelling if you're eligible.


Still, only 1.5Tflops (roughly 750Ti level), which makes it insufficient for bleeding-edge robotics, but 8GB is nice for inference. NVidia has some high-performing board for automotive, but that one is 500W for a change...


I'm right there with you. There are some pilikes like Espresso Bin that are much more powerful, but they just don't have the community support that Pi does. http://espressobin.net/

Still at $49 it's cheap enough I'm going to see if it'll work as a wall-of-ceph appliance. Would love to have an SBC designed to be an OSD node but Western Digital never put theirs in the channel and there's not much else comparable AFAIK.


Apple is so in love with walled gardens. I can't imagine them ever making truly open hardware. Perhaps they are uniquely qualified to make something open because of the amount of effort they have put into making things that are closed.


It's not that they love walled gardens, it's that they have different priorities than other companies. Users and usability are their primary concerns. Everything else is done to support that.

They have open-source initiatives, they like to be open like that when it doesn't interfere with those mandates. It's just that usability is such a huge thing that it impacts every step of their process.


Maybe something targeting schools/education?


> Seriously, a high power SBC at a reasonable price point would be insanely useful for projects like portable MAME cabinets

Right now one can buy an Exynos5422, which CPU wise (I've never tested the GPU) is 4/5+ times as fast as an RPi 3b, for the price point mentioned (<100$, board only).

For me, this is effectively a "high power SBC at a reasonable price", however, at the same time, users interested in SBCs (or micro-PCs) seem to have a wildly different array of requirements (I've read virtually everything: (very) low price, GPU, kernel support, open source drivers, arch, ports, form factor, etc.etc.), so a desirable SBC seems like a pipe dream due to the fuzziness of the requirements.

(For reference, the CPU performance ratio is calculated on single-core performance of a generic RPi 3b core vs. a fast Exynos5422 core, on OpenVPN maximal bandwidth).


Link/source ?

Btw imho the worst part about RPi is that it has poor ethernet performance. This improved a bit with RPi 3+, but still meh.


Here is a Phoronix test suite run:

https://openbenchmarking.org/result/1805276-FO-1703199RI52

It's hard to asses a generic ratio, since the performance increase is wildly variable. I think it can be reasonably assessed a "base" 3x performance ratio against the RPi 3B (with an outlier of 8x).


They'll fix it in the RPi 4 and then you can complain they don't have 10Gbit support, so at least you have those two things to look forward to.


> I wish Apple would make a sort of Raspberry Pi like board with an A series chip

Likely outcome of the unlikely discovery of some lost ancient document that somehow makes The Woz indisputable CEO for life.


Someone recommended in an article an ODRIOD-C2 over a Raspberry Pi. They said you get quite a bit of performance spending just a little more ($40). Here's a comparison:

https://www.phoronix.com/scan.php?page=article&item=raspberr...

I might buy one depending on what feedback I get on it. Anyone here tried those out for Linux or BSD projects?


Surely there's got to be something like that available already. I'm mostly dealing with low powered M0/M4 boards, but the selection in this range is massive.


There's an absolute ocean of old obsolete ARM cores clocked slowly. The difference in performance is so vast it would be hard to compare with an A11 chip, especially once you include graphics acceleration in the mix.

Apple has been doubling down every year on the "we can make it go faster", and the rest of the ARM chip makers have responded with a "meh". It's so bad that people get really excited for a Samsung chip that is no slower than a two generation old A series chip.


Performance is highly dependent on task and graphics is extra tricky to compare, but looking at geekbench, it seems the Samsung S9 is about halfway between iPhone 7 and 8/X. So half a generation behind is more accurate.


Well, the price is wrong, but nvidia's Jetson platform is very powerful compared to the *board SBCs:

https://www.phoronix.com/scan.php?page=article&item=march-20...


Jetson CPU performance is on par with chips like the RK3399 or the Exynos5422, which are 4/5+ times as fast as the RPi 3b (I owned RPi 2/3b series, and the Exynos).

Note that the first benchmark in the mentioned article is an outlier.


No, it's not. Having worked with the Jetson platform extensively, the TK1 was equivalent to a 5422 (because they're both built on the vanilla A15 architecture). However, the TX1 and TX2 ("Denver" cores) are closer to (though, still slighty behind) the current SD 835:

https://www.youtube.com/watch?v=l6y4atVl-mc

And when it comes to GPU compute, it's much more powerful than all the alternatives.


I'll correct to "comparable", as it's more appropriate.

However, it's not a "clearcut no" or "very powerful in comparison", due to the number of cores (no doubt about the GPU, I had made this explicit in the parent post).

I've run the Phoronix test suite (which uses a set of real-world applications)[⁰], and, excluding the outliars for both parts (Redis/OpenSSL), we're talking about 10 to 40% advantage for the TX1, and little more for the TX2; XU4 has even the edge in one case.

They're definitely faster for the majority of use cases, however, I wouldn't classify them as "very powerful in comparison", at least, when considering the performance of an RPi 3B.

[⁰] https://openbenchmarking.org/result/1805276-FO-1703199RI52


The TX2 is a dual-core chip (ignoring the 4 "littles" since they're unused in benchmarks). The 5422 is a quad core. And even then, it's outperforming the 5422 by a decent margin in most tests. Even c-ray, an intrinsically multithreaded application is close.

So you're correct, they're "comparable" in that if you take twice as many 40%-as-powerful cores, you might shrink the gap. But for any fundamentally single threaded tasks, a massive gap exists. And if/when you leverage the GPU (which can't be ignored, despite your handwaving), it becomes a canyon.


Let's call this product "iPhone mini". With a good name for it, I'm sure we can convince Apple to do it.


I find it's a pity there's no Mac mini anymore, at least not a powerful one, with dedicated GPU, a kind of NUC Skull Canyon.

I still run a 2008 Mini with Manjaro Linux..still good for Kodi


Have you looked at raspberry pi clones built around FPGAs?


I didn't even know these existed! From a googling I can only find hats/daughter boards for existing raspberry pis. Is that what you mean or are there ones that are all in one?


I own a parallela with a Zynq1020 fpga and a form factor similar to a raspberry pi.

Here is somehing that costs around a hundred dollars (a bit higher) called Zynqberry.

https://shop.trenz-electronic.de/en/TE0726-03M-ZynqBerry-Zyn...



It's not a pi clone, but google snickerdoodle board.


Like Apple TV?


The rock64 and variants should do what you're looking for.


I said I wanted a working GPU.


It does have a working GPU. There is the RK3328 specific libmali: https://github.com/rockchip-linux/libmali



Odroid N1 is getting close.


I'm not really impressed by this article. I don't see much of a performance improvements in desktop CPUs, so I don't expect them in mobile CPUs as well. They were catching desktop CPUs until recently, but now they are playing on even field, so I doubt that it would be 25% in single-thread. Sure, they could drop more cores, but that's it and only for specific workloads. Also I'm not even sure that I need that performance in my phone, I don't play games or run Idea there. Battery life speculations are very unconvincing as well. CPU is only part of battery drain, another parts are display and, most importantly, radio chips. So if my CPU eats 10% of my battery, those 25% improvements will be 2.5% which is not that impressive. They have terrible batter anyway, I have no idea how they managed to make a phone which turns off at -30C. How am I supposed to use it at winter?


Yeah. All I really want is more RAM so that I can tab between Safari and Facebook without one of them getting unloaded and losing my place.


Which model do you have? No way this should be happening on a recent one just between two applications.

On the X (3GB) this doesn't seem to be an issue for me at all.


I'm really impressed by how long apps remain in RAM on my 8 Plus. I'll play a game before bed and find it still where I left off during my commute the next morning, and that's with some browsing/podcasts in between.


as soon as one of those app gets access to more ram, it will use all of it, and you will have the same experience just look at desktop


We will have at least a good month of multitasking bliss ;)


You shouldn’t lose your place even if the apps get unloaded.


You would want the performance for new deep learning applications (for example better images with low light conditions)


Can someone explain to me why this matters?

It feels like CPU advancements in recent memory have brought nothing at all. It used to be, in the early days of smartphones, every generation brought something very new, something that can truly be considered an improvement. I remember the first dual core phone, the first quad core, etc. Nowadays, there is no noticeable difference at all.

Even now, where phones have indeed caught up to computer CPUs in speed, we have yet to see a phone really do anything more with that speed. The one thing I want to see, but nobody seems to be doing it, is a phone that converts into a desktop. Your laptop doesn't have to be separate from your phone. Your laptop could be a dock for your super-fast phone that converts to a desktop operating system. Of course, there have been a few products on the market that have attempted this, but none that lasted.


Games! More polygons, better frame rates, the usual...

Except for the fact that we're still limited by an imprecise and rather limited input device (touchscreen) that has barely changed in a decade...


Too true. Selecting text is still as worse as I remember it always being. But the battery life could be into the days. The processing capabilities may be similarly improved too. It’s not a nothing upgrade.


"The one thing I want to see, but nobody seems to be doing it, is a phone that converts into a desktop."

Samsung has its "DeX Dock" which lets you turn your Samsung phone into an Android desktop, and use the phone as a trackpad.


Ubuntu was working on this, and to be honest, I would very much prefer Ubuntu over Android in order to work or do development.

But it never was profitable enough and it died. Nowadays there are some Android and Windows phones with that ability, and I think they are not very successful as well.

In fact, you claim you want this, but if the technology were successful, you would already have one of these phones and one or more docking stations.

The whole idea could die in a few years.


Apple would be in a good position to create a dockable phone, however the x86 & ARM gap needs to be bridged in order for this to happen.

Perhaps that is their long game:

https://www.extremetech.com/computing/266773-apple-may-dump-...


Why would Apple cannibalize their desktop products?

Apple wants to sell more devices, not less. And it also helps their iCloud business (which wouldn't exist if everybody had only one device).


Agree with your second point.

Re:cannibalization, Apple is more than willing to cannibalize their products - just see iPod Nano v Mini, iPhone v iPod, iPad v Mac, iPad 2018 v iPad Pro. It's part of their playbook to cannibalize themselves, so that competitors don't wind up sneaking in market segments. This is enabled by organizing the company functionally rather than by product line, which lets them avoid the 'strategy tax' of pre-existing divisions that own products wanting to keep those products alive (bureaucracies self-perpetuate and all that).

That said, a phone that converts to a desktop (at least in how I think OP and others who bring this up think of it) is not part of the playbook, because its making the device do double-duty in UI/UX. See iPad not having a mouse.

I think Apple's perspective is that the glue that ties mobile UX to seated/desktop UX together is the cloud, and to your point, that involves multiple devices. The exception is non-interactive content (AirPlay), which third parties can license.


I guess I'm not really talking about it from a business-centric perspective, but rather one of progress. This is where I feel the next step in mobile computing lies, given these advancements in mobile CPU. But you're right, perhaps it does not make business sense, and that's why we're not seeing it.


I guess that too little competition means that progress has to give in to the business side of things.


I suppose a fast chip could always be under clocked and maybe that would get you less heat and power consumption.


TLDR : it does not really matter that much.


According to the public information from TSMC on the new process node, it looks like we're in for significant power draw savings.

>Compared to its 10nm FinFET process, TSMC's 7nm FinFET features 1.6X logic density, ~20% speed improvement, and ~40% power reduction.

http://www.tsmc.com/english/dedicatedFoundry/technology/7nm....

Also, any speed increases due to redesign of the cores would be on top of the ~20% speed bump from the process node change.


7nm is not going to be unique to Apple. TSMC has said that they have 50 customers using their new 7nm tech this year. That most likely includes Qualcomm whose Snapdragon chips power most Android phones.


It only seems to be unique in that apple may be the first there.

Obviously everyone is going to move to it as it’s availability increases.


AMD is launching VEGA 20 in 2H @ 7nm. Also, Nvidia is expected to launch a 7nm video card this year.


The dearth of WWDC rumors this year is really amazing to me. Apple really is taking leaks very seriously. We're 10 days away from the keynote, and absolutely nothing is getting out.

This speculation is the best that Macworld can do? No offense to them, but I mean, the rumors are slim pickings.

I really hope that Apple fixes its laptop lineup at WWDC. It's a complete and total shitshow right now. And that's coming from a huge Apple fan.

Even before you factor in the keyboard issues, it's a mess. The Air line maxes at 8gigs of ram, but you can configure for a better core i7 proc than the MacBooks. Of course the screen is awful.

The actual MacBook line has an anemic processor, but you can get 16gigs of ram and a decent screen. But by that time, you're paying MacBook pro money. Which at 13-inches, you can't get the top specs without the touchbar, and the midrange specs on the non-touchbar are stupidly overpriced even for someone like me who's willing to pay a premium for Apple kit. And still capped at 16gigs ram.

15-inch model suffers from the same problems as the 13-inch: You can only get the maxed specs with the touch bar and the mid-range specs are a joke for the price.

The only pro laptop worth buying right now is a 3-year-old mid-2015 MacBook pro. I was in the market for a new Mac a couple of months ago and literally could not find anything worth spending money to upgrade on from Apple. I picked up two 11-inch macbook airs for cheap on eBay because I adore that tiny form factor and use them as mostly dumb terminals for remote stuff. They are surprisingly useful and have great battery life.

Apple really needs to fix the laptops and give up the goods on the Mac Pro machine. I don't care if it's throwing a bevy of these chips in an ultra thin MacBook Pro-Air X Plus or what. But this situation is embarrassing. Reminds me of the mid-90s, when there were a ton of options if you wanted a mac, but nothing really worth buying.

My work ThinkPad is an ugly brick that gets the job fucking done. I'm about to buy one for my home if I don't just stockpile all the 11" macbook airs I can find and build a home lab cluster to remote into for doing real work instead.


You're right -- this has been a very rumorless year. Interesting given how many leaks we've seen over the last couple of years; I guess someone got serious about tightening up. Or, of course, there's nothing much coming.

You're dead right on the laptop lineup. Touchbar, whatever, I don't think that's an enormous issue one way or the other -- but the incoherency of the tech specs matters a bunch.


They did tighten up leaks. There was a big internal meeting about how Apple is actually not only firing people but also prosecuting them--which was promptly leaked.

And yeah, it's also possible that there's nothing really exciting coming that's worth leaking.


It could be a big software year. Most leaks we get tend to be supply chain, where as software may only be known about a handful of people until the day it’s unveiled.


>We're 10 days away from the keynote, and absolutely nothing is getting out.

I'm OK with this. I'm not so locked into a feedback loop that I need to know every little thing happening at Apple before it actually happens. I think the death of Think Secret did that for me.

The downside is that all of the Apple fansites (9to5Mac, MacRumors, etc...) are all full of crap "wish list", "deals" and contests disguised as articles because they have to fill their quotas.


Cannot agree more. You can get 6-core i9 Dell Precision with 4k screen, 32GB RAM, and 5x better GPU that weights 4lb for 2/3 of maxed out MacBook price. Sure the build won't be that good but with this new keyboard I'm not so sure anymore. And I get fucking ports.

The only reason I'm with MacBook is vendor lock-in with my work environment but that isn't going to last forever.


Only getting 8gb RAM in my 2015 13” Macbook Pro is one of my greatest regrets.


Same for me getting 128 gb storage


I paid for the storage upgrade, not the RAM one and I regret it weekly :( I saved 475€ on the whole machine thanks to exchange rate, but still the bill was salty.


I emphatically agree with every point. I am in the exact same position and feel exactly the same. I bought a 2016 13” MacBook Pro and got rid of it because it just wasn’t... better. Still using a laptop and a desktop from 2013. And waiting...


No rumors may also maybe be because they don't have anything exciting to show?


> Apple really needs to fix the laptops and give up the goods on the Mac Pro machine.

Apple just needs to give up on Macs. They obviously don't care about the product much anymore. License the OS and let OEMs and regular people build machines that can deliver the value that Apple can't or won't.


Apple’s Mac lineup is the most profitable PC business in the world, AINEC.


It's still a blip in their bottom line compared to their mobile devices. It's been pretty clear for years where their priorities are, and I say this as someone who's been an Apple fan since grade school. They've been riding their laurels hard for the last five years when it comes to PCs.


This article can be summed up as "a lot of things could be way better... maybe".


I hope at some point these chips make their way into an Apple laptop.


I think its only a matter of time before apple unveils what's basically high spec clamshell iPad pro and stops making new macbooks.


I disagree, I think they are doing a full about-face with these latest generation of garbage MBPs and I am hopeful for the refreshes. A lesson many will learn the hard way, eventually, is you don't scorn your power users because they are an endless source of evangelism and free marketing for your product.


> a full about-face

I'd expect double-down on that in fact.


Also, they're literally the developers for the mobile platform. iOS has a very long way to go before you can develop on it.


iOS is not particularly different from OSX. That's why Apple developers get to use a super fast simulator instead of the dog slow emulator we Android developers are stuck with.


Similarities between iOS and macOS don't have much to do with it.

The reason the Android emulator is slow is that it is actually emulating an ARM CPU.

The iOS simulator, on the other hand, is running an iOS natively built for x86, and Xcode targets the x86 architecture when you build your app for the simulator.


It certainly doesn’t hurt that the simulator only needs to run the iOS front end on top of macOS Darwin instead of a full copy of iOS, though.

Couldn’t something similar be done with Android on Linux, running x86 Android and using the host system’s kernel?


There's a lot of Android specific kernel interfaces that'd have to be added to the desktop. Stuff like the binder IPC. It's doable, but work.


> It certainly doesn’t hurt that the simulator only needs to run the iOS front end on top of macOS Darwin instead of a full copy of iOS, though.

Uh, no. The simulator literally boots an entire copy of iOS, with its own libraries, frameworks, and utilities, that's entirely separate from the host OS.


That is incorrect. The iOS simulator is not a virtualized environment. It's not running it's own kernel etc, rather it's just a set of iOS libraries and frameworks built for x86 and a thin compatibility layer so that they run on macOS.

You can actually see your running iOS applications, and all of the iOS daemons etc show up as first class processes on you Mac if you run "ps".

iOS apps running in the simulator are not separate from the host OS at all - they're running on the host OS, just like any other process. Try killing "Springboard" from your Mac command line and watch what happens inside the simulator...


Does anyone really uses ARM emulation? I think, everyone just uses x86 Android image which should be the same as Apple's iOS Simulator. And if you need to test ARM, you should just use real phone.


I've used both. ARM emulation is slow, as you'd expect from a CPU emulation. The default, x86, is blazingly fast. Of course, this is assuming you remember to turn on virtualization features on your CPU.

The default x86 is a fair bit faster than my phone....


Except in multitasking UX. Yes, technically there isn't anything stopping you from running a compiler on iOS, but there's so much more to having a relevant development platform than "does llvm technically run".


> Yes, technically there isn't anything stopping you from running a compiler on iOS, but there's so much more to having a relevant development platform than "does llvm technically run"

You can't run unsigned code on iOS (with a few exceptions), and I don't believe that anyone has come up with a viable solution to codesign code on the fly on-device.


Sure, but that's easy for Apple to fix if they want. The lack of true, useful multitasking on iOS is the much bigger issue.


iOS development builds actually have the compiler installed by default. It can also be installed on jailbroken devices via Cydia too.


Sure, but my point is that's easy. The hard part is converting iOS to a point where you can get real multitasking work done. Even just switching back and forth between an IDE and a web browser is super painful compared to a desktop/laptop.


Not really. The main issue is software.

Add a Bluetooth keyboard and you can code just fine on iOS. The problem is that there are no native tools for it (that we have access to).

That’s actually a very minor hurdle, not a long way at all.


iOS won’t ever be a development platform. That would necessitate bringing down the AppStore walled garden, which would be a very stupid move from Apple.


Not sure why you say this. There are already various kinds of developer tools on iOS (see Codea, for example). They don't bring down the App Store walls.

An Xcode for iOS would be designed to make it straightforward to develop iOS apps and then sign them and upload them to the App Store when you're ready.


Sure, you can have toys like Swift Playgrounds in the AppStore, but real application development takes more than just Xcode. E.g. you need a shell and various command line tools, like otool, lipo, nm.


Lacking multitasking is a productivity killer for a developer, typically juggling multiple balls at once.


Good thing iOS has had multitasking sinc the beginning then.


I think he/she was suggesting macbooks would move to iOS/ARM and macbook pros would stay macOS/Intel, at least in the near future.


Yes, pretty much that.


No one should pay an "apple premium" anymore for their computer- Their recent MBPs and their lack of response to the many complaints shows they don't care about reliability and quality in 2018- I am so happy now that I'm using a PixelBook as my dev machine (with the new official linux support)


I could see them doing something like the Surface where you have the detachable iPad, but a more powerful rig when its attached.


Apple forces developers to use MacOS to develop for iOS on. Compiling code on iOS is banned (except WebKit). You'd ban all developers from having a portable development platform.


I don’t see how that could be a problem. If Apple the hardware manufacturer asked Apple the App Store owner to change that rule, I don’t see why they would not do so.


So a 12" MacBook with a touchscreen?


I thought the point of ipad was to ditch the clamshell design?


I think it was, but if most people who use them for work are sticking keyboards back on them then it might make sense to offer a clamshell or hybrid design.


How about a DualPad - it's a clamshell with an upper and a lower iPad. When you separate them, they're just iPads, but when combined they become something more powerful with different features.


Sounds like something Asus would make... https://youtu.be/Z2ANnpHnUrc


To be honest I would think a A12X would be better with a few extra GPU cores.


As we are more and more able to make use of all the parallelism of our hardware from our code, that's more or less unavoidable.


Perhaps Apple could introduce a Mac Mini with an iPad-class version of the A12 for all people that need to develop for iOS but don't want to spend a ton on hardware. The basic tools like Xcode and some Apple productivity software could already be ported, most developer toolchains would follow. The Simulator would have realistic CPU characteristics. And Apple would ease in on the transition to macOS-ARM.


If they did that it would need to have the profit margin of an iPhone or iPad, at least. But frankly, I don’t know why they seem to hate the traditional form factors so much. The last time they upgraded the Mac Pro they said “we’ll totally never make you wait this long for a Mac Pro upgrade again.” Yet here we are.


In previous discussions about sub 10nm chips I kept reading that the only company which is close to achieving this feature size is Intel - and for all other companies it's just a marketing term. Does this still hold?


No. Intel is still on 14 nm [1] while TSMC/Samsung/GloFo "7 nm" should be roughly equivalent to Intel 10 nm.

[1] Don't believe the hype.


What are the chances the next gen iPhone gets a L1/L5-capable GNSS chip? That'd arguably be one of the better features too. There's some speculation that Pixel 3 will get it (BCM47755).


OT, but with the whole GDPR drama, this website has made a better effort than most to comply. However, I doubt that all the default opt-ins would be ruled GDPR-compliant.


I got pretty confused by their popup, because there are three options:

- "Update Privacy Settings"

- "Sounds Good, Thanks"

- "Not Now"

Apparently "Sounds Good, Thanks" is their wording for expressing consent, which is already odd, but what does "Not Now" mean? Does it mean no consent, so no tracking for now? In any case, pretty poor wording as well.


Have chips ever been critical features? I mean the ones that add features like NFC ok, but doesn't every chip make phones "faster"?


Chips are the enabler to all sorts of critical features and user experiences. Its not just the chip - hardware architecture and software are there too, but three examples:

- maintaining device responsiveness for more and more complex software and higher resolution displays

- 120fps UI (iPad Pro)

- 4k60 video

Re:4k60, try to find another device besides the iPhone that can shoot 4k60 that isn't professional equipment. Best you have is the new Samsung S9 which can do so for a whopping 5 minutes. That's primarily the chip.


It used to be, now most phones (except for some bad ones, mainly on the low cost portion) have mostly satisfying performances.

Sure, all of them drop a frame or two here and there and that's very annoying when you work on graphics, but nothing essential.

IF we could push the SOCs to have at least one week of battery life, that would be pretty awesome, but that's not going to happen any time soon.


They used to be critical features, but I just don't think the incremental improvements are as noticeable anymore. My iPhone 6s feels quite snappy yet.

The iPhone 3G to the 3Gs was an amazing leap in usability though. Everything just worked so much better, mostly just due to faster chips as far as I'm aware.


It's very noticeable on an iPad, where it has a lot more pixels to draw.


Yes. Apple’s chips have enabled amazing performance (far above Android on a single core) and very low power draw, letting them get buy with less battery/heat output for equivalent performance.


The 508 in my Nexus 5X is not quite quick enough to be smooth all the time so I can definitely see an improvement jumping to a more modern SOC. But the improvements have slowed down as mobile hit the same sort of limits that desktop had already exposed and now it is on for the same very slow progression of improvement.


Think you mean Snapdragon 805


High quality 6dof tracking please!! With leap motion-style hand tracking in the front facing camera! This is all we need for AR to take off!


Dont' waste your time, this article is rife with speculation, all based off of a rumor that the new chip will be 7nm.



A much more informative article, thank you for sharing!


This. The article in 3 parts:

1. A clickbait title.

2. Real news from Mark Gurman out since a few days now.

3. Extensions of previous years changes ("the camera improved with the iPhone X which had a new processor, the camera might thus improve this year for the same reason!").


Also... bad article if it’s meant to be hyping iPhone.

The worst scenario for Apple is if the CPU of the new phone is the “best feature”.

Phones are crazy fast right now. You know what they aren’t? Crazy though, crazy efficient, crazy helpful if you don’t want to be distracted, etc... phones are really stale.

Typed on an iPhone8 that I just can’t not tell the different between my older7 and was almost identical to my older iphone6.


Meanwhile I’m on the iPhone X and it’s the best phone I’ve used since the iPhone 4.

You’re using the model that was explicitly designed to be like the previous models. It’s not a shock it’s much like the previous models then is it?


I wasn't going to read the article just based off the source, thanks for confirming my inclination.


[flagged]


I love my AirPods. The Apple Watch is green with jealousy ...


I love my AirPods too but I'm still waiting for the wireless charging mat that works with the iPhone 8 and Apple Watch. Remember that? They announced it a year ago...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: