I use the optical digital out from the headphone port about 8 hours a day. It's connected to my audio technical dac/headphone amp.
I'll only be interested in wireless headphones once they actually have decent quality. And even then, I already have very good Beyerdynamic headphones (the T90 and the DT880) and I don't see why I'd spend money to change them.
I'd be rather annoyed at not having hdmi or sd card mostly because carrying dongles is a pain.
It's kind of sad but Apple products just do not excite me anymore. I remember before I used to try and follow the keynote to see the new iphone or I looked forward to seeing the first intel macbook when it was unveiled... Now, I look at the iphone 7 and there's really nothing in it that makes me want to buy it. There hasn't really been anything new with their laptop either (ok, maybe the new small macbook is not bad it's a good travel notebook)...
Genuine question, is there any particular reason DAC is connected via optical rather than USB? Or was it because AT's DAC do not have USB support?
I asked this question because I initially thought anyone with an high-end headphones are least likely to be affected if Apple removed headphone port from a Mac (because the included headphone port usually isn't powerful enough to drive some of the high-end models anyway, and many would likely to replace the DAC too, in addition to adding an Amp.)
Optical just feels like a cleaner interface. USB requires drivers, and introduces timing and electrical interference issues.
USB also feels like overkill for an audio interface. It allows transfer of data in both directions, and introduces security risks if you don't fully trust the DAC you are connecting to, or worse still if the DAC is connected to a network/the Internet.
I'm not suggesting for a moment there is any audible difference between the two, but optical audio has always struck me as a 'cleaner' way to output digital audio.
I would be extremely annoyed if it were removed from the MacBook Pro.
"USB requires drivers, and introduces timing and electrical interference issues"
On multi-tasking OSes with memory protection, everything that accesses hardware requires drivers; it's 'just' the amount of driver code that differs (which, for USB vs optical, probably is quite a big difference)
Also, originally USB didn't support isochronous channels at all, so one couldn't guarantee that one could send audio out continuously.
Most USB DACs do not require special drivers. The drivers are included with the OS. If you don't trust the device to be plugged in via USB then you shouldn't buy it. Most DACs are fairly simple hardware and tear downs are common. Check Head-Fi and other audiophile sites.
SPDIF also combines the clock and data in a single signal, it has more or less the same issues with jitter that USB does. Not that it matters since I'm not aware of a single proper study that indicates jitter is audible.
Often there is a receiver IC that will perform clock recovery and convert from S/PDIF to an internal digital audio format, I2S most commonly, to send it to a DAC IC. In this simple setup there generally is no more buffering than needed for operation of the pieces.
In more complex situations, either where the S/PDIF signal is being re-clocked or where the data on the S/PDIF line needs to be decoded, then they will be buffering.
To clarify some terms:
Re-clocking - the system detects the clock rate of the signal but does not use that clock as the reference, instead it will generate it's own clock at the proper frequency. Some buffering is needed here to offset drift between the clocks causing sample starvation.
Encoded data - S/PDIF (really AES3 for the most part) has a framed block transmission format not entirely dissimilar to a TCP packet. There is meta-data, frames, sub-frames, etc, it's not just 'pure audio'. Various formats can be stuffed into these data frames, like compressed audio data (5.1 Dolby Digital for example).
I have a similar setup; I use optical anywhere and everywhere I can because it isolates electrical noise. No mains hum, no ground loops, no 'chainsaw' noise from poor switched-mode power supplies. No copper path from source to amp at all.
I try to steer clear of some of the more 'woowoo' audiophile stuff; but if a $10 amazonbasics optical spdif cable can do more to reduce noise than any thousand-dollar unicorn-dust copper cable on the market, I'm in.
It's because the AT's DAC/amp that I use doesn't have usb support but I do have a USB DAC and it's proven more flaky. The advantage of an optical connection is that it doesn't need drivers and so you only need to rely on the constructor being good with hardware and not on them being good on the software side of things (which most Japanese company cannot do).
In this case, I use a DAC combined with the Amp but honesty while I see the difference that an Amp brings, I don't see the difference of using a different DAC than apple's DAC (I've tried doing blind tests). So, I could see myself just using an headphone amp without using a DAC..
It needs drivers, they are just the same as the audio jack. At least in my computer (not apple), the same board that drives the audio jack is, at the same time, providing output for the optical plug without any special software.
I've never used an external DAC, but it seems that the parent is weary of companies fucking it up with the usb drivers (or the firmware in the device) and introducing delays or artefacts in the audio stream. An optical to audio converter is easy, probably any third year EE could produce a working prototype.
Ok, to be clearer, an optical connection uses the drivers from apple and will always work well whereas I've seen some usb dacs that used custom drivers and would fail (example AT-HA90USB)
Yes, that's some pretty old information, but I used to think the same thing. The main cause of latency is audio buffer size. At the same buffer size, the difference between FireWire, USB, and optical is not really enough to care about. The main reason you'd use a higher buffer size is if your software can't handle half as much data in half as much time.
There were some problems with old software and hardware that have been resolved for the past ten years or so, but USB audio could be problematic in the early 2000s. Optical was never really any better than FireWire in terms of latency.
Right. I'm really not an expert on this but my understanding is that with S/PDIF supposedly you have a control over the buffers on either side whereas with USB audio you are limited by the USB spec.
Eh, that's not a very accurate way to describe things. S/PDIF itself transmits samples of audio one at a time, whereas USB transmits frames encoded in packets, but in practice there are going to be hardware buffers of some size, and the software will process one buffer at a time. The end result means that there are no real advantages to S/PDIF, and a number of serious disadvantages.
There's at least one really nice benefit: because there are no conductive wires in optical cables, electrical noise isn't directly coupled from the source to your DAC. This can make a big difference if your source is simultaneously charging from a noisy source like a car adapter [1]. Similarly, you won't create a ground loop by connecting your source to a wall-powered DAC.
[1] I have literally never seen an isolated [2] USB charger for a car, and I've looked. You can fudge it using an inverter, an isolation transformer, and a 120V charger. I'm not talking about "my golden ears hurt" noise; I'm talking about "egads this music sounds awful and crackly" noise.
[2] By isolated, I mean a charger that provides galvanic isolation between its inputs and outputs. Some switching power supplies are isolated and some are not. Any charger that plugs into a wall plug and isn't polarized MUST be isolated because, if it's not, it'll zap you. The upshot is that your car's electrical ground ends up connected to some piece of copper in or near your amp, and car's electrical systems are noisy as hell. [3]
i have a similar setup with a consumer level tube dac (highly recommended with a pair of studio monitors, btw).
1. i used all the usb ports on my workstation and don't want a hub
2. i use the usb and line-in analog inputs on my dac for portable devices, the optical one is permanently connected to my workstation.
3. optical feels 'native' in the operating system, it's just another output on the soundcard and doesn't require anything other than native drivers. this is probably wrong, but whatever.
Things might be interesting again if Apple ever tells the world what they're doing with the PrimeSense (designer of the original Kinect) assets they bought. Consumer depth-sensing cameras took a major step backward when they pulled the plug on PrimeSense.
I've been messing around with a RealSense depth cam lately and it's been somewhat interesting. In the past I played around with an old gen1 Kinect using stuff like TouchDesigner and ZVector but had to hold off on picking up a gen2 Kinect when the adapter needed for PC hookup was somewhat pricey and always on back order.
I haven't gotten into the gritty details yet and have only been using it with TouchDesigner and ZVector Touch so far but it's fairly promising as a low cost, decent quality RGB+depth camera.
I presume these assets were bought so they would have a competitor to Google's Project Tango if it takes off.
The first Tango phone from LG is far too big and underpowered but future Android phones could have this stuff built in unobtrusively. Couple that with better AR games (a new version of Pokemon GO, for instance) and Apple will badly need something similar to compete.
Here is my list of things your laptop should have:
* A headphone jack that supports the built-in microphone (conferencing, anyone?)
* The magsafe power connector - my kids are rough on stuff, even my stuff.
* USB ports, perhaps at least one of each old and new plug. As many as you can fit on the sides, cuteness be damned.
* an ethernet plug.
* function keys on the keyboard. E.g. - I use F1 and F6 A LOT in Jetbrains IDEA.
Here are things I could live without:
* an SD card reader
* any particular video plug form, as long as I can hook a monitor to something with an inexpensive adapter plug.
* CD/DVD drive - nice to have, but I understand about space and cost.
This is for work, not a fashion show. It's not supposed to be a tablet. I already have an iPad mini, and want something more, a lot more, in my work computer.
Oh god, why would you trade Ethernet for an SD card or video? I actually have the ethernet adapter as well as a VGA adapter, as I'm a traveling consultant who deals with many different environments and not all of them are state of the art. I've never used the Ethernet adapter, but I use the VGA adapter several times per year. I use the SD slot more than all of them combined, though, as my Time Machine disk goes in there.
Really though, this just highlights the difference between how people use things. I cannot imagine why in the world you would want Ethernet, as it never gets used. You probably have a hard time wondering why they include an SD card reader. There are plenty of professionals out there who never use any of them. I've only had a 3.5mm plug in my Macbook Pro once or twice.
For as much hate as adapters get, professional users are the least likely to lose them and the most likely to buy them. It doesn't make sense to make consumers buy adapters for things they use every day, but how many device-specific things do each of us have? How many of us have a box of cables, dongles, and plugs in our closet? All of us. There are a ton of different use cases for a pro device, and modularity really does make sense. I don't need Ethernet. I don't need a 3.5mm jack. I don't need HDMI. If I do need them (rarely), Apple makes a Thunderbolt adapter for it. It's the same reason many of us build our own PCs, so we can pick exactly what we want or do not want in our computers.
Note that the parent comment was emphatically speaking for himself, not all people. He is not pretending his list is universal. If the parent comment says he'd rather have Ethernet, well that's just his personal use.
The parent comment was a nice start for a conversation: rather than trying to argue over the perfect computer for everyone, the parent starts a conversation with a list of things which are true for him.
Right. If you read my entire comment, you'd see the first part was my (intentionally heavy-handed) reaction to his personal preferences and the second part was my contribution to that broader conversation. My opinion is that modular machines are inherently professional. Anything that needs an adapter for daily use is not, in my opinion, consumer oriented. Professionals put up with modularity all the time, because "professional" is a very broad term and we all have wildly different needs.
I'm agreeing that his needs are different from mine, and offering the opinion that this is exactly the reason I'm not upset about needing adapters for a Macbook Pro. I'd rather need an adapter for the few times I use VGA than sacrifice portability just to have a (for me) rarely-used port. I don't need a professional-level DAC integrated. What I do need is a serial port, but it would be crazy for me to demand Apple put in a serial port when a USB adapter works just fine for the very few people who actually need serial.
I like the idea of a large SD card for Time Machine. I have external USB connected drives, though. I use wired networking every day at work (alas, with a USB adapter now, since the hardware that hooked everything up to thunderbolt broke) - wifi is frowned upon in that environment.
Sorry to break this news to you: you are not Apple's target market.
I agree about your points, but ask your average consumer when the last time they used a Cat5 Ethernet connection was and you'll probably get a blank stare. Likewise, function keys. Anyone that's not a developer is likely to not even care about them.
This should pretty much be a solved-problem. They have a non-pro line where small, sexy and fashionable are the order of the day (lets assume the 'new macbook' will absorb the Air line), and a Pro line that's verging on desktop-replacement.
One thing I always loved was that apple's lineup was so clear-cut that I never had to put a moment of thought into which model I needed.
If they go this direction, what's the differentiator between the Pro and non-Pro lines? non-pro doesn't get a 15" option and Pro doesn't get a 12" option? That's it?
I think the 'new macbook' is going to absorb the Air line and become "the perfect starbucks machine" in a couple of iterations (everything that currently holds it back, is everything that held the 1stgen Air back). They have the 'average consumer' machine right there. Don't try to make the Pro the 'average consumer' machine too.
Or, better for marketing, a 'Pro' SKU which is actually targeted towards their bread and butter 'I don't know what an ethernet port is, but want "the best"', which removes anything and everything in the name of weight, power, and industrial design. _And_ an innocuous 'Developer Essentials' or something SKU that actually unashamedly focuses on being a work horse.
I'm curious, what would you put in a developer SKU? To my mind it'd just be the option to put a quad-core i7 in the consumer model, but perhaps we have different priorities?
Keep the Ethernet port, and USB-A ports (I'm sure those are next on the chopping block). If you're not trying to shave every gram and millimeter off, you can have a bigger battery. Function keys, and a 10 key would be nice too.
Actually Ethernet Port is not needed. I have a MacBook Pro 11,3 and it hasn't one. I can live with that. At my company it's stick into a Thunderbolt Docking Station with Ethernet. And at Home I probably don't need it.
I also have an MacBookPro12,1 and the Thunderbolt adapter was the first accessory I bought. You are right that you don't need it at home, but many customers do not have wifi at all and the only way to connect at their site is Ethernet.
I need it at customers place and also in the office - the WiFi is unreliable (but only on Macs, Thinkpads do not seem have that problem, go figure), I'm losing connection and the only way to reconnect is to toggle the WiFi off and on. It quickly gets tiring and the thunderbolt adapter solves that.
Thanks for the tip. Now here is the funny part: the router is Netgear WNDR3700(v1) running OpenWrt. I have exactly the same router with exactly the same software at home - and it doesn't behave the same. It runs great at home, no dropouts.
That's awesome for your use case, but I've come across tons of times where it be useful, forgot the thunderbolt adapter, and was weary of thunderbolt adapters just laying around.
Think not relying on WiFi while giving a presentation.
yeah it's useful to have the adapter, but do you need it so often that your ethernet cable is 100% attached to your notebook?
I mean I actually don't want only 4 USB-C's.
But I'm fine with having, MagSafe, 1-2 Thunderbolt, 1-2 USB-C's (depending on how many Thunderbolt), SD Card (I have a 256 GB extension card), maybe HDMI.
However Ethernet falls short cause it's so big. I mean the smallness of the Mac is a pro and a contra (it's too heaty I actually needed to clean my MacBook Pro since it got slower after there was too much dust in it). But just having it bigger caused by one port would be bad.
I mean, yes, I agree, the average consumer is not going to use ethernet nor function keys. The average consumer is going to use the laptop to browse, write documents, watch videos and, maybe, do some basic media editing.
And that precisely is who the macbook line is for: slim, portable, affordable (in apple terms) and, let's be honest here, more than powerful enough to do anything you might want to do with it.
The pro line, as I understand it, was always oriented to professionals; people that maybe needed more from their computers. That usually means much more horsepower: More powerful chipsets, definitely more memory, in many cases a better/dedicated graphics card, better and larger drives... And yes, sometimes, function keys.
Usually, it also means having the ability to connect external stuff: a standard media output for meetings and presentations, a network connection (even today, plenty of companies don't have wifi, or if they have, it's a guest one), external storage, graphic tablets and plenty of other stuff. Sometimes you need all this at the same time.
Personally, I've found the sd card reader useful a few of times but I could do without it, the hdmi port has always been very useful, I'd like more than two usb ports, but they are usually enough and the only times I've used the lightning ports has been to use adapters for other stuff (like an ethernet cable).
He must be in some part of their target market, considering that Apple has offered the MacBook Pro 13-inch form factor essentially unchanged for the last 10 years. (...with non-retina display, superdrive, ethernet jack, kensington lock slot, etc.)
As a counterpoint, I use my MBP for work (both coding and photo editing) and my list would be:
Should have:
* Headphone jack
* SD Reader
* Two USB ports, one on each side
Don't care:
* Video output ports - the few times I've needed a second/larger monitor AirPlay worked fine for me
* Function keys - I don't mind having to use fn to use the function keys
Don't want:
* Ethernet
* CD/DVD drive
* A dozen built in USB ports - if I need more than two I'll buy a hub
Not including a headphone jack would be really annoying but I would probably still buy one. I could buy a cheap pair of blue tooth headphones and be fairly sure they'll last for a while.
On the other hand, no headphone jack on the iPhone 7 means I absolutely will not buy one.
Getting rid of a video out would be the single dumbest idea they could ever have. You will literally lose every professional on the planet who needs to present. I don't know of a projector in a corporate environment that supports airplay. 90% of them are still stuck with vga.
I agree, but aren't there several USB -> (vga/hdmi/what have you) devices on the market? I'm assuming that most people who need video out functionality need a dongle, but maybe HDMI is enough these days.
You can use options in Keyboard preferences to default the combo-function keys to actual function keys (where you'd press fn to use them as hotkeys rather than the other way around)
The built-in mic can be set as the audio input in Audio & MIDI settings.
I love MagSafe. Very skeptical about USB-C. We'll see.
I love Lightning. I'd happily swap Lightning for all the USBs. I'd happily use lightning headphones.
I don't remember ever using Thunderbolt, so those can go.
I've never had a second (or third) screen. I've always just used Teleport span to my iMac. I'd probably feel different if I was doing video or something. So I'm ambivalent about HDMI.
I bought one of those flush Mini SD adapters. Which is cool. But I could live without it.
In short, for my work, I just need Retina, keyboard & touchpad, SSD, power, wifi, bluetooth, and some lightning ports.
I like magsafe as well, apparently making the cable that goes from the charger to the computer should be user replaceable, as some have issues with fraying and it's very costly.
Lightning, I'm kind of meh on, I don't use external drives, and only use it for ethernet and external displays (mini-dv to X), but would like to keep 2 of said ports... USB would prefer 2x A and 2x C if possible as a migration step intermediately.. and I ABSOLUTELY use the headphone jack gor analog headphones.
The rest, SD/Mini-SD are sometimes useful, but could make do with USB adapter... Retina, Keyboard, touchpad, SSD are essential... in fact, I want the older travel for the thouchpad back... user serviceable ram and a standard ssd interface (m.2) and size would be nice. Was so pissed my current MBP is stuck with the ram soldered on. Don't know that I'll purchase another mbp if I can't upgrade the ram and drive later with standard parts.
Just to clarify, my old MBP has all of these things (and then some). I want its eventual replacement to as well, rather than being minified/brutalized to an aluminum brick with a dinky keyboard, ZERO plugs, induction charging and wife/bluetooth only connections.
The Air I was given at work has no ethernet plug, even though I use wired networking every day, and the "College Humor" iPhone 7 review hits too close to home.
Headphone with microphone yes? MagSafe yes, multiple USB and thunderbolt/mini-DisplayPort ports, function keys can be set to function like that by default.
I just got a MacBook Pro and I don't think anything is missing from your list. Not 100% on the headphones.
He sounds exactly like the target market for the MacBook Pro to me. That's the one Mac Laptop with the horsepower for programming, photo editing, video editing, and other professionals. The average user you are imagining can save money by using the Air or MacBook-1.
I think getting rid of the magsafe connector is a much bigger deal.
I love my macbook charger, but it is expensive as hell to replace.
I really like the idea of a universal charger as well. So it seems weird that they are on the front line of the usb-c open standard on laptops, but use an incompatible plug on their phones.
Plug-wise it will make android a better match than iPhone.
To me, using the same charger for your phone and a laptop is weird - after all, your average phone charger will only deliver 10-18W of power, while a macbook charger usually needs 45-65W. So if you use your phone charger on your laptop, it will be too weak to charge it while it's being used(and will take hours and hours to charge from 0% to full), and if you use your laptop charger on the phone it's a massive overkill. My point is - it's all about what the customers expect. If you tell them they can use the same chargers for both phones and laptops, they won't be happy, because their laptops won't perform as expected.
As an anecdote I can offer an example of the Asus Transformer T100, which my mum has, and she constantly complains to me that it actually discharges while she is using it. I go and check, and of course she is using a 1Amp micro usb charger, instead of the 3Amp micro usb charger supplied in the box. Customers can't really be expected to know about this stuff(I mean, they can, but a certain group of them certainly won't).
Those are mild inconveniences — but how is it not better than what we have today?
Today you can't use your laptop charger to charge anything but your laptop (you can pass through your laptop's USB, but that's another cable) and if you have a dead laptop and no laptop charger you're out of luck.
With USB-C you can just carry around your laptop charger and use it to charge your phone if needed. If you forget your laptop charger you can borrow a USB-C cable from just about anyone (because they'll hopefully be universal), or spend like $10 on one if you're traveling.
That seems like something that could be solved by color coding or otherwise marking the connectors. I know Apple is allergic to color coding (and colors, generally), but there's opportunity to expose more information to the user other than simply by the physical shape of the connector itself.
IMO, the connector should prevent you from doing something dangerous and/or harmful (i.e. you mustn't be able to plug an unregulated charger into a device that's going to turn into a flaming ball of slag) but the fact that a device is going to charge slowly is sort of a "SHOULD" rather than "MUST". Marking the connectors -- put a different symbol or color on high-current ones -- would go a long way towards alleviating the unpleasant surprise of not having a device charge like you expect it to.
And while I know that designers and developers like to laugh at the stupidity of non-technical users, I don't think the average person is incapable of discerning that (say) "hotter" colors mean more available charging capacity, provided that it was standardized. Hell, they're probably fine actually reading numbers as well, if the number of amps of available current was clearly marked in the same place on various chargers and not in minuscule print.
Couldn't you build a charger with high capacity that detects the device and delivers the appropriate output? I can imagine it's not that much more expensive than a standard charger and would be a great selling point.
That's not a problem - you can safely charge your phone even with a 100W charger, your phone is smart enough not to draw more power than it needs.
The problem is, that a more powerful charger needs to be physically bigger. 65W Mac charger is definitely a lot bigger and heavier than a 5W iphone charger. So you can't make iphone chargers which are still very small, yet can charge MacBooks when needed.
I can charge my Chromebook Pixel 2 with my LG G5 charger. It will take forever to charge fully and will drain faster than I'm using it, but it absolutely works.
This was still infinitely more useful than my SO's Macbook/iPhone combo during the hurricane that knocked out power to my city for about 5 days. (90% lost power for some amount of time while around 20% still didn't have electricity 5 days later, my SO included. I didn't cover the recovery schedule after that.) While I was able to charge my laptop completely on a mobile power bank they was relegated to their phone until we were able to find a hotel with power restored to charge it at. Again, it's not like people will just be able to carry around one cable "to rule them all" but they will have a lot more utility with just the one cable than they used to.
Did you just stop reading the comment after that sentence?
I was able to fully charge my laptop from a power bank using a phone's USB cable while everyone else was relegated to simply using their phones and waiting for power restoration. What is not "working" about that?
Well, while I absolutely agree that being able to charge your laptop from a power bank is cool, I still think it's a usability problem - like in my original example, where my mum just doesn't understand how a micro usb charger can charge slower than another micro usb charger. Devices should be seamless, and you shouldn't have to worry about it - if you can charge your laptop with a charger, it should provide enough power to keep using it while it's charging - yet most phone charger will not do that with laptops. Tech savvy people won't have a problem with it, obviously, but a lot of consumers will.
I think you're understating how big of a difference there is between a laptop charger and a phone charger. Of course two identical AC adapters for phones are difficult to differentiate; the cables are the same gauge, the bricks are about the same size, they both came with phones.
I'm not sure your mother would have as much difficulty differentiating something the size of a laptop charger from something the size of a phone charger... It's not like laptops are going to start shipping with phone-sized chargers all of a sudden.
Sorry, what do tablet chargers have to do with laptop chargers?
Like you said, "case in point", phone and laptop chargers aren't really going to be getting confused. I think the point you made about your mother is very valid, because a QuickCharge supported charger and a typical standard USB charger will look almost identical, not just in connector shape, but also in absolute size. I agree that adding tablet chargers to the mix confuses that even more. I don't believe that is the case when it comes to laptop chargers vs phone chargers. They are very different in size (both cable and adapter size). They also ship with different products; it's easy to think phone charger, same shape, same size, they do the same thing. I don't know if that will hold true when the chargers are less similar.
I would imagine people will understand "small charger, long charge - big charger, short charge", kinda like they do now, regardless of connector shape. You're right in implying that there is a big difference between tech savvy users and the typical consumer, so maybe I'm completely wrong.
Although I generally dislike custom connectors, I have to admit to liking the Magsafe connector on my Macbook Air. But the idea of a universal charger holds appeal too.
I miss the fat magsafe. Thin magsafe doesn't really work for me. And the replacement cost. Why can't just the magsafe cable (to the brick) be replaceable?
We'll see about USB-C. I'll probably hate it.
As for the audio jack, meh. I won't miss it.
I love the lightning jack. I want everything everywhere to be lightning.
If they didn't care, why would they bother with a survey? Asking 'do you use this feature?' is almost the opposite of saying 'Lol, I'm removing this feature, I don't care if you use it'
Most people who buy MacBook Pros already aren't Pros. That's fine but if you then tailor the machine to people who aren't Pros ("most of our customers"), what are the Pros going to buy?
The Mac Pro was aimed at actual Pros. You can see how well that went.
The previous slots-and-bays Mac Pro was aimed at pros, generally. The current one is aimed at one specific niche. (I'm not actually sure what that niche is, only that I'm not in it.)
I'm kind of convinced the nMP was aimed at medium to large scale graphics video production houses. The kind of place where assets are kept on a SAN, accessed over a 10Gbit Ethernet, and no local storage is used at all.
They basically walked away from the single-user/boutique studio who don't need the complexity and structure of running a large datacenter. It was completely mind-boggling choice.
I mean, yea, at least they're asking... but look at the wording and response choices - they ask "...do you ever use..." and don't seem interested in how heavily you use it or how important a feature you feel it is.
But you already have to connect a bunch of dongles and wires if you're using it at a desk...
Personally I find myself almost always using my laptop in one of 3 ways:
- on my lap, in front of the TV, on dining table, etc. - no dongles or wires
- away from my desk, but working - 1 wire for charger (maybe), 1 dongle for wireless full size keyboard and wireless mouse
- at my desk - 2 wires for monitors, 1 wire for charger, 1 wire for speakers, 1 wire for USB hub... then from that, wires for keyboard and mouse and USB hard drive and all the rest
It's not obvious to me that USB-C is going to make much of a difference to this one way or another. The lack of a Magsafe-type magnetic connector is a backwards step, in my view, and I can't say I'm overjoyed, in fact I'm quite the opposite, at the prospect of having to having to buy a whole new set of hubs, chargers and adapters. But I don't think my desk will be any more or less messier afterwards, and it's not like they're doing a Macbook and giving you only one port (assuming all 4 ports are equal of course).
My use case is similar, and I'm bullish on USB-C improving this mess.
At your desk, you just need a dock. Everything you currently connect to your laptop can be connected to that dock, and the cables hidden away, then all you need to connect to your laptop is a single USB cable.
Away from your desk, why is the dongle for your keyboard and mouse a concern? Most dongles are tiny, and can be left plugged in permanently.
It isn't, the point is just that you need 1 dongle for this now, and you'll need 1 dongle for it with USB-C. This isn't making stuff better, but it isn't making stuff worse.
(Well, OK... it is making stuff worse, because you'll need to throw away all your old stuff and buy new stuff... oh, wait, you were thinking of keeping your old laptop? Well then, you'll need two sets of stuff! But the original point was solely about the cables and dongles.)
Not really. Now you need to charge the Macbook with MagSafe and connect the dock to it. It's still two cables. USB-C allows you to charge your mac through the same cord it transfers data to/from.
Not like it matters too much, but I see some point in it — I come to my desk, plug in only one cable from the dock and have it all set. Also I can easily see how demoing it from the stage falls under what Tim Cook calls "innovation" nowadays. Which seems to become more and more important to him.
> It's not obvious to me that USB-C is going to make much of a difference to this one way or another
If there is one cable for charging, displays, keyboards, speakers for EVERY kind of device rather than proprietary chargers /docking stations or a mess of thousand HDMI/DVI/VGA/sound/power cables, then within 2-3 years every good hotel room and every meeting room at your firm will be equipped with USB-C that is connected to displays, charging, sound and keyboard.
The nicest thing I can say is that it's getting better. Before WWDC/Xcode 8 this list would be twice as long, two years ago three times as long.
The first IDE I used was Eclipse, and it kind of set my expectations for what an IDE should do. My main complaints about Eclipse were "it's slow (lol Java)" and "it's hard to find settings in all the 100 pages of options".
Here's some stuff that comes to mind (it's 10 PM here so bear with me):
* Buggy. Infuriating stuff like sometimes not being able to expand compiler errors. xcassets go weird all the time (orphaned assets). It'll lose connection to the Simulator and I just need to quit everything and re-open. Speaking of the Sim, not even sure if they ever fixed testing Today Extensions, that was super-broken for ages.
* Crashy - this was under control for most of Xcode 7, but it's back since 8 beta 6 and not fixed in the GM. How it handles crashes is also horrible. It restores the windows out of order so a tiny window I had for documentation gets the whole project opened in it and visa-versa.
* Code-completion is not very intelligent. It got a lot better last year but not good enough.
* Refactoring tools are extremely weak for Objective-C, and 100% non-existant for Swift. This bothers me a lot. In a young project I like to rename and refactor things a lot and doing it with regexps or "change it and see what breaks" feels like the stone age.
* Interface Builder is still not a pleasant experience. Xcode 8 helped a lot. I wish it had a better UI for AutoLayout stuff but I'm hesitant to complain about this since I have no concrete suggestions.
* The iTunes-inspired UI in infuriating for stuff like seeing progress when there are multiple things going on
* Source/version control integration is useless, I don't know anyone who uses it. Do it right or don't waste your time. Only thing I use is blame view but even that is excruciatingly slow.
* Project file format is version control-unfriendly.
* Doesn't have niceties that Eclipse spoiled me with such as a TODOS view that lists //TODO: comments across the project
edit to add I also have a lot of trouble with the debugger, where a regular old NSException crash will send me to UIApplicationMain with zero context (aside from the exception details that get printed to the console), but I'm not sure where to place the blame there, lldb? the runtime?
Have you used Intellij IDE or any of the related IDEs? I use it every day for various stacks and languages but have no experience with Xcode. I'm a little curious about how they compare.
I should try the vanilla IntelliJ someday because my experience with Android Studio was abysmal (it seems to get better now, but it is too late as it has had me thoroughly disgusted from Android development)
I'm not the same person, but I'll never turn down a chance to moan about Xcode ;)
The main problems I had with it when I was using it, mainly Xcode 4/5/a bit of 6:
- super-lame extensibility support - there was basically none. You could write addins for it, but you had to use undocumented everything. To figure out the APIs, one would run classdump on Xcode's private frameworks, guess based on the method and class names, and experiment in conjunction with gdb. Then every other version, stuff would break
- inflexible window layout - yes, I know, Apple knows best, but, seriously, why do I have to have grep output share the same panel as compiler errors? And why does that panel have to be so tall and thin?
- stupid missing functionality - no M-x back-to-indentation, no M-x shell-command-on-region, no keyboard shortcut for jumping to the matching closing brace, no function for jumping to the next function, etc. (Xcode 4 shipped without even search and replace in selection! - and it took them about a year to get it back in)
- confusing rules about window layout - I used Xcode for 2 years and never figured out what was going to happen when I pressed Cmd+Shift+Y. I tried to set up a separate tab for the debugger, but it was 50-50 whether Xcode would use it. In the end I just gave up and let Xcode do what it wanted
- no memory dump window in the debugger as I recall?
- lame assembly language support - you can't step instruction by instruction using the standard keys (you have to dive into the lldb command line and do it the hard way), and the register display is shared with the locals, so you keep having to scroll (and thanks to the inflexible window layout you can't split them apart and have them separate)
- seemingly no thought put into which files are per-project (and so live in version control) and which are per-user (and so want to be gitignored or whatever). The default is for per-user build configs, which is... well, not a crazy thing to support, but it's a poor default
I've used numerous packages over the years, and I've ended up getting used to every single one... except Xcode 4.x and later.
Xcode 3.x, after a fairly ordinary period of adjustment, I had no problem with.
(I'll give Xcode 8 a quick spin this month; I used Xcode 7 a bit a couple of months ago and it suffered from many of the same problems. But is it my job to try using stuff that was awful in the past, just on the off-chance that it might have been improved, or is it Apple's to make software that doesn't suck so much that after using it every day for 2 years I basically resolve never to use it again and indeed to this day actively turn down work just on the chance I might have to use it? There are, of course, arguments for both... but I'm going to go for the latter myself.)
Honestly I would be completely fine with just about everyone doing away with the 3.5mm jack but only if there is a good plan to move onto something just as universal like, say, USB-C AND there are already compatible headphones out there for it. Yes it will suck for many who have older headphones but if it truly helps in the miniaturization effort of technology I would be willing to bite the bullet if it's universal.
Wireless right now kinda sucks. I've gone through 5 different pairs of wireless headphones ranging from cheap $30 Amazon ones to fairly expensive $150 ones* and pretty much all of them have connectivity issues regardless of the phone I use even when they're in my pocket. Plus battery life isn't very good. So we need a good, wired solution here.
Now moving it to lightning? Absolutely no way.
* Surprisingly the cheaper the bluetooth headphones the better audio and connectivity I've had. Go figure. My latest $70 LG headphones are MILES above the $150 ones I bought and broke a few years ago (can't even remember the brand at this point).
I think most technology is already small enough. I don't really care whether my laptop is a centimeter thick or seven millimeters; phones getting thinner doesn't really help me either. What's the point of making everything thinner? Saving weight is useful, but you quickly hit diminishing returns.
You're speaking of the comparisons of yesterday's computer with today's. These are minimal changes that add up. If you looked at computers 8 years ago to today the difference is absolutely astounding. But 1 year ago to today? Underwhelming.
Making things smaller have a lot of benefits.
- Axing analog components allow for smaller, digital components that can contain more capabilities
- Lighter weight
- Typically better battery life / efficiency. You can fit more battery in it and the signals have less to travel.
- Research / betterment of humankind. The more we can shrink things the better, easier and faster we'll be able to use the same type of technology in smaller devices possibly even ones that embed into your body.
You have a point, but when the way to make the devicer smaller is to just leave out useful parts because they're too big, that seems like a step backwards.
I wouldn't mind my mac being inside of my phone though, with just (properly working) wireless connections to an external screen, keyboard, touchpad and wireless charging.
Bluetooth audio is often criticized for inferior quality, but the truth is, many of the people I talk to just have experience with cheap (or older) Bluetooth audio devices - often with inferior hardware and poor audio codecs. I've owned some really bad Bluetooth audio devices in the past, so I can understand the criticism.
Still, I believe Bluetooth audio has come a long way. I personally use a pair of Sony Bluetooth headphones almost daily (with aptX on my MBP), and I've had absolutely zero issues with pairing or sound quality for consuming content - music, video, voice, etc. Granted, I don't use them for pro audio, but when I do occasionally need the additional benefits (lossless quality, low latency) for audio recording/mixing, I use what pretty much every other pro audio person uses: an external DAC with wired studio monitor headphones or speakers.
The point is, I believe most people would be happy with decent Bluetooth headphones. Those who wouldn't be happy with Bluetooth are likely to be using external audio hardware already.
I don't want to deal with charging my headphones, managing remaining battery life, keeping track of another charging cable at home and work and travel, etc.
That's a fair point. Managing headphone battery life has been pretty easy for me, but of course not everyone is the same.
Over-/around-/on-ear headphones tend to have pretty good batteries (the bigger size usually lends itself to longer-lasting batteries). In my case, with daily use of a few hours per day, I normally go for about two weeks before needing to charge my headphones, and even then, it's just a standard micro-USB for charging (I use the same cable for several other things anyway).
The limitation/hassle with batteries is obviously more pronounced with earbuds / lightweight headphones since they'll require more frequent charging. Apple seems to be alleviating that frustration - at least somewhat - by combining charging with the carrying case. As charging becomes more transparent to the user (or integrated into their routine), the less of an issue it becomes.
>many of the people I talk to just have experience with cheap (or older) Bluetooth audio devices
Dollar for dollar you're going to get better sounding headphones with a 3.5mm connection than Bluetooth. So with price being equal it will be the Bluetooth cans that are inferior quality, every time. This is significant as a person for whom value per dollar is of utmost importance (and being wireless isn't).
That might be their marketing message, but I think you'd be hard pressed to tell me how, exactly, Apple products are in any way more luxurious than other products on the market. Higher quality in some cases maybe, but I don't know about luxury.
That said, it doesn't challenge the argument that at any price point, be it $100 or $10,000, you'll get better sound from hard wired cans. Is that not party of the luxury equation?
Completely agree here. Anyone that is using a Mac for Pro Audio is using USB, Thunderbolt, or on older platforms, even Firewire to an external DAC.
I have a pair of Jaybird X2 bluetooth headphones which are fine for consuming content, but when producing music I use an Apogee Duet connected to a pair of Yamaha HS-8 studio monitors.
Here's a concept: things that don't need to be charged. If Apple removes the headphone jacks from their computers I won't be buying their computers anymore.
I charge a phone and two tablets, one of them a Kindle Fire. And I charge a watch. The nightstand full of charging things train has left the station for the kinds of customers who will adopt AirPods.
The only headphones I have that connect via a headphone jack are active and need batteries. Which you could argue need charging as I use rechargeable batteries.
You must be pretty unhappy with Apple for such a thing to be the breaking point, or audio on your computer via headphones is super important to you?
I only use it for conferencing. I use my phone for music, etc.
Of course I could never be an audiophile, I have some hearing loss, so I have some built in tolerance to quality deficiencies simply because I couldn't discern them anyway.
I'm actually afraid to use the audio jack on my MBPr. It grabs the adapter so hard I'm afraid something in the jack will break. This happened on my last MBP where the jack got stuck in audio mode due to some crappy little mechanical switch getting stuck or breaking in the jack.
You're right - I was mistaken (and will update my comment); the iPhone doesn't support aptX. Admittedly, I use the headphones more often with my MBP, but I haven't noticed any quality degradation when using my phone (as long as the source is a higher bitrate/quality).
I agree. I'm using a brand new pair of Beats (I know, I know, but they came with the MBP) and they sound really good. A little bass heavy but that is not because of the Bluetooth connection. Now, maybe a BT connection won't satisfy the most professional audiophile, but they should probably be using an optical link anyway.
I am completely different on this one. I go to work in various places and shove a pair of high end wired headphones in my pack, and love the quality. I have been very unhappy with the best bluetooth out there. I realize I'm probably not representative of most consumers, but that's what the 'Pro' part of MacBook Pro stands for. Mainly, I'm not going to be shoving a DAC in my pack. The idea of not including such a tiny port is absurd, and I have to start wondering if it's not just all a business scheme. Either way, if they go this way, I'll most likely just bail on the entire ecosystem, and I think many people are teetering in that direction anyway at this point. In some ways, it will be wonderful, as there will be so many more people making linux work even better on Lenovo's and whatnot.
I'm kind of particular about headphones, there's only a handful I actually find comfortable, and charging is always painful for wireless, so I'd actually prefer analog, low-tech... the cost difference isn't worth it and greater chance of failure in my experience. Keep it simple already.
For me, personally, anywhere that I want to plug in my ATH-M50x is somewhere that I'm not moving, so I don't mind a dongle. USB-c provides enough wattage to power an external DAC and if Apple does get rid of the headphone port on the next MacBook Pro, then that's what I'll buy. (Currently I plug the headphones into the headphone port at work, but at home I have a DAC/Amplifier plugged into my monitor's digital audio out, thus the mac's audio goes over HDMI.)
I care enough about sound quality that these headphones are the only ones I'm interested in using for awhile.
Apple has enough of a track record of getting rid of things, and people thinking it's absurd for them to do so, only within a few years for it to be obviously the right move. So I trust them a fair bit.
One of the biggest examples of this is releasing a phone without physical keyboard. How absurd!
There were several years of people insisting the iPhone was going to fail, or was inferior because it didn't have a physical keyboard like the Blackberry.
I use the same set of headphones with an Aune T1 DAC (I'm sure there are better ones out there, but I bought it a few years ago) and it works great. The thing I really like about this setup is that the DAC is class compliant, so it will work if I connect it to any iOS or MacOS device, although it requires a powered USB hub when I connect it to my iPad.
It's kind of cool to have a lossless, high quality audio setup you can listen to with a mobile device, although I'll be the first to admit that you could spend a lot more money if you wanted to get a better setup.
After not having a VGA connector, making MacBooks clumsy for presentations, not having an ethernet connector, making it annoying for anything networking related, killing headphone, SD card and USB-A will turn those laptops into nothing more than iPads with keyboards.
Oh no please. The fact that Macbooks do NOT have the old bulky VGA connection is a win in my book. The DisplayPort/Thunderbolt connection is much more versatile. I can hook up the macbook to whatever connection that I need, be it VGA, DisplayPort (full sized or mini), DVI or HDMI.
The old, antiquated VGA connector is only good for older projectors. Plus, when the notebook gets older, it is among the first things to corrode.
Is it such a bother to add an adaptor? Or a cable. Those can be found for cheap today.
I can understand that the Ethernet port can be more useful, but I have plugged an Anker adapter to the monitor in my office. I just leave it there. Or take it with me, if travelling.
You may call it 'older projectors', but it is the defacto standard. Any conference will make sure that at least VGA is available. It doesn't help of you show up with display port and the conference offers DVI.
I never had a female VGA connector fail. I wonder what you have to do to make that happen.
Apple's thunderbolt-ethernet dongles are from a mechanical point of view an extremely poor design. If you expect problems with a VGA connector, expect those dongles and the port on your MacBook to fail way earlier.
Don't get me wrong, it is perfectly fine if Apple offers modern connectors. It is just that some connectors, like VGA, RJ45, USB-A and 3.5mm jack are so universal that not offering them is just making lots of people suffer for no good reason.
And as for those connectors being bulky, just use the opportunity to add a larger battery. It's a MacBook Pro, it can use all the juice it gets.
Any conference will make sure that at least VGA is available
I can't remember the last time I was presenting at a conference and they didn't have a Mini DisplayPort cable available. Macs are popular enough for presenting that it's a non-issue everywhere I've been.
While I agree with the sentiment, presentations via google hangouts/screen sharing has been absolutely wonderful. It's cross platform and in our experience usually "just works" and doesn't require any wires!
If you have no confidentiality concerns at all in any way in your work, sure... My clients would not be happy with me blasting there stuff over the interwebs.
If you have a decent internet connect. If the screen you want to display on is connected to the network. If nobody in the room is playing around with the wifi just for kicks. If you want to present data that you do not mind sharing with google. If your client doesn't care one iota about security. If all the equipment involved was purchased in the last few years. If you are on an even-numbered street.
Wait until you want to display a breakdown of a merger bid to a board of directors who each had to lock their cellphones in a box before entering the room. Sometimes Google hangouts isn't an option. Sometimes that physical wire is needed.
Totally agree. It's been wonderful to have but I know it's not a fix-all.
We're very close to a world where we only need to plug in the USB type-C connector and we'll get power, internet, and display all in one cable. To me that's the end-game.
Up until this moment I actually believed Android/ChromeOS was ahead of the curve with USB type-C. I think we'll reach that moment as soon as we have Type-C laptops from Apple and Google.
Not a criticism... but when the directors locked their cellphones in a box, did anyone check that they didn't have a secondary phone or other recording device on their person?
It's a ritual, not an absolute safety. The worry isn't that the board members may record something but that someone may have hacked one of the phones. Accounts of hacked mp3 players or cameras are extraordinarily rare. Also, no ringing phones during an important meeting. Only if you are into national security stuff are people physically searched.
This. After owning four generations of MacBook Pros, I just bought a ThinkPad. Not having to carry around half a dozen dongles will probably offset the weight disadvantage, and definitely reduce my aggravation.
Thunderbolt to VGA is only a $29 adapter ($19 if you want to use a non-Apple branded one) that I'm happy to carry to avoid having a thick, ugly notebook computer.
The DB15 adapter is insanely thick for a modern computing device. It should have died years ago, but unfortunately it's still required for most aging conference room video equipment.
> PC manufacturers stuck with PS/2 keyboard connectors for another decade after Apple switched.
PC manufacturers still sell PCs with PS/2 keyboard connectors. Because once you lay out a thousand bucks for a KVM switch, you're in no hurry to throw it away. Because people still love those mechanical IBM keyboards. Because you can't plug a thumb drive into the keyboard port of a machine at a secure site. And PS/2 to USB converters are full of bugs.
Meanwhile PCs have had USB support since USB has been a standard. Apple "dropped" PS/2 because they never had it -- Macs used ADB. Adopting USB (from Intel et al) was a bid to stay relevant by becoming compatible with PC peripherals.
Dropping ADB (and VGA and RJ45 and so on) are nothing but Apple saving a buck at the expense of the people who still use those things. You can have the new thing without destroying the old thing.
You can have the new thing without destroying the old thing.
OK. So every computer needs to have, at minimum, multiple USB ports, probably at least one type C USB port, a PS/2 port (really at least two PS/2 ports), a serial port, an ethernet jack, a CD-ROM drive, a floppy drive, a VGA port, a DVI port, probably an optical audio port, an HDMI port, a pair of audio ports for stereo out, an SD slot and a 3.5mm headphone/microphone jack.
After all, it's wrong to destroy the old things just because there are new things. So I'm sure you'll get busy right now starting up a company to make computers that come with all of these and they'll sell like hotcakes, right?
There are a bunch of PC makers who already do. Take a look at the back of most workstation-class PCs.
About the only thing you're going to have trouble finding is the floppy drive, and that's because they're legitimately defunct. A PS/2 keyboard is just as good as a USB one; a floppy disk on a new PC is a useless anachronism.
Well, after the VGA we got DVI, there is HDMI, display port, mini display port, and I probably forgot some.
All the time, VGA (and headphones jacks) just work. It is not the most brilliant solution, but it works.
I don't know the details, but some combinations, I think going from HDMI on a MacBook to DVI actually don't work. I guess due to HDCP getting in the way.
About adaptors, nice if you use the laptop only on your desktop. In the field I managed to break a thunderbolt to ethernet dongle and it also took out a thunderbolt port on the MacBook. It used to be that only a cheap RJ45 patch cable would get old. You toss it in the bin and get a new one. I guess that's progress.
HDMI to DVI works without issues on the MacBook Pro. Just make sure your display accepts digital input. Some older displays/projectors had DVI without actually supporting the digital in pins and only had the analog (VGA equivalent) pins.
I know VGA is on it's way out, but it's a sad thing. It's the last open-spec video monitor connector that we have. You need a annual licence to make anything that talks HDMI, and IIRC the org behind DVI closed the spec a few years ago.
I love the optical output embedded in the headphone jack. I think it's awesome and extremely useful for playing movies on my older home theater which doesn't have the HDMI switching capability to get DD 5.1 that way. And it's a lot more reliable than trying to 'cast' the video, particularly when the codec may be unsupported by the receiver as well.
I gotta be honest, I had no idea there was an optical output in the MBP headphone jack. Been using OS X since 10.3 with almost exclusively MBP's the whole time. Shows how much I use audio stuff. >.<
Every day I put my MacBook Pro on my desk at work and the plug in, power, two thunderbolt connections, hdmi, usb, and my headphones. Sometimes I use the SD card reader. There isn't a free port on my computer. I would actually really like an extra thunderbolt or USB port. So I'm hoping they don't reduce the port count on these things any more.
may i ask what you connect via thunderbolt? I connect the two thunderbolt to two monitors. I might actually connect to three now i think about it cause i still have an HDMI out that is not used
I have one constantly connected to an ethernet adapter because my MBP's wifi cuts out randomly throughout the day (which kills all my remote sessions of course).
This helps only with laggy connections with the occasional lost packet. Not with hard network drops (not to mention, there is more than just the terminal impacted by hard network drops).
No, mosh doesn't work like that. You can turn off the network connection in one of the nodes (both the server and the client) for hours and when they come back you still have your session like it was before the connection dropped.
I do this everyday on the client side (taking my laptop to work, opening the lid and having my session going on), and sometime on the server side as well.
* One port dedicated to a monitor
* The second has to do double duty, swapping between ethernet ports and another monitor as needed.
I also have a USB->DVI adapter (https://www.engadget.com/2013/05/11/kensington-usb-3-0-multi...) when I need both the monitor and the ethernet port, but it's crappy. (Flickering cursor, occasionally cutting out, some applications like IntelliJ and Sublime Text have issues rendering on that monitor)
FYI unless you have a dedicated graphics card (if you have the 13" you don't for sure) you can't use both Thunderbolt and HDMI to power 3 external monitors. Source: I tried and expected it work based on various news articles that said you could but left off the fact you need a dedicated graphics card. That said 2 external and my macbook open is good enough until Apple releases an upgrade to the MBPr that I think is worth buying (and upgrading the the 15" with dedicated).
I don't want to get bothered with batteries or similar hassles when I'm using my notebook (as a desktop)
Yes, please let me add another short-life battery powered gizmo to my workflow
I use the old trackpad (with AA batteries), I have no interest in some rechargeable crap that will bother me as much as other (mandatory) battery powered devices
That's the case for me as well. I love the old wireless keyboard & trackpad that take the AA batteries, because I can swap them over so quickly, no hours of downtime due to recharging. Same with my Bose QC25 headphones, which last for weeks on a single AAA battery - I just keep a couple of spare AAAs in my backpack, and if I'm travelling it's easy to buy AAA batteries anywhere.
I do have a pair of Bluetooth headphones I use with my TV, but I never use them with my MacBook Pro, I always use the QC25s (and the media control cable would be awesome, if Apple hadn't hard-wired play/pause to only work with iTunes).
> I use the old trackpad (with AA batteries), I have no interest in some rechargeable crap that will bother me as much as other (mandatory) battery powered devices
Honestly, I think Apple is trying to figure out just how much stupid stuff they can pull off and still have people praising them left and right, just because they're Apple.
I don't get it. They have the Macbook Air and the "Macbook" (12") that target the casual-ish consumer-ish market for people that want a fancy, shiny Facebook machine. So why do they have to dumb down the Macbook Pro. It is used, as the name suggests, by professionals.
MBP is most widely used pro audio laptop. Some people I know do simple mixing without plugging an external audio interface, just using built-in audio. It seems that Apple does not care about those people anymore.
I do pro audio. Nobody serious is using the built-in audio interface. If you're recording, you're probably using some hardware from Apogee. If your DJing, you're probably using some hardware from Native Instruments.
Why is asking a question to existing customers "Apple does not care about those people anymore"? I don't understand this sentiment as their actions show the exact opposite of your statement. Is this meant to just be a flippant comment to garner upvotes on HN? I don't get it.
It's also a reasonable quality DAC and more than good enough to do work on the move with. Your statement makes no sense, unless you measure "proness" by the girth of a connector?
Dear Apple: I abandoned my unlimited-data-for-life AT&T plan and jumped on Google Fi the last day they were offering phone discounts, just to avoid buying an iPhone 7. Yes, I actually use my SD card slot and my headphone jack. And my USB ports. The retina macbook is as close to perfection as I have ever imagined a laptop could be. Please, stop.
Please remove all USB ports, headphone jacks, and other extraneous ports. It's more important to build for a functional ideal of simplistic beauty than for any kind of real world usability.
That is what I really want as an Apple consumer."
/sarcasm
I wonder if there truly exist people like this who are ruining Apple products for the rest of us.
For developers, the MacBook Pro is becoming less and less useful as Apple increasingly configures it for casual and business users. I've been hearing increasing grumbling about all the adapters and dongles you have to buy and carry. The prospect of no 1/8" jack is another step in this direction. I've noticed devs moving increasingly toward the lenovo/linux combo, or equivalent, as a result. I'm not sure if Apple realizes how important it is to their business to have developers choosing their laptops, or the loss when the tide starts pulling away. It's been a difficult line for Apple to walk... between offering what developers want, and what consumers want, but you actually need both for a full functioning ecosystem. They are now beginning to make it all a consumer and business line, and I think they don't realize the importance of the ecosystem. Very worrisome.
While true USB-C is slowly becoming the standard connector so, eventually, there will be plenty of USB-C sticks until you no longer see USB-A at all. The dongle would be annoying but I would at least feel like it's moving in the right, universal direction [in this case].
I am not looking forward to it. I just looked around my office, and there are 55 USB ports within arm's reach. (both host and target ports.) I will never ever get around to replacing all those devices with USB-C, so I'll be carrying USB-C dongles for a decade.
Standardizing on a single port will eliminate a lot of issues like this but the transition is never very easy. But you likely won't need dongles. Just like today where you can buy USB-C to USB-C you can also buy USB-A to USB-A. Likely we'll just have many cables like that. The only exception would be something like connecting 3.5mm to USB-C. But as devices keep moving towards USB-C you'll need fewer, different cords.
That's it! All my music is on my iPhone so I never listen to music on my MPB so I've never used the audio jack. Having said that, unlike the iPhone, I don't really see what's to be gained by removing the audio jack. Especially as some folks have pointed out they use the digital optical out to go into a DAC and then into their headphones. That's not how I work but having the audio jack there to support them for that doesn't hurt me any. In fact I would argue that's the case for all the ports I don't use: it's a PRO machine. It needs to be able to handle all kinds of diverse scenarios not applying to me. Let the MBA be the razor-thin, portless machine and leave the MBP alone.
For phones, I'm an Android user but for laptops, I've loved MacBook Pro's (and personally bought three of them other than the work provided ones) for the last 10 years at least for their premier hardware and solid, Unix based OS. In between, I've taken a cursory look at premier chromebooks or ubuntu based laptops as possible replacements but have found them short on one pretext or another. But if the minimalism fundamentalists from Apple win out and start stealing required ports (and Yes, I need the mini port, the HDMI port and the SDCard slot ... along with USB), I'm gone. No more macbooks for me.
I'm using an MBP I recovered from a wine spill. It also has a headphone jack broken off in the headphone socket. I'm currently using a bluetooth receiver connected to my audio source switch to connect it. It's not an ideal solution because the audio management in OS X appears to not want to reconnect to Bluetooth audio after sleep, etc. But it will prefer a monitor with HDMI over the internal speakers. I'm not sure why, and I don't know how to change it so that it reconnects to and prefers a Bluetooth audio device. So, for now, I'll be connecting the audio-out on the monitor to my source switch.
For unhappy reasons, I feel like at least a minor expert on the lack of audio jacks: I bought a Note 4 a little over two years ago and mine was part of a batch they sold without an audio jack; these is a bolt in the hole. Verizon quickly stopped selling the jack-less Note 4. My wife's, which she bought after mine has an audio jack.
So, I have over two years experience of a phone without and audio jack. It sucks. I dislike using Bluetooth headphones.
I also use audio jacks on my laptops.
This seems like a profit thing for Apple, selling their own expensive wireless headphones.
I listen to music on my Mac using bluetooth headphones, but I still leave a pair of ordinary wired headphones plugged in just in case the bluetooth pair aren't connected for some reason, otherwise the sound will play over the built-in speakers by default. I'd be happy to give up the headphone jack but only on the proviso that I have complete control over what happens in situations where the headphones aren't there.
I'm sure I remember there being different volume settings for speaker vs headphones. If there's anything playing out of the speakers, it's at the volume you had used it last, before you plugged in your headphones, no?
Yep, it’s possible to set different volumes for all sound output devices connected to the Mac (including the built-in speakers) in System Preferences → Sound → Output.
I use an iPhone headset with my Mac for Skype since while I've been using bluetooth headphones for music exclusively with my phones for a decade now, the headset profile hasn't kept up with audio codecs and it still sounds a lot worse for microphone voice.
Ugh, yes please! I think they'll end up licensing out the W1 chip technology making it easier to pair devices so the device doesn't drop out. My bluetooth speakers use to drop out and re-pair at max volume. It use to freak me out... to this day, I'm always very wary.
I use the headphone port on my rMBP (2013) daily. Frequently use it at the same time as both USB ports and the mag-safe charging port. HDMI port used often enough that I wouldn't buy a laptop without one. I've never used either Thunderbolt port and would actually prefer to swap those out for more USB ports.
[edit: Thunderbolt, not lightning. See, I never use it.]
I'm a weird use case: I don't use the audio jack on my Macbook Pro. As someone who still relishes their 10-year-old Apogee Duet audio interface on their 2015 Macbook Pro (Firewire 400 -> Firewire 800 -> Thunderbolt), I wonder if I can convert Thunderbolt now to USB-C? :)
USB does have DMA. In fact, OHCI standard mandates it. The hub topology is a problem as some devices ignore bandwidth allocation for isochronous transfers.
It's only displacement of wires, though. You'll need more wires to charge all your wireless devices.
For example, I still use a wired mouse because it never stops working. I use wired headphones with my work laptop and home desktop because I never have to hear those fatal words "Battery Low!" just when I want to settle in and focus on something for the next few hours.
This isn't Microsoft, they don't upload your usage habits to the mothership. Even on iOS which does some of that it's a super-clear opt-in on first launch.
I use wired devices on my desktop, where possible. The fewer batteries going to the landfill, the better. Rechargeable batteries are a real crapshoot, since I'm prone to recharging them overnight, when not in use, thereby shortening their lifespan (and it's a major PITA to figure out which ones suffer from that and which ones don't).
That doesn't make any sense, because the existence of Apple selling an adapter doesn't mean you have to buy theirs: them not selling a first party adapter can thereby only be a bad thing (and in fact it is a pretty horrible thing as all the third party adapters are crap :/ the Apple adapters tend to last forever, while the third party HDMI ones I am forced to buy tend to literally crack into pieces after a rather short period of usage). But like: the point is that Apple doesn't always go "yay, an opportunity to make money on an adapter", as one of the most common adapters I see in the wild (to plug all but the largest MacBook into most modern projectors) is one Apple doesn't even bother selling.
I was about to call you on this but checking the store it seems you're right, they're reselling another adapter, not an apple branded.
However, I do disagree on the quality of third party adapters; there are many fine ones out there, and the cheap mini-dispay to HDMI I bought last time I was in the US works picture perfect. I think it was PNY but not at home so don't remember.
Not to say all are good, but there are quite a few good adapters.
Yeah, try finding a decent one on Amazon that will actually work with a Mac Mini or Macbook Pro. Had to buy 3-4 of them to find one and the build quality was absolute garbage on all of them.
I suspect the survey is just to show that they supposedly polled the community. It seems any decisions this big would have been resolved quite a while ago in order to be able to ship when it seems they will.
To analyze and understand Apple's move (and it's not just Apple, they are probably the one people complain the most about because they tend to have a far smaller product range being used by a higher number of people), one must think 5-10 years ahead.
By getting rid of all these different ports, they are making their product simpler, while still capable of interacting with devices that use them (via dongles). This means they have to build less into them making them also lighter, more portable and requiring a standard way to connect other devices.
Dongles are a pain. Yes they are. But also one can argue that in a lot of situations, maybe the majority, if you are in need to use external devices, you're probably in a situation where your laptop is on the top of a desk, in which case maybe even using some kind of a HUB makes more sense (this applies both for home, office and meeting rooms).
So why aren’t they doing this in 5-10 years time? Well, because by doing this (Apple and others), they’ll speed things up. They’ll create more need for devices compatible with USB-C, which means companies will start making peripherals for that market.
However, think by now we’ve all understood that Apple is aiming for a wireless future. The last compact camera I bought mid-2015 already had Wi-Fi. Headphones are moving wireless (hopefully in a safe way, since there is a reasonable concern about the health effects). Wireless peripherals already exist. The reason why they still have USB-C is probably mostly because of charging, with the nice side effect of keeping other devices able to connect via a simple dongle (rather than a wireless HUB).
I’m also not a fan of wireless keyboards and mice. Mostly because of the hassle of managing batteries/running out of juice and not finding other batteries. A good move from Apple would be to solve that problem: wireless charging becoming the norm instead of an expensive add-on, with longer lasting batteries, with the dream being on-the-fly wireless power feeding.
tl;dr - Apple wants to make things simpler, lighter, more portable and they are not afraid of putting the weight of their whole range of products behind it and taking the plunge. There will be some pain. We’ll hopefully end up with a better experience after the transition period.
The RF power levels in headphones are much much smaller than in mobile phones. Also, there is no evidence that the microwave frequencies used have any association with cancer. There are certainly problems (not cancer related, mainly heating and burns) with high power RF, but no consumer product could be released that would output those levels.
I use the headphone jack fairly often, but I'm fine with them removing it. I seem to buy new semi-expensive headphones every other year, and if my next purchase must be a pair of wireless ones, so be it.
I don't get all the angst over this. If you have a set of wired headphones that are irreplaceable to you, get an adapter. If not, get a set of new ones.
> I seem to buy new semi-expensive headphones every other year
Why? Good headphones last tens of years.
The alternative to a headphone jack is not wireless. Bluetooth audio and having more unnecessary batteries to charge suck. The alternative is a USB DAC + amp you can use with any standard pair of headphones.
Sometimes they break, by wear or by accident. Other times I find that I made a mistake in buying them because they are uncomfortable or badly designed (chord too long, clicker placed so low I can't use it if I'm wearing a coat). I don't buy new headphones every other year as a matter of principle. It just seems to have been the case for a good number of years.
Good headphones last tens of years.
You must use yours differently than I. Or the ones you buy are more than just semi-expensive.
You know, you could just pay a technician 20 bucks, a good headphone cable (whatever Sennheiser uses in top lines) and get it replaced. It is quite easy.
The 3.5mm port on the MacBook Pro is not just for headphones.
It doubles as an S/PDIF connector for outputting digital optical audio, so can be used to feed external DACs or other audio equipment with lossless audio.
I don't want to have to carry around a bag full of adapters for my laptop to be useful, just for the sake of making the machine a couple of millimetres thinner.
As per the iPhone 7, removing the 3.5mm port would free up a fair amount of internal space, which could be used for other purposes, and ultimately contribute to making the laptop thinner.
If there is not already a cable you can buy with lightning at one end and whatever is needed to connect to your audio equipment at the other end, surely it is only a matter of time before there is.
The gap between the surface of the screen and the bottom of the laptop is so tiny that when you close the lid, any dust particle, even so small you can barely see them, gets squashed against the screen. When you leave the laptop charging for a while, static seems to build up on the screen, and it attracts dust particles.
You can tell this is not an incidental thing if you examine the surface of a 2 year old screen. Even if you don't have visible scratches, you can see the contours of the trackpad on the screen, where the gap is a fraction of an inch wider, and the screen stays more pristine.
> You can tell this is not an incidental thing if you examine the surface of a 2 year old screen
I'm looking at a 6 year old screen, a 3 year old screen, and a year old screen right now - no scratches on any of them, and I don't do anything special to aka scare of them.
Yup mine too, one grain of sand is all it seems to take. You could follow the path the single grain must have taken during a single commute. When the macbook is charging, a bit of static charge seems to build up on the screen, which can attract and hold small particles.
Well, animals with paws can scratch the screen. Cats and a bird video ("videos for cats" on youtube) are a particular vector on how to get your screen scratched...
If you've got a problem with cat claws scratching glass, maybe you should return Battle Cat to He-Man. Normal earth cat claws are nowhere near hard enough to scratch glass...
Glass? This is not the iPhone we're talking about, it's the Macbook Pro. The top layer of the screen is the polarizing filter, it's plastic and really not very scratch resistant.
I rarely use the headphone ports on any device now. Wireless bluetooth headphones with active noise cancelling are one of the best tech-purchases I've made. Would never go back.
If you don't mind crappy audio (music that skips, plays fast/slow depending on interference, etc) and terrible recording (handsfree car-like 8bit audio), Bluetooth is fine.
The audio recording hasn't been bad for long time, though with BT it really is a case of you get what you pay for. I had a fairly nice BT headset from Jabra back from about 2009, and it sounded good for playback and for talking to people on skype/vent/game chats. My cheapo chinese knockoff headset I bought for running has a horrible mic (but good audio)
I'm not saying it's perfect and everyone should immediately love it, but wireless audio is good enough for most people right now. The fact that I have the option of getting a $10 pair of BT headphones for running is pretty cool, and it's a purchase I do not regret in the slightest - they've survived Russian winters, so they're good enough in my book.
<sarcasm>iPad sales are falling so Apple thinks the way to solve that is by making the MacBook more iPad like so that people don't feel temped to choose one over the other.</sarcasm>
I'll only be interested in wireless headphones once they actually have decent quality. And even then, I already have very good Beyerdynamic headphones (the T90 and the DT880) and I don't see why I'd spend money to change them.
I'd be rather annoyed at not having hdmi or sd card mostly because carrying dongles is a pain.
It's kind of sad but Apple products just do not excite me anymore. I remember before I used to try and follow the keynote to see the new iphone or I looked forward to seeing the first intel macbook when it was unveiled... Now, I look at the iphone 7 and there's really nothing in it that makes me want to buy it. There hasn't really been anything new with their laptop either (ok, maybe the new small macbook is not bad it's a good travel notebook)...