Yes, the 10th generation of the Facebook/email CPU
Nobody would use that for real work, Apple won the laptop market when developers started using it, now they are killing them for tablets, which are gadgets, not work machines.
I recently got an upgrade but, I spent quite a long time working as a professional data scientist and data engineer on a 2013 MacBook with one of those Facebook/email CPUs. Quite happily, too. It was never going to hack it training any deep neural nets, of course. But that is what our data center is for. I wouldn't want that kind of stuff running locally, anyway, for a whole host of reasons. And it turns out that analyzing data in Jupyter or R is can easily be a lighter workload than displaying Facebook ads or doing whatever it is that Google has recently done to Gmail.
I will admit that our front end developers do all have nice high end machines, multiple high-resolution monitors, all the good stuff that Jeff Atwood tells us we should buy for all developers because They Deserve the Best. I attribute our site's sub-par (in my opinion) UX on the Facebook/email CPUs and 13-15" monitors that our users typically have, in part, to the fact that my colleagues' belief that high-end kit is necessary for getting real work done is a bit of a self-fulfilling prophecy. There's not much intrinsic incentive to worry about user experience when your employer is willing to spend thousands of dollars on insulating you from anything resembling a user experience.
It always amuses me how warped many commentators on HN's perspectives are on what is and isn't "real work". At time's it's bordering on "no true scotsman" fallacy.
I assure you that plenty of people are using those computers and getting paid for the work they do on them.
Most of these people do their work without writing any JS too, enabling them to cope without 32TB of RAM to manage their 16 tabs of JavaScript each running in its own chromium instance.
Some day, we’ll re-learn some old lessons about the need for efficient utilisation of resources. In the mean time, check out the latest old game re-implemented at a fraction of the frame rate in your browser.
To be fair my laptop at work has 8 cores and 64gb of ram and it's already 3 years old
I'm gonna change it soon not because it's not good anymore, but because some of the hardware has become too slow for the things are necessary nowadays
Mind you it's not my choice only, things are more complex, deadlines aren't longer, quarterly reports are still every 3 months but the things we have to do have become more complex and computationally heavy and need new hardware to keep up
I'm an hobbyist musician, a 300 dollars laptop is good enough and even if I was a pro it would be enough
Truth is that the computational power of a 100 dollars smartphone would be enough
So a new chip by Apple doesn't change that their mobile line is already overpowered for the average use cases and the laptop one is underpowered and overpriced
I think the assumption is that user on HN are devs. I mainly do OPS and have a huge vCenter installation availble to me, as well as AWS and Azure test accounts. My spec’ed out work macbook pro mainly runs iTerm, nothing that my personal macbook pro from 2013 can’t do.
There are only two things that I use at home that could benefit from an upgrade, AppleTV hobby projects and YouTube, not the video playback, that’s fine, but the pages loads slowly.
> I think the assumption is that user on HN are devs.
That's a fair assumption. My problem is with the other poster using the term "real work" to imply that Apple's devices are underpowered or useless. And even then, if they are, there's a lot of dev work that can still be done on machines a decade old performance-wise.
I'm starting think I'm in an episode of Twilight Zone: we are in an alternative world where people suffer from severe cognitive dissonance and can't argue properly
Real work referred to computers means heavy load
The original post said "Apple is killing laptops to sell.more tablets which are gadgets and you can't do real work on gadgets"
Which is true
There are a lot of people driving push scooters, you can't do real work with them, you need a proper vehicle if your job requires moving things and/or people all day
Paper and pen have nothing to do with laptops and, usually, brain is more powerful than an i5
> Real work referred to computers means heavy load
I think this is perhaps the source of your problem. You are assuming that others interpret the phrase "real work" to mean exactly what you think of when you hear the phrase.
For many people, when you say someone is not doing "real work", you are implying that their work is not important, or it's not valid. If someone says to you "why don't you go get a real job" - it's the same kind of thing. There are plenty of jobs in our industry writing CRUD apps for businesses, for example. Those ARE "real work", no matter how common or unglamorous they might be. However many of those jobs can easily be done on a machine with very modest resources.
Yes, there are jobs where the demand on computer hardware is much more resource-intensive. But it is a mistake to assume that those scenarios are what people will think of when you use the phrase "real work".
Here on HN real work is not something an i5 can handle
And if an i5 can handle your workload, then the A13 won't make a difference either in price or performance (because you already don't care about performance) it only matters to Apple's profits
So no, an i5 is not enough for doing real work in technology
Any developer here on HN would tell you that with 1k you can find much better deals for the money
If your job is not technology related you can do it with a 5 years old laptop of any brand
We buy new machines because we always need more power, to do everything else that is not real work I own a 3 years old 12 inches Chinese laptop with an i3 that runs Ubuntu and is just perfect
You might be amused, but you wouldn't accept it as a work laptop if your company gave you one
> If your job is not technology related you can do it with a 5 years old laptop of any brand
That really depends on what technology you're working with. In general, if you're not working with a bloated JS project or a multi-million line C++ codebase, a computer from the last decade will do just fine as long as it has 8-16 GB of RAM.
I mean these days the difference between an i5 and i7 is almost non-existent to me as when possible I disable hyperthreading out of an abundance of precaution.
There's a lot of "real work" in tech that can be handled on an i5.
Most embedded programming work could easily be done on an i5 from a decade ago.
> We buy new machines because we always need more power
We need more power because people keep developing bloated software for newer machines.
---
Can you define to me what real work is? Without simply saying "it's work that needs more than an i5 to handle", that is.
> computer from the last decade will do just fine as long as it has 8-16 GB of RAM.
That was my point as well, if you aren't working on anything that requires power you don't need a "working machine"
But if you do, the A13 is not a solution
> We need more power because people keep developing
I'm not the one putting hundreds of ML/AI models in production
But I do enjoy having a system that makes it possible to test an entire stack on a single machine, something that just a few years ago required multiple VMs and complex setups
Even if you're developing puzzle games for low level Android phones an i5 is not enough
You can not believe it, but it's the truth
> Can you define to me what real work is?
Of course I can, even though you can't define what it is that can be done with a baseline i5 that qualifies as "real work"
A typical dev will do some, all, or more than these things
- open up an editor, with multiple files open, with integrated language server, linter/formatter, background error checker
- open up an ide, jetbrains, Android studio, Xcode, on a middle sized project, with a few tens dependencies and start working on it
- launch a maven/gradle compile
- launch a docker build
- launch a docker-compose up with a main application and 2 or 3 services (DB, redis, API backend)
- launch the training on any of the ml/ai framework available. Of course you'll launch it on a very limited subset, it's gonna be slow anyway
- process gigabytes (not even in the tens of gigabytes) of data
- on my i3 even apt upgrade is slow. That's why I use it as a media player and not as work machine.
I really doubt they are, I'm an average programmer
My laptop is a working tool for professionals, if I use it as a dumb typewriter I don't really need modern generation CPUs, a Pentium 2 would be enough
Ram is a more pressing issue these days, given the amount of bloated software one has to run just to page a colleague about something (yes slack, I'm talking about you, but not only you...)
When I am at my computer working I want it to do things for me while I do something else effortlessly, without noticing something else is going on
If I have to watch it while it finishes the job, it would be just a glorified washing machine
And it means it is underpowered for my workload
That's why people usually need more power than the baseline, because the baseline is the human using it, the computer's job is not to just display pixels at command, it's much more than that
Imagine you are an administrative employee, you are typing a report on your laptop, you're doing real work BUT you're not doing real work on your laptop or to better put it your laptop is sitting idle most of the time which is not real work for it
Work is a physics concept, real work means there is force involved and energy consumed
If the energy is minimal or the force applied almost zero, there is almost zero work done
> real work means there is force involved and energy consumed
See my reply earlier in the thread. I think the primary source of contention here is that you assume everyone should only think of the definition you give for the phrase "real work".
Yup, I bought that one within says of the October 2016 because the new ones were outrageously expensive compared to what I was used to and I was not willing to give up inverted-T ;)
There are tools that work and tools for professionals, tools work, professional tools are for people that do it as a daily job and their job depends on them
Your opinion of "what works" is not universally shared by everyone.
You don't know the details of saagarjha's work developing on Android (unless, perhaps, you actually know them in real life, which seems unlikely given the way you responded), and neither do I. saagarjha is the one best able to determine what works for them.
If your point is that that having more resources avaiable to you can make you more productive, that's fine. It's always nice to have as beefy of a machine as possible.
However, not everyone has that luxury. Businesses have budgets, and most developers I know don't get to set their own budget for equipment. Sometimes you are lucky, and the money is there, and your management is willing to spend it. Sometimes that is not the case. Regardless, my primary development machine right now is a 5-year-old laptop, and I get plenty of development work done with it.
The way you worded this latest response makes it sound as if you are saying that I am not a professional, and my tools are just "toys", because I don't work on an 8 core machine with 64GB of RAM. I don't know if that is your intention, but if it is it is both inaccurate and insulting.
> Your opinion of "what works" is not universally shared by everyone.
Earth orbiting around the Sun wasn't either.
See the problem is not if you are a professional or not, but if the tool is.
If I do the laundry and the washing machine takes 4 hours to complete a cycle I'm still washing my clothes, but I'm not doing it using a professional tool
There's no place where I implied people using less than optimal tools are not professionals, I'm talking exclusively about tools.
Nobody would use that for real work, Apple won the laptop market when developers started using it, now they are killing them for tablets, which are gadgets, not work machines.