Hacker Newsnew | past | comments | ask | show | jobs | submit | penglish1's commentslogin

A few of these examples are great reasons to Google - but also to remember that for many new-to-you activities, you'll be very bad at it at first. And the feeling of being bad at it often causes people to drop it before getting past the initial pain, to the point where they are basically competent enough to actually enjoy it.

But getting past that initial time hurdle isn't enough for you to be basically competent either - you might no longer be put off by the overwhelm of a new thing, but you might still be really really lousy at it. A more methodological approach is called for, hence:

https://lifehacker.com/learn-anything-in-20-hours-with-this-... 1. Deconstruct the skill: Break down the parts and find the most important things to practice first. If you were learning to play a musical instrument, for example, knowing just a few chords gives you access to tons of songs. If you want to learn a new language, learn the most common 2,000 words and you'll have 80% text coverage. 2. Self-correct: Use reference materials to learn enough that you know when you make a mistake so you can correct yourself. 3. Remove barriers to learning: Identify and remove anything that distracts you from focusing on the skill you want to learn. 4. Practice at least 20 hours.

Personally - I tend to get lost in over-analysis without even actually starting! Googling the thing isn't step 1 for me, but all the steps - an end to itself. I've got to constantly monitor myself on that one.


Googling is great, but I'd like to highlight that two distinct examples weren't necessarily Google at all. Sure - The Wirecutter appears in the first page of Google search for "best dishwasher" but you could also simply start on The Wirecutter page. Same for the humidifier.

The Wirecutter is great, and as the article points out, not just because of the review, but because of all the detail about what went into it, and the meta-information (eg: dishwashers need to be cleaned! Who knew? You can pre-clean your dishes too much for the dishwasher to work right? Humidifiers can over humidify!?)

For a specific subset of The Stuff in the article, starting at The Wirecutter would be a fine thing as it would be very difficult to find that info and metainfo through Google!

It is also worth noting and well worth the purchase price (or often free online through your local library) to look at Consumer Reports. The OG consumer product reviews, and, unlike The Wirecutter, funded entirely by subscriptions, versus referrals. Depending on the specific product, The Wirecutter or Consumer Reports much be more up to date, have better coverage, a different take, or coverage at all.

One small difference between CR and The Wirecutter? CR has created a formal standard, and is trying to actively review for, and push for additional security and privacy in consumer devices: https://www.consumerreports.org/privacy/consumer-reports-to-...


Q: 1a. How come today, for home use, WiFi seems to exceed the price/performance of hard-wired? Is the Ethernet standard lagging, or lack of interest/demand, or some more physical barrier?

A: WiFi does not exceed the price/performance of hard-wired Ethernet. They are different enough it is difficult to compare properly - for instance, for wired Ethernet, do you include the cost of the wire installation, termination, etc? I don't think it is reasonable to include - complex wiring gets more expensive - potentially by an order of magnitude! Fiber can be much more expensive than copper.. or cheaper. Fiber transcievers get very expensive for very long distances (miles) etc.

So to do a simple comparison - 10GbE vs. Wifi. Let assume you are adding these to a computer which doesn't have them. The Wifi adapter is $26.90 A 2.5GbE adapter is $30

What about the network switch? (or wifi access point) Typically with wired ethernet, you do this as a "cost per port" - currently about $25/port on the low end (rated at 10Gbps, by the way, but able to step down to 2.5Gbps).

This (top performing "high end") Wifi router runs $250. It may officially support 200+ clients but that just means it can theoretically talk to them at all. Good luck pushing actual data to/from them! In practice, to match up in cost/performance with a 10 port ethernet switch, it would need to drive the full 2.5Gbps for 10 clients simultaneously! I guarantee you it can't.

And the price equation only gets better if you go full 10Gbps - which is probably the "sweet spot" right now for price/performance, buying in SOHO-sized quantities.

Q1b: The ethernet standard is pretty excellent, and currently goes to 400Gbps, and if you have the cash to buy 400Gbps equipment - you can basically expect it to work to spec - if the switch fabric says it will support X Tbps concurrently - it will. Good luck with that on Wifi.

There is a TON of demand - you just don't see it in your house. Or your neighbor's house. Or your friend's house.

You should though! Ethernet is AWESOME - and back when we had wired landlines, people ran those wires all over their house. It isn't that hard, or that expensive, and you'll get MUCH better, more predictable performance from doing it, and moving as many devices as you can to wired ethernet. Even your WiFi access point benefits from an ethernet cable running to a location where it's signal works best, rather than some closet off in the corner of your house.

Q2: Related, for somebody who wants reliability, is Ethernet/wired still a sane choice, or does that just make them an old geezer? In urban setting with overlapping wifi cards all blasting full strength at their neighbours, is real-life wifi performance actually near as good as wired performance?

A: No, real life wifi is nowhere near as good as ethernet. Switch to ethernet as much as you can! Of course, it doesn't make much sense to run an ethernet cable to the tablet you read sitting on your couch. But every desk, printer(?), your WAP(s), etc.


And what happens when the 2 computers disagree? I thought 3 was standard practice for this sort of thing.


You need 3 for things that if they stop working the plane crashes.

This is more of a "if it stops working the pilots have to fly more conservatively until it is fixed" thing. It apparently is generally considered acceptable for such things to not have triple redundancy, as long as a failure will be detected so the pilots can deal with it. Two computers is good enough for that.


The internet may get better, but it is not not guaranteed. And I predict that if it does it will be despite Facebook, not because of, or with it.

Social networking was NOT an enormous pain in the ass. Everybody had, and still has email - it was the gateway to the internet. Roughly since the beginning of the internet. And all the people that joined. Email got used in almost all of the ways that Facebook did.

In addition to email, there were ways to do "roughly" everything Facebook did with varying levels of difficulty - some roughly unobtainium for the "later billions" of people that Facebook drew - not just email based listservs, but USENET before that, self hostedweb forums starting in the 90's, Craigslist in the commercial space.

The problem IS Facebook (and in a different but related way, Google).

UNlike Craigslist (for example), Facebook is a public company. But it is not accountable to the public, as we've seen over and over again. It's users are a commodity and are not stakeholders in any meaningful way. But - even the shareholders are not. Zuck owns AND controls it almost entirely. And unlike Craig Newmark (who isn't perfect), Zuck appears to have been deeply flawed from the very beginning, and the poison of unearthly money and power have only made it worse.


email addresses were treated as disposable and temporary from how i remember it before gmail went mainstream.

providers came and went and custom domains for emails were extremely rare... at least in my circles


> email addresses were treated as disposable and temporary from how i remember it before gmail went mainstream.

Not at all. On the consumer level they were typically associated with your ISP account. So you'd get to be <version-of-your-name>@<isp-net.com>

My parents still have the email they set up with the minuscule local cable company in the early 2000's. That's only the second email they've ever had outside of the dialup provider email they had in the late 90's (when internet finally came to our small town)


Works fine until you move and change providers, or as in the case of many of my friends/colleagues, change email addresses due to spam, which is often related to signing up to services which eventually get hacked. I regularly get spam addressed to custom email addresses, one of the first times was adobe28593@mydomain.com.


What was so special about Gmail? Except for the user interface and mailbox size, Hotmail and Yahoo were just as good and people used them for a long time.


not sure, to be honest.

thats how it was in my circles until ~05-06. everyone kept switching emails and used a different address on each service/forum. until gmail came along. suddenly everyone used their gmail address for everything. and even that got phased out when facebook came along.


The big difference that I remember from gmail was the superior spam filter (other inboxes filled with spam quickly) and better security (everyone with a yahoo mail account had it compromised and started sending spam / virus laden links after a while).


I really thought this would be about something else, given the title, but oh well.

As others have pointed out, this has been going on since.. the beginning of IT.

I got my start in the 90's, working for an employer that sold tech software on: AIX, Digital UNIX, Ultrix, SunOS, Solaris and IRIX. SunOS and Ultrix were on their way out, but still supported. Windows and Linux were relatively new additions. 64-bit was just starting to become a thing, so we needed to support both 32 and 64 bit versions.

Supporting dev, stage and prod environments for all this was a huge job, and took a fairly high level of technical skill as well as a huge amount of domain specific knowledge. Really - far more domain specific knowledge than technical skill, on the balance.

It did require building a lot of software, and autoconf was a (new) blessing for cross-platform builds.

My CS degree was helpful, but honestly there just wasn't much programming needed, outside of shell scripts.

At the time, every few years someone would say how all of this "IT management stuff" would be automated away. I distinctly recall Sun executives banging on about it in the media a bunch just as I was starting to work full time after graduation.

And I distinctly remember thinking - they are completely wrong. And they were.

Commoditization was a big shift - now all those UNIXes are gone and we're left with just Linux.

"The stuff" that cannot be automated was shifting then, has shifted a lot since then, and continues to shift.

At the time, lots of knowledge and skill was needed to build sendmail for every OS, configure and install it everywhere. Now we've basically just got Linux, and just about every package you can think of is available via apt and yum.

Configuration is done through a DSL like Ansible, Chef or Puppet.

And now we're shifting such that we'll just use SES or some other cloud service for sending, and we won't manage mail servers at all - or any other commoditizable service - SQL database, noSQL, NFS, block storage, etc etc.

Or we will, we just won't be tweaking many knobs and buttons on it - and we'll still be managing it primarily with a DSL rather than bash scripts.

And perhaps writing a fair bit more "real" code as well.

But - somebody's got to stitch all that together - as the article says, the key is providing value that is specific to the company/product/service. It always has been!!

Creeping - sure.

But is it an apocalypse if a bunch of DBAs and Windows Administrators have to learn some new skills, or retire, or lose their jobs? People who basically have had to be continually learning and adapting all along?

Was it an apocalypse when I "lost" the career value of all the skill and knowledge I had related to IRIX, Digital UNIX, Solaris, etc?

There is a REAL creeping apocalypse, but it isn't this. It is security. Software is eating the world, and for every line of code written, X new security bugs are introduced. In this, I'm including social engineering bugs.

That is creeping.

And the apocalypse will be when some combination of those bugs leads to something truly horrific. If it hasn't already - like the end of democracy.


The article says that Americans never really regained trust in American manufactured sedans since the 60s/70s. But then doesn't explain.. who exactly was buying all those sedans.. while not trusting them? There definitely were a LOT of American sedan sales for a while there.


I grew up in rural Iowa, and the concept of buying something not made by the big three was pretty exotic. The dealerships weren't around, so people didn't even consider them as an option.

Now most of the small town dealerships have gone under as part of the restructuring from the recession, so people are going to cities to buy cars anyway now. When they get to the city, there are dealerships for the foreign car makers, so I'd guess people end up more likely to buy one.


Federal, state, and local governments. Taxis. Ford Focus did well too.


The companies appear to be complaining that they can't hire skilled people.

Paying more is one solution - hire the skilled people away from their current job.

But so is training. Training could be considered a benefit, but at one time it was the standard assumption - all jobs include training, particularly jobs that are in any way skilled. And not just training at the beginning, but ongoing. Every year. Like a salary, vacation and sick time. As part of the standard budget. Not the first thing to be cut - because it is just as necessary as salary, vacation and sick time.

Unless of course, the job isn't really skilled, and companies doesn't actually need skilled people.


Thank you for this public service!


I could use an overview that includes an update to the Computer Architecture class I took in the early 90's. This is good - for "general purpose" microprocessors.

At that time, nothing at all was said about GPUs - they basically didn't count at all. I don't really recall anything about DSPs either. And FPGAs were considered neat and exotic, but a little useless, particularly compared to their cost and more of a topic for EE majors.

Now I've seen a great update (posted to HN) about how FPGAs are basically.. no longer FPGAs and include discrete microprocessors, GPUs and DSPs.. often many (low powered) of each!

This statement: "The programmable shaders in graphics processors (GPUs) are sometimes VLIW designs, as are many digital signal processors (DSPs),"

is about as far as it goes. Can someone point me to a 90-minute guide that expands on that?

* What about the GPUs and DSPs that are not VLIW designs? * What is the architecture of some of the more common GPUs and DSPs in general use today? (as they cover common Intel, AMD and ARM designs in this article). eg: Differences between current AMD and NVIDIA designs? I don't even know what "common 2018 DSPs" might be! * How does anything change in FPGAs now, and where is that heading? (the FPGAs-aren't-FPGAs article was a few years old)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: