Hacker News new | past | comments | ask | show | jobs | submit | more mv1's comments login

Yes, but with 5G, I now have high speed internet on your block too, and so does everyone else.


Except that my devices don't have 5G or fiber ports. I have WIFI and Ethernet ports.


The fiber line goes to your house then gets converted to Ethernet via a modem; just like cable does now. In fact when i had FIOS, the fiber was converted to coax before it came into may apartment, then went to the modem and cable box. So from a user perspective, it wasn't any different than cable.


Unfortunately, I think the import by URL, by itself is a bad idea. You ideally want reproducible builds, and npm provides a form of that with its enforcement of semver and hash checks in the lock files.

If you are really pedantic, you should probably also check in node_modules with every release you care about.

Having said that, having a system that can source modules from a URL and then enforce hashing and versioning could be a win. I'd just hate to see a repeat of the early golang build system that led to the rise of gb.


This is a trademark issue and I believe laws vary by country. No idea how to quantify the risk though.


I think the problem is that building any new abstractions is a pretty big lift today, and so innovation is slow moving. Something like this can make that innovation easier, eventually giving way to some simple YAML files.

Unfortunately, it seems like every app eventually has its own pet requirements as it matures, meaning that building new abstractions might be important as part of the UI.

But generally, I agree, tools like this are needed because existing abstractions aren't good enough. I'm just not sure they will ever be good enough when everyone has custom requirements and app-specific architectures.


I'm optimistic, we might have to take a few steps back every now and then, but overall, I think things will improve. Whether that will happen fast enough is anybodies guess...


Umm, the correct answer is to switch doors. It has been discussed ad naseum as you mention. The simplest way to see that the switch doors gives you a 2/3's chance of winning is as follows.

Without loss of generality, assume that you always pick door 1.

If the car is placed randomly, you have a 1/3 chance of getting the car. To see this, note that when not switching, you only win if your initial guess has the car behind it, which happens 1/3 times.

When you switch doors you only win if not switching loses. Since not switching wins 1/3 times, switching wins 2/3 times.

You can write a simple program to simulate the game and see that switching wins 2/3 times.


I love this comment: "The reason why we put it in is not physical data loss, but once in a blue moon you will have a bug that destroys all copies of the online data and your only protection is to have something that is not connected to the same software system." I think that is often overlooked when designing HA storage systems.


The purpose of double blind review is not to anonymize the work of well known authors - conferences, informal conversations, etc. guarantee that the work of established members of the community will be known, even before Google.

What double blind review does is give the unknown author a fair chance as there is always the outside possibility that it is the new work of a well known author that the referee has simply not seen yet.


A "fun" exercise in C++ is to take a line of code that makes extensive use of advanced features, like templates, smart pointers, etc. and step through a -O0 compile with gdb. I was shocked, in some cases, about how many different functions were called for what looks like a simple expression.

This can lead to some interesting errors. For example, writing

void foo(const std::string s)

by mistake, instead of

void foo(const std::string &s)

won't result in a compiler error, but will result in an unusually large, and unexpected overhead in calling foo.

On the other hand, STL is great (less the error messages). I sorely miss it when programming in C.


> I sorely miss [the STL] when programming in C.

Well, C++ got at least one thing right: deciding to support parametric polymorphism (or "genericity"). Templates are quite ugly, but at least they do the job.


Historically, publishers provided at least two valuable services: distribution and marketing. They captured the value they were providing by "taxing" distribution (i.e., adding a profit margin for themselves). ebooks make distribution cheap and easy making publishers unecessary in that role.

Marketing is still valuable, but publishers never directly charged for that (as far as I know), hence their dilemma.


I like Richard Hamming's quote which is something as follows: "What you do is luck, that you do something is not." (something is referring to great work in this context).


Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: