Hacker News new | past | comments | ask | show | jobs | submit | anon4's comments login

Or they'll force the compiler writers' hand. See: EGCS.


"consensual slavery" is a tautology more ridiculous than anything the Soviet Union ever wrote in their propaganda. Nobody would sell themselves into slavery if they had a choice.

Which kind of brings up an interesting angle - if every company is asking for an NCA, can't you argue that you signed it under duress, i.e. you didn't have a choice in the matter, and therefore that part of the contract is invalid?


>"consensual slavery" is a tautology more ridiculous than anything the Soviet Union ever wrote in their propaganda. Nobody would sell themselves into slavery if they had a choice.

It might seem contradictory but the world is not always nice and tidy.

Consensual here just means "the other person agreed/signed to it".

But what forces pressured them to do so are seldom examined -- and lots of people assume as long as it's not a "gun to the head" it's OK.

Of course from:

a) "total freedom to evaluate and pick among many options -- or even ignore them completely" to

b) "tons of factors pressuring enormously for a quick decision, and very limited options"

there's a huge scale of "consent" before we get to "gun to the head" non-consent -- but it's seldom acknowledged. In other words, consent is seldom black and white.

If we want to make the term less ridiculous, consent should only be used when there's actual and considerable choice on the matter. Not when it's "that or immediate hunger/homelessness/etc."

Of course long term everybody could be homeless/hungry without a job, but there's a difference between a person choosing to work for company X after evaluating what they do, the salary, etc, and a person in some Asian town that has to go work to the nearby factory or else, or a non-college educated single mom in Alabama a week away from eviction, that latch-on to whatever they can get to survive.


>Nobody would sell themselves into slavery if they had a choice.

It actually used to be common in the Roman empire. It was effectively how you declared bankruptcy.

This is something to bear in mind any time anybody advocates strengthening creditors' rights (e.g. removing bankruptcy protections or making them more onerous). The logical end point of this process is actual, literal slavery.


You don't need to go all the way to the end. And companies might be legal persons, but not physical persons.

Strengthening creditors rights against companies can be done without strengthening them against people.


Go start your own company?


No, he means like if you write the flac to a flash drive. Just write it to a digital medium, rather than go to the trouble of pressing discs.


Still, it held off for 4 years. That's pretty much the usual lifetime of a console generation, so I'd say the system was a success.


It was broken much, much earlier. At first you required an external device (special cartridge) or hardware mods, but there was a software exploit via the web browser for over a year now.


In addition to being broken earlier, it's been trivial for even a casual user to pick up the exploits and use them to run arbitrary software on the device. There's a long-running website out there you can load in the 3DS web browser to exploit your device with a single click.


It held off for 4 years before completely irreparably being broken.

Many smaller breaks happened in that 4 years.


Well the first break that brought piracy happened about two years after launch. Also, the Vita, which launched around the same time is sill unhacked pretty much. It's a shame that there's not as much good games on there for it to be a commerical success. In fact the Vita does so many things right that the 3ds fails at (security wise).


It's no worse than checking return codes though. And I really don't want integer overflow check to be turned on for all integers. I'm writing stuff that has to run, even if it occasionally produces a wrong answer. Having hidden checks inserted by the compiler that just crash the program would be really bad. Yes, I should check the inputs, but if I ever forget to check one, it's better that some ranges of input data produce nonsense results rather than crashes.


This fact should be drilled into the heads of managers everywhere - Henry Ford, one of America's great entrepreneurs and businessmen, set his workers' shifts to 8 hours not out of some love or sympathy, but because 8 hours is the maximum work you can extract from a human being in a day. It's not some average or medium amount, it's the maximum sustained amount. If they could do 9, he would have had them do 9.


Lol, that poem. That poem is meant to be taken literally, the poet wrote it for his friend who wouldn't shut up about nature hikes.

You had a much better saying at your disposal "The grass is always greener on the other side". Alas, you chose not to use it and now we don't know how your comment would have turned out, had it made sense.


Jesus Fucking Christ, now we're exploiting undefined behaviour to remove one compare instruction? You don't even need the jump, just use a conditional move. Just..

    test rax, rax
    add rax, $offset
    cmovz rax, 0

Or keep the damn jump and expect the cpu to profile it correctly.

Sorry for sounding angry, but this kind of thing isn't what makes my programs faster, this kind of thing is what makes it harder to write correct programs.

Edit: clang does exactly this at -O1: https://godbolt.org/g/o6gh0M


A small nitpick, your code is incorrect because the add operation will set ZF.


That you need to invoke undefined behavior to remove the compare instruction is only the symptom. The actual disease is nullable pointers themselves, which create the need for a compare instruction at every pointer dereference in the first place.


No. Java has nullable pointers and almost all JVMs implement them with a pointer to address 0 and unmap the first couple of pages of the address space. That will cause a fault, leading to a signal, which is then handled to materialize and throw the exception. No compare. Just naked loads/stores with fixed offsets. The disease is multiple (implementation) inheritance.


> No. Java has nullable pointers and almost all JVMs implement them with a pointer to address 0 and unmap the first couple of pages of the address space.

This is just plain ugly. Why should the normal operation of a program written in a high-level language trigger page faults? But, leaving aesthetic concerns aside, I'm not even sure it works. How do you guarantee that the OS won't give you back the same pages you unmapped when you try to map new pages (say, to grow the heap)?


> This is just plain ugly.

Yeah. Unfortunately the only really efficient protection mechanisms that modern processors have is virtual memory. C++ programs generally unmap the first few pages for exactly the same reason; to catch nullptr derefs.

> Why should the normal operation of a program written in a high-level language trigger page faults?

NullPointerExceptions are not considered normal operation. They are safety violations that have a controlled, well-defined semantics. BTW page faults happen all the time; the OS transparently handles them and maps in new pages as necessary. The problem you are referring occurs when a page fault happens and the OS knows there is no mapping for those addresses.

> How do you guarantee that the OS won't give you back the same pages you unmapped

Because the mapping is for an address space range (e.g. 0-N), and the OS does not overlap new request with existing mappings unless specified in the request.


> NullPointerExceptions are not considered normal operation. They are safety violations that have a controlled, well-defined semantics.

I'm not really buying this. My definition of “normal operation” is very simple: Everything but FFI calls. Normal operation in a safe language is supposed to be safe.


> and the OS does not overlap new request with existing mappings unless specified in the request.

Ah, so by “unmap”, you actually something like POSIX's `mprotect()`, rather than `munmap()`?


Sorry, yes. You can do this just through segment declarations in both ELF and MachO binary formats, to prevent anything getting accidentally mapped there before startup.


Garbage collectors (and various other parts of the JVM) use similar tricks to coordinate threads without using memory barriers. The compiler just inserts TEST instructions to some fixed memory address in strategic places in the code.

When the garbage collector needs to run it protects the page and waits until all threads have segfaulted and transferred control to the collector.


Malloc never fails, but you might die if you touch the memory. In general, modern OSes don't have a good story about exhausting available memory beyond "let's kill a bunch of processes to free up memory".


> Malloc never fails

malloc can fail, even on default linux (overcommit enabled), if you go above the process's vmem limit for instance (because 32b or rlimited). And of course not all OS overcommit, Windows famously does not.


It's a way to mitigate sql injection attacks when you don't want to pay for a rewrite of the data layer to not communicate with the database by blindly mashing strings together.

And don't get me started if your name can't be written in US-ASCII, i.e. if it has weird squiggles above or below the characters or if it's written in gasp cyrillic.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: