Hacker News new | past | comments | ask | show | jobs | submit | superluserdo's comments login

Both situations involve a huge amount of water flowing downstream. That needs to be replenished either through rainfall or through pumping.


See this >15-year-old video "How to get featured on YouTube" - https://www.youtube.com/watch?v=-uzXeP4g_qA, which I remember as being originally uploaded to the official Youtube channel but looks like it's been removed now, this reupload is from October 2008.


Speaking of mathematical missteps relating to bases, I've always been baffled by why we refer to a base system by the number above the highest representable single digit. Every base is "base 10" in that case! Why is binary referred to as "base 2", when the number 2 doesn't even appear? Wouldn't it make infinitely more sense to refer to our conventional number system as "base 9", binary as "base 1", unary as "base 0", and hexadecimal as "base F"? Or we could have used a more sensical word like "ceiling" or "roof" in that case, to convey that it's referring to the highest single-digit value in the system.


It's the base, as in "base and exponent", of the value of a digit position. If the lowest digit is marked digit zero, then each digit of a number contributes (digit x base^position) to the total value. E.g. 1567(base 10) = 7x10^0 + 6x10^1 + 5x10^2 + 1x10^3.


Because it refers to the number of digits in the system.

Base 10 comprises ten digits, viz. count(0, 1, 2, 3, … 9) = 10z

Base 2 is count(0, 1) = 2

Base 16 (hexadecimal) is count(0, 1, 2, 3, … 9, a, b, … f) = 16


But your counts are all using base 10! If we were to count the number of digits in binary using binary, we would get 10 digits. If we were to count the number of digits in hexadecimal in hexadecimal, we would still get 10 digits. This is true for all bases.


Changing the representation doesn't change the number. Ten, two, sixteen...


Try this:

Base 10 is count(0, 1, 2, 3… 9) = ||||||||||

Base 16 is count(0, 1, 2,… a, b, …f) = ||||||||||||||||


Straight and true.

The other view is fine too: Base-9 would mean "max number is 9".


The base b is actually the multiplier for the value in each digit: the first digit is b^0, the next to the left is b^1, the next left is b^2. Similar for right of the dot: b^-1, b^-2....


Steve Mann resolved this notational dilemma by using the term "crown", as in binary = crown 1, octal = crown 7, decimal = crown 9, hex = crown F, and so on.


What would you call base -2? Or base φ? Or base i-1?


>But day after day, over decades, surely the minuscule amount of heat being generated by activity on the surface has SOME cumulative effect.

No it doesn't, because the heat escapes out into space, so the minuscule effect is only an immediate one, not a cumulative one. That's why the greenhouse effect, being cumulative, is much stronger. If I put on an extra coat every day, it's not the heat output from my muscles in putting the coats on that is making me feel hot, it's the increasing number of coats that I'm wearing. Likewise, if I burn a tonne of coal, then that heat will have essentially disappeared overnight, but the global warming-causing CO2 will stick around in the atmosphere for another [very big number] years.


This very confident sentiment comes up in every comment section about IPV4/6

- "Updating the standard" is making a new protocol

- "forced suppliers to patch updates" - how?

- "USB backwards compatability that shiz. The fact I can take a modern usb device and plug it in a 1.1 gen port and it still just works. Why the hell isn't ipv4 like that for upgrades?" - because you're changing the address space of the protocol. If the new standard can address more than 2^32 things, then it won't be backwards compatible with v4.

- "Seriously is there any real technical hurdle why we didn't do it this way?" - Assuming you're talking about having a variable-length address from the start in IPV4, because I assume having a non-fixed packet header size would be much more computationally expensive and violate a lot of assumptions that you can make when the header is fixed (having a fixed region of the buffer that is known to always be the full header). You'd be much better having a fixed-length address that is enough to cover all possible nodes in the network - exactly what IPV6 has done.

- "Astounding they baked it in just xxx.xxx.xxx.xxx" - IPV4 was first deployed in 1982. Wikipedia tells me that the year before, there were just over 200 nodes on the ARPANET. I think you're doing a bit of a disservice to the people who designed this stuff by castigating them for not factoring a 20'000'000x increase in network size into their protocol.


>His stance on compiler optimizations is another example: only add optimization passes if they improve the compiler's self-compilation time.

What an elegant metric! Condensing a multivariate optimisation between compiler execution speed and compiler codebase complexity into a single self-contained meta-metric is (aptly) pleasingly simple.

I'd be interested to know how the self-build times of other compilers have changed by release (obviously pretty safe to say, generally increasing).


Hmm, but what if the compiler doesn't use the optimized constructs, e.g. floating point optimizations targeting numerical algorithms?


Life was different in the '80s. Oberon targeted the NS32000, which didn't have a floating point unit. Let alone most the other modern niceties that could lead to a large difference between CPU features used by the compiler itself, and CPU features used by other programs written using the compiler.

That said, even if the exact heuristic Wirth used is no longer tenable, there's still a lot of wisdom in the pragmatic way of thinking that inspired it.


Speaking of that, if you were ever curious how computers do floating point math, I think the first Oberon book explains it in a couple of pages. It’s very succinct and, for me, one of the clearest explanations I’ve found.


Rewrite the compiler to use a LLM for complication. I'm only half joking! The biggest remaining technical problem is the context length, which is severely limiting the input size right now. Also, the required humongous model size.


Simple fix: floating-point indexes to all your tries. Or switch to base π or increment every counter by e.


That’s not a simple fix in this context. Try making it without slowing down the compiler.

You could try to game the system by combining such a change that slows down compilation with one that compensates for it, though, but I think code reviewers of the time wouldn’t accept that.


probably use a fortran compiler for that instead of oberon


His stance should be adopted by all languages authors and designers but apparently it's not. The older generation of programming language guru like Wirth and Hoare are religiously focused on simplicity hence they never take compilation time for granted unlike most popular modern languages authors. C++, Scala, Julia and Rust are all behemoth in term of complexity in language design hence have very slow compilation time. Popular modern languages like Go and D are the breath of fresh air with their lightning fast compilation due to their inherent simplicity in their design. This is to be expected since Go is just a modern version of Modula and Oberon, and D is designed by a former aircraft engineer where simplicity is mandatory not an option.


You cannot add a loop skew optimization to compiler before compiler needs a loop skew optimization. Which it would not need at all because it is loop skew optimization (it requires matrix operations) that need a loop skew optimization.

In short, compiler is not an ideal representation of the user programs it needs to optimize.


Perhaps Wirth would say that compilers are _close enough_ to user programs to be a decent enough representation in most cases. And of course he was sensible enough to also recognize that there are special cases, like matrix operations, where it might be wirthwhile.

EDIT: typo in the last word but I'm leaving it in for obvious reasons.


Wirth ran an OS research lab. For that, the compiler likely is a fairly typical workload.

But yes, it wouldn’t work well in a general context. For example, auto-vectorization likely doesn’t speed up a compiler much at all, while adding the code to detect where it’s possible will slow it down.

So, that feature never can be added.

On the other hand, may lead to better designs. If, instead, you add language features that make it easier for programmers to write vectorized code, that might end up being easier for programmers. They would have to write more code, but they also would have to guess less whether their code would end up being vectorized.


perhaps you could write the compiler using the data structures used by co-dfns (which i still don't understand) so that vectorization would speed it up, auto- or otherwise


>Most users just want something that gets things done with as little friction as possible.

It's funny to read this as someone that always dreads having to get a new phone, or install some proprietary app, or reinstall a Windows VM, or sign up to some service, exactly because I know how much friction is going to be deliberately put in my way from every party involved.

"No you can't just change your phone's service provider", "No you can't unlock your phone's bootloader", "No you can't just boot into a new copy of the OS without signing into a bunch of things", "No you can't install this Windows image on a device without a trusted platform module", "No you can't just install a program from the web", "No you can't just look at all the files on your phone", "No you can't just have an app sync photos from the SD card", "No you can't create a login without giving us your phone number", "No you can't opt out of our 'telemetry'", "No you can't view the video you're paying for without the right OS/browser/monitor/cable".

Coming soon: "No you can't view the URL of the page you're on", "No you can't install an ad-blocker", "No you can't access this site without an attested, locked-down OS-browser stack", "No you can't install a different OS on this PC", "No you can't use a local, unlicensed generative AI model".

I think it's time we reframed the discussion and stopped dumbing users down and pretending that extremely anti-user behaviour is "user-friendly". Using technology used to be challenging because the hardware and software were still being bootstrapped to a point where it was fast, simple and bug-free to use. Now the experience of using technology is to have to navigate some corporate bureaucracy in every direction, which is totally independent of tech limitations.


In a better world, all schools would have computer literacy classes that did more than teach kids how to use Microsoft Office. Actually teaching them how computers work instead of siphoning them into learned helplessness just so big tech can keep on raking in billions.


IMO "Solves iPhone addiction" is more or less a rephrasing of "people will quickly get bored of this".

It's just a smartphone, except you can't run third-party software, can't directly interface with it, and can't connect it to other machines. And instead of holding an N-million pixel, M-million-colour, extremely high-constrast display directly in your hand, you have to indirectly project (meaning extremely LOW contrast) a single-colour display onto your hand from a projector that's shaking around being clipped to your clothes.

The only single hypothetical upside I can see to this tech is that it might lower the two-second delay in looking at my phone caused by putting my hand in my pocket before raising my hand, but you could say that that goes against the goal of solving phone addiction.


Because people find the concept of a person talking to themselves in public a bit weird, and talking into a completely unthinking machine is basically that. Maybe perceptions could change when low-latency conversational AI is very widespread but I think for the medium term unless there's a second human involved, people will still instinctively see it as talking to yourself, not talking to "someone".


The "laser ink display" looks a bit like the totally bunk display tech of the Cicret Bracelet "product" that VFX videomaker Captain Disillusion did a comprehensive takedown of a couple of years ago https://www.youtube.com/watch?v=KbgvSi35n6o.

While it looks like there are a few videos of apparent actual demos, I haven't seen one yet where the device (and more importantly, the recording camera's settings) are controlled by an impartial reviewer, and I'm extremely sceptical that this is usable in the real world. There's a demo by the founder where one of the inputs is to tilt your palm up, and even in the demo the projection struggles to compete with the indoor lights, nevermind the sun https://youtu.be/CwSeUV3RaIA?t=205.

The pitch of this seems to be "no more distracting screens, and no need to download and manage lots of apps and services". Except there is a (very poor) screen, it's your hand. And you're limited to just one service and set of apps, the one that comes with the device.

It's all well and good saying that the AI can do everything you want, but the real world (sadly) has copyright restrictions and content licensing agreements which an out-of-the-box service by a legit company will have to abide by. If the song I want to listen to isn't available on whatever music service this product is partnered with, could I transfer music files from my computer to this device? There's a lot of use cases like this where you very quickly start to want an actual screen, and actual methods of input more precise and domain-specific than conversational voice commands.


> If the song I want to listen to isn't available on whatever music service this product is partnered with, could I transfer music files from my computer to this device?

What a weird example. They say they've partnered with Tidal, which would have 999 out of 1000 songs people look for, maybe more.


Even Spotify is missing probably more than 5% of what I'm looking for, which makes me strongly doubt that claim.


> If the song I want to listen to isn't available on whatever music service this product is partnered with, could I transfer music files from my computer to this device?

Unfortunately, "nobody" has music files any more. Spotify forever.

(Of course readers here are the exception.)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: