I'm such a user. Been mostly running on debian/stable since the 90-ies. At work and privately. I cheated when I got a new computer in the beginning of August this year and installed Trixie a couple of weeks before release.
My reasoning is quite simple: I really don't need the latest versions of everything. Were computers useful two years ago? Yeah? OK then, then a computer is obviously useful today with software that is two years old. I'll get the new software eventually, with most of the kinks ironed out. And I've had time to read up on the changes before they just hit me in the face.
Sure, it was a bit painful with hardware support some twenty years ago or so, but I can barely remember the last time that was an issue.
For the very few select pieces of software where stable doesn't quite cut it there's backports, fasttrack and other side channels.
Interesting. I consider Excel the worst of Microsoft's misdeeds. Not that there's not an abundance to pick from, but Excel may very well top the list.
It's perhaps the single worst database in the world; with no type control, no relationship management, no data safety whatsoever to speak of (it even actively mangles your data), its interface is utter madness, and yet - it's the most used database in the world.
It's perhaps the single worst development and runtime environment in the world, obscuring code, making reasoning about code and relations between code almost impossible, using a very obscure macro language that even morphs between different computers, and yet - it's the most used development and runtime environment in the world.
It's perhaps the single worst protocol/data exchange format in the world, with dozens of intentionally obscure, undocumented versions, insane format with surprising limitation (did I mention it actively mangles your data? - it's worth repeating anyway), supremely inefficient, and yet - it's the most used protocol/data exchange format in the world.
I can't really think of anything in the computing world that has done as much damage as Excel.
What you fail to realize is that (nearly) everything you think of as a flaw here is a key feature.
Excel allows norm(al users)ies to scale Mt Impossible from the bottom where they don't care about types, or relationships, and don't want to (because it's too abstract). They want to solve a problem. So they start with simple data given meaning by physical space, and work up from there.
It's genius. It's computing for people that will never care about pointers.
> It's computing for people that will never care about pointers.
That's a bingo, although I'd phrase it even more glowingly as "It allows people to solve many common problems with computing, without knowing about pointers."
Everything you say is not wrong. But despite being so horrible, the business world still runs on excel. Finance, underwriting, accounting, engineering tools, fantasy football leagues… Excel is a highly used tool possibly the most used tool and enables many users who do not consider themselves programmers to be productive with their PCs. It’s timeless and hated by many for valid reasons, but its impact is vast.
But that's just path dependency. If Excel didn't exist, everything would run on something or somethings else. And it's not clear whether this timeline is better or worse than the average timeline in that respect.
Without a doubt, if Excel didn't exist, someone would have created it.
It's the lowest-barrier programmable logic, a coordinate-system where arithmetic can be applied to contents of any given coordinates.
And it likely would have grown into the same exact mess as Excel, with continuous expansion of the arithmetic part, as people kept reaching the limits of it but wouldn't go back and recreate everything in a DB...
I'm told there were better spreadsheet software back in the day, but that Excel basically won accounting/finance by allowing itself to be shareware (i.e. effectively free), in a similar way to how Microsoft has at times turned a blind eye to piracy of its other produce (e.g. Windows).
Not so much.. I mean if Word Perfect and Lotus 123 had a merger, then they would still be competitive as neither was really better than the MS Office counterparts, but as a combo they would have had more entrenchment to work from.
IBM buying Lotus and not Word Perfect was probably a mistake, had they really wanted to take it seriously... but they seemed more interested in Lotus Notes (think Outlook+Access in a self-hosted cloud environment), it was imho nasty af.
Not really. Once Windows came in, Excel was pretty much the best game in town. Lotus didn't really do a great job on Windows. There were some attempts at more integrated office suites but they didn't really take off. There were also some attempts at different spreadsheet models but people were probably too used to essentially the original Visicalc model. Not sure that Excel was anymore effectively shareware than any of its competitors.
> Excel is a highly used tool possibly the most used tool and enables many users who do not consider themselves programmers to be productive with their PCs.
What frustrates me the most about this is I've seen some insane excel wizardry from the accounting department at various jobs over the years that is effectively programming, and that if these people had put just as much effort into learning Python & using a database, they'd be better off and might actually make good developers. In my view, Excel ends up becoming sort of an artificial barrier to departments outside of IT being able to make business software.
Also a good point- but there is no python runtime on accounting and PMs computers. And it’s also a huge mess to try and support. Imagine some python code from 10 years ago, then juggling the versions, then god forbid any module dependencies. It’s simply not portable. Meanwhile the VBA written in 2000 is still working all contained in an excel Workbook.
I would dare to say that all business apps start as an Excel sheet (or Google Sheet) and after the usefulness of data collection and data arrangement/presentation is validated (often long after the usefulness is validated) they eventually become a full-fledged business web app.
And as a casual Excel user (to get data from CSV, remove some rows, move few things around, etc.) it isn't even great. You can't open two files with the same name because Excel seems to have some "global state" between windows; to the point where you might be hitting Control+Z to undo some changes, and it's undoing stuff on the other spreadsheet without you noticing.
Doing something as "simple" as a LEFT JOIN of data requires having two separate documents (or one, but saved on your system), open them in the Power Query editor (if it's the same document you do it twice, once per table) which creates two "queries", and then you can either use one to join against the other, or create a third one "joining" them. In the end, you get three new sheets on your docs: the original tables and the merged one.
Then there's the annoyances: if you use Excel in English (US at least), apparently you get a CSV separated by actual commas "," (ASCII 0x2C) but using it in Spanish (Spain) you get it separated by semicolons ";" because commas actually separate number decimals. Meaning whenever I build a program that parses/writes CSV, I need to consider the chance it's using semicolons and commas instead of commas and dots. Not that it's non-standard: CSV doesn't specify a delimiter, but you could stick to the same format everywhere, or give an option to customise, or create "Tab-Separated Values" (essentially CSV with tabs separating values).
Another one is formulae, that also change based on language, and their arguments separator also changes. In en_US you'd use `=SUBTOTAL(109,B2:B7)` while in Spanish it's `=SUBTOTALES(109;B2:B97` (plural instead of singular, and semicolon instead of comma). Meaning any guide, documentation or tutorial in English requires me having to "guess" how the function is translated, and manually changing commas to semicolons.
With all this, I mean to say: Excel isn't even that great for the "normal" user. Or perhaps I'm too "power user" for this and just lazy enough to bother with it instead of using "proper" tools like Python or R.
CSV literally stands for Comma Separated Values, so I don't know what you expect. For the most part, you should have (double)quotes around your values that contain commas and double the double-quotes for literal instances.
UTF-8 is now pretty much the defacto standard for the files, where as historically you'd have a number of different code pages, and/or UTF-16 (BE/LE with or without BOM) and a lot of other variances that were much harder to deal with.
Pretty much any software library for CSV handles these things for you. As for localization of input/language parameters, can't really speak to that aspect of things. And I'm not generally using multiple spreadsheets, etc... at most I'll have a database source connected to work against queried data.
The Galactic Standard Calendar or Galactic Standard Time was the standard measurement of time in the galaxy. It was based on the Coruscant solar cycle. The Coruscant solar cycle was 368 days long with a day consisting of 24 standard hours.
60 standard minutes = 1 standard hour
24 standard hours = 1 standard day
5 standard days = 1 standard week
7 standard weeks = 1 standard month
10 standard months + 3 festival weeks + 3 holidays = 368 standard days = 1 standard year
This follows a convention that was well established and felt pretty ancient when I learned about environment variables in the nineties (i.e. 30 years ago). Variables that are flags enabling/disabling something use 1 to enable, and 0 to disable. I'd not be surprised if this has been pretty much standard behavior since the seventies.
I always thought that an unset boolean env var should define the default behavior for a production environment and any of these set with a value of length>0 will flip it (AUTH_DISABLED, MOCK_ENABLED, etc.). I thought env vars are always considered optional by convention.
I don't doubt any of that but why stick to such old conventions when there are explicit and immediately clear options?
I don't think me writing an if condition
if boolean != true
instead of
if boolean == false
should pass code review. I don't think my pet peeve is necessarily different from that. I understand there's a historical convention but I don't think there's any real reason for having to stick to it.
Hell, some of the other compiler options are flags with no 0 or 1, why could this not have been --static or any flag? I'm genuinely curious.
Moreover, 0 here maps to false but in program exit codes it maps to success which in my mind maps to true but then we have this discrepancy so it does not appear to be the right mental model.
I owe my career in programming to a MUD. That's really where I learned to code (mainly by staring at and trying to debug tonnes of really bad code other clueless newbies like myself had written). That it turn, got me a spot as a sort of hang-around at a local ISP/consultancy shop whose staff intersected a lot with the people running the MUD. They eventually decided to hire me when a suitable contract showed up.
All in all, I'd say the MUD was a terrific place to learn to code. You could literally write a few lines of code, and see their effect immediately. "I want to code an orc." Inherit stdmonster, call a few API functions to set name and description, and BAM! - you've got an orc! And so on. Motivation never ran dry because - hey, I was adding features I wanted to a game I loved! Feedback (of varying quality, sure) was immediately available in the built in chat channel. Code was hot loaded/reloaded, so iteration cycle time was approximately zero. Emacs + angeftp (later replaced by tramp) to the host machine, you were literally editing the live code all the time (who needs pull requests when you have C-x C-s, eh?), so lots of instructive oops moments. It was amazing.
Have a whole bunch of friends with a similar story.
I'm the same, but from the other side. I learned to code clients/bots that would play the MUD. I had a fantastic fighting script, that basically fought optimally, one that used two characters at the same time to solve some complicated maze, and I even wrote a headless bot runner that was compatible with the files of the MUD client I used.
It was all great fun, and I also owe my extensive regex experience to it.
Same here , I got addicted to a mud called realms of despair , smaug codebase, a derivative of Dikumud, I then ended up helping run a pretty popular server on the same codebase with some friends I met on the server. Bought a "learn c++ for dummies book" (even though it was programmed in c) and started modifying the server and the rest was history ;) I guess I was 14 at the time and I haven't stopped programming since. Gaming is definitely the gateway drug to programming and text based games are in some ways the most interesting form for learning due to not having to worry about graphics and immediately seeing results like you mention.
I've often thought about implementing "Claude plays" some open source mud. Seems like a much more pure form of experiment since it's all text.
Just last Friday I was doing some cut&paste proof of concept of "Gemini plays Genesis MUD".
I gave Gemini a little prompt and just started pasting the game output into Gemini and the commands from Gemini back into the MUD. It managed to create a character, do the tutorial and started doing some initial skill training before I got tired of all the cut&paste.
Would be fun and interesting to let it completely loose in a similar environment.
I had ChatGPT play The Two Towers and have screenshots. It was surprisingly good at it, but after the first session (max message limit reached), it was unable to pick up where it left off. It started describing rooms TO me instead of responding to rooms I was describing to it...
> tonnes of really bad code other clueless newbies like myself had written...
When I first looked at MUD code, I had not yet learned to code. I thought that the folks who wrote the code must be so smart, and felt intimidated by it. Fast forward a few decades, and I recently looked at MUD code again. I spent a week porting ROT 1.4 to a node server, mostly just as a personal coding exercise, and found myself realizing just how bad that code actually was.
Yet we need to be fair. As you said, it was written by newbies, mostly students. It was written before modern tech stacks, before modern practices. And despite all the critique we could throw at it... it worked. It stills works. It was shared, copied, modified, and kept on working for many people, over many years. And it definitely inspired people to learn and try new things.
Absolutely! Every single line (my own included) was truly a work of passion. We were all there coding because we loved the game, and wanted to make it even better (although opinions varied, of course, on exactly what would make it better). We all did our best, and we all just wanted things to be (even more) awesome.
We didn't even have any version control. Everybody was logged in, editing the same files over ftp, and reloaded the code on the running instance. It was chaotic and hilarious. I eventually ended up in charge of the "mudlib" (essentially the standard library). I learned so much from reading, debugging and trying to improve that code and, eventually, being a sort of mentor to a slightly younger batch of newbies.
I've never been a guru when it comes to clever algorithms, and I'm pretty shit at math, so I've had to find some other role fit on a team. I believe my main strength to this day is debugging weird and messy code, and much of the reason for that is the years I spent trying to get that wonderful mess of a code base to work.
When I was starting out with Python, I found a library that implemented a Python version of MOO. It was brilliantly named "POO" (although in later versions it lamentably changed its name to "MOOP"). The cool thing about it was the in-world coding language was also Python, so when you code for custom rooms and objects, etc., it was all Python. I had a lot of fun with it.
I suspect that in another 10 years or so - or probably today already - we'll see similar stories coming from people who started with e.g. Minecraft or Roblox, which are in a sense just as much programming, user generated content games as MUDs are. Maybe a bit more visual.
So, here's the first three sentences in the linked article:
I recently changed jobs and found myself in a position where I would need to do a lot of work on remote machines. Since I am Emacs user, the most common way to do this is using TRAMP (Transparent Remote access, Multiple Protcol). TRAMP is an Emacs package that let’s you treat a remote host like a local system, similar to VSCode Remote Development Extension.
My reasoning is quite simple: I really don't need the latest versions of everything. Were computers useful two years ago? Yeah? OK then, then a computer is obviously useful today with software that is two years old. I'll get the new software eventually, with most of the kinks ironed out. And I've had time to read up on the changes before they just hit me in the face.
Sure, it was a bit painful with hardware support some twenty years ago or so, but I can barely remember the last time that was an issue.
For the very few select pieces of software where stable doesn't quite cut it there's backports, fasttrack and other side channels.
reply