I read through several pages before giving up and jumping to the end ... but what I saw gave no objective reason that the war will be won, this is pure hope or maybe a "call to arms". I actually don't see any convincing reason why we'll "win" this war and in fact I feel like we are on the precipice of becoming permanently locked out from it ever being possible. The main reason is the complex web of laws interacting established platforms that make it effectively illegal for any new competitor to ever become established. All over the world governments are starting to regulate complex and specific requirements around security, surveillance, encryption, etc that are fundamentally incompatible with true "general purpose" computing. For example, if your computer can encrypt things without a backdoor then authorities cannot listen. But if it can't then by definition it is not a general purpose computer. Which is it going to be? I think governments will win and we will lose general purpose computing.
The thing that has always defanged authorities of the past is organizational inability to see where the game is changing. And the game has gone on for a long time - villages would do all sorts of things to operate outside of the vision of the local lords.
The probable source of disruption comes from people one step removed from the top who see an opportunity to shake up the system and turn an activist message into opportunistic gain. This is why we often see waves of "anti-corruption" campaigns, sudden policy shifts, etc. The politicians see a trend forming and jump on to it. When they get in power they walk some of it back, but they can't turn back the clock all the way.
The source of a trend towards GPC comes from a series of "small wins" like the recent breakthroughs in Right to Repair, from IP that has recently expired, and from nationalistic competition("world's free-est country" will always be a title up for grabs).
It only takes one little country that's a "hacker haven" to jump ahead of the rest for a clamor to erupt. The dominant players will conclude that the answer is to strongarm that country into the hegemonic framework; others in weaker positions will see opportunities in jumping on. Then the fight is waged economically, and if the resulting products and services are desirable, concessions are made.
I think we will win because locked down platforms are fundamentally less powerful and less suitable for innovation than open platforms.
We are reaching a turning point where even the brightest minds struggle to generate major innovations on the locked down web. You can't build "the next Facebook" on the web as it is today because the incumbent powers suffocate you so effectively.
Conversely, the dweb is flush right now with innovation and new ideas, with an ecosystem of builders that are excited to share and compound off of eachother's ideas.
I believe that at maturity, the dweb will run absurd agile circles around the lockdown web.
I want to believe in dweb too, but I've been around since the truly distributed days of email/ftp/nntp/gopher and the birth of the web, and while I believe in open protocols, no distributed system has outcompeted a centralized one in terms of practical value to users. Even email, the most powerful protocol of them all, still requires hosting, people are never going to run their own email servers at scale. So inevitably you have centralized nodes, and at some point economies of scale kick in and there is a qualitative difference once consolidation achieves a certain critical mass. For instance, it's now very hard to run your own email server even if you want to, because if you get blacklisted by Gmail as a small operator you are unable to interact with a huge percentage of humans, with essentially no recourse.
Again, I truly want to be inspired here and will work to suppress my cynicism if you can just tell me why you think dweb will be different from Indie Web, Mastodon, or App.net.
The big step that dweb has taken in the past year is that it's now possible to build all the same protocols without requiring users or community managers to run servers themselves. Everything can be done in the web browser, and nothing requires user to have uptime anymore.
With "dweb" do you refer to getdweb.net or to one of the many other alternatives that have sprung up?
Can you explain how it works, is it available via the regular browser by typing in an url? What's the selling point? Sorry, I know I could do this research myself but I did that several times now with various alternatives and it just doesn't seem like any of them are good enough. Does dweb help against DDOS?
We already have a technology where people don't need to run servers: torrents. That's because everyone that uses it is simultaneously a server.
A big problem of torrents (and IPFS, and most p2p technologies) is how to give proper incentive for people to contribute. There are "cheating" clients that don't upload back for example. And there are people that close the torrent as soon as it is downloaded. Well, you can sometimes rely on people's sense of belonging to a community; or you can sometimes shun noncooperating people from the community (as in private trackers).
I think dweb's main contribution is giving the option of monetary compensation for participating in the network. Perhaps such incentive wasn't needed when people were just seeding things they are passionate about, like movies and music, but it may enable more boring stuff to be decentralized as well. Hence, blockchain.
Most current p2p protocols are about sharing data in a decentralized way, but you can also execute algorithms on this data. That's what's being done with blockchain today (technically you wouldn't need a blockchain for this, but we already established we might want a way to give monetary incentive to peers, so, there's that)
This scheme may or may not work. I think it may have adverse effects, like when the web was taken down by advertising and commercial interests. But it also opens a path to a new kind of web.
About ddos: existing protocols have a big problem with scaling; an actual denial of service attack would make things much worse. That's why Lbry (which is a decentralized Youtube of sorts) is mostly accessed through a centralized site, Odysee - funnily enough, Odysee doesn't use any form of decentralization to serve the videos. It could make peers serve themselves using webtorrent or other protocol on top of webrtc, and it might some day just do it, but it doesn't at this time.
And yet more people use email than Slack. My parents don’t know what Slack is but both have email. So does my teenage daughter. VCs like to see growth of course, so Slack until recently was the winner from a VC’s point of view, and I suspect here at HN we are a bit biased towards that viewpoint. But growth of one single product doesn’t tell you much about the grand scheme of things. Email is a very good example because no other messaging product has more adoption than it.
Honestly, lot more people use WhatsApp/FB Messenger/iMessage for personal communication. Slack/Teams are taking away huge amount of intra-office communication. For professional and inter-corp communication, email is still popular because LinkedIn product managers aren't smart enough to build de-facto comm. platform over there network.
I hate new features in familiar applications. I think it’s marketing you are referring to, not consumer demand. While I’m only positing an anecdotal personal reaction, I’m also not seeing data behind the premise you proposed.
Slack is puzzling to me, as it’s a terrible product, IMO, which I feel bandwagoned from a few top down decisions by some larger companies to risk their essential internal communications on a third party proprietary app. It makes no sense to me and I would never specify it’s use on any team I lead.
Counterpoint: closed platforms will continue to account to majority of users because innovation available to open platforms is counterbalanced by massive amount of capital available to closed platforms.
> You can't build "the next Facebook" on the web as it is today because the incumbent powers suffocate you so effectively.
I don't doubt this suffocation is real, but isn't it really the strong network effects that Facebook benefits from which stops somebody from "building the next FB"?
Facebook refuses to let anyone else tap those network effects, that's what's suffocating. Remember when Facebook and Twitter and everyone else had these amazing robust APIs you could build entire startups on top of? Then when they killed hundreds of companies overnight by turning them off?
In the dweb, those APIs can't be shut off. Those hundreds of innovative companies would still be alive and adding value to the world.
People will install 20 social media apps for 20 different psychological contexts that have subtly different needs and can use a UI distinguisher to key their brain in on which set is active.
People won't install another social media app if there isn't sufficient incoming social credit encouraging them to use it and the need falls into an existing category. Something sufficiently different can gain an “I couldn't have done this with” credit to overcome some stickiness with.
… maybe. Sort of. I think.
(I wonder what this means for e.g. Facebook's acquisitions of Instagram and WhatsApp, if it's true.)
With the dweb, you don't need to install 20 social media apps. There can be 20 different social media apps that all interoperate. I can be using one app, and you can be using another app, and we can still message each other and see each other's posts because the data is openly available across applications.
That's why it's going to win. You don't need every platform to implement every interesting feature, the open nature of the data allows the innovation to happen much more effortlessly.
"Take WhatsApp: the technology behind it is ridiculously simple. "
No it is not, when you serve over 100 billion e2e encrspted messages everyday all around the world, and groups and chat, ... and all of it in a way that people think it is instant.
Thats a problem with Signal. It is really really sluggish compared to whatsapp or telegram.
I worked at WhatsApp for almost eight years. It's actually pretty simple behind the covers.
Yes, there's a lot of attention to detail and a lot of edge cases that WA covers better than most others, but there's not a lot of truly complex things. It's mostly doing a lot of simple things well (and some amount of picking the right structures to keep things as simple as possible). Often doing things simply means skipping current computing trends. And of course, sometimes doing things simply doesn't actually meet the needs, and may make changes harder later.
Of course, Signal has much harder to meet privacy goals and a smaller staff and they didn't build on Erlang, so they're playing on hard mode.
Acquiring WhatsApp was not cheap and required significant effort. Why bother.
Than answer to your question is no, IMO. Network effects may discourage competition from people with defeatist attitudes (who would give up before even trying), but network effects are not exclusive to Facebook. They can happen elsewhere.
One theory why Apple is letting iPhone users stop Facebook from tracking them is that Apple believes that iMessage is or will be competing with WhatsApp. It aligns with the theory that Facebook's move to change the WhatsApp ToS at this time is because usage of so-called "E2E" messaging apps is growing significantly.
Right and that's one of the ways that we can recognize it's an interesting area. Once it's understood, the problems well-bounded, the window has generally closed. Of course there is also risk, sometimes we've wandered too far afield and we're just out in the weeds.
I always see everything from national laws of various stripes to human apathy blamed for the slow decline of general purpose computing but I think the answer no one wants to admit is that the dream simply hinged on figuring out how to overcome certain practical hurdles that we have failed to cleae.
There's all sorts of fingers that could be pointed in different directions across different industries of course (some being the same industry now due to consolidation of the stack) but the most fundamental failure of judgement in my opinion was that there was a romantic notion once upon a time that computers would be this great equalizer that could even allow clever enough eggheads to take on the world (literally, not with some fancy high value business idea) while simultaneously bringing real opportunity and equality to the common man ('anyone who can use a computer will be more valuable than a CEO').
The former idea never quite managed to explain why these eggeads wouldn't be beholden to the limitations of the hardware at their disposal (Turns out computing hardware multiplies the effectiveness of the boogey men just as readily as it does the eggheads and the boogeymen do a lot more hiring). The latter notion was abandoned by the same people who dreamed it up in the first place not that they'll admit it (of course they wanted computing to empower the common man, it's surely a coincidence that every step of the way they focused on expanding only an egghead's ability to command computers with wider accessibility being an afterthought motivated primarily by profit incentives)
I think the ugly truth is that general purpose computing was more of a philisophical goal than a realistic one. It seems a lot more likely that from a technological perspective it's just a lot more efficient to have an expansive commons of innovative collaboration available for dueling giants to draw from on an as-needed basis.
> "I actually don't see any convincing reason why we'll "win" this war and in fact I feel like we are on the precipice of becoming permanently locked out from it ever being possible. ... For example, if your computer can encrypt things without a backdoor then authorities cannot listen. ..."
Why? Buy an off the shelf processor chip, buy memory chips, buy some interface chips to provide access to peripherals and storage, solder them to a board and you have a general purpose computer that will boot any code you write for it. The Raspberry Pi, Arduino, etc. are examples of this. If you want an absolute guarantee of no third-party code not under your control, use an FPGA to custom implement your processor the way that Bunnie Huang's Precursor project does. FOSS fills the gap for software.
Sure, you might not get the performance you want for the price you want, you might not be able to connect to the sites you want, or run a specific proprietary software package but that has nothing to do with general purpose computing. In no way, shape, or form is general purpose computing endangered nor has it ever been.
Sure in a free market you could do that, be we aren't living in a free market. Both corporations and politicians wants you to use locked in devices where you are easier to monetize and control. And since those are the two powers controlling regulations chances are you will no longer be able to buy those chips legally without an active software engineering license and a business document stating what you will use them for.
It wont happen tomorrow, but what about 20-40 years? Compare todays market with 20 years ago and how much it changed, extrapolate 20 years in the future and it doesn't seem improbable at all.
> Buy an off the shelf processor chip, buy memory chips, buy some interface chips to provide access to peripherals and storage, solder them to a board
I suspect those things will become regulated and require either constraints built in or license to be able to purchase them. Then likely aveneues for connection will be required to verify devices are trusted (so, cellular companies, ISPs) and they will do this by associating devices that try to communicate with government issued cryptographic signatures that are built into chips etc.
And yeah you could bypass everything but it will still be illegal. So a bit like you could build your own car but you won't be legally allowed to drive it on the road and you will incur huge liability if you try.
> For example, if your computer can encrypt things without a backdoor then authorities cannot listen. But if it can't then by definition it is not a general purpose computer.
No. The user could still deliberately chooose to run a different back-doored encryption software on the same computer. The definition still holds.
They can only choose to do that if they can run that software. We are pretty close to the situation already where users no longer have choice over what software they can run.
The iPhone went there years ago. The situation on Android is tenuous. OSX and Windows both now both start from a secure boot mechanism. At this point it is still possible for a user to work around these things but this is now "at the pleasure" of the OS maker and not something the user actually controls. Removal of a couple of checkboxes is how far we are from losing user-controlled right to install software on any mainstream consumer supported OS.
So I am arguing that now that operating system manufacturers have forfeited their ability to say they cannot prevent the user from installing software, it is only a matter of time before it becomes legally mandated that they close off this capability, and deliver only software that is delivered from a verified (probably government licensed) channel.
hah. Linux. It lives on the mercy of hardware-majors. Which live on the mercy of the software-major (there's only one left, M$. They aren't multiple).
i upgraded my home server few months ago. It was runing archlinux for 13 years. It was up-to-date, not some ancient codebase. Yet, the "new" 2y-old-tech motherboard/processor/memory combo did not even start / recognize the SSDs. Took me a week in debugging, and replacing the motherboard with another one. Exactly One check box in the BIOS/setup, yes, and your gone. The 2nd mb did have it. The 1st did not anymore.
The general public will lose access to cheap general purpose computers, to the extent that they even exist now. Maybe because it poses a threat but probably most because the general public doesn’t care about general purpose computing. There will be no outcry when it becomes inaccessible to them.
The government can never effectively ban general purpose computing in an authoritative way because it’s just too easy to build a computer. You can build a computer using a bunch of fpgas connected together. If they ban fpgas, hackers will start using 80s-style 8 bit micros.
That is all to say, the people who care about general purpose computing will never lose it and the people who don’t care won’t even notice.
Despite the cheapness of those devices, they are increasingly irrelevant and inaccessible to the general public. They are instead being passed over for locked down devices like iPads and iPhones. To a normal person the mere thought of figuring out how to use a raspberry pi (or increasingly even a windows PC) induces a stress response.
How relevant and accessible to the public were general purpose computers in the past?
> To a normal person the mere thought of figuring out how to use a raspberry pi (or increasingly even a windows PC) induces a stress response.
I’d argue that this has always been true—the only difference is that now, if you’re willing to push through that stress, you can get one for dirt cheap. And I’d also argue that the reasons that this is true are completely ignored by the article at hand.
> How relevant and accessible to the public were general purpose computers in the past?
There were no alternatives in the past. Now that locked down alternatives have been invented, general purpose computers will slowly but surely fall out of use
I see, there was a misread of my original comment. The bit about “to the extent that they exist now” is a comment on the state of current so called general purpose computers like the ones based on intel platforms. In appearance they are “general purpose” but there are various locked down subsystems that are not under the control of the user.
I never claimed they were expensive, not sure what you’re referring to there.
If they don't have a motivation to do so, sure, because it's extra work. Introduce a motivation, and that quickly changes given the wide range of introductory material out there.
On several points I agree with the OP on the desirability of "universal computing". For more, I've long thought of most of the concerns in the OP and have not been and still am not much concerned: Why not?
=== No Smartphones
For all the threats of smartphones, I avoid them -- I don't have a smartphone. My phone is an old Bell touch tone desk set connected to essentially a land line.
I saw some threats of smartphones and didn't like the cost to buy, cost of usage, bad keyboard, small screen, and the general inability to write and run my software, old and new.
=== Digital Appliances
For threats of digital appliances, devices, from, say, Amazon, I don't have any. No way do I want some digital appliance listening to everything I say.
=== The Cloud
I don't trust the cloud. I make no direct use of the cloud. So the cloud is not a threat or cost to me. And I don't have to do mud wrestling with their poor or missing documentation on how their services work. What they are offering, no thanks. Not even for free. And whatever they say about reliability, security, or functionality, I don't believe it.
=== Encryption
For threats of encryption, if the situation gets serious, say, for my email, then I will make use of PGP (pretty good privacy). So, I will have encryption under my control that Apple, etc. can't do anything about. And I won't have to worry about back doors.
There was a reason PGP was open source -- to keep a big power, government, company, from getting control over encryption, putting in back doors, etc. I like the idea of being able to control my own encryption.
=== Text and Console Windows
To me, the main data I work with is just text, the standard ASCII character set. And my main user interface is text in console windows. A big reason is that it is easy to automate the use of text in console windows.
So, in particular, I make minimal use of the Microsoft user interface idea of on-screen direct manipulation graphical user interfaces (GUI). I never liked the idea of a GUI -- insults me as a user; is an interface I essentially can't program; gives output tough to process further.
In one sense, important to me, a GUI is nearly always a big step backwards; it has me do something once; but to do it 200 times I have to do it 200 times as I do it once. Instead, I want to automate doing that 200 times. E.g., I had a list of about 300 URLs and wanted to download them all. So I used my text editor KEDIT to develop a REXX script to call the program CURL 300 times and then ran the script in a console window and then processed the 300 downloaded files with KEDIT. No use of GUIs. For such work, usually GUIs are useless.
=== Files
I'm totally in love with Microsoft's NTFS (new technology file system or some such) file system. I would like better documentation on the file and directory (I HATE Apple introducing the word folder) attributes such as system, hidden, archive, etc. and on locking and concurrency.
E.g., for handling those 300 files, do that in a subdirectory -- don't let other files get in the way and can copy, delete, etc. easily.
=== Manipulating the Text
Since I work with text, I need good tools for handling text, and my most important computing tools are the text editor KEDIT, its macro language KEXX, the scripting language REXX, the D. Knuth mathematical word processing TeX (I write TeX but no LaTeX), a spell checker ASPELL that comes with the TeX distribution I use, the .NET languages and object library, for some important work an old Watcom Fortran compiler (with the very nice IBM linear programming package OSL, optimization subroutine library).
So, I want good tools and if necessary write my own and don't want little apps doing things for and/or to me. Really, so far I have no apps at all.
=== Version of Windows?
Since I like text and console windows so much, the stuff Microsoft added to Windows 7 Professional to get to Window 10 Home Edition I don't want. I just like that version of Windows 7. I can think of some improvements I would like a lot, but none of those are in Windows 10.
For what Apple wants me to use, no way, not a chance, never.
=== Good Windows
To me, one of the best things about any of the versions of Windows is that they are still good to great places to run old command line software. E.g., KEDIT goes back to PC/DOS, OS/2, Windows 95, ..., Windows 7, Windows 10.
=== Typing for Developing
I used KEDIT with KEXX and REXX to develop the software for the Web site of my startup, 24,000 .NET programming language statements in 100,000 lines of typing (lots of good comments with some little KEDIT macros to ease using documentation, etc.).
Visual Studio seems to be intended to do what I do with KEDIT -- so far I prefer KEDIT to Visual Studio. E.g., Visual Studio is part of the long standing Microsoft idea of GUIs, and I just reject that as a big step backwards. I had no trouble at all using KEDIT, etc. to develop that software. The problems were, e.g., bad documentation for SQL Server that made getting a connection string a solid week of mud wrestling. Finally someone in Microsoft's SQL Server organization solved the problem.
=== Documentation or Experiments
Part of the Microsoft, Apple GUI approach is no real documentation, e.g., nothing like what was written by Mike Cowlishaw for REXX or by D. Knuth for TeX, and, instead, learn just by experimentation. I don't like that experimentation -- e.g., Windows 10 has some huge number of special keystrokes that do things, and I still have no knowledge of what those keystrokes are or do. I encounter those keystrokes by accident; some windows pop up, and I work to close them ASAP since whatever they are I know I want nothing to do with them. I don't like undocumented tools.
=== No, I Don't Want That
Generally what the Apple, Microsoft people and their app developers have in mind to please me just makes me angry. What they are offering me, I don't want. To me their work just gets in the way of my work; I hate their work and their assumptions about my work.
So, on several points I agree with the OP on "universal computing". If Microsoft will keep console windows and let me run old software, I will be happy. To make me happier, they can do better on documentation and tools for common tasks in system management. E.g., I'd like better means of backup and restore. For more, they can have fewer bugs and security problems. For more, I would like some good documentation for Microsoft's Power Shell. For the rest of the industry, I wish I could get a keyboard as good as IBM shipped with their PC/AT.
I agree. I think the reaction to the recent reveal of Windows 11's TPM requirement shows that we have basically no chance of winning this war because the average computer owner of the 2020s is simply not intelligent or educated enough to know when they're being screwed over. They hear "it's for your security!" and immediately roll over like trained dogs.
Not quite sure how TPM and secure boot have gotten the reputation of screwing users over... I utilize both on my thinkpad running linux. Personally I like knowing if you steal my laptop you just got yourself a very slick paperweight as the entire boot stack is locked to my keys and if you try to boot my drive outside of my system or with a different kernel TPM is going to stiffle both of those attempts.
I think that reputation stems from both TPM and secure boot being a bit difficult to grok and I would agree about screwing users over if there was no end-user access/cryptographically locked by the vendor but both are fully accessible.
TPM,Secure boot,Pluton,memory encryption... These are all the puzzle pieces of a locked down system more akin to an xbox than a PC. One by one they are harmless but together they can completely take away control from the user. These technologies could be good for everyone but right now they are good for whoever hold the encryption keys built into them (not you). With the user override feature detailed in the eff link we could have the best of both worlds (superb security, personal freedom to run anything). Too bad we are heading in the exact opposite direction...
Sure, of course there is tension between security and control, the same systems that allow me to secure my system from tampering or theft can also be used against me by the vendor to stop me from having full access.
I don't think that problem can be attacked by blaming the underlying tech, that quickly heads into the same territory of thought governments use to try and justify weakening encryption because it can be used by bad actors. I think the options (and to be fair all of them are hard) are: competition from vendors that allow full control, regulation from governments forcing vendors to give users full access, or subverting the implementation to regain control.
Also forgot to mention with secure boot you are able to sign with keys you have generated.
>competition from vendors that allow full control,
All major manufacturers of modern processors are part of the "Trusted" Computing Group. There will be no meaningful competition in that space.
>regulation from governments forcing vendors to give users full access
This would be the best but also the most unlikely scenario. The status quo can only benefit from what will happen. Imagine the government standing up to big corporations to benefit the people. Yeah, I can't either.
>subverting the implementation to regain control
I think this is what will happen from time and time. Every 5 years we will have a brief time when people possessing certain hardware will have full control yet again, only for it to be patched out (like in the case of the Nintendo Switch)
>Also forgot to mention with secure boot you are able to sign with keys you have generated.
Oh sure you can. The question is whether will it make you fail hardware based attestation like in the case of android custom roms. Google effectively killed the custom rom scene by giving app developers a bulletproof way to block them.
If you go down the rabbit hole and read about the trusted computing push by MS of the first half of the 00's you will see that the people generated so much negative press even before launch that MS stopped development.
I don't see this same scenario unfolding now... The articles about Pluton are basically the same copy pasted drivel... All of them fail to mention the extreme danger this poses to freedom of expression in the online world. Something changed in the last ~15 years.
I spend 3-5 years getting the perfect PC setup only to have it knocked down again every time I get a new PC and all the settings have moved, half the programs that used to work now either wont or has a replacement that's not quite what I want.
I am not against progress but I just need to work so I now specifically keep the last two generations of my PC offline just so I can compile a clients firmware or modify a PCB with the same environment I developed it on. The next generations of development environments are going on-line so it may not be an option for me.
At one point I designed complex communications systems from ISO layer 1 to layer 7 but these days I dont have a clue how to use the top layers, they change daily and I the guy in the IT dept to fix any issues with my smart phone or connecting to a clients network so I feel everyones pain.
I personally don't mind rethinks. I do this myself often. New insights come up all the time, I'm especially enamored with tiling window managers right now.
But what I do hate is taking away choice. A lot of these 'updates' have actually significantly removed configurability. "We removed this option because we don't think you need it" happens way too often. A computer exists to serve us. Not for us to bend to its will (or its manufacturer's).
It gets tiresome at some point. My first PC was in 1993, with MS-DOS and some kooky custom GUI that Packard Bell cooked up on their own. I'm a bit jaded now because I got really good at several of the early UIs I used and they of course are in the dustbin now. Nothing like having your skills thrown in the trash every couple years.
The Unix CLI skills that I learned in the late 90’s in university have had the longest staying power of anything in my career. They have extended their usefulness to Mac and now Windows with WSL. Pretty cool.
My solution is... don't try to get the perfect setup.
Learn to just be happy with the defaults and get on with what matters - the work you're using it for. I change maybe 1 or 2 settings on a fresh macOS install and that's it. I don't even change the wallpaper.
> I just need to work
So don't distract yourself with trying to create the perfect setup! Worse is better.
I don't know if Apple know best or not - I didn't say I thought that anywhere and I'm not sure who you're quoting - the point is the opposite - I don't care. As long as the system is usable, get on and use it and actually focus on your work rather than tinkering for the sake of tinkering.
The only war I'm fighting against is wasting my time with system setup.
Defaults are designed for common users, who barely know how to copy and paste.
Their setup is really not working for me.
I want my desktop designed for me, from me. And if the OS thinks, it knows better than me, how to handle things and regulary rearanges or reconfigures, than screw this OS.
Right but when you write it out like that can you see how it doesn’t make sense to expect that from a company? Not reasonable to complain when you don’t get it.
Or not remove the configuration utility, in some cases. No one here asked for telepathy and amazing defaults, but take away my configurability and I get upset.
Think about it like this - every single extra simple boolean configuration option doubles the configuration state space of the program. Add just ten option checkboxes for a feature and you have made the system three-orders of magnitude more complicated. Everyone pays for that complexity.
In reality this added complexity is not always so dramatic, when you have a solid architectural base.
And I believe Windows for example has one.
And windows still runs fine for me without cortana for example. But I have to resort to hacks, to make that change. So I do not demand more features here.
I demand just more control, what features I want to use. That was there and is getting removed for other purposes than complexity of developement.
You were an outlier before in all likelihood. It’s vanishingly rare for one person to have the skills design at both the PHY layer and application layers above it. It may have been possible in the earliest days, with lower speed and more resilient comms protocols, but I’d wager it’s practically impossible now at 1G Ethernet or wifi.
You are basically correct but even today I can add a PHY 10/100MB Ethernet from a low spec microcontroller understand and compile a basic TCP/IP stack, and add some simple telnet or HTTP server on top, several of my projects do this and a lot of engineers have that basic understanding. Even humble AVRs and Casio calculators can run a basic TCP/IP stack but the difference between this and a gigabit ethernet on some massive multithreaded operating system is night and day and as you say well beyond most.
I got sick of things changing too, so I developed my own platform which abstracts away from the underlying OS, so I won't have my settings changed randomly or have hard links broken. It may seem like overkill but stability has a price.
What bothers me is the idea that we can only have one type of computing--that for general-purpose computing to exist, we have to kill off every other kind of computing.
This is not a zero-sum game. We can have console-style computers and general purpose computers, and they can both exist simultaneously without one having to win and the other having to lose.
>We can have console-style computers and general purpose computers, and they can both exist simultaneously without one having to win and the other having to lose.
No, we really can't, and you probably didn't read the article if you think we can.
Making general purpose computers and the software to support them is less profitable than making smartphone-ish locked down computers. It's just a fact. The corporations will gravitate towards the most profitable options, as they always do.
"But as long as there's a market, even small, someone will make them" you say. But you're forgetting that computers don't exist in isolation. They just don't work that well on their own. The main reason they're so useful is because of networking. You can be running a computer with 100% free software, but you will probably still use online services that are not free. You need to use them to live a normal life in the world of today.
But thanks to hardware DRM, you might not be able to use these non-free services on a free device for much longer. Do you know what happens when you try to open the McDonalds app on an android phone running a custom operating system? It doesn't run. The server tells you to fuck off. It sends a message to your device's TrustZone, a black box security chip that you have absolutely no control over, asking if the device is running an original locked-down OS. The chip signs the response with its own private keys that cannot be extracted and sends it back to the server, which can then decide to reject you if it's not the response it wants. This is the reality of smartphones. It's not a joke.
And now, with Windows 11 requiring a TPM chip which is just TrustZone for x86, this is coming to desktops. And everyone is eating it up, to the point where TPM expansion boards went up in price after the announcement. Nothing will stop it.
10 years from now you will try to open some random popular website on your linux computer and it will not work. It will detect that you are not running an authorized system and reject your request. Want to order food? Too bad, use a locked down device. Want to buy something and have it delivered to you? Too bad, use a locked down device. Want to access your bank's website to check your account, or even just spend money? Too bad, use a locked down device. The bank part is already a reality, many banks today require verification using their app before letting you make online purchases, and the app only works on locked down smartphones.
Eventually, when enough network services stop working on "general purpose computers", 99.9% of the population will not want to use them anymore and they will disappear.
Unfortunately you are exactly right about what DRM/TPM is going to do to computers. Once Windows 11 reaches 50% marketshare, some Western government is going to demand that ISPs in their country not allow anyone online unless they are using a government-approved OS. Then they will require OSes and app stores to ban Tor and E2E encrypted chat apps.
Perhaps they won't go so far as to kick Windows 10 computers off the internet, but they might at least restrict them to certain sites and protocols. They could also say that people running "unsafe" OSes must install a government-issued CA certificate, to allow TLS interception.
Web hosts don't generally need to make outbound connections, but in any case, companies will be allowed to register specific domains / IP addresses with a government regulator, on the condition that they don't support E2E encryption (and, depending on the contents of the site/service, they may have to hand over a copy of their TLS private key, and not use forward secrecy).
Also, of course there will be government-allowed Linux versions, which implement Secure Boot and have a package manager that only installs approved apps. Admittedly, I'm not sure if this theoretical government would stop people writing their own software, but they could demand that OSes only allow "approved" apps to send packets over the internet. This would also prevent people developing "piracy" apps.
It'd be too cumbersome, there'll always be a new 'hack' to install a different package, then there's the ability to inject your messages and to send over these 'approved' apps and little to no way of ensuring such a huge infrastructure are all up to date on 'security' patches which stop these exploits.
You can start to see the restrictions creeping in. It seems inexpensive tablets don’t play videos from the the major vendor (Netflix/amazon/hbo) if the hardware/os don’t support
“Widevine” drm solution.
I genuinely wish I could give more mod points. This is already happening. If there is going to be a war, its outcome is far from a given ( and I personally worry, general computing will be on the losing side ).
The only thing that could stop is us. We are still creating the building blocks that make it all happen. It is not like we do not stand a chance, but it is hard not to feel pessimistic about the outcome since just about every communication from the power centers can be summed up with 'moar powah, moar'.
In the old days you couldn't do banking or shop on your PC because the web was young and these services didn't exist.
Perhaps in some future time your hacker computer won't be useful for these things either, so you have to use your phone instead. This seems livable? Maybe we should aim at doing more interesting things than shopping and banking?
Neither TPM, DRM, nor an app store was ever necessary for secure online banking, so I'm not sure what service you're referring to. TLS doesn't require those things.
I'm going to guess that banks didn't offer online check scanning until smartphones got popular because it would have been a tech support nightmare. First scan your check with a flatbed scanner, in the fairly unlikely event you have one, or maybe try your low-resolution webcam, which a website couldn't access through your browser in 2010, then upload it through a web form... how many users have we lost at this point?
Mobile banking apps with check scanning started showing up before hardware DRM was common on Android devices, and not all current apps make use of the hardware DRM. I think it's therefore safe to conclude the UX rather than the DRM was the enabling factor here.
Is there information available on actual breaches involving banking on rooted Android devices? I smell BS when banks and the like claim it's an unreasonable security risk.
Okay, I see what you're saying. I do think not being able to do trusted transactions online with computers I control is a pretty dire and serious regression of individual freedom at this point. It's dystopian having to buy a $500 planned-to-be-obselete device which spies on me (on behalf of enemies to my freedom) every few years to do basic things in society.
The economics of mass-production don't work as market sizes shrink, you can observe this by comparing MP3 player prices and selection today vs a decade ago.
Looks like the 'usual' capacity went from 8 GB to 32 GB and anything over $50 disappeared, in lieu of dozens of inexpensive devices from Chinese manufacturers?
A decade ago, there were more MP3 player choices with low cost and good quality, even supporting open-source firmware. Now there is low cost/quality and high cost/quality. This thread has more details, https://news.ycombinator.com/item?id=26870648
There was a time when laptops had 4:3 screens. Then the market changed because of demand for HD-video screen ratios. Today, the only portable option for 13" 4:3 screens (vertical real estate for text editing) is iPad Pro.
Hyperscaler cloud vendors can now order custom CPUs that are simply unavailable to retail. There's an endless list of markets which have been altered as demand volume shifts from one segment to another.
The original essay about general purpose computing points out that it's the underpinning of the "special-purpose computing", and that the "general" part will always bubble to the surface, unless users' freedom to own their devices is taken away.
So the meaning of "war on general computing" is closer to "war on ownership". Sure, owned and unowned computers can both coexist, but it's not clearly a good thing to allow someone else to control one's devices.
> unless users' freedom to own their devices is taken away
This is a really easy problem to fix though, we are already quickly moving in that direction. Every big provider is moving towards more and more hardware and software lockins, and since everyone is doing it slowly consumers aren't reacting. I'd be surprised if general purpose computers aren't seen as a niche market in 20 years and illegal in many parts of the world in 40 years.
Here is roughly how it will happen: As more people are locked in general purpose computing will get an increasing density of bad actors. Big tech will publish studies showing that almost all viruses, malware, scams, child porn etc comes from general purpose devices, and therefore push for regulations to ensure everyone uses a locked down device. Anyone who argues against that will be attacked with arguments such as "Are you really going to prioritize your hobby at the expense of children getting sexually abused? There is no need for anyone but criminals to own a general purpose computer!".
There is an enormous amount of software being written every day, much of it outside of silicon valley, much of it outside America, there are far more forces at work here than the SV tech monopolies and while I absolutely find the picture you're painting to be believable I feel (hope) it might not survive reality.
It requires the big powers to work together, and for every other actor to not contest it, for governments to accept and endorse the monopolies, and we're already seeing some fighting back in those fronts. I imagine the monopolistic orchestration will pull itself apart long before general purpose computing dies.
I suspect we'll see much ado about all of this in court at some point, it will be sad to even get to that point, but even if the consumers are happy campers I really doubt other businesses, and other stakeholders (nations included), will be.
>We can have console-style computers and general purpose computers
Up until the time you try to get your non-GP computer to do something the manufacturer didn't want you to, such as retrieving some data locked in your Android phone.
My biggest fear isn't technical, it's cultural. Computing doesn't feel like it's winning hearts & minds. Computing gets further & further away, less and less personal, less intelligible, more mystical every year. We accept more magic into our lives, & the sense of engagement, the sense of ownership, the idea of personal computing feels like it's fading.
I'm techno-optimist, but there's going to be such a huge lag between the wins we start to make, the re-free-ing up of computing, & any significance or adoption. We need to re-liberate computing, make the technical victories, before we can even begin to fight the real general-purpose computing war. The dream of computing needs to be re-kindled.
Perhaps the market for GP computing just is smaller than.. the market for magic smartphone.
We tend to think everyone needs to do computing, and understand the technology they rely. But I don't understand the magic that goes into the medicine I take. Nor do I have an understanding of how the electricity grid operates. My computer just magically gets power!
The market for commoditized computing is just bigger than general computing. That doesn't mean GP will go away.
Steve Wozniak wanted - and built - a pre-assembled computer for tinkerers and engineers; it also turned out to have some mass market appeal as a game and spreadsheet machine, and Apple made a fair amount of money selling it to hobbyists, gamers, schools, and businesses.
Steve Jobs realized that computing appliances (from computers that you couldn't open up to handheld music/game/app/phone devices) for people who typically had little or no interest in tinkering or engineering ("the rest of us") was a much larger market. Apple claimed the high margin section of that market and became one of the wealthiest companies on the planet.
I recall a story about Jobs being opposed to hardware - and software! - upgrades for the original Mac because "you don't upgrade your toaster." That's precisely the thinking behind the iPod, iPhone, iPad, and Apple watch - except Apple now knows that you'll have to buy a new internet-connected toaster every few years if they stop producing security bug fixes for your old one.
Good point - some 2012 Macs are still getting security patches so that's 9 years and counting! Some iPads are nearing 7 years. It depends on the device though. Apple TV 3 from 2012 is limping along but some popular channels (HBO, Youtube, etc.) no longer work. It remains to be seen what software support will look like for intel Macs as Apple moves to ARM.
Apple's officially supported repair lifetime for hardware lasts for 5 years I believe. I managed to squeak in a few repairs right before some devices became "vintage" by Apple's reckoning, though I think they may still offer repairs for some older devices if parts are available.
I run into this all the time. but medicine doesn't get called "bicycle for the mind".
The question of market share is uninteresting to me, now. We have zero idea what the market is. The ecosystem is unhealthy, rotting, consumer dissent is skyrocketing (see the Freedom Phone yesterday as a rife example). Fixing this situation is not doable in 18 months though. We don't fix the war on general purpose computing with a product launch (although pinephone aloneight singlehandedly jump start the sea change). This market based mentality though I find so reductionist & off base, besides the point. We have so much pioneering to do in computing. so much freeing people to enable them to begin to think.
I don't think it is, people just don't care. They want to pull the device out of their pocket and take a picture or message a friend. If anything a more general purpose device would have less appeal because it may well get in the way of doing those things.
Maybe you aren't winning hearts and minds because you aren't actually offering something people want. Maybe you aren't going to re-kindle any dreams because you are not offering anything worth dreaming about.
> Maybe you aren't winning hearts and minds because you aren't actually offering something people want.
I feel like most people have no idea what tech is offering.
There have been some special purpose projects (FreedomBox, NextCloud, &c) that have specific ideas, but the relatively new YUNoHost is a fairly new breed of examples to create an easy to run way to get people started hosting their own stuff. https://yunohost.org/
Federated model is interesting, in that it means not everyone has to operate their stuff. Instead, we have protocols of interoperability, and lots of hosts. That gives people the experience faster, but yes, it's still currently sub-par. And most importantly, there's no Competitive Compatibility (ComCom, formerly called Adversarial Interopability), so most things we write will not interlace with & work with people's existing networks.
> Maybe you aren't going to re-kindle any dreams because you are not offering anything worth dreaming about.
You're welcome to your opinion on that. Indeed right now is a good time to question this. General purpose computing is a very nebulous header, with a lot of different ideas. Certainly there's the negative-liberties we can see slipping away, as DRM, as cloud-computing removes us from power over our systems. The appliance-ization of computing is indeed the forefront of what people think about when they think about general purpose computing, and I tend to agree, that a far more revolutionary outlook with much further reaching positive-liberties is what we ought to be dreaming about. The difficulty is that these endless open frontiers are still unsure: each of us has to carry our own little light, try to figure out who elses light to join with, where-as the big huge forces of the world & their snuffing-of-the-light is much more visible, apparent, easy to rally around. So we need to make some traction on the big dreams forwards, we need something encompassing, and bold that floods people's imagination with possibility & excitement.
This is just my 2c, but the ubiquotous & pervasive computing world, I think, spoke to a vector of computing as cross-system, as connected, that currently is almost entirely anti-General-Purpose. We don't have good open general systems for working cross system. This is one hub of capability that we need to encompass into our dreams, that needs to be part of the General, for the General to get far. But it's just one piece, just one aspect. The dream needs to immerse us fully.
I think work's like Karli Coss's data-liberation is basically the ground floor of where we need to start. This is still early prototype stage, basically, but we need wide-spanning access to a huge cross-section of the digital (in much easier to pull off manners) to even begin to allow the interesting dreaming to begin, to begin to inspire each other: https://beepb00p.xyz/myinfra.html
I'm not so hopeful. To have some guarantee of rights and freedoms today, some sacrifice in convenience is needed. Most people I know who can understand what is at play are not willing so sacrifice even a bit of convenience.
The purpose-specific computing is more profitable right now. If we make general-purpose computing more attractive, then we may have a chance. But even then, compatibility maybe difficult.
> The purpose-specific computing is more profitable right now.
If people paid for open general purpose systems and software those would be more profitable because people use them more and use them for more serious things.
I am very close to deciding that the FOSS movement is partly responsible for this dystopia. More specifically it's the substituting of free "as in beer" for free "as in freedom." These two are actually at odds. Free "as in beer" is the bait on the hook for surveillance capitalism.
You can see how easy politicians can conquer minds, it's no surprise trillion dollar companies are able to sell anti consumer products at luxury pricing.
No, it isn't. This is intentional ignorance and if people keep believing it we will absolutely lose. This kind of thinking goes all the way back to the 1980s and 1990s when people said "GUIs are for wimps" and became increasingly irrelevant as everyone started using GUIs.
When I am driving and need turn by turn directions, if I have to take extra steps to get my maps app to work I might have to pull over or might try to do it while driving and crash. My map app must "just work."
If I'm about to give a talk and I plug in the projector's HDMI cable and my video driver crashes and I have to load up a config file, I look amateurish. My video subsystem must "just work."
If I'm trying to close a deal and can't share a document, the deal may fail and revenue could be lost. People might even lose their jobs. My collaboration system must "just work."
I could keep going.
Convenience is extremely important in the real world. It saves time, money, and even lives.
I agree, things people don't value themselves is often dismissed as 'marketing'. No these are just things other people value more than you do, and convenience is massively important. Without a lot of effort put into convenience there are lots of technologies many, even most non-technical people would never be able to even use.
This is the kind of thinking causing FOSS or other grassroots movements to fail. Convenience is extremely important. Much like you don't wake up every day taking pride in understanding every aspect of the electrical and water distribution to your house or how your car engine works, most people don't want to understand how their software works. That doesn't mean it shouldn't be easy for folks who _want to understand_, but unfortunately a lot of grassroots software just gatekeeps this way. The result is the slow death of GPC as users use the thing that's easy and there's no privacy or freedom-respecting alternatives that non-technical users can actually use.
I can see the day coming when few people will have general-purpose computers. Those will be the people who make things, and also have a good set of tools and maybe a milling machine.
This has already happened with phones and tablets, after all. And Chromebooks. And Windows 365. And Windows S. And locked-down enterprise machines.
I'm really not sure I see this. Enterprise devices are more portable than they used to be, not less. Gone are the days of science relying almost exclusively on supercomputers that could only run specific proprietary Unixes and basically required proprietary compilers. Connection hubs and DSPs doing signal translation from various industrial devices and military and space communications networks to IP networks have gone from almost exclusively ASICs manufactured by one of two companies to FPGAs, fully reprogrammable blank slates you can do pretty much anything with. Phones and tablets are certainly less general purpose than desktop and laptop PCs, but much more general purpose than earlier incarnations of phones and tablet-like devices such as Palm Pilots, digital address books, graphing calculators, flip phones, land lines, things that could only do one thing and couldn't have any type of extension application installed at all from anyone, whether it was part of a walled garden or not. If the average American teenager today has nothing but an iPad and iPhone, that isn't completely general purpose, but it's a huge improvement on when I was a teenager 25 years ago and the closest thing my family had to a computer at all was a word processor, not a software suite like Word or Lotus but a specialized typewriter with some proprietary embedded firmware and no writable memory at all.
You make good points: the barriers to entry have dramatically decreased across a huge portion of the stack. With one notable exception: the cost and complexity of modern fabs increases every year while the companies involved consolidate or die out.
tsmc could easily decide “we’re not gonna fab your RISC-V design” at any moment and you’d be more or less SOL. I mean, you could fall back to a 150 nm process, but that kills a lot of opportunities.
And if things continue — if Apple, Amazon and Google move further in the direction of custom silicon — this will happen. Apple already has a pretty big advantage with their custom silicon — which they don’t sell for use in non-Apple products. Now just imagine if they brought tsmc in-house and had exclusive access to the most advanced fabs on the planet. Your ability to compete, and the room for people to build GPC would constrict MUCH further.
We have this enormous, all-pervasive stack built upon a foundation of like 3 companies, in the midst of massive consolidation. I honestly don’t understand how so many people who think about that are comfortable with it.
> Those will be the people who make things, and also have a good set of tools and maybe a milling machine.
One of the chief things I hope that home-cloud operators get to, quickly, is multi-tenancy. Given how easy it is to take some Raspberry Pi's & build a home Kubernetes cluster (or to spend $1000 & build a radically better version), the next question is: how do we scale that impact?
I'd love for my work to scale to my friends! I used to spend so long trying to build ldap into the ftp, http, xmpp, &c self-hosted systems I made, thinking one day it might help friends too. And I still think that way, but now that vision is less about building super-tip-top services to serve everyone, and more about building a platform that my friends could run their own services on easily, in a reasonable way. #selfhosted, I hope, begins to federalize somewhat, that we can selfhost each other, via some common, well known platforms that support these endeavours to help us build together.
Personally I like to imagine grade school having a half dozen servers, and kids getting their own virtual clusters to operate as they might, to learn about & immerse themselves in computing. This kind of feels like a maker-space sort of idea: a collectively owned means of production, an availability of tools that is community owned & operated. Ideally in my school server model, the kids themselves get the experience (at some point) of bootstrapping their own clusters, get the end to end experience (take one machine out, format the drive, compile a linux kernel, install os, install cluster/platform software, join another hardware unit to it). Similar to a rep-rap producing another one, sort of; creating the chain of knowledge to reproduce & understand.
There's plenty of semi-interesting existing examples to cite with regard to collective hosting: the SDF cluster, tilde.club, &c. I guess I hope that we can virtualize a little more, give more people something closer to their own sovereign little spaces on computing hardware, where-as historically these have been operated more akin to singular shared spaces.
I don't know anyone who got into programming who didn't do it on a computer they had for other reasons. Nobody buys their kid a $2000 device just in case they might want to be a programmer, especially since the kid isn't going to play with that device -- there won't be any games for it.
No, it's always a side-effect of using GPCs for other things. Even, or really, especially when it's inconvenient.
Anecdotally, it is indeed getting harder to hire good people. I don't know that we've reached 'peak programmer', but the count isn't growing as fast as it used to.
Your "nobody" scenario is basically the whole beginning of the personal computing era. "Useless" machines that cost a fortune, had no available software (to within statistical error) and booted to BASIC. Without that era, we wouldn't have this one.
As long as you have paid for your annual "software development licence" from the government, and they haven't revoked it after finding you breaking "best practices" like producing or using encryption software without a government backdoor.
We are talking about the future here. Restrictions on our freedom will be added until it is no longer profitable or until people start to protest. And so far it has been immensely profitable and among the general public basically nobody complains, so more restrictions will likely continue to be added for a really long time.
There are subscription-only development studios and toolchains. (Always have been, really.)
If there is a new platform where this is the only kind of thing we can run on a walled garden device, then I'm not sure quite how general purpose you could call such a device.
If your FOSS-developed app wants to open a network socket to an IP address that isn't on the government whitelist, then the OS will perform an OSCP check to ensure the (hash of the) binary has been recorded, along with the identity of its developer, in a corporate or government-run database.
> And is there really anything wrong with that? Every house doesn't need a milling machine. Why should every house need a general purpose computer?
Why not have general purpose computing? Computers are vastly powerful machines that could be useful over very very long amounts of time. By allowing closed, proprietary locked down applianceization of computing, we create expensive high-tech consumer devices which seem to quickly (within 5 years) become unsupported, obsolete, unmaintainable, & unrepairable.
Computers have nearly endless uses & applications, until we artificially restrict that. Society ought to try to keep computing general, because it allows for us to adapt & update systems along with the times. Nothing but general purpose computing seems to be renewable. Why should we have other forms of computing?
If every house had a 3d printer, or a milling machine, or a small chip fab, or insert home manufacturing machine here, they would be far less dependent on a handful of established players for things like replacement parts etc. And much more immune against government regulation seeking to control thme and what they can buy/own/do.
A 3D printer, chip fab, or CNC mill isn't useful without feedstock, designs/models, and electricity. So even if every household had a bunch of micro-fabrication devices the government could still regulate whatever they pleased by regulating access between your property and the outside world. If everyone had a 3D printer and was manufacturing things the government didn't like then PLA/ABS/resin would quickly become regulated.
Not to mention the "printing cartridge model" that some manufacturers are already trying to play at. Unfortunately this is inevitable as the 3D printing industry is getting more commoditised.
A majority of people don’t want to take the time to make their own parts anymore than they want to repair their cars or appliances. The article can put convenience in scare quotes all it wants and blame mega corporations, but it’s the reality of consumer preference and specialization in complex societies. People’s time and motivation are limited. Spend time fixing your own stuff or delegate it to someone who does it as a profession?
I don't think these are for every household. Right now a 3D printer is incredibly useful if you're really willing to put in the work to learn, optimise, and even design your own parts.
Joe soap is never going to do that or even want to do that. He doesn't even have a need for parts, after all he doesn't repair his stuff, he just throws it away and buys new stuff.
Perhaps he'll buy a 3D printer when he can go to Amazon and click "print" instead of "order". But we're a long way from there.
I can see the day coming when owning and operating one of those will require the equivalent of a carry permit.
And just like for weapons, none of the iphone and chromebook users will understand what the fuss is all about, who would want to use one of these anyways.
"You think you have won! What is light without dark? What are you without me? I am a part of you all. You can never defeat me. We are brothers eternal!"
I enjoyed the first half but then it became a bit of a rant. It also went from praising creativity to vilifying developers who make you "require an internet connection". Everything is connected nowadays, especially now that we are forced to offer websites as "apps". And just as websites constantly change, so do apps require regular updates. For basic reasons such as compatibility, UX, etc
It's not all black and white.
But I agree, we will win. And I suspect the big players already know this, that might help explain their obsession to squeeze every cent from their market dominance
How many times did good websites need to be updated before they became good? Are you sure a change in the w3c spec won't require further changes? Or that new technology won't require different meta tags?
And how can Apple's practice of blocking web functionality on purpose and then disallowing different browsers be defined if not forceful?
I don't know why that project died but I'm not surprised. It reminds me of how smart phones used to have removable batteries and were easier to replace parts for but many consumers preferred to buy slim phones that couldn't be opened.
While true the loss of removable batteries wasn't in isolation. Apple led the charge with a product offering a lot of other advantages, then once they had a moat of network effects and walled garden, it caught on elsewhere.
Yeah, I used to work for BlackBerry and watched the whole thing unwind in slow motion. BB focused on the wrong things and had shit software. (And I say this as a person responsible for writing said shit software)
After reading the text for a while, other GUI elements on my screen started slowly turning cyan. It took me a while to realize it's a weird compensation of my eyes to the website background color. (I used to have a monitor with a VGA connector that occasionally dropped one color channel)
I think the general trend will be to push power users to portable 'platforms', like Mathematica or using another analogy, trading apps like ThinkorSwim
*> "As users see their smartphones weaponized against them, and find few real alternatives, some are expressing fears that Tech Giants are plotting an oblique coup in all but name, and positioning to usurp national governments with their own brands of cybernetic governance. They are building in control and exclusion, disinformation, private digital money and surveillance capitalism into gadgets we seem unable to step away from. I believe this threatens Western liberal democracy, fought for at such cost 80 years ago."
If you stopped there then you missed the parts where the author moved on to ranting about the Soviet and Maoist ideology of Big Tech!
I'm a big supporter of free and open source software but this article is way over the top.
The article misses how much of a golden age we're in. There's tons of devices available where the operating system and full app suites are open source! Desktop Linux and even an un-Googled Android smartphones are the best they've ever been. Good dev software stacks are FOSS and competitive! The web is built on open interoperable vendor-agnostic CPU-agnostic OS-agnostic standards! We no longer face the threat of a closed-source OS (Windows) eating all computing with no alternatives, open or not, as we used to. Programming becoming a very lucrative career path has insured that there's a strong "computing culture" and no shortage of resources available to get started. Newer programming languages that are much more beginner-friendly than C yet still powerful have taken off shortening the on-ramp for people to competently contribute to software they use. The computing world has changed in the last few decades, and some people have failed to notice the many huge positives. Almost every sentence in this paragraph represents a predicted doomsday we avoided.
You don't think that giving the government (or entities even less accountable) complete control over online information, discussion, and commerce, plus an almost perfect 24/7 surveillance system, might weaken liberal democracy?
* > You don't think that giving the government (or entities even less accountable) complete control over online information, discussion, and commerce, plus an almost perfect 24/7 surveillance system, might weaken liberal democracy? *
The government doesn't have this control, nor do any of the FAANG companies. I know quite a lot of people who are online without the use of any of FAANG equipment or resources. If you granted these entities this power then you did so of your own choosing. Whether that was wise or not is an entirely different question. Whether it might weaken liberal democracy, I argue no. Liberal democracy is being attacked by our long-term enemies. They're using new tools and our ignorance to have us rot from within but it's a mistake to blame technology for what was done by the hands of our enemy. Lenin turned out to be correct, the capitalists really did sell them the rope they would hang us with. Perhaps I see this in a different perspective due to my being a cold war kid.
> The government doesn't have this control, nor do any of the FAANG companies.
That's correct, they don't have complete control yet, and I agree that right now liberal democracy is being attacked by illiberal authoritarian regimes, as it has been for a long time.
However, I think the point of this article is that we are also in the middle of a war for general-purpose computing, and that war might end with governments and/or corporations in a much stronger position than they are now in terms of censorship and surveillance.
Already we are seeing isolated incidents of people being rendered less free because of the control that modern platforms grant to authorities, and I don't think it is helpful to blame the victims for choosing to use mainstream operating systems or social media sites that end up destroying their livelihoods, for example.
Outside of those on the extreme, market forces ensure that consumers get what they pay for when it comes to computing. The paranoid will always be paranoid, that doesn’t mean the sky is falling.
I think I agree with the article. However it rambles on too much and the colour scheme is extremely hostile to my eyes. Sorry. I think you have a good point but you need a TL;DR. And please, white on hot pink is not a good choice.
> Climate change is in the process of teaching us that mono-cultures built in the service of a few powerful industries are a risk.
Sorry for nitpicking but... Climate change is an indirect consequence of industrial revolution. Industrial revolution has saved orders of magnitude more lives and fed more people than climate change is likely to affect.
I would guess 150 years. I think by the beginning of next century we'll have the engineering prowess to use large-scale geoprojects to basically set the climate to whatever we want. (As long as we don't suffer a massive world-wide societal or economic collapse in the interim).