Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Using computers more freely and safely (akkartik.name)
159 points by gstipi on May 29, 2023 | hide | past | favorite | 65 comments


So much can be commented on in response to points or perspectives that aren’t included in a person’s position. The amount of remarks on the unspoken tend to increase in relation to the responder’s nitpickiness over the author’s choice of words.

One of the strangest things is when people over-interpret modest remarks as declarations of war against the fabric of sensibility and human decency. This seems to be a trend nowadays.

I don’t think that the author of this article/talk meant to discourage people away from all software contrary to the types he discussed here and it doesn’t sound like that. It reads like a modest appeal.


Thank you. Words commenters aren't focusing on:

* Prefer

* "Gravitating away"

* "even one of these suggestions"

* "better than nothing"

* "This isn't always possible. Sometimes we have to use software with millions of users. But I think it helps to rely on small-scale tools as much as possible, and to question tools for large crowds of people at every opportunity."

(I'm the author.)


>But I think it helps to rely on small-scale tools as much as possible,

The problem is that your presentation is not covering the various downsides of choosing "small-scale" software. There are tradeoffs and if you don't explicitly highlight them, it's a disservice to readers.

E.g. You mention that large-scale software for millions is "expensive". But small-scale software is also expensive in different ways. (Software used by only a few can be more expensive in time/labor/hassles because of lack of features, workarounds required, lack of tutorials, etc)

I've written several utilities in the "small-scale" software category for my friends to use and that experience has enlightened me that most people (who are not hackers & techies) should use "software for millions" as the default choice.

If you're one of those that chooses small-scale software (e.g. your old Lua v5.1 anecdote), I think you're already part of a self-selected group and you don't need blogs suggesting it to you. You also are willing to overlook the downsides.

I like most of Clay Shirky's writings on various topics but his particular essay on "Situated Software" which you cited is incomplete and misleading because it doesn't cover "software rot": https://en.wikipedia.org/wiki/Software_rot


It's quite possible I'm too far captured by my own belief system, because it seems not a contest. Perhaps someone else needs to write the rebuttal, and defend mainstream software with constant bugs, vulnerabilities that stay unpatched for months, and unaccountable "feature" additions.

I'm not saying we should use situated software everywhere. I'm saying we should _try to_ use situated software everywhere. I think of this as evolutionary adaptation advice. Akin to, "you're evolved to climb trees, avoid plains." My hope is to stimulate demand for situated software, with cascading improvements in availability and convenience.

Situated software has costs of convenience. To that I say, "suck it up." It's good for you. With the full knowledge that most people will ignore me.

It's important to weight the costs on both sides by the impact when costs occur. That's a big part of why I think it's not a contest. One 0-day pays for thousands of problems on the other side.


I am not technical. I come from a science background so I am somewhat limited in regards to software development.

I used to try a lot of APPS to produce documents and scruture my mind. Notion, google docs, word, calc, excel... It was fine but it I agree that some times is a pain. Google docs run very slow in my computer, Windows overheats in my laptop , etc.

Finally I discovered Emacs. I am pretty bad with elisp but I love it. Now I just use, Emacs, a couple of web browsers, bash, Python and some perl scripts.

While this article oversimplificates some things It has valid points for specific cases. btw, I like this web design!


> Finally I discovered Emacs.

Do you mean Emacs with Org-mode? Or just plain Emacs? I'm curious, because I don't know much about Emacs but I heard a lot of good things about Orgmode.


Yes, Emacs with org-mode. In fact, I got into Emacs because I wanted to try org-mode.

Org-mode it's just a mode, Emacs has a ton of modes, major and minor. The GNU Emacs Distribution, the Vanilla Emacs, already comes with org-mode.

It was six months ago. org-mode blown my mind. It was a transcendental experience, I was thrilled. It had everything I wished and much more. Thanks to it I discovered Emacs and I love it, it makes my life easier. What's great about Emacs is that you can solve any question you have with the manual, it's awesome, mind-blowing.

But It has a steep learning curve. I got through vim tutorial in less than an hour. Going through Emacs tutorial took me 4 afternoons, one our every day. It was 4 days because it's too much for one sitting. I almost gave up, I am happy I didn't. You have tot taking it slow, there are a lot of crucial key bindings you have to learn.

If you like writing or ordering idea, notes, etc. I recommend it, at least try it and then decide for yourself. But you have to put some time to get familiar with the Emacs ecosystem. Take it like a pianist, slow and steady makes you learn twice faster.

I am just a novice but this summer I want to try to learn more elisp and try to make one package. I hope this is not off topic. Emacs fits the description of the article, right?


And for those that find the Vim approach to most things preferable, Doom Emacs sets up Emacs + Evil-mode in a way that makes Emacs+Vim feel smooth, and doesn't require really any effort on your part. Big recommend.


(bit of a rant here, you've been warned)

As someone who could've developed an IT/programming career, but didn't because I felt things were already bloating back in the '00s, I agree with the majority: "harvesting your own food" can be rewarding but also a tedious and thankless job. It's certainly not for everyone, but if it works for some people then it is (let's put efficiency aside for a moment) perfectly valid. In fact, being more of a H/W guy I find myself gravitating towards this approach more often than not. Leanness and reproducibility is key for my workflow (I went the RF-world path), I can't afford different end results when a dependency changes/breaks something.

IMHO, keeping up with the modern paradigms for S/W development looks like a never-ending nightmare. Yes it's the modern way, yeah it's the state of the art. Still, I didn't feel it was a wise investment of my time to learn all those "modern dev" ropes, and I still feel that 20 years later. I'm nowhere near antiquated and I'm on top of all things tech (wouldn't read HN otherwise), it's just...

I see former friends/classmates that went this way, and they're in a constant cat-and-mouse game where 50% of time they're learning/setting up something dev-chain related, the rest 50% doing actual work, and 98% of it feeling way too stressed. I see modern Android devices with their multi-MB apps, bloated to hell and beyond for a simple UI that takes ages to open on multi-core, multi-GHZ SOCs. I see people advocating unused RAM is wasted RAM, never satisfied until every byte is put to good use, reluctant to admit that said good use is just leaving the machine there "ready" to do something, but not doing anything _productive_ actually.

And yet.

Without that bloat, without the convienience of pre-made libraries and assist tools for almost every function one could desire, we wouldn't be where we are now. Imagine for a moment doing AI-work, 3D movie rendering, data science etc. with a DBless approach on single-core machines with every resource micro-managed to eke out the most performance. It's simply not feasible, we would still be on the 90s... just a bit more hipster.

This article resonates so well with me. And at the same time, it feels so distant.


I was trying to square the same tension in my mind when I made OP. And the compromise I arrived at was, "try to find people with complementary interests to organize with." That's really what "software with thousands of users" boils down to. If programmers who can take the lead when software is still small and approachable, and non-programmers coalesce around their forks rather than upstream, we might slowly evolve towards the hazy societal organization I'm vaguely pointing in the direction of.

But an essential component of this plan is for non-programmers to articulate early and often their desire to migrate away from the current monopoly they are forced to use.


Of course we non-programmers want to move away from big corps environments. Here's my 50 cents of what would be ideal for me...modular software, easy to assemble, no code. And if a module is not available, I'd be happy to pay a (reasonable) amount to get it done. All this open source.


Why does open source matter to you? Is it to preserve the option to pay someone to make changes to it?


For sure among the hardest problems in software engineering are versioning dependencies, and managing dependencies. At least those are the two I find the most aggravating. It seems like almost nobody can get it right even though component-based software engineering, SoA, etc. I think are generally extremely good ideas. The execution is pretty crummy pretty much everywhere.

With all that said, my sense is that hardware engineering has its own heap of Sisyphean problems and complexities. I definitely would not go back to working on hardware engineering problems like I did super early in my career (a mix of embedded firmware, device drivers, PCB design, and web development). I shudder at the thought of ever working with anything Verilog/VHDL, Xilinx, or SPICE ever again, or debugging PCB designs on the bench top in the lab with an oscilloscope and a logic probe. At least in school I ran more than a few bodge wires to patch a mistake in a PCB design iteration. Maybe in some sense, it's a blessing that those linear systems theory abstractions fall apart utterly in RF engineering problems, and one has to contend with the fact that all circuits radiate. At least circuits that still contain the magic smoke.


I believe nix is the logical solution to dependency management and thus is the future of it.


Imagine for a moment doing AI-work

In many ways it's still the same. Transformers use matrix multiplication is their main operation, the underlying matrix multiplication libraries have mostly seen incremental performance improvements over the last two decades or so. Most other ops in eg. core PyTorch are implemented using C++ templates and are mostly familiar to a 2008 C++ programmer. Most of my work is largely C++/Python/Cython as it has been the last 1-2 decades. Sure, the machine learning models have changed, but those are relatively easy to pick up.


But most software is not 3d movie rendering or AI. Look at the AppStore. 99% of these apps could have been written with 1970's Pascal. 1996s ICQ had 90% of functionality of modern messengers. People just love new things.


I love the sentiments and ideas presented. Thanks for putting these thoughts into words, and sharing. Not sure why some ppl interpret your post as some kind of militant set of demands. It's very thought-provoking and helps think about software and its creation/curation/usage in a different light. The deeply-instilled "corporate vendor -> money-expending user" ecosystem of software and computing comes with many costs, sociological, psychological and so on. Further, the ability to have agency over one's computing experience is basically constantly threatened by software vendors and constantly-lobbied regulators. I think it's wise to persistently question the status quo, especially when it is largely dominated by profit-centric intentions.


This is the "just eat less" dieting advice applied to software.

I understand the desire to grow your own food and bake your own bread, as a response to the onerous, distressing complexity of the systems we live inside, but you have to accept the cost of that being more work and less connection.


The move towards "growing your own food" or "baking your own bread" in the software sense does not necessarily equate to more work and less connection. On the contrary, it can lead to a deeper understanding and mastery of technology, fostering a sense of empowerment and self-sufficiency. Developing one's software or systems can provide tailored solutions to meet specific needs, which could result in less work in the long run because the tools are perfectly designed for their intended use.

Not to mention platforms like the one we develop makes deploying free software based self-hosted solutions push button, so there really isn't any excuse for sticking to the faceless "crowdware" providers.


>> does not necessarily equate to more work and less connection. On the contrary, it can lead to a deeper understanding and mastery of technology, fostering a sense of empowerment and self-sufficiency.

Those are not alternatives though. It most certainly does mean more work, and even something as good as achieving mastery & empowerment in this area of your life has opportunity cost that might not be worth it for all.


Upon self reflection over several decades, I think I would prefer to have not been thinking like this quite so much.

Which isn't to say I think I should have been a front end webdev chasing after the next new shiney every 3-6 months so I felt like I had my pulse right on the bleeding edge of new technology.

But I use a Jetbrains IDE now quite a bit to write code when I used to be a vi/vim user.


> Humanity didn't get good at building houses by building the same house a million times.

The amount of quality variation that we are willing to accept in our houses is much greater than what we are willing to accept in an automobile. The quality of the automobile is BECAUSE we make millions of the same car over and over again.


But we also went through a period of trying lots of variations that ended up on the road.

I suppose your point is even more valid if we switch to planes. But I'd counter by comparing the levels of regulation between planes and apps. Houses vs apps feels like a fairer comparison on that score.


We still have variation in vehicles and some vehicles are known for being of better quality than others. Dollar for dollar the ones that are produced in the millions are more reliable than the ones that are produced in the thousands.

With software would you expect better quality from an operating system that was one of 50 options each with 100 million of users or something that was one of 5,000,000 options each with 5,000 users? If operating system is an odd corner case, then substitute "Library to Handle Timezones" or something else.


I think the crux of our disagreement is: is popular software mature enough and regulated enough yet to be worth distributing to 100 million users? I think the answer is overwhelmingly "no". In Carlota Perez's terms, I think we've moved too quickly to the installation phase.


> is popular software mature enough and regulated enough yet to be worth distributing to 100 million users? I think the answer is overwhelmingly "no".

Well it depends on the alternative. If you had to choose between Linux and TempleOS you'd probably get fewer bugs and the ability to do more actual work with Linux. If you had to choose between Windows and TempleOS, you'd probably be able to get more work done with Windows. If you had to choose between MacOS and TempleOS, MacOS would probably let you do more of whatever you were trying to accomplish than TempleOS. Yet TempleOS is only used by a small number of people and only has the resources that you'd expect from something that very few people use.

Now if you are saying that the alternative is to just not use software until it is "mature" you can definitely go back to writing your drafts with pencil and then typing it in a typewriter, but you probably don't actually believe that since you seem to find greater value in using the Internet and software for communication instead of the postal service.


> Now if you are saying that the alternative is to just not use software until it is "mature"..

I'm definitely not saying that, right? Hopefully OP conveys what I'm saying.

> If you had to choose between Windows and TempleOS, you'd probably be able to get more work done with Windows.

If you bought a stock and the price climbed up 20% for 19 years before falling 100% in the 20th, do you care that the stock was doing very well for a long time? If you can do more with Windows and get hit by a virus that steals all your bitcoin, TemplateOS probably starts looking quite good at that moment. So it's not a quantitative argument for me of "which side lets me do more work." There are qualitative, almost spiritual questions of "what is this work _for_?" and "what sort of lifestyle can I live with?"


> The quality of the automobile is BECAUSE we make millions of the same car over and over again.

I guess that explains why things like modern refrigerators are so much more reliable and durable than the ones that were made 20 years ago. /s

By that reckoning, modern cars should be just about perfect, modulo certain modern expectations like not emitting greenhouse gases?


What I've found fascinating about the refrigerators I've had recently is that the refrigeration process rarely is the part that breaks. However the plastic shelves decompose right about the time the warranty goes out. My guess is that this is by design to sell more refrigerators or $200 plastic shelves.

The reliability of a modern mass produced automobile is incredible. If a software company could produce software with a few defects as a modern automobile they would rule the world.


> Prefer software with thousands rather than millions of users

Yep, guess which software shows up as a toy CTF challenge for the weekend? Just because you can understand how something works doesn’t mean it’s secure.


The concrete thing I'm doing and generalizing from boils down to:

* Starting from Lua which seems to have a decent security story;

* Changing a few lines of _safe_ Lua for yourself without introducing new buffer overflows and so on;

* and limiting the reach of those changes to a few thousand people _at most_. (99% of forks won't have even that, thanks to the tyranny of the power law.)

Your comment is very much something I think about. I don't think it's as cut and dried as you make it sound. It seems worth exploring. It seems analogous to doing controlled burns every year to avoid humongous wildfires.


Excellent advice. I will now look only for things that I have no chance to understand. Proprietary systems are probably the best bet since we are, by definition, not allowed to understand. Only windows for OS and secure pulse for internet connection for me - there couldn't possibly be any major security flaws with that logic right?



With the mindset described, I'd say most of the software that the World runs on nowadays shouldn't be used. Also:

> My first resolution is just to bring less software into my life. It is still early days, we don't really understand computers yet.

While philosophically kind-of true, this approach - especially for things to come - will get you quickly lost in the modern "software-eaten" World, IMHO.



Also posted at these services:

https://spectra.video/w/a1dvx7y6gy1k9pnqDjQBoD (PeerTube)

https://archive.org/details/freewheeling (Internet Archive)


[flagged]


The only thing silly here is your comment, which essentially gives the least charitable interpretation of TFA possible. You're reacting to a position the author doesn't hold by interpreting the writing as if it's far more demanding and extreme than it is.


I'm going by their checklist. None of these things are values of quality, they are at best non-sequitur and at worst often negative. The degree of demand or extremity doesn't matter, a mild dose of "use software only thousands of other people use" isn't good advice either.

I allow that there may be something philosophically fulfilling here, or an attempt to correct for certain unavoidable ills of complexity (if a a poor attempt), but again that's an incredibly, incredibly narrow thing to talk about presented as ubiquitously applicable.


"Prefer X" does not mean "Never ever use anything else than X".


It's easy to crap on something, but I'm pretty happy with this system.

I use:

Trisquel GNU/Linux

IntelliJ 2019.3

Perl, PGP, and text files

About 20 different web browsers

My own web-based information manager

And yes, I sometimes use Word 97 for writing.

And I'm pretty happy with it.


I collect old computing equipment and I have a number of machines that run old software. The biggest problem with old operating systems (mostly Windows, but others too) is that modern web is unusable without TLS3 and back porting that to old browsers is pretty much impossible...

How many times "I just wanted to download one file" on a windows 98 machine. The only "solution" is to maintain a proxy running a modern OS that renders modern web as image maps.

As for Word 97,I remember when it was new. I also remember when Microsoft went to the "new" ribbon interface (it was much later, 2016 perhaps?) I hated it, my users hated it... Fast forward a decade and last time I tried to use pre-ribbon office product It was extremely inconvenient to use. I don't know if it is due to high dpi screens or something else. This and lack of good dictionaries is the main reason why I continue to pay for MS office instead of using Libre office (or one of ancient office products I own) for example.


Half the browsers I use are still compatible with HN despite being 10+ years old. The other half I mostly use on my own websites, which allow plain HTTP.


Every single thing on your list (except the info manager and maybe more obscure web browsers) has tens of thousands into the hundreds of millions of users.

Except for the web browsers, none of the things you listed are the result of or otherwise benefited from meaningful forks.

You will never understand, much less be able to make meaningful edits in afternoon to, the entire Linux kernel. Likely much the same for Perl and PGP (150K lines of C and a highly specialized domain respectively).

You don't have the tools to edit IntelliJ or Word 97 (was that what that bullet point was about?)

Lord help you if you're updating any of those things. I better not see anything higher than 2.X in your uname -a.

You overwhelmingly don't follow this post's advice


It is probably best to say the author's advice is not practical and, for most people, not desirable. It is also worth adding the solution is problematic in the sense that being able to understand software from top to bottom is a skill set that few developers even have, even for the most trivial of applications. Most developers build their software on the shoulders of others.

That said, the author's point about the trustworthiness of software is valid. In most cases software is more complex than individual users need it to be (and it is like that to serve a wider audience since software development is expensive). The complexity makes the software more vulnerable while making it easy to hide malicious intent.


I agree with your first paragraph entirely and it's worded more generously than mine.

I think there's something valid in your second, but on the whole simplicity is often a myth. Hardware is complex, the world is complex, the software stack simply reflects these complexities, it does not invent them.

It feels great to have simple code until your environment no longer mirrors that code's assumptions. You have an MBR bootloader that assumes the A20 line is on the keyboard controller, upgrade hardware, and suddenly you can't boot anymore, bet you wish you had a more complex bootloader now. You have a graphics renderer that knows no one has more than 4GB of VRAM so it measures allocatable memory by allocating up to 4GB of textures until it gets E_OUT_OF_MEM. Except on modern cards it overflows its texture counter, bet you wish you had a more complex renderer.

And this is ignoring inherent complexity, or the nature of handling errors even without changing hardware, but I've ranted enough on this. There is a narrowly valid point here but it is presented without caveat and thus far too broadly.


"You don't have to do all this. Their benefits are additive, but acting on even one of these suggestions is better than nothing."


forgotmypw17 doesn't appear to follow any of these points except incidentally (most modern web browsers are forks of one another for complexity reasons, their preferred document editor is an EOL closed-source product therefore doesn't get updates)


If you look closely, each of them has a place in my regimen.


I wont use prop. W97 over Trisquel.

Once you learn Groff+Mom+simple files cat'ted after.FOOBAR tags you wont go back.


I think you missed the point.


I read the whole thing, they have a section called "the punchline" that restates those bullet points. If the point isn't "the punchline", and the text certainly supports that is indeed the point, I'm not sure what is.

What do you think the point is?


Everything is monolithic by accident. Source code used to be able to be shared and compiled directly with little effort. Now we have created tools to create entire systems dedicated to running source code. We have literally codified a poorly designed, middle-manager riddled organization into our current systems.


> Source code used to be able to be shared and compiled directly with little effort.

At what point was your "little effort" claim true? Compiling random OSS projects has never been particularly easy. You've always needed to hunt down all the compile-time dependencies and get those compiled/installed.


There was a time when source code was shared in magazines and someone had to manually write the source by copying from the magazine.


The point is: systems that are used by smaller numbers of people can have some intrinsic benefits that you may not be aware of.


While the premise of the post is interesting, I've stopped reading it when reaching "a computer from 2015 is 2-5 times slower than an Apple 2e from 1986 just at reading a keystroke and displaying it on screen" and only scanned the rest.

If oversimplifications are being used to prove a point, the argument becomes weak.

I actually read it some months ago, because I'm interested in the smol net, and I use and like the Gemini protocol quite a lot.

Unfortunately this post rants against perceived software obesity quite unreflectedly.


Pity you stopped reading there because that point has some pretty solid support:

https://danluu.com/input-lag/


Danluu's study seems to conflict with

https://pavelfatin.com/typing-with-pleasure/

which shows Gvim on Windows with a maximum latency of 1.2ms - much faster than 30ms on the vintage Apple //e or TI 99/4. Even Eclipse, which usually feels slow to me, came in at a max of 20.8ms. (1.2ms seems very fast to me as I would expect ~2ms of average frame latency even on a 240Hz monitor?)

Which study is correct? Are both of them measuring touch-to-display latency?

It's also worth noting that those vintage 8-bit microcomputers typically used CRT TV monitors at 30Hz, while modern monitors often run at 60Hz or 120Hz or more (possibly with asynchronous refresh), so they will have lower frame latency.

While I haven't tested it myself, I believe newer iPad/Apple Pencil 2 combinations have improved their input latency (they claim 9ms but I'm not sure if that's actually end-to-end touch-to-display latency in something like Notes or Procreate.)


Dan Luu's post, which I also read months ago, is very detailed, has a lot of data, but fails to make sense or come to a helpful conclusion in the end. Like so many of his posts.

With helpful conclusion I don't mean just stating the facts or comparing transistors or input latency with network latency. As if developers stopped caring and created crappy software on purpose.

The post compares an Apple 2e, which is a single-tasking OS that just displays the pressed key in the basic interpreter on the screen, and modern devices, where it is not always clear, what kind of app or setup is being used. But we know, that it's plenty of layers of GUI and OS code, that most people don't want to miss. Not to mention, that mot of the higher input lag is not detectable by humans in normal work conditions.

Yes, there were years, were CPU performance couldn't keep up with added features, like immediate spell checking. I used computers through all those years and know this first-hand.

And I never dismissed the importance of input lag. I pointed out the oversimplification to support the main argument of the linked post, which suffers from this as a result.


> The post compares an Apple 2e, which is a single-tasking OS that just displays the pressed key in the basic interpreter on the screen, and modern devices, where it is not always clear, what kind of app or setup is being used. But we know, that it's plenty of layers of GUI and OS code, that most people don't want to miss. Not to mention, that mot of the higher input lag is not detectable by humans in normal work conditions.

You can do an experiment for yourself along these lines. A few years ago, I pulled out an old laptop that had Gnome 2.x installed as part of an Ubuntu release from circa 2009. In every respect, the hardware from this machine (which itself had been a budget laptop when it was purchased in 2006) was worse than the hardware I was using in my daily life. I was struck (almost startled), however, by the difference in how immediately it responded to my input compared to my daily driver at the time—to the point that it distracted me from the original reason I booted it up in the first place.

A system capable of running Gnome 2.x is no pre-Mac, pre-multitasking system. In fact, it has all the capabilities/affordances that you'd expect and need from systems today. So your characterization is less than kosher (basically a motte-and-bailey).


> But we know, that it's plenty of layers of GUI and OS code, that most people don't want to miss.

You're missing the main point: you're just assuming that we must trade those features for improved latency, but this is just not true. The properties he measured are a result of particular architectural and abstraction choices, but other choices could be made that don't necessarily require such sacrifices. For instance, the exokernel.


I'm not assuming that we must trade those features for improved latency. It is a possibility, but there are always alternatives.

Unfortunately in IT you always trade one set of problems for another. And clean architectures have to be watered down with time to stay practical.

Nobody is smart enough to predict all pros and cons accurately. We're always smarter after the fact. When we have finished a transition and gained some experience with the new technology. But then it's mostly too late to go back.

On top of that, computing is always a moving target. Now you have to target highly mobile devices with small batteries traveling at high speed in metal tubes connecting to unreliable networks. While more or less related to keyboard input lag, depending on where the action should be registered, you have to be careful where you spend your development resources.

That's why I think this is an oversimplification. True in its deepest form, but neglecting reality.


> Now you have to target highly mobile devices with small batteries traveling at high speed in metal tubes connecting to unreliable networks.

In another comment[1], I called your earlier characterization a motte-and-bailey[2]. I'd wager it was probably not deliberate in that case. I have to think, though, that you're at least aware of how intellectually dishonest this move is.

Compare like for like. The existence of my phone, running a completely different system and set of applications, has _no_ bearing on how responsive to input an entirely separate machine in a traditional laptop/desktop form factor is or explain why the state of things should have degraded over the last ~10–20 years.

1. <https://news.ycombinator.com/item?id=36115622>

2. <https://en.wikipedia.org/wiki/Motte-and-bailey_argument>


I have recently seen ridiculously bad and slow smartphone apps, which seem to be so because they are programmed to phone home for every single action even though there is no need to do so.


"I pointed out the oversimplification to support the main argument of the linked post, which suffers from this as a result."

I don't understand this at all (along with several other sentences in your comments).

(I'm the author of OP.)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: