There's really nothing to debate; the guy I replied to was totally correct about everything except the bit about "AC is much better in the home", where I pointed out that he really meant that a high voltage roughly where our current AC systems are (120V-240V) is much better in the home than some kind of low-voltage DC system, and that with modern technology, it would probably actually be better to have a DC system. But realistically, that's not going to happen because the gains (probably very minimal) aren't worthwhile compared to the enormous cost of conversion, given how standardized our current AC system is and how all our infrastructure, point-of-use devices, etc. are all designed around that.
Basically, he was assuming practical real-world considerations, I'm going off on a tangent about ideal conditions. His argument is about whether it's better to stick with the current AC system that your house has, or if it's better to install a low-voltage DC system to supply 5V, 12V, etc. to all your devices from a single, central, whole-house power supply as many people who don't understand electricity will frequently suggest. He's completely correct: low-voltage DC is a terrible way to supply power over any distance more than a meter or two because of resistive losses, so it'd require massively large copper cables or busbars. And power supplies are generally very low-efficiency when operated at low load. So our current approach (separate little optimized power supplies for every device, plugged into a higher-voltage AC supply) is actually optimal.
I was never arguing that an individual should replace the AC in their house. My argument was, with current technology, the AC setup can be seen as tech debt.
Which seems compatible with what you are saying, but the parent was specifically claiming I was wrong.
That is, you seem to be echoing my point. But seem to be claiming it is different. What am I missing?
I wouldn't call it "tech debt". Present-day AC systems may not be completely optimal (given current electronics technology), but they do work well.
As I understand it, "tech debt" is something that has to be reckoned with at some point, or else you're going to have real problems in the future (just like refusing to pay off a money debt will generally cause you real problems at some point when the creditor sues you and gets a judgment). You can't just let it go on forever; eventually you need to "pay it down" (by cleaning up the codebase, migrating to newer technologies, etc.), or else catastrophe happens (the company is unable to compete and goes under). One common factor cited in these stories is that the code becomes too unmaintainable and unreliable: too many weird changes for customers pile up and introduce serious bugs which cause the product to not work properly.
This isn't like that at all. We can go on with our current household AC power systems indefinitely. Maybe we could get a 1% improvement by switching to DC systems (at an enormous cost because most of your appliances and devices won't work with it without adapters), I don't really know exactly how much better DC would be (not much really), but what we have now works fine. Furthermore, it's not like the whole electric grid system needs to be changed: it's entirely possible, for instance, to switch distribution systems to DC and leave household systems AC. Instead of distributing the power at 30-something kVAC in your neighborhood and using outdoor transformers to step it down to 240VAC for your house, it could be distributed in DC form, and those transformers replaced by modules which convert the 30-something kVDC to 240VAC. In the old days, this was hard and expensive to do, but with modern power electronics it's not. But even here, the question is: are the gains worth the expense? And the answer is very likely "no". (For reference, I'm not a power engineer, I just studied it in college as a small part of my EE curriculum.)
So this does not, to me, resemble "tech debt" at all. It's just a system that we use for legacy reasons and which is extremely reliable and works well, even though it might not be the absolute most efficient way to solve the problem. This is no different than many other engineered systems. Perhaps you have a decent and extremely reliable car. Could it be better? Sure: you could build the chassis out of carbon fiber, use forged aluminum wheels instead of cast, etc. all to save weight and improve fuel economy. Are you going to do that? Of course not, because the cost is astronomical. There's cars like that now, and they cost $1M+.
So for AC systems that we're talking about, the question is: what is wrong with them that we want to consider replacing them with something else, instead of just sticking with them even if they're not quite as efficient as they could be? Because the cost to upgrade them would be enormous, so you need to have a very good reason.
Most instances of tech debt are things you don't have to deal with. Usually, it is the term pulled out for things people don't like. Or generally deprecated methods that have better replacements, but still work.
It is this second sense that I was latching on. It --tech debt-- will drive decisions today. But it is not clearly bad. Just a constraint on current decisions that was made in the past. Often for decent or really good reasons.
Bit rot is another term for things that start to decline in how well they work. That is generally different, though. Usually a by product of replacing implementations without keeping functionality. Such that people relying on old behavior are left cold. (I can see how tech debt can easily turn to bit rot. But it is not required.)
Consider, LaTeX being an old code base is often used to call it tech debt filled. People want to modernize it. Not because it doesn't work. But because they think there are better ways, now. And they do not consider all of the documents made on it as infrastructure.
Now, i concede that all of this is my wanting the terms to have unique and actionable meanings. Elsewhere I was told "tech debt" is a catch all term now. That seems to rob it off usefulness.
Edit:. I forgot to address the monetary aspect of the analogy. I like that, to an extent. But most debt is taken in very specific terms financially. Unlike colloqually termed debts between friends. That is, there is no interest in this metaphor that works. Nor is there a party you are borrowing from.
>Most instances of tech debt are things you don't have to deal with. Usually, it is the term pulled out for things people don't like. Or generally deprecated methods that have better replacements, but still work.
I'm not so sure about this. To me, "debt" is something that has to be paid eventually. Otherwise, why use the term "debt" at all?
So if something works fine, why waste your time and energy replacing it with something newer?
Usually, the reason for this is the assumption that sticking with something deprecated will eventually bite you in the ass: something you're depending on won't be supported, will have security holes that won't get fixed, etc., and you're going to wish you had fixed it earlier. So this is a valid use of the term "tech debt" IMO.
But if something is just something someone doesn't like, that isn't "tech debt" at all. I don't like .NET, but it's invalid for me to call all software written in .NET "tech debt". I don't like Apple's ecosystem, but it would be pretty ridiculous for me to call all iOS software and apps "tech debt" when many millions of people use and enjoy that software every day.
So, for your LaTeX example, I don't consider that tech debt at all; instead, it's just like iOS and .NET software to me. If someone doesn't like it, that's their problem; the fact that it isn't brand new isn't a problem for me and all the people who still happily use it.
So personally, I think anyone using the term "tech debt" to just refer to things they don't like is using it incorrectly and in a totally invalid way.
I find this a compelling view. But, I urge you, just google technical debt. You will see the definition: "Technical debt is a concept in programming that reflects the extra development work that arises when code that is easy to implement in the short run is used instead of applying the best overall solution."
So, in this case, AC/DC fits if we agree there is a chance the "best overall" solution is DC. (Which, I fully grant, is not a given.) There is also a bit of playing loose with "short run."
Then, skip back to the top of this thread, where you will find: "products that are written badly by inadequate teams" and "case of unpleasantness" and "A product is replaced (or intended to be replaced) by a new product that does more or less the same thing, only this time with a smart new team, in a hip new language..."
All of this is the first, most highly voted, post. The next post is a highlight of poorly engineered solutions.
My point? Find a case study that has the usage you are referring to here.
Now, certainly rhetorically it has this appeal to people. But I have never seen it used in a way that it fits the metaphor. Just used to hit the emotional strings of "you must pay back your debt!" While usually claiming that the design or lack of some technology is the debt.
I think we're going off on a tangent here, but even with that definition from Wikipedia, there's no such thing as "the best overall solution". Everyone is going to disagree about that; the best you'll get is a consensus. For instance, back to LaTeX, there's countless academics out there who use TeX/LaTeX/whateverTeX for writing academic papers, and getting beautiful results while not having to mess around with a WYSIWYG editor like MS Word and just typing in some simple formatting codes. That's what *TeX was designed for and has worked well for for ages. But I'm sure you'll find a few people who say this is bad because it's "old" and that they should switch to the latest MS Word for everything, and rewrite all their papers in the latest MS Word. If you look really hard, you might even find someone who thinks both are bad, and that all academics should rewrite everything in WordStar.
"The best overall solution" is up for debate. It's the same with programming languages; one team will say that C is the best overall solution for a certain problem, another team will say it's Python, another team will say it's one of the .NET languages. I'm sure you can find plenty of engineers who will claim that mission-critical real-time avionics systems or automotive ABS controllers should be redesigned to use x86 CPUs and run Windows and have the code written in C# instead of using C/C++ and running on a small RTOS on an embedded microcontroller.
The implication I see with your Wikipedia definition is that implementing something easy in the short run instead of something that really is the best overall solution will eventually lead to more work to fix the shortcomings of the quick-n-easy solution. So, like I said before, a "debt", because it has to be paid back eventually (with work). The problem I see is that not everyone agrees on what is the best overall solution, and unlike a money debt that's easily seen by looking at a dollar figure, the only way to really know how much "tech debt" you have is through experience, i.e. accumulating it and then finding out over time how much work you have to expend to fix things when your quick-n-easy solutions start having real, demonstrable problems. If your solution has no actual, demonstrable problem (e.g., you use LaTeX and it continues working great year after year for your use-case), then I don't consider that to be "tech debt" at all, even if some people don't like it.
Basically, he was assuming practical real-world considerations, I'm going off on a tangent about ideal conditions. His argument is about whether it's better to stick with the current AC system that your house has, or if it's better to install a low-voltage DC system to supply 5V, 12V, etc. to all your devices from a single, central, whole-house power supply as many people who don't understand electricity will frequently suggest. He's completely correct: low-voltage DC is a terrible way to supply power over any distance more than a meter or two because of resistive losses, so it'd require massively large copper cables or busbars. And power supplies are generally very low-efficiency when operated at low load. So our current approach (separate little optimized power supplies for every device, plugged into a higher-voltage AC supply) is actually optimal.