I would warn against conflating the concept of an interface with an abstraction just as much as I would against conflating generalizations and abstractions.
An interface often accompanies an abstraction, sometimes even represents one. But if we get down to definitions, an interface is merely something that you interact with: a function, a class, a network endpoint etc. If you write in machine code, you might not think that you are working with any kind of an interface, and certainly it's not a high level one, but you are interfacing with the machine's native instruction set. You could then argue that the instruction is an abstraction that hides what the machine is capable of. But now we're splitting hairs and about to ask silly philosophical questions like whether a car's steering wheel qualifies as an abstraction that exists to hide the axles. I would argue not. The interface's primary responsibility is to provide a mechanism of interaction, rather than to reduce complexity (though a good interface is simple and intuitive).
Both you and the author of the article posit a similar definition of 'abstraction'. From the article:
> An abstraction is only as good as its ability to hide the complexity of what lies underneath.
And from your comment:
> Abstraction means removing details from the thing you present.
I would actually argue that both, while very close, are missing the mark.
An abstraction exists to reduce a concept down to its essentials.
This doesn't necessarily contradict the definitions offered, but I think there is a nuance here that, if missed, causes the offered definitions to become useless.
The nuance is in deciding what is essential or necessary. If, definitionally, you choose to dispense with acknowledging the essential... well then you get the problems that the author is writing about. You get shit abstractions because no one bothered to think in terms of what they INCLUDING rather than DISPENSING with.
Yes, obviously, we abstract in an attempt to simplify. But that simplification needs to come from a positive rather than a negative.
In other words: What is the thing for? What does it do?
"Hides complexity" is the shittiest answer that anyone could ever offer as a response when faced with a given problem. First, what is the complexity that we are trying to reduce? Secondly, why are we trying to reduce it? Thirdly, when we have achieved our primary goal of reducing the complexity, then what does the thing look like? What value does it offer over interfacing directly with the "thing" being abstracted?
Abstractions in computer science are extremely valuable. So valuable, I would offer, that any time we hear engineers decry abstractions or point to abstractions as the root of all evil we ought to ask a very pressing question: "who hurt you?"
A good engineering solution is a simple one, and an abstraction is intended to be a simpification mechanism. But it doesn't have to necessarily simplify the intuitive understanding of the problem at hand. This is a good goal if you can achieve it, don't get me wrong. But that abstraction might exist so you can swap vendors in the future if that's a business priority. Or because you've identified some other element / component in your system that will be difficult to change later. So you stick it behind an abstraction and "Program to Interfaces" rather than "implementations" so that you can simplify the process of change down the road, even if it comes at an more immediate cost of making the code a bit less intuitive today.
Everything in software is tradeoffs. And a good abstraction exists so that we can focus on requirements and programming to those requirements. This is a smiplification when done properly.But the focus ought to be on defining those requirements, of asking "what should simple look like?"
I agree with what you're saying, and I would phrase it as "write the abstractions that reflect the essential complexity". The whole program should minimally reflect the essential complexity of the problem. Of course actually doing that isn't easy, but the result is obviously a simple solution for a given problem. It becomes another challenge to maintain and refactor: the question of changing problem constraints and being able to minimally change a program to match.
Why are ADTs like stacks, queues, and hashmaps so popular? Why are languages like C or Forth so highly praised for a high ceiling for performance and efficiency? Because they are usually "about as good as it gets" to solve a problem, "what you would've more or less done anyways". Maybe on a GPU, a language like C isn't quite fit, because the problem has changed. Make tools (e.g. CUDA) that reflect that distinct complexity.
An interface often accompanies an abstraction, sometimes even represents one. But if we get down to definitions, an interface is merely something that you interact with: a function, a class, a network endpoint etc. If you write in machine code, you might not think that you are working with any kind of an interface, and certainly it's not a high level one, but you are interfacing with the machine's native instruction set. You could then argue that the instruction is an abstraction that hides what the machine is capable of. But now we're splitting hairs and about to ask silly philosophical questions like whether a car's steering wheel qualifies as an abstraction that exists to hide the axles. I would argue not. The interface's primary responsibility is to provide a mechanism of interaction, rather than to reduce complexity (though a good interface is simple and intuitive).
Both you and the author of the article posit a similar definition of 'abstraction'. From the article:
> An abstraction is only as good as its ability to hide the complexity of what lies underneath.
And from your comment:
> Abstraction means removing details from the thing you present.
I would actually argue that both, while very close, are missing the mark.
An abstraction exists to reduce a concept down to its essentials.
This doesn't necessarily contradict the definitions offered, but I think there is a nuance here that, if missed, causes the offered definitions to become useless.
The nuance is in deciding what is essential or necessary. If, definitionally, you choose to dispense with acknowledging the essential... well then you get the problems that the author is writing about. You get shit abstractions because no one bothered to think in terms of what they INCLUDING rather than DISPENSING with.
Yes, obviously, we abstract in an attempt to simplify. But that simplification needs to come from a positive rather than a negative.
In other words: What is the thing for? What does it do?
"Hides complexity" is the shittiest answer that anyone could ever offer as a response when faced with a given problem. First, what is the complexity that we are trying to reduce? Secondly, why are we trying to reduce it? Thirdly, when we have achieved our primary goal of reducing the complexity, then what does the thing look like? What value does it offer over interfacing directly with the "thing" being abstracted?
Abstractions in computer science are extremely valuable. So valuable, I would offer, that any time we hear engineers decry abstractions or point to abstractions as the root of all evil we ought to ask a very pressing question: "who hurt you?"
A good engineering solution is a simple one, and an abstraction is intended to be a simpification mechanism. But it doesn't have to necessarily simplify the intuitive understanding of the problem at hand. This is a good goal if you can achieve it, don't get me wrong. But that abstraction might exist so you can swap vendors in the future if that's a business priority. Or because you've identified some other element / component in your system that will be difficult to change later. So you stick it behind an abstraction and "Program to Interfaces" rather than "implementations" so that you can simplify the process of change down the road, even if it comes at an more immediate cost of making the code a bit less intuitive today.
Everything in software is tradeoffs. And a good abstraction exists so that we can focus on requirements and programming to those requirements. This is a smiplification when done properly.But the focus ought to be on defining those requirements, of asking "what should simple look like?"