It doesn't make much sense for people who always heard and thought that "an interface just lists the public methods and properties of a class". In that view the classes always come first and the interfaces are unnecessary boilerplate.
However if instead you assume that "a class is just an implementation of an interface", then everything changes. Now the interfaces come first and classes are required to materialize them. In that paradigm objects only communicate with interfaces, never (or almost never) with other classes unless a specific implementation is required. Which is why an interface is always defined for a class even if there is no plan for other implementations of it.
Rather than a "contract", think of an interface as the description of a job. If you ever need to write a safety procedure, does it make more sense to write "In case of fire, call John; John happens to know how to extinguish a fire." or "In case of fire, call a fireman."? "Fireman" is an interface. "John" is a class implementing that job. It might be the only class in the codebase that implements it, but that is no concern of the caller. They just want that fire dealt with.
I discovered software programming about 20 years ago during high school by fooling around with my Casio programmable calculator. I spent countless hours writing small utility programs and games in BASIC, then I bought a more powerful calculator (still Casio though) which allowed me to learn and write programs in C, ASM (286) and C++.
Around 2005 I got myself a Casio ClassPad 300, which was like a strange combo of a calculator and a PDA. It had a big B&W touchscreen with a stylus. It was the first Casio handheld that came with an official SDK to build native apps for the device; before that it was only possible through pure reverse engineering and hacking -- good times. :) So there were 2 options to write programs for this device: you could either code in BASIC directly on screen, which was easy but very limited and suffered terrible performance -- or you could use a computer to write and compile a C++ add-on and then transfer it to the device, which was extremely powerful but of course a lot more complicated especially for beginners.
So here is what I did: I took the C code of the Lua 5.1 interpreter, embedded it into a new C++ add-on, and then I made it possible to write and execute Lua code directly on the device through this add-on. I even made a new custom font for the text editor and a new virtual keyboard for the stylus to maximize the amount of code visible on screen. I wrote my own memory block allocator for the interpreter to provide access to additional RAM space, among many other hacks. But more importantly I made most of the features of the SDK accessible from Lua scripts, which means it was then possible to write or read persistent files, draw any kind of figures on screen, create user interfaces using native UI components, use advanced math features, send data packets through USB etc.
The project quickly got a lot of interest inside our small community: now you had a third option for coding, offering excellent performance, lots of advanced features and it was very easy to start with. It was the first time I built a tool incrementally based on user feedback instead of building it just for me. Even the maintainers of the SDK and official apps followed the project closely. Eventually I moved on and never got the chance to actually implement everything I had in mind. I learned a lot, had lots of fun, and this experience has been key to the identification of my ideal career path.
Let's say my startup makes a product featuring some sort of kanban-like board. The user moves a card from one column to the other. According to this article, the frontend might just need to perform an UPDATE on the right table from the database and that would be it.
Yet, this is what my backend does:
* make sure the user is allowed to move that particular card.
* update the database.
* add an entry to the board's activity log.
* notify all connected clients listening to events from this board that a change occurred, so they can refresh the UI instantly.
* if some users asked to be notified about changes, generate notifications.
* if some of these users are offline but still want to be alerted, generate and queue some emails to send.
* add an entry in a raw text log for easy debugging.
* register the event in some kind of analytics storage for future stats.
* if the board is integrated with e.g. Slack, call Slack's API.
* if some users registered webhooks through my API, trigger those.
I'm so glad my frontend engineers actually do not have to worry about how to do any of that.
The article's subtitle literally says "The best backend is no backend at all". The article gives the impression you have no idea what a backend is supposed to do, except expose some data.
After reading the article I'm not entirely sure all these drivers did participate in the race using their own setups while staying home. Especially after this part: "[Pike] and Majors tried to fill up a field that reflected this big tent, letting in fellow NASCAR crew members, as well as some public relations and social media specialists."
Can someone please tell me they did not turn a strong recommendation to avoid public events (which lead to the competition being cancelled) into an opportunity to gather tens of people at one place?
Metaphorically, "field" means the group of people competing, and "big tent" means an inclusive group. So there's a natural interpretation that doesn't mean they all gathered in one place.
FYI my wife and I signed for a mortgage 4 months ago here in France for our new home. 325k€, 25-year, 1.66% interest rate, 0.8% insurance rate for complete coverage of the both of us. Both fixed rates.
The rates were actually going up at that time - if we could have borrowed 6 months earlier we would likely have had even better rates. Also if you can borrow for a shorter period of time the rates get much lower.
Interestingly the French market (and more generally the European Market?) seems to have been spared from that trend, at least for now. The prices have remained steady during the last year and there is still stock available for a large choice of card flavors. See for example https://www.materiel.net/achat/gtx-1080/catNom-cartes+graphi...
Perhaps electricity costs there are higher than in the U.S. so mining is less profitable?
Although I don’t have these issues in Ohio, either. You can find any NVidia card you want easily, and only AMD Vega is in short supply at the moment. Still doable though, if you’re willing to put in a bit of effort.
Cards in French retailers are 20 to 40% more expensive than the American retailers.
They are constantly sold out as well. From my experience, there are only the overclocked/premium editions left and they charge a lot more than the RSP.
Thanks Bob! :-)