Hacker News new | past | comments | ask | show | jobs | submit login
What I learned covering computer science in 2024 (mailchi.mp)
135 points by lr0 41 days ago | hide | past | favorite | 12 comments



>A pithy quote attributed to the pioneering researcher Edsger Dijkstra helps me explain: “Computer science is no more about computers than astronomy is about telescopes.”

Reminds me of the MIT lecture "Lecture 1A | MIT 6.001 Structure and Interpretation, 1986"[1]. And I still remind myself of that. If you fundamentally understand your problem and solve it, then writing a few characters in your favorite text editor that do the task is easy

- [1] https://www.youtube.com/watch?v=2Op3QLzMgSY


That's a good rule of thumb, but you should keep in mind that it's impossible to fundamentally understand most of today's challenges. I'd argue this has kinda lost it's meaning as we kept adding more and more leaky abstractions over the decades, making a fundamental understanding of the whole tech stack and what each layer does impossible.

And without these abstractions, you'd struggle to even do a fraction of what today's platforms provide.

But yes, the closer you can get to understanding the challenge you're solving, the better you'll likely be able to solve it. With the caveat that this understanding doesn't necessarily let you truly innovate either. It's just a rule of thumb from a time when computers were much less versatile and ubiquitous.

Totally off topic from the context of the article though


But these abstractions help the developer not solve computer problems, like how to arrange and shift bits in memory, but focus on the actual problem.

Only a subset of problems requires the maximum of performance.

But I would also argue, there are problems, that require more than a few characters in a text editor, despite understanding them.


What are "most of today's challenges"?

When I look back over my career, I mostly see increasing levels of abstraction for basically the same set of tasks: here's a database, make a UI to interface with it — sometimes the "database" has been a custom file format or a REST API, but it's still a database.

That's not to say there's no other categories of work besides CRUD apps, of course there is, games and document editors are mostly coming up with interesting rules for transforming a state that's fairly easy to display or sonify, and the challenge was one of "make it faster" which often meant throwing away every abstraction above the hardware itself — but even then, before starting my degree I managed to put together a decent raster painting app with just Visual Basic, and I know that code was bad even by the standards I had at the end of the degree.

Knowing MVC and MVVM makes it easier to work with system libraries based on those architectures. But why did we ever get things like VIPER? All I had when working with that was stress — what should have taken one person a few days, took a team several weeks. When I've been given free rein to choose the right solution, going with the old one has more than once allowed me, alone, keep up with an entire team doing the same thing some other way on a different platform.

Even "just" reactive UI is an abstraction that most of us don't really need — which is why there's even a push for "vanilla" JavaScript.

So, what are today's challenges, that current abstractions we need to learn to work with each other are the solution?

Or did you mean abstractions such as "I can pretend my keyboard doesn't bounce when considering keypresses" and "I don't need to care about OSI levels 1-6 and barely need to think about level 7"?


The quote I was responding to contextualized it to being able to write the few lines necessary to solve the problem.

From this perspective, the fundamental understanding for a crud application is quiet challenging, as you'd need to understand

* block storage

* the way the database handles the data (storage, access, permissions etc)

* The way your application interfaces with the database, likely over the wire adding

* The entire tcp/IP stack

* The VM/runtime you're using to create your application, along with every used library

* also everything these libraries use, adding the whole OS to the stack.

* If it's a web application, you'll also have to add the browser along with it's JS ecosystem

You can keep going, having a fundamental understanding of the problem to a degree that you'd be able to easily break it down into working code in an editor implies you're able to directly understand all error scenarios which every abstraction layer added, this I find impossible.

The only project I'm aware of even attempting something like that is SQLites proprietary testing suite, and they're at what, 900+ times testcode vs application code right now?

If you instead think about the tech stack of the 80, it had a lot less to keep in mind, as the scope ended at a pretty rudimentary layer. The terminal being the main UI and likely nothing between your code and any persistence you might utilize.

But I wasn't a programmer in the 80th, I've formed this opinion by reading about the tools that programmers used at that time. Maybe there was a lot more layers to software development back then too? I cannot say one way or another with any confidence, as I haven't lived through that time period as a working adult.


On the other hand, understanding what part of what you are doing is due to a leak from the lower layer helps you isolate it in case you should shift what you are building on. And now we have another abstraction!

It is rare that you need to understand more than a few layers down, at any given time. As I type on this keyboard, I have no care what the neutrons, protons and electrons are doing in the keys are doing. From time to time maybe if I get a nasty static electric shock I want to stop.


May explain why some people are great programmers but average coders . . .


It's a nice recap of recent advancements in theoretical computer science (quantum computing, adaptations to Dijkstra's algorithm, busy beavers).

Though do note there's many other branches in compsci than theoretical computer science. I would've added "theoretical" in the title and the intro "... Ben Brubaker unpacks what computer science is really about" seems a bit much.


Did the optimal algorithm for dijkstas shortest path mentioned in the article get used for something immediately consequential and very time sensitive?. My intuition would think routing internet traffic could be a good us case, but I may be wrong.


It was part of a tech demo for a research computer, in 1956, intentionally designed to show an easily visualisable problem domain - road connections between cities. That’s over a decade before ARPANET existed.

There’s an interview here with more: https://dl.acm.org/doi/pdf/10.1145/1787234.1787249


Dijkstra's algorithm is used in the OSPF routing protocol which used to route traffic around large networks (think ISP internal network, or a large Enterprise).


The article on it [1] says that, "The new result probably won’t have such practical applications, for which there are many considerations beyond theoretical optimality guarantees."

[1] https://www.quantamagazine.org/computer-scientists-establish...




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: