Hacker News new | past | comments | ask | show | jobs | submit login
Foundations of Software Engineering (cmu-313.github.io)
499 points by charlysl on Oct 31, 2020 | hide | past | favorite | 125 comments



> You can write code. Can you build software?

After 20 years doing this job I can confirm that building software is really hard. I’m not sure how you really give people experience with that. My courses at uni did little to prepare me for building software.

Once you have a codebase in place, things are a whole lot easier. There are far fewer decisions to be made. Really you have little choice in how to do things as the current structure dictates your options. Discussions with customers are easier too, as you’re no longer talking in the abstract.

But it’s really challenging reaching that point.

Edit: having said all that, this does look like an interesting course


I think the problem is you really need to feel the pain of a few badly organized projects to understand why we need things that seem like extra time sinks. It's hard to design a course that deliberately goes about building a project the wrong way without seeming contrived, and yet it's easy to just start writing something that ends up being a big ball of spaghetti.


> I think the problem is you really need to feel the pain of a few badly organized projects

I'd go further than that. I've come across a few programmers who had _only_ built new systems, or maintained systems they had built. You could cajole them about coding standards, documentation, unit tests, clear naming until you were blue in the face. It is pointless.

So you force them to maintain someone else's code base. Be prepared about the rants about how bad code base is, whinging about having to learn all the unfamiliar libraries, the total lack of comments, yada yada. But somehow the connection between how that programmer worked and how they worked is never made.

Then 6 months later, after a some of the intimate knowledge of their own code they were using to make maintain it tractable is gone, you move them back. Then, finally, it hits home. They improve steadily thereafter.


Yeah, I was surprised there is no mention of drift and technical debt. These are the things that make software engineering hard. Someone new will always suggest a rewrite, which is tempting but is almost always a trap and massive timesink.


You could have them build a project one requirement at a time. That's how projects usually get out of hand in the real world. But I'm not sure if a semester is long enough.


I had a class in grad school called Database Implementation. We wrote a simple SQL database from the ground up, with a SQL parser as the first assignment. I had worked in industry for a few years, so it was fun seeing that "I have painted myself into a corner" look in some people's eyes. And the "I have to rewrite everything" scramble. :-)

That was also when I learned you could deadlock AIX 3.2.5's NFS filesystem using mmap. (I was also working as one of the dept's sysadmins, so putting 2 and 2 together was easy.)


I think you can do this. You take an existing badly done code base, and ask the students to make significant changes and/or additions. After they've felt that pain for a semester or two, you discuss what causes the pain, and what choices led to that point.


> Really you have little choice in how to do things as the current structure dictates your options.

I really like this framing. The architecture of the code should reflect the flexible and rigid parts of the domain, and how they interact. A good design will let small changes in the domain become small changes in the codebase -- and large changes in the domain will at least not be arbitrarily larger in the codebase.

Software architecture feels like a very code-centric and technical thing -- and it's not like it isn't, exactly -- but it's really driven by a solid understanding of the system into which the software will be placed. Often, the pre-existing system is a bunch of humans, and people who previously interacted with humans will now interact with software. Also often, the pre-existing system is some mix of humans and software. And even if the pre-existing system is pure software, some of the internal or external boundaries will undoubtedly shift.

> But it’s really challenging reaching that point.

We as a profession need to acknowledge that not everything is "solved" by software, and that we really need people with design experience, human factors experience, and the ability to distill the domain into something that can be addressed by software engineering. I think "reaching that point" is predominantly led by these factors, and not technical excellence in the strict sense.


It's amazing they have a screenshot of FreeCol in there! That's a Java game I've submitted bugfixes to, like fixing multiplayer, in just a few hours, despite having barely any Java experience, because it's so debuggable and ordinarily architected.

> But it’s really challenging reaching that point.

Choosing opinionated and well supported languages & frameworks is essentially the lesson there, and they'll probably finish that in one week.

The real difference here is that this course does not aspire to cram leetcode, which is what you're actually evaluated on if you get into Software Engineering. In that framing, they are adopting an extremely contrarian and risky position: if their students perform worse in job hunts after taking the course, and entrepreneurial / indie / artisanal software development remains hit-driven, students would conclude the course is bad in the long run.


> Choosing opinionated and well supported languages & frameworks is essentially the lesson there...

There are strong benefits to opinionated frameworks, and reaching an initial version of a code base is absolutely one of them. But the trade-off comes in the next version, when you get customer feedback, and the devs start saying things like, "No, we can't do it that way because that isn't how the technology works."

That doesn't make it the wrong answer for a MVP or a V1... but this is an academic course. I'm hoping they cover the longer-term consequences of early choices in software development. Because that is what is really hard - not getting a first working version, but maintaining delivery speed over the long haul as you realize the limits of your choices.


> Once you have a codebase in place [...] There are far fewer decisions to be made. Really you have little choice in how to do things as the current structure dictates your options.

And for these exact same reasons working with an existing codebase can make further software development a challenging process. It's all a matter of how well those past decisions line up with the present and near future which may not have been apparent (or simply not considered) at the time.


> Once you have a codebase in place, things are a whole lot easier. There are far fewer decisions to be made. Really you have little choice in how to do things as the current structure dictates your options.

You still have the choice to extend the current code, or build more code around it.

For example, you have a ticket system and are now tasked to build full-text search for it. You could try to do it within the existing code base, or to add an exteral search engine, sync that data to it, and build a shallow integration between the two.

That said, you are right in that deviating from the current philosophy of the code base is a big risk, and so a big decision to make.


> After 20 years doing this job I can confirm that building software is really hard. I’m not sure how you really give people experience with that. My courses at uni did little to prepare me for building software.

I'm kind of amazed at just how much experience it seems to take to build a robust & generalizable model for making reliably good decisions about code complexity management & problem solving techniques.


The real trick is getting people to not have the same year of experience over and over again.


There is art to it. I wonder if it can be taught.


Pragmatic programmer does a decent job of it. Not perfect, but decent enough.

I'm of the school that thinks it's something you need to practice. Many software positions especially in large companies are very limited in scope, you basically just maintain and patch existing systems, rarely having to build systems from scratch. Building these systems from the ground up is what is hard but you learn what works and what doesn't the more you do it.

If you want to learn how to build software, join a start up or small company that has a healthy appreciation for what software rewrites can provide. Run away from companies that follow the never rewrite mantra. Usually saying something like you'll just repeat mistakes, its like well no shit this thing was built before I was born, if you rebuilt it with me here I could avoid those mistakes.

I'm not saying you should always rewrite, its usually better to refactor, but some design failures are hard to refactor out, have double binds or are too critical for stability woes. But systems that live forever should have unlimited quality assurance budgets for refactor and repair, very few do in this industry. Managers seem to expect systems to live forever all while constantly patching with very small refactoring budgets. It's completely insane.

Like running a machine in a factory 24/7 and never doing maintenance but expecting it to work perfectly like new forever with no future cost incurred as eventual debt.


> Managers seem to expect systems to live forever all while constantly patching with very small refactoring budgets. It's completely insane.

Preach it! I don't know where this mentality comes from, but it seems to treat software as something that just magically solves problems and only needs a little upkeep to chug along.

The problem is that problems change, and a different problem is no longer necessarily a solved problem. It takes a lot of design effort to make something that can easily be adapted to changes in the problem space, and it still requires effort to make sure the software doesn't specialize itself over time into a corner.


Yup! And then sometimes it's actually advantageous to specialise systems into a corner, it has benefits, but these decisions need to be considered and made with care, and then it needs to be an understood limitation going forward that it can't be rolled back at a later time without a high cost.

Thinking about the above further I notice the things I've learned over the years also require the ability to link problem to cause, or in my case it came in reverse, cause linked to problem.

I ran into problems for years, learned how to fix them, and then during code reviews (even in self review), discovered the cause behind these problems.

Sometimes these were causes that would only become problems at a later date because of knock on effects.

I don't know how you teach that. Possibly antipatterns / code smells which usually hint at underlying problems that may be elsewhere / at a different level.

Refactoring by Fowler and TDD by Beck cover a lot of these in good detail.


Refactoring is an amazing book, but I think most people would be better served by starting with working effectively with legacy code, as it shows you how to get to a place where you can refactor confidently.

The examples are kinda dated but the approach is pretty timeless.


I tend to disagree but this might be my biases at play.

Refactoring starts from first principles, almost all examples include the reverse refactoring as often you need to undo optimisations a few steps before you can move forward in a different direction.

I was recommended working effectively with legacy code by the same types of managers that expect forever returns off the same code. They like to point out how that book encourages small patching without refactoring as refactoring is often fruitless in short term money terms, whereas Fowler tends to advocate refactoring as a healthy and necessery part of software development.

I can't say I hated working effectively. I found TDD by Beck more useful, and working effectively with unit tests to also be worthwhile.

All of these books cover a lot of similar ground.

If you're going to read only one or two I suggest TDD > refactoring and avoiding the others.

(Just my opinion based on what I got out of them reading after 10 years on the job, I suspect juniors might have a different take)


So, I love all of the books you've mentioned.

However, I have ended up inheriting a pile of crappy code driving business value a bunch of times in my career (data science is much worse for this), so for me, Working Effectively was really really useful as it provided ways to deal with the craziness.

As soon as I finished WELC, I ordered refactoring, and devoured it - but it is less practical for my situations, where there are no tests and nobody sees why they would even be useful.

In terms of good and impactful reads, TDD is probably the most bang for my buck I've ever gotten, in that it's super short and very very good. Kent Beck is hilarious also, which definitely helps.


That'd be "Working Effectively with Legacy Code" by Feathers -- which I now look fwd to reading; thanks for the rec!


> I wonder if it can be taught.

Of course it can be taught, it has to. The current model of expecting people to just pick it up on their own by going to school or a bootcamp isn't working.

I think a large part of the problem is that in contemporary workplaces it's becoming rare for senior talent to spend serious time developing juniors and doing so as an integral part of their job rather than merely within the scope of some mandated "knowledge transfer" phase.

To many places are beset with pushy "deliverables" treadmills and suffocating project management. It's hard, especially in large organizations, to just "think" and be creative. Juniors need to see that, emulate it, and get meaningful feedback beyond a dumb burn-down chart.


Art schools that teach you visual art and craft is a thing. So I don’t see why it would be impossible to teach the art of building software. We just need to change our methods.


I would feel significantly more comfortable calling programming an art rather than science or Engineering.

I was an engineer before switching to programming and despite throwing the word Software Engineer in my title, there is no Engineering.

There is developing only.

'best practices' come from Authority, not science(note at bottom). This is a stark contrast to Engineering where everything is proven with math.

(Note)We don't write in logic gates anymore, efficiency is obscure at modern coding levels.


> This is a stark contrast to Engineering where everything is proven with math.

You've just described the field of formal methods. For software with strict requirements of correctness, correctness is sometimes proven mathematically.

It's not that a rigorous approach to software development is impossible, it's more a question of when it's really worth doing. At present, formal methods are very costly to use. I don't think it's always necessary to go all the way to full formal verification for software development to count as rigorous, though.

Related reading: They Write the Right Stuff, from 1996. [0]

> efficiency is obscure at modern coding levels

Depends on the domain. There are plenty of domains where efficiency doesn't much matter, on modern hardware. Game engines are still tuned for efficiency, and will continue to be.

[0] https://www.fastcompany.com/28121/they-write-right-stuff


To clarify, you cant know what is the most efficient path in Programming.

You can modify and time, but you can't know ahead of time.


If you are writing in an assembly language ie for a specific hardware architecture, why couldn’t you know the most efficient path ahead of time?


I don't see the point here. It's quite possible to assess the performance of a game-engine.


Okay I'll explain one more time, maybe a second write will clarify.

In Engineering, everything is defined with specifications and based on specifications you can calculate if something will work. If you need something to work at 100 degrees C but it will never get to 110C, you don't buy a material that works at 110C, you find the cheapest material that works at 101C. (Factor of safety aside)

You can prove in math your design will work

In Programming, unless you are working at switch levels, how could you prove your code is most optimized?

It's not going to be. Abstraction has removed the possibility of doing that.

The closest thing I've seen to Engineering in the Programming world is Industrial Engineering. This is the way Engineers optimize production environments. You can see this in Programming in various time/load aspects. But industrial engineering is an "After the fact", and as much as I like optimization, it has a reputation of not being Engineering.

Hope that makes sense. Science and math vs optimization.


> You can prove in math your design will work

It's possible, but extremely labour-intensive, to mathematically prove the correctness of a program. I mentioned this in my previous comment.

> In Programming, unless you are working at switch levels, how could you prove your code is most optimized?

'Most optimised' is an entirely different problem than 'your design will work'.

Complexity theory lets us do something like this, but at a more abstract level, rather than at the level of real-world performance on a particular machine.

As you say, real-world code is almost guaranteed not to be perfectly optimal in terms of performance. We really never need to aim for this though. Market forces push for high correctness and performance, to varying degrees across different different problem domains. Performance matters in game-engines and for high-frequency trading, and in those applications, the software engineers put in great effort to optimise their systems, using fewer abstractions.

Sometimes the software engineering challenge may be a life-critical hard-real-time system, such as in medicine or aviation. Software engineers are able to deliver such systems. It's not of great concern that these systems aren't perfectly optimal in terms of their performance, provided the programs give the right outputs and always meet their deadlines.

The generation of perfectly optimised code has been researched, branded superoptimisation, [0] but it's little more than an academic curiosity. I doubt it will ever be possible to scale it up to be practical for large programs. (I'm not sure if/how they deal with the way most modern processors are terribly complex and don't have easily predictable performance.)

> Abstraction has removed the possibility of doing that.

It's very often a sound decision to trade off on performance in order to gain on some other dimension, such as development velocity, or maintainability, or indeed correctness, given the project's resource constraints. As tools improve, the Pareto frontier advances, and we get a little closer to being able to 'have it all'.

An example of this might be the use of Rust for web development work, which can apparently greatly improve performance (or greatly reduce the computational resources needed). It can bring the performance of C++, while retaining many of the advantages of safer, higher-level languages like Java and C#.

Whether it does this well enough to really grain traction, we'll have to wait and see, but I think it's a good example.

> The closest thing I've seen to Engineering in the Programming world is Industrial Engineering.

Industrial engineering is an entirely different discipline than software engineering.

For examples of 'proper software engineering', the obvious candidates are avionics (as discussed in the article I linked above), and development methodologies involving formal methods, such as with the Tokeneer project. [1][2]

[0] https://en.wikipedia.org/wiki/Superoptimization

[1] https://blog.adacore.com/tokeneer-fully-verifed-with-spark-2...

[2] [Huge PDF] http://www.adacore.com/uploads/downloads/Tokeneer_Report.pdf


We may be talking past each other as your points aren't relevant to Engineering.

I'm going to highlight this-

Engineering is Science and math vs Programming which is optimization.


Software engineering is an optimisation problem in the same sense that other engineering fields are optimisation problems. There are tradeoffs to be made on many dimensions, just like in any engineering discipline.

Again, mathematical tools can be used to prove the correctness of programs.


I think you are missing the science and math aspect of this. This isn't a contest. Physicians make lots of money by following tradition and practicing medicine as an art. There is huge debate that Physicians could lose their monopoly powers if medical is a science.


This is vague. Please clarify your point. What aspect are you talking about?


I suspect that marketingPro meant that you can't know what is the most efficient path for writing that game engine.


I learned most of it through pair programming. You teach each other.

With solo programming, everyone has to figure everything by themselves. A deeply inefficient system.


After writing code for 20 years, I found out there are different interpretations of "simplicity", just like an art.


Any thought pattern can be compressed, transferred & applied I'd say.


>Once you have a codebase in place, things are a whole lot easier.

You sound blessed to not have run into huge spaghetti code with no comments, no testing, violates best practices, and doesn't work for the purpose, and also doesn't compile.

After this job doing bug fixes and feature adding, I'll take new code ANY day.


Oh don’t get me wrong, I’ve worked in plenty of bad codebases. One job I walked into and on day one the two other senior devs said “congratulations, your share is these 1M lines of code”. It was a total mess. It was back in the day when coldfusion didn’t have functions and instead you called modules that accessed and updated variables in their parents/grandparents directly. There’s nothing more devilish than code that can touch any other part of system. By the time I left there I understood the domain well enough to rewrite that whole system away. I feel your pain.

I’m talking more about how to start things right so you don’t end up in that place. It’s really hard.

And my point still stands that in your codebase, it’s done. The ship has sailed. There’s some comfort in knowing that you’re just going to have to make the most of the restrictions you’re working within (small comfort, I know).


Personally I feel like our software development methodologies/technologies/tools are drastically underequipped to handle the complexity of the systems we are attempting to build. I'd estimate we'd maybe be at the level now to handle complexity of software 30-40 years ago (i.e. 80s software), but that is not based on anything solid because in the world of software there is precious little solid to base anything of. The free lunch we have been enjoying might come to an end now that software is imbuing itself more and more to things that really matter, like human rights and safety. Electronic voting and self-driving cars come to mind as examples.

Of course another problem is that not only is software engineering still pretty immature, its use also is very unevenly distributed. While we might be able to build software somewhat reliably with state of the art tools and methods, vast majority does not use them. Not that we have much agreement on our field what constitutes unequivocally good practices. This has lead to the situation that even the basic foundational building blocks we use are usually of dubious quality.


I used to think that but now I don't think we're getting better. The more I think about it the more I see that people are equally bad at it, progress only happens by building out the tooling beneath them that allow them to build more and more complicated things, without them realizing it.

Developers have the capacity and willingness to work on a small part of a codebase and the rest has to be filled in by others in a similar situation and by tooling.

I'm a bit biased since I run an edtech company but it's very disappointing to see how hard it is to get companies and people to actually invest in education.


I question the need for all this complexity.


I took and TAed this class. I ended up dipping my toes into the open source waters with some Firefox contributions and completing the rest of the minor program including a set of contributions to Chromium. One thing that was heavily emphasized in that curriculum was how open source can be a huge asset to companies. I really took this to heart, and it has played a significant role in shaping open source strategy for containers and Linux stuff at Facebook.


Anywhere we can find the videos? I think they'll worth their length in gold.


is there a textbook you'd recommend for learning the content of this course...for those of us who can't go to CMU to take it?


the lectures slides are avialalbe on this page


Software engineering was life changing for me. I studied cognitive and computer sciences (CMU '03) - and when I graduated I realized I didn't know much about writing software.

Luckily, I found a software engineering course (Berkeley, 2005) after graduating and they let me attend without being enrolled. That professor (Kurt Keutzer) had wisdom! Literally life changing.

These engineering skills are totally different from what you learn in a computer science curriculum. I highly recommend a course like the one OP links.


Hey everyone- I am one of the co-instructors for this course (Michael Hilton). Thanks for all the comments! I’m happy to answer any questions about the class, and I’m always looking for suggestions on ways to make the class better: AMA!


This looks great! Are there plans on making more of this course's content available to non-CMU students (e.g. recorded lecture videos)? Thanks


That is something we would love to do in the future, however our current videos have students voices and faces so we cannot release them for privacy reasons.


My wife's in a remote master's program. What they did is chop up the lecture. When students asked a question, that was omitted from the online presentation. The result is that an hour of class winds up being, say, four 12-minute videos.

That sounds weird, and it can be a bit choppy, but it has advantages. You get to stop and think at the points where others needed to ask questions. You have natural breaking points if you prefer smaller-sized bites.

So, I don't know if the students are in all of your video, or just when they ask questions. But if it's only when they ask questions, you might consider this approach.


Aww, sad :(. The faces are likely problematic, but the voices themselves shouldn't be (not identifiable to a person)


Why not? I can't really see the difference, except that identifying a person by face is easier than by voice.


Yeah exactly, a generic voice is not personally identifyable.


My suggestion would be to consider safety-critical software engineering: What is the difference between an autonomous driving prototype and product? From there, venture towards process stuff like ISO26262, SPICE, MISRA-C, etc. Students should learn what the purpose is behind these process standards. It might lead to interesting discussions for what use cases you want to employ how much process/tool overhead.


Thanks for sharing this content professor! Any resources (online courses/books/etc) that you can recommend for current CS students that don’t have an opportunity to take a class like this?


I haven’t had time to read it yet, but I have been hearing a lot of good things about the new book from google (full disclosure, I know the authors and they are great) https://books.google.com/books/about/Software_Engineering_at... I also recommend following practicing software engineers on Twitter, to see what people in industry care about.


Regarding lecture 16, Intro to QA testing, my feedback as a practitioner would be:

- Typo - "principle goals" / "principle techniques" instead of "principal"

- Add accessibility and internationalization under non-functional testing types, with at least a brief mention of their importance

- Add mention of exploratory testing under manual testing as a valuable technique

- Most of your students will not be QA staff themselves in their careers, so a brief mention of the benefits and pitfalls of collaborating with dedicated QA personnel may be useful. A mention that the type of QA personnel varies highly by software subspecialty and size of the business (ranging from non-technical to CS degree holding specialists) may also be useful.


I'm glad that the lecture slides were publicly available. I was particularly excited about the slides for architecture and design docs. I just wish I could sit in to listen on lectures (or have the discipline to read through the textbook :PP).

In my undergrad, I felt that my CS classes were divided into "theory" and "systems". But, even the "systems" classes would feel removed from the modern tech industry and commercial software.

The topics that I see mentioned in the syllabus are familiar to me, but only because I surf Medium articles, and not because of my formal CS education. Yet, these topics are incredibly deep and complex. I would consider the people who work on them to be researchers in a hyper-specific domain.


The elephant in the room that 17-313 doesn't addresses is team dysfunction.

You know, the guy who insists on agile when you're about to automate a nuclear reactor cooling system. Or the one who insists on expensive message queue tech when async/await is more than sufficient. The bro that mansplains directed graphs at a PhD because she comitted the cardinal sin of being a woman. The infrastructure manager that micro-manages a developer because that manager was too lazy to stand up the test environment on time.

We all know someone like that, and after 30+ years in this industry it seems it's worse now than it was in the 90's. Back then all we had to contend with was Windows v. Unix and bastard operators from hell(1). Today it's more insiduous. Often well-dressed, veiled in charm and confidence and hiding behind a big pay check because they're adept at managing upwards. I can deal with it, which has another unfortunate side-effect - other developers want me to manage, I just want to write code.

Amusingly (sadly?) these problems(2) were identified in the 80's if not earlier. Books were written about dealing with it (2 and 3, amongst others). But still we allow the behaviour and often organisational/industry culture encourages it (fintech, anyone?).

(1) http://bofh.bjash.com/

(2) https://en.wikipedia.org/wiki/Wicked_problem and https://www.amazon.com/dp/013590126X

(3) https://en.wikipedia.org/wiki/Peopleware%3A_Productive_Proje...


It does, in the last lectures, see Nov in the schedule:

https://raw.githubusercontent.com/CMU-313/CMU-313.github.io/...


What's there is valuable for sure. It pushes agile though. A lot. Another approach may be to provide a way for a team to choose a methodology that's appropriate for what's being built (1). It describes effective teams, but doesn't provide guidance on identifying dysfunctional teams/team members, and transforming those to productive teams/team members.

(1) https://www.wittenburg.co.uk/Work/Choosing.aspx


Fun thing is: agile are principles.

Scrum is a method, or safe or less, or kanban.

And: The complete V belongs to the "definition of done" for safety projects, like the example above, control software for a nuclear reactor.

Whiche gets me to an intuition that you can have dysfunctional teams in any software engineering approach.

The challenge of dysfunctional teams is not addressed by a discussion about agile, lean, "waterfall".


You are absolutely correct. Defining agile as principles, or scrum as a method is correct, but doesn't change how the terms are used. Turning dysfunctional teams around is something I'm reasearching (as identifying them is easy).


Psychological safety seems to be a big part of the puzzle. For example:

> If there's one thing I feel that is absent from the Agile Manifesto, it's psychological safety. We have learned in the past 20 years how important this to a teams success. https://old.reddit.com/r/agile/comments/jd29lf/the_elephant_...

Just today, I started reading The Culture Code by Daniel Coyle. The first two chapters are great and it addresses exactly this question.


I've been thinking recently about tradeoffs between (agility / resilience / adaptability) and (efficiency). It seems to me that so many orgs' attempts to adopt "Agile" practices are motivated by a desire for increased efficiency per se. Which seems paradoxical. I'm not trying to get into a "one true Scotsman" debate about what Agile really means, but what I keep seeing is an overemphasis on process that runs counter to the heart of the manifesto. IMHO that's the crux of a lot of dysfunction. V. interested in others' thoughts here.


In my opinion [0] Agile improves time to market (latency) at the cost of process efficiency (throughput). If you adopt Scrum to improve efficiency (maybe calling it velocity), then you are using the wrong approach. It will certainly not improve efficiency. Worse, by pushing Scrum to improve efficiency, Scrum loses all its advantages. This leads to the common result nothing changes.

[0] http://beza1e1.tuxen.de/cost_of_agile.html


Why value efficiency and not quality? Remeber Deming (1)?

             Result of work effort
  quality = -----------------------
                  Total cost
When focus is on quality, quality rises and costs fall. When focus is cost, costs rise and quality falls.

(1) https://en.wikipedia.org/wiki/W._Edwards_Deming


It's literally part of the schedule: https://cmu-313.github.io/#schedule


No, it isn't. See my comment addressing this above.


I saw your comment, and it says that the course does not address team dysfunction.

It explicitly does.

Whether it conforms to what you want to consider to be team dysfunction is another point of discussion entirely.


I do not consider dysfunction something other than the course describes. What I take issue with is that the course offers an outcome but no approach. Read it again.


CMU has always been big on this kind of thing. I took a CMM (pre-CMMI) course there, about twenty years ago.

I tend to write stuff I architect from the ground, up. This has earned me a number of sneers, and comments about “reinventing the wheel,” but I seem to have picked up a knack for it, from many years of experience (my very first engineering project, in 1987, was a complete design of a microwave routing tool; including chassis milling, electrical design, firmware operating system, and host drivers)[0].

One of my biggest motivations, is that I “eat my own dog food.” I am usually the poor bastard that needs to back through my legacy, and figure out what I did (in fact, I’m doing that, right now). It has taught me to leave a good legacy [1].

[0] https://littlegreenviper.com/TF30194/TF30194-Manual-1987.pdf (downloads a PDF).

[1] https://littlegreenviper.com/miscellany/leaving-a-legacy/


In my experience there are two types of devs who reinvent the wheel.

Those who think they can do better and those who can do better, haha


I know this is in jest, but honestly there are two kinds of wheels: round things that everyone uses to roll around with in common/universal ways, and specific configurations of wheels like this person mentioned for microwave routing.

You probably don't want to reinvent the wheel that is SSL or UTF-8 or something ubiquitous like that, but there's actually not that much wrong with reinventing the way specific wheels are arranged and designed to work together in a domain-specific context like microwave routing (which I know nothing about). The act of reinventing it will probably teach you a lot about the domain itself, and depending on where you started in your journey, it can be quite fruitful (i.e. if you are a true beginner, then many of the lessons you learn will be things that are already common knowledge; if you are already a journeyman or expert, it's quite likely you might pick up some novel insights in the course of disassembling and reassembling the thing).


If you are able to define that you indeed need a new kind of wheel, it may not be a reinvention :D

Most people also don't even know that the wheel exist and don't think that they reinventing it.


In my case, as in most, it depends.

For example, they are absolutely correct about "reinventing" encryption algorithms...unless you are a TLA, and want to have something that no one else has (and have some good nerds on staff).

It may also be good for control freaks (like Yours Truly), that deal with fairly sensitive data, and who don't feel comfortable using a well-known framework with a big presence on mitre.org.

I'm really pretty good at designing frameworks and SDKs/APIs. I've been doing it for over thirty years (since before they were coined terms).

Chances are good, that I'll probably be able to scare something up.

I like how it's OK to write a new coding language, but "reinventing the wheel," to write a backend.


You missed the third type: those who don't know if they can do better, but are still early enough in their career to want to do everything themselves because it's exciting and they don't yet know what is worth doing yourself, and what you should trust history to have already done for you.


Yep. Those will become the great programmers of the future.

Honestly, I can’t think of a better way of spending ones early career than reinventing some kind of wheel.

We should stop pretending that one is able to become a good programmer just by spending time in a chair doing the same thing for years.

Is it a mistake to reinvent the wheel? 99% of the time, yes.

Are those mistakes an amazing learning tool? Definitely.

If you’re a developer just starting: go and write a web framework, game engine or operating system. That's probably the only way to become a software engineer, since nobody will teach you the lessons you'll learn by doing. You’ll future self will thank you.


Yep. It unfortunately takes a lot of writing code to understand that you should not write any code and to become good at not writing code.


There's nothing unfortunate about that. You're not going to be very good software developer unless you spent a number of years biting off way more than you can chew in the assumption you could succeed, and failing a lot along the way, discovering when the solution to "why is this not working" is "I should find someone else's code to do this for me".

Computer science is a science. Software development is a craft. And like any craft, there's no substitute for experience.


”Good judgement comes from experience. Experience comes from bad judgment.”

-Attributed to Nasrudin, but made popular by Rita Mae Brown and Will Rogers


> There's nothing unfortunate about that.

So true. It’s one of the best things about our career.


Also those who don’t know that “wheel” is a major category with many open source options, and dismiss them after a cursory look when you point this out.


I think we don't learn and codify standards from our mistakes in software engineering in the same way we have learnt in civil, mechanical or electrical engineering. I guess it could be because we think the consequences in software mistakes are not as serious/permanent as they are in those other engineering fields.

Also the speed of innovation/change has been super fast compared to those other fields. But I doubt if we can keep up this pace of innovation for much longer without the software engineering process becoming the bottleneck.

I came across a startup of 50 people in which there were 4-5 different application stacks for server-side micro services – each team had decided to go their own way and there were no standard end to end observability and release management tooling and they were struggling to debug a latency issue – which itself was happening because they didn't have proper parallelization of RPCs and didn't have proper timeouts, isolation etc. This problem of not following well-understood design patterns, or keep stumbling into known anti-patterns (and worse yet failing to recognize them as such even after repeated punishing failures/outages) – I've seen various versions of this at different sized companies (from multi-million to multi-billion $$ ones).

And in each case, all decision makers didn't know/have standardized terminology to talk about things, what should have been easy conversations were seemingly unnecessarily hard about what's good design pattern vs anti-patterns etc. I can't help but think that this is uniquely a software engineering discipline problem.


A lot of the engineering that goes into microservices infrastructure is explicitly to allow heterogeneous application stacks. Paying all those costs in exchange for benefits you won’t use is just a different kind of bad engineering.

You definitely want standardization for truly undifferentiated concerns but it can go overboard. If you’re going to use the same application framework in every service, you may not need to separate services at all.


In most situations, micro-services are for organizing domain-specific functional concerns into decoupled teams/services so that functional domain changes can be done rapidly without risking rippling or inter-locking changes across the systems. The technical stack layer-cake is usually same for all the micro-services – that allows it to be 'micro' and yet be economical. You want less than 20% of your engineering force to be responsible for operating this technical layer-cake while the 80% of the engineering workforce to be productive on actually solving the functional domain problems of your business.

What you are referring to is the old-school SOA where each SOA service could be built by completely different companies using different stack and deployed in different environments and yet interact. This is applicable in financial/banking ecosystems (say real-time payment systems across banks/payment-networks).


I wish people would make more of a distinction between software engineering and software architecture/whatever is currently fashionable. The former is about customers and the latter about the people who write software, and as a customer I feel like almost no software has any concern for me or my hardware. Huge sizes (both in RAM and disk), constant CPU usage and all of this for no reason other than it's convenient or pleasing for software developers.

How do we build tools that are pleasing for software developers to use, while using resources efficiently and meeting the requirements? Can we start having that discussion instead of docker, microservices, kubernetes, etc?


On the other hand, sometimes a customer could not care less about CPU usage, memory, and in some cases paying twice as much for cloud services can still be a negligible amount for the business, yet the software developers obsess over efficient code wasting weeks on improving a metric nobody cares about.

I don't want to disregard your complaints, they are all valid, but one has to weigh the pros and cons, and in some cases, efficient algorithms are not the most important thing for the business.


Yes! I want tools that do that optimization for us! Better compilers/languages, but with more opportunities for optimizations. Even if it takes a long time to compile I think for most software this would make a lot of sense as it is used much longer than a couple days/weeks.


> software engineering and software architecture/whatever is currently fashionable. The former is about customers and the latter about the people who write software

Huh, that naming seems backwards compared to the construction of physical buildings. Architects work on the design of the building (what is going to be built) and interface with customers. Civil engineers work on implementing the architect's design, figuring out how it is going be to built. They work with both the architect (upwards) and the construction people (downwards), but not with the customer.


> ... [Software engineering] is about customers and the latter about the people who write software

Sounds weird to me, don't think I agree


I'll be contrarian. This is a mixture of:

1) Stuff I'd learn by reading Hacker News, after a few months

2) Stuff I'd learn on the job, after a few months

I'm not downplaying that it's important to know, but it's like converting a set of free blog posts into a $5000 university course.

Is that an efficient use of student time or money? Probably not. They're better off getting a summer internship, and getting paid to learn agile (and learning it better! working on an agile team will teach a lot more than listening to a lecture), or reading the anecdotes while procrastinating real homework, and posting in forums like this one.

It's okay that courses at a university do little to prepare people for some realities. Courses are designed to teach big-O, differential equations, database architectures, the math of machine learning, and similar sorts of things where there's enough theoretical depth that you CAN'T pick up elsewhere.

If a software engineer enters industry without knowing linear algebra, odds are they'll never be a data engineer in a serious ML project, or even more so, work on serious ML themselves.

TL;DR: Content looks useful, but it's an counterproductive use of tuition dollars and student time.


> If a software engineer enters industry without knowing linear algebra, odds are they'll never be a data engineer in a serious ML project, or even more so, work on serious ML themselves.

Linear algebra is a lot to learn[1], but I think it would be possible to pick up the material on the job, given enough time dedicated to it (like a month, not a week).

Similarly for pre-calculus (highs school math) and calculus.

I guess it does add up though, and if you're missing 5-6x of these fundamental building blocks and you have to fit a year's worth of math learning in between work, meetings, and other life obligations it can be tough.

[1] https://minireference.com/static/conceptmaps/linear_algebra_...


Linear algebra, to the point of being a:

* competent ML engineer;

* competent at doing statistical analysis;

* competent at 3d graphics;

* competent at graph theory / SNA;

* competent at signal processing; or

* competent at control systems

is a few semesters, not a month. There is deep intuition behind the concepts, and it takes a lot of time to pick up. The core problem is there are no short-term gains. Dabbling in linear algebra is not a skillset which improves your employability, or broadens projects you can work on.

Being able to develop TensorFlow, optimize 3d rendering for NVidia, process images for Adobe, or design a control system for a robot IS an employable skillset, and makes for much more interesting work. Picking that up is why I'd send my kid to a school like CMU.

Similarly for precalc and calc. It's not hard to know what a derivative and integral is. There is mathematical depth to being able to do something interesting with that.

If college gets you that background, you can pick up Agile and test infrastructure on-the-job.


> It's not hard to know what a derivative and integral is. There is mathematical depth to being able to do something interesting with that.

That's a very good point. In some sense learning of the material at level N, is only done when you use and consolidate the knowledge at the subsequent level N+1, and maybe even N+2.

Indeed, it would be fair to say that a student passing the final exam on topic X—even with a good grade—only truly understands 30-40% of the material. Only when the student has to apply this knowledge in later courses is the understanding complete.

Perhaps the best strategy for the independent learner is to learn topic X and immediately follow up with applications of X. I mean that's what people recommend anyway, but I always thought of it as a suggestion or a nice to have, but in the light of your comment I'm thinking maybe it's a requirement.


University is important for foundations. You can learn MySQL on the job. You can learn relational databases at university. However, vice versa does not work well. MySQL at the university is superficial, since you never experience it in a realistic production scenario. Relational database foundations have no immediate use, so why bother with it instead of delivering customer value.


It's not about software, it's about business understanding. So i prefer a business-to-software course rather than foundation/principles of building things.

To me, software is a tool to actually understand the world.


The course's github page also has previous years' materials:

https://github.com/CMU-313/CMU-313.github.io


To explain to my dad(a blue collar worker) at what stage i feel software development is right now. Our collective knowledge is equal to builders discovering a house has 4 walls and a roof. We are still experimenting what sort of materials we need to build the walls and roof with.


Looks very interesting but it doesn’t look like the lessons’ recordings will be made available. Anyone can suggest an alternative resource?


Does anybody know if there associated videos are available?



More 'ethics-free' guidance, thanks! /s

Yes, we should care about writing good software, what is a good process, how to best satisfy requirements, etc.

But I really would like to see ethical considerations play a role. Sometimes (surprisingly often I think) people should not do the work they are being requested to do. Just because one's boss (and one's paycheck) depends on following a set of instructions, if it goes against one's conscience, what is the point?

I mean, someone wrote these 'track and trace' apps. This is to say, they are explicitly enabling the government to track every single move of its citizens. Is that ok? Not to me. They should have resigned rather than play their part in increasing governmental control.

Unfortunately, technologists are building the dystopia of tomorrow. 'Smart' cities, anything that encourages use of a 'smart' phone, more data in the hands of FAANG, etc, etc.

smart == spy

If you are under the panopticon, you are not you any longer. You respond to please others, not yourself. You are in fact a slave.

Anything that enforces the hierarchical system, making it ever more restrictive, has blood on their hands in my view. We are designing a future that is less free, less accepting, more rigid. No one wants it. But all those doing the work, are selling out their fellows and even their own future rights, for a paycheck today.


> If you are under the panopticon, you are not you any longer. You respond to please others, not yourself. You are in fact a slave.

Under this line of reasoning, any level of societal participation is tantamount to slavery. I understand the what you're angling at and I agree to some extent; however, can you provide a viable alternative?

> Anything that enforces the hierarchical system, making it ever more restrictive, has blood on their hands in my view.

Society in and of itself is a hierarchy. Anything that contributes to society is therefore something that enforces the 'hierarchical system.'

You don't have to participate in society if you wish otherwise. The world could always use more mountaintop hermits.


> Under this line of reasoning, any level of societal participation is tantamount to slavery. I understand the what you're angling at and I agree to some extent; however, can you provide a viable alternative?

I understand why you ask this. I will respond in a way that I know will be unsatisfactory to you.

The way I see it, is all the actions that we are expected to take that are in service to self (i.e. getting yours). Society is geared up to make that seem perfectly natural. In fact, you/we all need to get acquainted with morality. The first step of that is to understand the world for oneself, to take no one's word for it but instead to apply the scientific method personally. Accept that you do not 'know' much, very little is proven. What you have are beliefs masquerading as knowledge - this is to say you have negative knowledge (aka crap).

What I'm really trying to get at, is that social change is an effect of the actions of all of us in aggregate. To make the world better, one can only attempt to makes oneself better. Be the change you want to see. And that is done by following one's heart/conscience/spirit or whatever you want to term it.

> Society in and of itself is a hierarchy. Anything that contributes to society is therefore something that enforces the 'hierarchical system.'

> You don't have to participate in society if you wish otherwise. The world could always use more mountaintop hermits.

Its not as easy as you might think to not participate in society! To get away from it completely is impossible. People I have never voted for, or agree with, demand, by force if necessary, that I contribute to and support their system.

Right now, you can see the next generation system that has been in design for us. Citizen scores, UBI (carrot), access (or not) to 'social' goods eg public transport, loans (stick), free movement, limited use of energy, water, etc without the 'administrators' say so. Smart water meters, electricity, 5g is the backbone to that.

Still, I don't discount the mountaintop - but I don't think there will be peace there either!


I have to downplay part of your threats, at the risk of sounding sheelplishly naive: but access to water and electricity is already under the control of "the administrators", and has pretty much forever. Even in the case it's delivered as a "public" service, the "administrators" were somehow expecting you to pay some sort of bills.

However, although I know some people get their water and electricity shut down, that's still regarded as extreme, and law tends to make it harder and harder over time.

The Internet itself (once a luxury, less than two decades ago) is slowly becoming considered as a "necessity" that should not be shut down. [1]

I understand your point that software is increasingly making dystopia technologically possible, and that geeks should beware how they use their skill - but I wonder how surprising and underrated law and social norms can be as counter-forces.

(Also, it made me smile to see UBI in the list of "evils" that are now "fashionable" to bash on HN. ;) )

[1]: https://en.wikipedia.org/wiki/Right_to_Internet_access#Ensur...


> I mean, someone wrote these 'track and trace' apps. This is to say, they are explicitly enabling the government to track every single move of its citizens. Is that ok? Not to me. They should have resigned rather than play their part in increasing governmental control.

To put yourself in their shoes: they may argue that it’s not ok to “let” bad actors abuse and injure citizens. Many of the people who have been in these positions and talked about it publicly have talked about how they felt like they were doing the right thing. Whether that be tracking child abusers or terrorists or “threats to national security”, they felt driven to make the tools to track and trace these people.


There is always an edge case to justify the loss of freedoms. In the fullness of time, it is revealed that case was just a pretext that doesn't really occur. But the genie is out of the bottle then.

The means is that TV/media play up edge case as if it was important. Once the justification is accepted, the freedom is taken and does NOT come back.

(Media, in case you haven't been paying attention is part of the governance system - driving what is acceptable to think about on a daily basis. No, propaganda is not to be found in China, Russia only - in fact the Western societies are far worse; we are propagandised from cradle to grave.)

Have any of the pre-9/11 freedoms we lost, come back? Travel restrictions, water bottles on planes, etc? They have not. So, an event occurs, this is played up to the n-th degree to justify the loss of control in our lives, and slowly slowly we tip-toe into overt control and domination by the system. In a way you can say that humanity's well-meaning nature - the wanting to do the right thing - leaves us open to these sorts of attacks - ie our good natures are weaponised against us.


For those who apparently can't be bothered to look before criticizing: Ethics is in fact in the course.


Learning ethics for building software is overrated. More often than not, those building it aren't the same ones who decided to implement the unethical behavior. How about we examine the failure of teaching proper ethics to MBAs instead?



Is that a joke? I don't care what Harvard has to say about Ethics! Harvard!! The educators of the master administrative class. That ethically outsourced all of the jobs to the rest of the world.

It says:

Code of Ethics As an ACM member I will .... Contribute to society and human well-being. Avoid harm to others. Be honest and trustworthy. Be fair and take action not to discriminate. Honor property rights including copyrights and patent. Give proper credit for intellectual property. Respect the privacy of others. Honor confidentiality.

Fine words! But take a look at the first - "Contribute to society and human well-being." Contribute to 'society'. 'Society' is a fiction. Its not a real thing - its an idea in people's heads. You could just as easily say contribute to stones.

People are the only thing doing the experiencing. Individuals. You can say 'society', but this just a way of avoiding talking about and to individuals.

Ethics questions are for the individual to work out. And it takes work! And the worst thing is that everyone thinks they know what they mean. That they are ethical. That its others that need the educating!


If you haven't worked out that governance is more than the political class by now, we don't have much hope! Educational institutions are as much a part of the problem as media institutions, as politicians, as we are ourselves. In fact, they are probably the worst - taking children and teaching them that the government is good, it loves them, that they are not slaves.

When government is slavery.

And it takes a lot for an individual to turn around and point the finger at themselves, accept some of the blame and make a change.

Most will take the ethic-free paycheck, rather than start to consider how their effort is creating a dystopian future for us all.


Internet == spy

Computer == spy

Chip == spy


Agreed. The internet, computer and chips though can be (more) neutral with care. They can serve you.


This looks truly excellent, I'm glad CS is starting to teach the core skills needed for day to day practice of most graduates.


> You may not copy any part of a solution to a problem that was written by another student, or was developed together with another student, or was copied from another unauthorized source such as the Internet. You may not look at another student's solution, even if you have completed your own, nor may you knowingly give your solution to another student or leave your solution where another student can see it

I don’t see why Schools are so against this. 80% of my learning came from comparing notes and solutions with my friends. Many times I’d learn something I never thought of and vice versa. I get the “don’t copy things you don’t understand”. Imagine a world where you couldn’t review your peers PRs and learn/give feedback.

In many subjects the assignments were so disconnected from the actual lecture notes. It was much efficient to learn from each other.


Yeah. There should be an entire course dedicated to adapting StackOverflow solutions to your problem. Learning from existing solutions and comparing them with what you're trying to do is half of programming.


Note that this is not to be confused with Software Foundations, the course on formal software verification: https://softwarefoundations.cis.upenn.edu/.


What are some MOOCs or books I can read regarding software engineering?


I'm not a software engineer so this criticism directed at them will probably get down voted. Why isn't security - and that includes secure code - one of the first priorities in design?


I wish there was a good online course like this from EdX or similar.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: