It is frustrating how widely applicable this book is. Universities across the country are struggling to adequately prepare students for software engineering jobs.
It is also frustrating that this book from 2012 is still applicable in 2018. I'm not sure what the solution is or what would drive change here.
In my opinion, it is not the mission of universities to prepare students for jobs. Universities are research institutions that prepare students for research. That's why all professors have to do research. Vocational schools exist for the purpose of job training. Big companies can also train their own workers.
In Australia, we intentionally collapsed most of our vocational schools into our public universities a couple of decades ago. I guess that was fairly unique because it was such a conscious decision. Universities in the modern world _are_ in fact vocational schools for many professions. Class sizes and student numbers alone often show this.
And while big companies can train their own workers, many choose to off-load that competency to the universities as well. In Germany, many of the largest export employers have a symbiotic relationship with local (to the factory) Universities. The companies help determine the course structure and syllabus, and in return the graduates are offered good living wage jobs straight out of university with continued training and certification.
Australia is a long way off of the German model, so all I know is what I've read in the Economist and a couple of other publications, but there are certainly murmers of moving to such a model in Australia as well.
"University" just doesn't mean what it used to mean.
As I say elsewhere, just look at other disciplines taught by universities. This is a totally untrue view of what universities are.
They are learning and research institutions. They've never just been research institutions. And the vast majority of fields don't really bother teaching any R&D to under-grads, they just teach them the subject and practical application of it in the real, job, world.
In the UK we used to have vocational schools too, that all got changed to universities. It didn't mean no vocational training went on in universities.
In France it's not so clear cut, but there have been more and more “professional” studies set up in universities. Many studies are actually not clearly « pro » or « recherche » but in between, and give you insights into both worlds. My Master's degree is like that and I ended up doing an industrial PhD after being sure I wanted to work in a company. In that context the teachers do their best to give some training specific to software engineering but it has to be balanced.
That's the theory that hasn't been true in years. It ought to be as you describe, but it isn't - universities nowadays are glorified vocational schools with an extra option to go on a research sidetrack if you really don't like money. It's visible across the board - from the reason people choose to go to a university, to the focus universities have on attracting students and teaching, vs. supporting actual research.
The problem isn't with the universities, it's with the lack of vocational schools with any depth or rigour for intellectually-demanding fields like software engineering.
Sure, my local vocational school does offer a certificate and diploma of software development - a 2/3-semester program which teaches a couple of Java and C# courses and how to use a database. Which is to say, it teaches you the bare minimum you need to know to become the most junior level of applications developer in an enterprise IT department - vastly, vastly underqualified to go into many of the kinds of jobs that expect CS degrees.
The other problem is that, well, I actually liked the theoretical and conceptual side of doing a CS degree, learning from lecturers who were researchers in the field. I, and most of the good software engineers I know, would have regretted just going to a vocational school.
IMHO, there's a third option worth considering. Here in Australia, lawyers start off by obtaining a law degree (a Bachelor of Laws or a Juris Doctor) from a university. However, before they're admitted to practice, they have to obtain a Graduate Diploma of Legal Practice - a 6-month course which focuses on the practical skills of being a lawyer.
I don't see why there couldn't be a similar type of program to teach practical software development workplace skills - obviously, not as some kind of mandatory licensing program, but as an optional extra.
> many of the kinds of jobs that expect CS degrees
... most of which don't actually need CS degrees, but in the absence of the rigorous vocational programs, it's a reasonable filter to get candidates who have a clue.
They are both research and training institutions, and quite obviously so if you look at another discipline.
The architects don't all learn abstract obscure theory, they spend a(crazy) amount of time doing drafts and models. The engineers don't all learn irrelevant and out of date techniques. The pharmacists spend half their time in labs. As do the chemists. The doctors have to do rounds on real wards.
So why in CS is your defence that a university is a research institution?
That's simply not true, and looking at any other discipline shows how wrong you are.
It used to be true for all subjects when universities as we know them today were founded. It is still true for the hard sciences. Physicists train to become researchers, no data scientists at hedge funds. Computer Science is not Software Engineering and shouldn't be forced to become that. Computer science is imho a branch of mathematics and the focus should lie on theory. To steal Dijkstra's words, I don't think it's any more about computers than astronomy is about telescopes.
I don't think it's a good idea to teach students whatever is currently en vogue in the industry. If you for some reason need to have an applied subject at a university instead of a vocational school you should call it Software Engineering or whatever.
You're focusing on the theoretical fields while conveniently ignoring the doctors, engineers, pharmacists, chemists, biologists, architects, etc.
All of whom spend all of their degree on practical, applied learning. And have done for decades/centuries.
If you then want to go into research in those fields you do a PhD. The vast majority of their graduates go into industry.
And of course, the vast majority of CS students go into industry, just woefully under-prepared, unlike other disciplines.
There's no reason for CS to stay theoretical only. The field of computer science/engineering/making/whatever you want to call it doesn't need loads of researchers, it needs loads of practical, professionally trained, programmers.
The ridiculous defence that it's computer science not engineering is so over and dead and that ship sailed decades ago. It's just a name. Just like a PhD, a Doctor of Philosophy, in History doesn't make you an expert in Philosophy, it's just a name.
I'm not against training loads of programmers, I just don't think that we should kill CS as a theoretical subject to do so. I believe vocational schools are better suited for the task.
IMHO computer science degrees should primarily teach computer science, but they should also weave in practical tools/skills into course projects, especially testing, debugging, version control, and a few classses of programming languages.
FWIW I think the “software engineering” course was the most useless “CS” course I took.
We were introduced to version control in our first year. The only problem was that till we started managing large code bases, most people found it very difficult to understand why we used Git in the first place.
Yes! Git is not merely version control but a specialized sort of version control. I've run into the opposite problem, when I've tried to use Git for projects that weren't source code.
Without thinking about it, just by reflex, I'd create a project directory for a new project: Web pages that were almost entirely English text, or a Photoshop project, or an e-book project...and init my git to track it and set up my server to push it to.
Before long, usually while trying to come up with a good commit message, I'd wake up and ask myself, "Wait, why am I tracking this? Am I just assuming it's the responsible, proper thing to do?"
It's not enough that the project be something that iteratively improves. Git's usefulness comes mainly from text data with extremely demanding constraints (hard to get it to work, easy to break, harder to repair than to start again, working/broken is not just a matter of taste), especially when you have multiple contributors, each of whom is more likely to break something than to fix it.
If you have non-text projects (ex: photo editing) or text that can usually be "fixed" by just pushing forward rather than starting again, Git still has benefits, but they may not be worth the costs.
You don't really see the value of Git unless you're writing code that you're a little scared to write.
I have to say I don't agree. I find git useful every time I do work that is even just a bit exploratory - which is almost all of it. E.g. doing a mockup in Inkscape - I'll commit, then create a new branch and try out some idea. If I'm not satisfied, I'll just checkout master and keep working on the previous version.
In fact, I think this is a common behavior; almost every PC I've seen has a poor man's version control implemented by the user by copying the file and renaming it (e.g. Report_1.doc, Report_2.doc, Report_2_valid.doc, etc), despite none of them being programmers or working with code.
But that's my point. I'm not saying that other things don't need version control at all but that many don't need the specialized, heavyweight version control features provided by Git. For many things that need version control (I'm not saying they don't), the lightweight "poor man's version control" of saving named copies is fine, so it will seem puzzling to someone doing only that kind of work why anyone would go to the trouble of learning something as complex as Git when a much easier solution is available.
It's when doing a specialized kind of work with more demanding constraints that you see the benefits that justify the cost of learning and using Git. If you never do that specialized work, the specialized VC of Git might not be worth it. "Poor man's" VC might make more sense.
Git works really well when you are editing Latex documents in groups. The merge features usually work pretty well, even if people have been changing the same document at the same time. You can find and check what other people have been doing. You can track updates. There is a central repository. It tracks embedded images too. In total, I would keep doing it, but the group in question is technical and can use basic git without problems.
edit: And you can undo when you find you need something again you deleted, which happens once in a while too.
I couldn't get my team mates to use it, but I did on my end, and it was a godsend. Other teams were merging stuff by hand and losing hours (in total) in the process.
Doing computer science rather than software engineering, I didn't have many. Most group assignments involved us sitting there together at a computer basically doing peer programming.
I've had multiple rising seniors as interns from top-tier universities who knew nothing about git. In one memorable conversation over lunch with one, I asked him how he got by without git; he used google sheets with classmates, and they also had shared a dropbox and only one person could work on it at a time. At first, I was dumbfounded that they had no instruction about it, but I realized its so dependant on your university and your coursework.
Agreed! Version control is not a natural concept that a beginning student would immediately think of. It’s also helping that a beginning student’s programming assignments are typically simple and short enough that doing without VC is tolerable.
They are also typically working alone (or are suppsoed to be!) in the early classes. Team projects tend to come later.
If you're working by yourself on the sorts of programs that are in introductory classes, you don't loose a lot without VC and it greatly simplifies things and lets students focus on the actual exercises/assignments.
Git would be horrendous overkill for a freshman programming class and it has so many footgun possibilities that the TAs would be going nuts helping students recover from various disasters.
Did anyone read this book? Any thoughts about it? I looked at it really quickly and it seems good, but I haven't read it and then used the knowledge. So how useful is the actual knowledge itself in a practical context?
Also, I would be curious to know if there is a 2017/18 version of it.
I think it helps you gain a quick perspective on what's actually out there, especially in case you missed something in your curriculum. I personally found the Java EE chapter a better introduction than the official starting documentation from Oracle, with fewer buzzwords and more actual information. The Object orientation chapter feels a little out of place to me, though; hardly anyone with a degree in IT wouldn't know about it. Overall, not reference material but a good intro, although it's somewhat outdated and the fact it was put together from individual essays leaks through.
- The very bottom of the 2012 PDF page [1] mentions the related doc generation project, MarkBind [2]
- The MarkBind site mentions two software engineering courses whose materials were written using it.
- One of these is TE3201 [3], which links to the current version of the book [4].
[1] http://www.comp.nus.edu.sg/~seer/book/2e/
[2] https://markbind.github.io/markbind/
[3] https://nus-te3201.github.io/website/admin/index.html
[4] https://se-edu.github.io/se-book/