I graduated Lambda School and currently have been hired and work in a software position. I don't encourage people to go to... Bloomtech they're calling it now. AMA.
I was earlier in the life of Lambda. I got what I needed out of it but I watched a lot who didn't.
Most of the people I saw get successful were people were people doing like... extra stuff outside of the classes. They were taking what they learned and applying it. Others organized groups of people, and essentially formed a small support network of people in a similar position. I completely understand that not everyone has the time for that.
I think two things happened over Lambda's life span that effected the quality of the teaching. The first was the initial pool they advertised too. I think I saw the post on Hacker News before everywhere else. I'd wonder if you're more selective from already technical people at the beginning and then sorta widening the net as time goes on to include more people if you could get initial big numbers push followed by more ok numbers.
The instructors I met were all great. I think a lot of that flack went more towards later cohorts or alternative program from the main full webstack track.
Did lambda school have any sort of assessment before accepting students? This really resonates with my observations. The people who got the most out of software bootcamps were those that had tinkered with video game modding, WordPress, etc. and had done some self-learning of coding before going to a bootcamp. In other words, people who knew they liked hacking on tech and had a basic ability to code. They approached the bootcamp with the mindset that they already knew they had technical interest and ability, and wanted direction on building marketable skills.
Do you think greater selectivity applicants would increase the success rate? I had thought the ISA model would incentivize this, but when I learned lambda was selling ISAs that seemed like it removed this incentive.
This is generally true but not always. As you can imagine we have loads of data on this.
If you only selected students who had been tinkering with writing code for 10 years certainly you'd be successful in doing so, but you'd also eliminate ~90% of those who we have seen become software engineers.
The only way we've found that does it well is to have people actually start writing code and see if they enjoy it. That's why we now have multiple free classes, have a free dropout period once you're in the school, and even have a three-week free trial of the school itself.
The notion that financing ISAs removed the incentives isn't really accurate.
First, most of the time ISAs are financed it's in the form of a loan you have to pay back with interest with an ISA and its repayments as collateral, or it's a sale to a neutral SPV with recourse in the case repayments don't hit a certain threshold.
In the rare instance (we've never done that) schools have been able to sell ISAs full stop, it's been at extreme discounts or based on discounted predicted likelihoods of future revenue, and if those ISAs don't repay the buyers bail and the school trying to sell them is out of business.
Edit: It's too late to edit my comment, but I noticed an error we have sold ISAs with minimal recourse not at enrollment, but at the point of _graduation_; we would sell half at an extreme discount at graduation (based on likelihood of being hired) and keep half on our books.
They had a prebootcamp with a free course that taught the basics. Some people cheat through that though and didn't quite process that... you can't really pull that off through the whole program?
I totally saw people who had never coded before succeed, I'm not sure I could pick out in an interview who would do well and who wouldn't. It's a marathon not a sprint.
For the selectivity thing... I'm really not sure. The selling ISAs thing was a bit weird when I learned about. I honestly kinda wonder about the actuarial calcs of it all.
Honestly I was a little surprised at the cheating too. I think I was a little naive in the beginning that if you actually wanted a job you would understand that you would have to be able to write code. But some folks are in a school mentality that if you get a grade/diploma you're good, regardless of whether you understand the things required to go into that.
Having tried a number of different ways to do admissions, I can assure you doing interviews is possibly the worst.
As far as ISAs go, it all comes out in the wash. If you create a pool of ISAs and students don't get hired you may have more ISAs but the average ISA is worth less, so the only thing that matters is whether each individual student gets hired. There's no financial wizardry that can let you sell $1 for $2 in the long-run.
Yeah that's fair enough. I don't think the people cheating ever got far enough to actually effect job stats. It was pretty easy to sus out who actually knew enough to keep going. I don't fault Lambda for that, it's just the reality of any educational goal line.
For the interview thing, that's just what you were doing at the time. At the rate you were iterating, I'm sure that there's a better process now.
Sure, not a problem. Most programmers do, probably.
However, I've seen students use Stack Overflow and other resources as a source of "program stamps":
- They enter their problem in an Internet search engine.
- Click on the first result with code in it.
- "Stamp it" on their own code: I.e., copy/paste it.
- Using editor/compiler, find any issues by trial and error and fix them so the editor/compiler doesn't complain anymore.
They might try a simple example or two to see if it works. And that's it.
There's not much reflection on their final solution, or on the bits they copy/paste. They don't seem to understand their solution, nor care about that or their problem solving process. They put in effort, they expect a passing grade.
> Most of the people I saw get successful were people were people doing like... extra stuff outside of the classes. They were taking what they learned and applying it.
Is this a knock on Lambda School, though?
You can go to MIT but if you never apply your learnings you won't be a good engineer.
Going through the motions of a CS degree is probably sufficient to land a decent job if you're getting good grades, not cheating, and you get an internship (or similar).
At least from what I've heard of bootcamps, you need to really go above and beyond to have similar chances with that route.
Not ironically at all! If someone expects to attend programming classes (or any classes for that matter), not do any extra work and as a result to become a programmer, that someone will have a rude awakening. Or as one of my university professors said "Attempt to learn this subject by just listening these classes is equivalent to attemting to become a gymnast by watching Olympics on TV".
If you go on the HN Algolia, there are a lot of posts from students and people involved with Lambda over the years that might paint a clearer picture of what kind of dysfunction was taking place at the company.
I don't think the assertion that "lambda grads weren't able to get good jobs" is true.
Obviously not every single student has been hired, but our hiring rates have always been pretty good (they're better now than they were in the past), and thousands of BloomTech (we had to change our name because of a trademark lawsuit) grads have increased their lifetime earnings by billions of dollars, and work at nearly every major company you can think of.
You can see our 2021 audited outcomes report here with all of the data https://www.bloomtech.com/reports/outcomes-report. (Note: 2021 outcomes report is very recent as you have to get students graduated, give them time to get placed, etc.)
Some of it I'm thrilled about, some of it shows us where we have more work to do (or need to do better in admissions - candidly it's always a difficult balance between giving folks chance and certainty of those folks' outcomes.)
High level:
90% of those who are job seeking got hired.
Our median hired grad increased their income by $27,500 (and that's just their first job - obviously software/data science salaries shoot up quickly after a first job).
About half of our students have degrees, and half do not.
This comment got me interested, so I dived deeper into the report [0].
Learners are divided into three groups: graduated (59%), still enrolled (5%), and withdrawn (36%). Graduated learners are further divided into two groups: job seeking (63%: ~37% of all learners) and non-job seeking (37%: ~22% of all learners). Here's the definition of “non-job seeking”:
“A BloomTech graduate who has been unresponsive to outreach, has explicitly indicated they are not pursuing a technical role, or has explicitly indicated they have paused their job search.”
When we apply the base rate to the 90% rate, we conclude that 33% of those who attend the program (learners) got hired.
I see, that’s … extremely unimpressive especially if you take the median salary increase from above. And I think maybe we should take everything else this guy is saying as potentially dishonest. Not including people who stopped trying to get a job is just an absurd way to do this calculation. Imagine a clinical trial that just ignores everyone who disconues due to adverse events. These stats seem borderline predatory
We should do a better job of getting more granular on that piece, because it really does matter, but the above isn't the right way to do that math to answer the question prospective students have, and is misleading in the opposite direction. The outcomes report is directed at prospective students who want to understand what will happen to them if they attend the school and look for a job.
You have to remember that (for this outcomes report) nearly every student uses an ISA under which no one is required to pay us unless/until they get a job using the skills they learned. There are a number of people who attend never intending to switch careers, a (large) number who ghost us the day after graduation, and a (large) number who get a job but don't tell us until we get tax returns (so we learn they were hired only after this outcomes report).
Our team works their asses off to work with these students, and is doing everything they possibly can. Slacks, calls, texts, emails, some of which are auto-generated from me personally, and in some cases even physical mail, to try to get them to work with us. If they respond _in any way at all_ with anything other than something that equates to, "I don't want a tech job" they are job-seeking in the outcomes report. We have built tooling to make applying to jobs easier, we find jobs that you should apply to for you, have an outreach generator where our team will write emails to hiring managers for you, and more recently even what we call "job search takeover" where we work with students on resume/portfolio/job criteria in advance, and we will actually do all of the work to fill up your calendar with interviews.
Students who look for a job in any way whatsoever get hired at a very high rate. In my view, if you're a prospective student, that's the information you actually want to understand. The fact that there are a number of students students (most of whom are using ISAs) who never intend to look for a job or don't look for a job is a fair indictment of our business model, but not a fair indictment of the quality of the school or the likelihood of getting hired.
So how should we treat that in an outcomes report? If you're a prospective learner do you want to know about the hiring rate of the people who ghost us or don't intend to look for a job, or do you want to know the hiring rate of people who map to the profile of what you expect to do?
If anyone has ideas of a better way to slice that data to convey the best information to a prospective learner, I would love to hear it.
I know someone who went through the Lambda/BloomTech bootcamp. Although they weren't able to transition to a new role, the experience helped them start to code in their job at the time. And iirc, because he didn't land a swe role, he didn't have to pay Lambda based on the ISA.
Where _would_ you encourage people to go for similar purpose with higher quality? Are there any particular bootcamps/online schools you think are good?
Some of my friends and family are eager (and IMO perfectly capable) to learn CS and enter the Software Engineering field, so I'm curious what would be a good way for them to do so (without spending 4 years on college again).
I do not have this knowledge. There used to be a site that categorized boot camps by results. Maybe try that?
My general advice is to get some one to try https://www.theodinproject.com/ as it's free. If they try it for a bit and find they don't like it, no harm they can just stop. If they want to continue after a bit and still want to go a boot camp they're more likely to succeed at any of them.
It's certainly true that BloomTech isn't for everyone, and that not everyone will want to become a software engineer. We try to filter for that, but it's tough.
A lot of people like the idea of making software engineering salaries, but don't particularly like building software. Then there are others that fall in love. We haven't found a great way to predict other than having people try it out.
> There used to be a site that categorized boot camps by results. Maybe try that?
I think you're talking about either the Council on Integrity in Results Reporting (CIRR) [1], or Course Report [2] (both still up). CIRR is a body run by the code school industry itself to monitor its own results; Course Report has over 50k reviews by students as well as articles with tips about how to pick a code school, "top x code schools for y" lists, etc.
CIRR seems to be reasonably rigorous and honest. Their reports are easily available on the site. I've poked around in their reports, and there's a huge range of results, from less than 50% employment at 6 months to 80%+. There seems to be little to no correlation between the reputation of a given school and the actual outcomes (some of the most reputable schools had employment rates of 50%-60% at 6 months).
A big trend I noticed was that the schools with the highest employment rates were relatively low-profile schools teaching unsexy technologies that are low in SV buzz but nonetheless have high demand, like Java and C#.
I love the idea of CIRR but it is largely a failed institution. Their measures have changed dramatically over the years (the last CIRR event anyone at BloomTech attended resulted in the notion that anyone who adds anything new on LinkedIn could be considered "hired," even if it was a portfolio project or self-employment), and are used very differently from school to school, resulting in every major school I know of stopping to work with them.
For example, we used them for our first outcomes report and paid extra to have them "verify" our outcomes report, but they literally never opened the Google Drive file we sent them.
I think it was a great idea set up by well meaning people, but the self-governing aspect and comparisons created ended up in weird incentives that resulted in it falling apart.
The review sites are perhaps marginally better, but the positivity of reviews are almost 100% correlated with how hard schools work to farm for positive reviews, and their business model is selling leads to the schools, so the incentive isn't for objectivity there either.
Honestly the best way, though it requires more work, is to find a handful of recent grads on LinkedIn and ask them about their experience.
Ah, OK. Now I wish it wasn't too late to edit my original post :)
The stats on the CIRR site across schools did always seem a little... odd to me, with differences in outcomes too big to believe at times. Sounds like I would have found the same thing if I looked at any individual school over time, as the rules and practices changed.
Thank you for pulling these up. I am in fact talking about both of these (admitted in my head I had mixed them into one site). I know at one point Course Report was flooded by Lambda Students as they were encouraged to leave positive reviews.
Definitely App Academy Open. App Academy is one of the original bootcamps and they have solid curriculum. They have their entire course for free online
Not OP, but I did go to a bootcamp but didn't get much from it. I learned a lot more from Udemy courses.
I would recommend that if your friends are serious about getting into coding, ask them to take a few React basics courses (just go by the most popular ones on Udemy).
I found that overloading beginners with theory doesn't really work. Getting them to build something and figure out the why of it works better - at least that was my experience. I learned about, say, json webtokens way before I used them, but it wasn't until I built an app of my own with account authentication feature that I figured out what JWTs actually were.
With all the recent layoffs, this might not be the best time to do a coding bootcamp. Not as many companies are hiring, and you'll be competing with people with better credentials and more experience.
A lot can also change in a year, so take this with a grain of salt.
It's a self-paced online curriculum that generally takes 1.5 - 2 years to get through, with an optional 4-month "Capstone" intensive after graduating the core curriculum. Progressing out of each course in the curriculum involves having to pass rigorous, easily failable, assessments; you just can't progress until you've cleared a pretty high bar of knowledge in each domain. There is a good mix of live interview assessments, written assessments, and coding projects.
The job placement statistics are staggering in comparison to the 'quicker' options -- as they should be given the time investment involved. I'm unaware of any institution that has had more success in placing students in high-quality software development positions.
Disclosure: I've been studying for about 14 months now using the Launch School curriculum.
The thing about self paced programs is that a lot of people will not do anything unless they're poked/pressured into making progress (e.g. via deadlines). Now you shouldn't have to be heldback by non-self-motivated persons, but of course the success rate will be higher because it filters out those people.
I was doing help desk, but I was pretty plugged into tech news. I was on hacker news and saw the initial post here by Austin himself. It seemed like a good idea and much better then what I was doing in the time. I was frugal and had expenses saved up to go a year at it. At the time I knew YC was involved and I think it did lend an air of legitimacy to a super new concept.
Honestly, not much. I'm one of the success stories. I think I was just early enough to have small classes and knew enough basic html and programming concepts from doing IT work to have the bootcamp
work for me. I was also incredibly lucky the company that hired me had a program where they took on essentially fresh college grads and boot camp grads, and funneled them into Junior positions with a project before you were assigned a team. It took me a lot of interviews before I actually got a job.
1. Scale. When you're self taught you can make decisions that only really impact yourself. Or maybe 3-4 other people for some of the projects. The company that I walked into has two monoliths (front end, and backend), and working in a code base that's 5 gigs files is a lot different. There's so many files. My editor is now slow. There's no way for me to understand the details of the entire codebase. Badly documented internal libraries are all over the place. This isn't what a bootcamp or self made codebase looks like for a beginner normally.
2. Specific practical stuff. I had been coding JS mostly on Linux. I had to learn PHP on Mac OS. I had never used either. And a number of other tools connected to the build process that are internal to this company.
By not dropping you straight into a team you can also control where you landed a bit. People who didn't like UI stuff ended up a deeper backend team, people who prefer ui ended up more towards the front end. Almost everyone in the group I cam with (17 people) had to learn PHP. When you have 16 other people learning the same stuff at the same time, you start to work together and organize into groups. We begin producing our own docs for the project and other helpful things.
I was earlier in the life of Lambda. I got what I needed out of it but I watched a lot who didn't.