> There used to be a site that categorized boot camps by results. Maybe try that?
I think you're talking about either the Council on Integrity in Results Reporting (CIRR) [1], or Course Report [2] (both still up). CIRR is a body run by the code school industry itself to monitor its own results; Course Report has over 50k reviews by students as well as articles with tips about how to pick a code school, "top x code schools for y" lists, etc.
CIRR seems to be reasonably rigorous and honest. Their reports are easily available on the site. I've poked around in their reports, and there's a huge range of results, from less than 50% employment at 6 months to 80%+. There seems to be little to no correlation between the reputation of a given school and the actual outcomes (some of the most reputable schools had employment rates of 50%-60% at 6 months).
A big trend I noticed was that the schools with the highest employment rates were relatively low-profile schools teaching unsexy technologies that are low in SV buzz but nonetheless have high demand, like Java and C#.
I love the idea of CIRR but it is largely a failed institution. Their measures have changed dramatically over the years (the last CIRR event anyone at BloomTech attended resulted in the notion that anyone who adds anything new on LinkedIn could be considered "hired," even if it was a portfolio project or self-employment), and are used very differently from school to school, resulting in every major school I know of stopping to work with them.
For example, we used them for our first outcomes report and paid extra to have them "verify" our outcomes report, but they literally never opened the Google Drive file we sent them.
I think it was a great idea set up by well meaning people, but the self-governing aspect and comparisons created ended up in weird incentives that resulted in it falling apart.
The review sites are perhaps marginally better, but the positivity of reviews are almost 100% correlated with how hard schools work to farm for positive reviews, and their business model is selling leads to the schools, so the incentive isn't for objectivity there either.
Honestly the best way, though it requires more work, is to find a handful of recent grads on LinkedIn and ask them about their experience.
Ah, OK. Now I wish it wasn't too late to edit my original post :)
The stats on the CIRR site across schools did always seem a little... odd to me, with differences in outcomes too big to believe at times. Sounds like I would have found the same thing if I looked at any individual school over time, as the rules and practices changed.
Thank you for pulling these up. I am in fact talking about both of these (admitted in my head I had mixed them into one site). I know at one point Course Report was flooded by Lambda Students as they were encouraged to leave positive reviews.
I think you're talking about either the Council on Integrity in Results Reporting (CIRR) [1], or Course Report [2] (both still up). CIRR is a body run by the code school industry itself to monitor its own results; Course Report has over 50k reviews by students as well as articles with tips about how to pick a code school, "top x code schools for y" lists, etc.
CIRR seems to be reasonably rigorous and honest. Their reports are easily available on the site. I've poked around in their reports, and there's a huge range of results, from less than 50% employment at 6 months to 80%+. There seems to be little to no correlation between the reputation of a given school and the actual outcomes (some of the most reputable schools had employment rates of 50%-60% at 6 months).
A big trend I noticed was that the schools with the highest employment rates were relatively low-profile schools teaching unsexy technologies that are low in SV buzz but nonetheless have high demand, like Java and C#.
[1] https://cirr.org/
[2] https://www.coursereport.com/