Hacker News new | past | comments | ask | show | jobs | submit login
Vermont Medical School Says Goodbye to Lectures (npr.org)
109 points by happy-go-lucky on Aug 4, 2017 | hide | past | favorite | 66 comments



Another one bites the dust. This is a trend/fad that will likely be reversed once properly studied.

My wife went to a medical school that had recently adopted so-called "active learning" techniques. All the students secretly hated them, but wouldn't share this feedback with higher-ups (academic administrators) because med students, as a group, are generally rule-following, adaptable students.

The "non-lecture" format was frequently derided by students, in private settings, as "Dr. Wikipedia". That is, they were learning medicine by using Google and visiting Wikipedia pages. Because they had to collaborate on their own learning materials, even though they had no background in medicine.

About 50% of students I knew (secretly) spent thousands of dollars on recorded medical lectures to supplement the completely ineffective class "group activities", and thus did well on the tests (and USMLEs) anyway. They never reported this to professors or administrators. Again, they were "working around" the broken "active learning" system. One of the hilarious side effects of this: the supposedly "objective peer-reviewed studies" of this learning style won't show score/testing degradation -- because the students are so self-sufficient that they work around the failing format with supplemental lecture materials purchased online! Thus, they still achieve the grades they hope, and the administrators think the system is "working"!

There was, however, a very small subset of students who preferred the "active learning" format. Most of the time, these were students who came into med school with a stronger science background than others (e.g. had basically started studying medicine before they got there). One reason they particularly enjoyed the format: they could skip class with little effect on their grades.


You should know that "what students like" and "what constitutes effective instruction" can be very different.

https://fnoschese.wordpress.com/2011/03/17/khan-academy-and-...

The simplest description is this: Consider two videos. One is a clear lecture (eg Khan Academy style), while the other is a dialogue between a character who espouses the common misperception, and a teacher who illustrates their mistakes. After watching the videos, you have students evaluate how well they think they know the material, how they feel about the video, and then you test them on how well they learned the concept covered by the video.

The lecture video had the following results: - Highly rated video clarity - Increased confidence in knowledge - zero change in knowledge (tested)

That is to say, students liked the video and thought they learned something, but didn't actually learn anything.

The dialogue video, on the other hand, was almost the opposite: - video was considered confusing (not at all clear) - decreased confidence in knowledge - large positive change in knowledge (tested)

Students, it would seem, don't know enough to be able to evaluate what is good instruction. (cf Dunning & Kruger)


I think you are right. But I haven't seen anywhere near a consensus​ that active learning constitutes an improvement. The reason large academic institutions are moving in this direction is because they are threatened by MOOCs and because administrators are often trained at business schools who love this learning style, mainly because most the content in business school is beside the point anyway.


I keep a notebook with a list of references on pedagogy, including active learning:

https://www.refsmmat.com/notebooks/pedagogy.html

The Peer Instruction section features a great deal of research in introductory physics courses, where well-implemented active learning methods can double or triple student learning of core concepts. (I also blogged about this research long ago: https://www.refsmmat.com/posts/2012-10-19-shutup.html)

Students in intro physics classes tend to memorize the formulas but don't have any conceptual understanding, and active learning seems to dramatically improve the conceptual understanding without harming the ability to solve typical calculate-the-force problems.

Happy to discuss or send PDFs if you're not able to obtain them. My email is in my profile.


The physics courses that I have been involved in, both as a student and in an instructional capacity, used several modes of instruction. Active learning was one, but lectures were certainly another.

The point of lectures were to lay the groundwork for the active learning component. Without that foundation, the students would have been forced to find other means to acquire background knowledge.


Like reading a book or watching a video? The point of a lot of "flip the classroom" strategies (roughly a superset of the "active learning" ones) is precisely that there are many ways other than lecture of acquiring background knowledge. That's certainly how we pick things up if we aren't in a classroom. So why waste classroom time on lectures if you can find other means of acquiring background knowledge?

tl;dr:

> Without that foundation, the students would have been forced to find other means to acquire background knowledge.

You say that like it's a bad thing.


Yes, like reading a book or watching a video.

While those are fine modes for learning, lectures do have the benefit of being a social mode of learning. Even in the extreme case of a lecturer who is playing the role of a sage on the stage, students still have access to other students. In less extreme cases, students still have access to the lecturer and the lecturer has the opportunity to include different instructional strategies. In other words, the benefits and drawbacks of lectures largely depend upon the attitudes of the instructor and students.


I'm going to have to disagree. Here's my default article I turn to on the matter - http://www.npr.org/2012/01/01/144550920/physicists-seek-to-l... .

The problem, up until the advent of MOOCs, was there was no incentive for higher education to improve. Prices go up every year, more instruction is dumped on post-docs and adjuncts instead of professors, but the number of people going to college continues to increase. Why improve your product if people keep buying it regardless?

And this idea of active learning is nothing new. Maria Montessori did extensive research on effective teaching methods more than a hundred years ago and moved more and more heavily to small projects and physical interaction as the most effective means of instruction.

http://www.infomontessori.com/sensorial/visual-sense-binomia...

I happen to like the binomial cube as an example, because it takes a confusing, "just memorize FOIL" type of concept and turns it into something you can directly interact with to develop a much clearer understanding.

And of course, there's things like this: https://xkcd.com/1356/

Or the explorable explanation that was on the front page the other day about game theory and trust: http://ncase.me/trust/

Or Brett Victor's classic Up and Down the Ladder of Abstraction: http://worrydream.com/LadderOfAbstraction/

The ability to interact with a model blows traditional educational methods out of the water as far as gaining serious understanding (imo), and I expect this type of instruction to only grow in importance as time goes on.


> ability to interact with a model blows traditional educational methods out of the water as far as gaining serious understanding

One nice thing about VR/AR for education, is that different media have different "it's natural" styles of use, and for XR, that's hands-on physical manipulation of models.


Both of those are lectures, though. One is just presented in an interesting format. It reminds me of studies on the efficacy of highlighting your textbook as read it: it only works when you just start out, but once the novelty wears off, so does its effectiveness.


This is definitely a trend but it is one I do not suspect will change any time soon. Just to put things in perspectives for people who have not experienced medical school:

> 50% of students I knew (secretly) spent thousands of dollars on recorded medical lectures

While this may be indeed be true for the students you know, this is also a common practice, in general. The pay-to-play nature of board exam preparation has existed much longer than the active-learning trend. There's no way to know whether these students would have purchased additional materials in another educational system.

> thus did well on the tests (and MCATs) anyway.

You seem to be conflating a couple of things. The MCAT (the Medical College Admission Test) is taken before medical school. There, just as for later examinations there has been a proliferation of prep materials, for which students pay dearly. This is in no way related to curriculum change in medical schools.

Again, you are correct that students who pay more do better.

> "objective peer-reviewed studies" of this learning style won't show score/testing degradation -- because the students are so self-sufficient that they work around the failing format with supplemental lecture materials purchased online

I understand the sentiment, but if students do better after the adoption of this new system, aren't they learning better? Eventually, medical students move from Wikipedia to more authoritative sources like UpToDate or primary literature. If they, themselves, are motivated to expand their knowledge - and as a result do better on tests - this intervention would have been beneficial.

I think you're implying that the huge industry behind medical test preparation materials is new, but this is decidedly not the case.


I enjoyed your comment. To respond to a couple points:

> MCATs are taken before medical school

Sorry, that was just a brain malfunction on my part. I meant to write "USMLEs" (residency admission exam for match process). Corrected my comment to state that.

> "but if students do better after the adoption of this new system, aren't they learning better?"

They are not doing better under the system. They are doing the same, or marginally worse. But I believe the students are more stressed, and getting less value per student debt dollar, due to the "active learning" trend. They are also less prepared for the USMLE and less prepared for residency than they would otherwise be. But they are substituting personal effort for their loss of solid foundational instruction.

A really good study would give students half their topics in lecture style and half in active learning style, and measure time spent and topic on "supplemental" learning outside of class.


> This is a trend/fad that will likely be reversed once properly studied.

Err, no. It's quite clear that traditional lectures work poorly, even when they are well done and well liked. And that active learning, when well done, is unambiguously better. However, it's also clear that doing active learning well, takes a lot of instructor training, preparation, and practice. So deploying it is a known problem, with it's own research.

Though there's some work on doing hybrids, lectures with sort of a "sing along" question-response, with an effectiveness in between, but easier to train for and deploy (at least for big introductory classes).

Which isn't to say that I don't facepalm and think "we need to be doing so much better than this", even when watching active learning at MIT, with its years of instructor and institutional experience across several departments.

And even as VR/AR transforms education, something like active learning seems likely to remain a key component. After all, good ways to learn something include practice it, discuss it, and teach it.


"traditional lectures work poorly, even when they are well done and well liked"

Just reflect for a moment on this general statement. You are saying traditional lectures -- a format used for millennia in education of every professional and technical class of society across multiple continents -- "work poorly".

And, we are meant to believe that "active learning" -- a new teaching style that throws away the lecture entirely and replaces it with group participation from students -- is, on the average, better.

Now imagine if I said this... "deep reading of books written by leaders in their field/discipline, even when those books are well-liked, work poorly."

How credible a claim could someone then make that "active reading" -- a new teaching style where we eschew books altogether and instead write and read material generated by students themselves -- is, on the average, better? Especially if in the classrooms where my hypothetical "active reading" is deployed, the students secretly buy well-written books, read them, and pass the tests just as well?


Why are you assuming that today's traditional lectures are exactly like the lectures that have been held thousands of years ago?

Most disciplines have never needed lots of theoretical preparation before someone could be put to work, so lectures weren't even necessary. The master could just show the apprentice how his apprentice how he does something, and then let him practice for a few hours.

The more theoretical fields probably started with small groups of wealthy people, which allowed much more direct interaction with the teacher. And what I have read from ancient Greek philosophers almost always involves some kind of dialogue between teacher and students. I can't really imagine aristocrats quietly listening to a lecturer anyway.

The first lectures to a large audience in an amphitheater were probably public dissections, and even there the medical students had plenty of opportunity to do the cutting themselves, under the direction of the lecturer.

The current "traditional" style of non-interactive lectures is probably actually very recent; together with the emergence of mass-education.

Your "active reading" sounds very much like the descriptions I have read of writing circles in creative writing, where students read each other's work and discuss how to improve it. It seems to work fairly well for them. Of course everyone still reads other books, but that they can do alone, while "active reading" can only be done together.

What this means for active learning is that when classroom presence is used for interaction with other students and the teacher, while students who need lectures watch them on their own, this is no indication of failure, but the system working as intended. (Except having to pay for lecture videos. Those should be free.)

The whole point of being in the same room is the possibility of interaction. If the lecturer is just going to rattle off their script while flipping through the slides, I would rather watch that on video where I can set 2x speed until the hard part starts.


Cambridge and oxford use a mix of lectures and supervision/tutorials - it's not a new thing: https://en.m.wikipedia.org/wiki/Tutorial_system


> format used for millennia in education of every professional and technical class of society across multiple continents

This is a very misleading summary. The most dominant format for millennia across continents etc. is apprenticeship and direct tutorial (sometimes called “coaching” or the like). Even today serious high-level professional training is done by such means.

Lecture is a low-quality substitute which people turn to when the number of students vastly outstrips the number of teachers, and the teachers can’t afford the time and attention to work directly with the students in small groups.

In universities, early introductory courses get stuck with lectures so that scholars can focus their direct instruction time on graduate students who are serious about becoming scholars themselves.


This is not a reflection of how the term "active learning" is used at my university. The students don't like it, but it's the following: the students watch videos of lectures, then during "lecture" time, they work on worksheets and problems provided by the instructors while the instructor and the TAs walk around and assist them. It's not students lecturing to each other. Maybe there are places where that happens and is called "active learning", but I don't think it's the standard.


I would characterize that experience, instead, as very, very poorly implemented. A properly implemented "active learning" environment simply puts the impetus on the students to be pro-active. Whether that's buying outside materials, or requesting a lecture from a professor, or simply meeting with a professor to clear up misunderstandings, the students are the ones who start that process. Of course, that's all reliant on the students being informed of this process from the outset.

There should be some kind of structure, and I imagine, if there are exams, there is some kind of structure, for the students to be following. There should be required textbooks as well, so that resources are consistent.

And, of course, there needs to be unbiased feedback to improve the system.

Also, even lecture-based students spend thousands of dollars on outside material. And doing so isn't a "secret". At least $300-400 is spent by each medical student for the arguably "required" board prep materials, completely disconnected from how they learned in the first two years.


You might be right. It's certainly possible there are "great" active learning programs out there, and my anecdotal experience is definitely just that -- anecdotal. (And even, second-hand.)

That said, I've personally experienced great teaching supplemented by great lectures, and I've not personally experienced great active learning.

I studied computer science and philosophy undergrad. I had great professors and not-so-great professors in each. The excellent ones had excellent lectures that I still think back on today, years later. Plus, as a typical programming auto-didact, I learn a ton from recorded lectures/talks on YouTube and similar channels. I wouldn't trade these resources for the world. I can remember a handful of classes that had more of a group/collaborative feel to them. I don't think I'd describe any of them, in retrospect, as "successful" classes.

I remember one, in particular -- an algorithms class -- where the professor had just done a long sabbatical in Japan and was trying to cargo cult the group/collaborative learning techniques he witnessed there in his algorithms class, yielding hilarious (and definitely not effective) results. That was the one class where I had to pull an all-nighter at the library, cramming from Sedgewick's "Algorithms" textbook (which wasn't assigned by the professor, but every student had passed around as the right way to learn this stuff) and the odd Wikipedia page. All this effort to pass the final exam -- because I felt like I had spent an entire semester witnessing an educational experiment, rather than learning anything.

I imagine many people are in the same boat as me in terms of their experience levels here. Lots of experience with successful lectures, very little experience with successful group learning environments. Certainly, lots of experience with good and bad teaching all-around, where "style" doesn't seem to be the critical issue.

I am afraid too many of these institution-wide shifts to "active learning" are being made on little more than well-intentioned hopeful faith.


See, I've had the opposite experience. I can't remember anything from the vast majority of lectures - I either make notes, which means I'm missing material while I'm writing, or I don't take notes, and don't remember much of anything at all. Lectures are nothing but a huge waste of my time.

Now, recorded lectures, on the other hand, are superior. Combined with the ability to directly contact the lecturer, it would be an infinitely more successful method of learning for me.

Perhaps the way forward is the way my medical school does it - 3 different pathways: lecture-based, problem-based, and self-study. Let each student choose the path best for them.


I agree. Not to be too cute about it, but I think "multi-paradigm" education, rather than single-paradigm, is definitely the way to go, as we've even learned in programming. We all remember what happened in programming when people suggested we write 100% of programs in an object-oriented style (Java, EJBs, etc.).

These days, we are much more mature about "best tool for the job" -- OO design for state/behavior binding APIs or data structures; FP for concurrency/parallel computation; simple imperative style for mission-critical code that needs to be read/reviewed by many, etc. (among other examples.)

I don't think it's controversial to think that in something as complex as human education, there may be multiple learning tools available to us. We should avail ourselves of each of them.

The recorded lecture, the live lecture, the class Q&A, the take-home assignment, the group study, the deep reading, the thoughtful essay, etc. Put it all on the table. Don't ban any one form out of some trend while declaring, "the lecture is dead!"

I also agree with you that it might also be the case that by allowing some amount of tailoring/personalization per student (a la Khan), we could probably get very good average results, since students can smooth over some of the rough patches where the more dominant teaching style is failing them individually. For example, I personally found it much easier to learn Linear Algebra by working through my problems in Matlab. I noticed other students did just fine with working out the math on pen-and-paper, but I needed the iterative and visual feedback to really grok what was going on. It feels "obvious" to me that different students learn differently -- and different topics demand different learning environments! Why are we trying to shoehorn every student and every topic into a single dominant style?


I went through an MD cursus with no lectures. It was horribly inefficient and boring. This experiment has been tried multiple times in various universities already. To my knowledge, it has always failed spectacularly, except in the mind of the cursus policy makers. Also to my knowledge, this kind of cursus has been reversed to a lecture-based system in all cases after some time. My personal take is that the "collaborative" methods are so much in contradiction with the ultra competitive nature of medical school that it simply cannot work. In other words, it is a stupid thing to implement if you want to keep a selective school.


Why do you even need a med school with these types of students


I'm currently a fourth year medical student, and based on my experience you don't need anything from the med school in the first two years. The goal of the first two years of med school is to perform well on the USMLE Step 1 exam. My classmates and I found that you can do extremely well on this exam by using 3rd party resources exclusively. The consensus across the community of med students is that UWorld (a question bank), First Aid (text book full of relevant Step 1 facts), and Pathoma (Khan Academy style video series made by a pathologist from the University of Chicago) are all you need to succeed. Sure, the first two years of med school could lay a solid foundation to help you study for Step 1 at the end, but that foundation isn't absolutely necessary.

I go to a medical school that heavily emphasizes small group based learning. We have lectures, but they are optional and poorly attended. Most students only attend the small group sessions 3 times per week for the first two years of med school (largely because those sessions are required). At first, the small group sessions seemed helpful since there's more of a chance to have a discussion about difficult concepts and topics. After a while, however, you get the feeling that the small groups are just the blind leading the blind. Often these groups are led by fourth year medical students (such as myself) that are not nearly experienced enough to provide adequate guidance to first and second year medical students. Usually, the main takeaway students get from these sessions is that they should read First Aid, do their question bank, and watch Pathoma. All of which every med student knows to do anyway.


To be completely honest, as someone in medical school right now you definitely don't "need" the structure and materials that medical school lecturers and courses provide for those first two years. I however needed the structure and did not have the work ethic to do it on my own. Additionally the next two years of clinical learning most definitely require you to be at the hospital.


A professor of medical education once said to us something along the lines of 'The half life of the facts we tell you is about 6 years, so the actual content isn't necessarily that important. What is important is the language and that you learn to understand what you hear and read after you graduate.

5 years after graduation, I feel like I am only starting to truly appreciate that comment. I started medical training 11 years ago, and half of the stuff I read in books during first year was probably already known to be wrong.

The lectures we received were biased and incomplete, but they provided a largely up-to-date picture and, more importantly, a story.


The story definitely matters, I agree. But if you're good at making that story up as you read the book, why bother coming to class? I'd also argue that not everything is completely out of date. Only things like cytology or genetics and autoimmune disorders from what I can gather. Pathology is largely the same, as in the manifestations don't really change. You can get by studying a First Aid board review book from a few years ago and still pass step 1. Physiology doesn't really change, only treatments targeting the pathophysiology.

Medical Schools largely understand that everyone studies and learns differently and that's why they don't mandate you come to lectures. They're recorded for your own benefit as well. My professors would also provide several different learning methods throughout the preclinical courses to really hammer in those mainstay concepts regardless of how you learn. Be it quizzes, or team based learning, or straight lecture.


They need corpses to cut up. That's difficult outside a med school.

http://news.nationalgeographic.com/2016/07/body-donation-cad...


> "Well this method of teaching is actually not as good as other methods. Would you do that?"

A university did a trial. Lecture with active learning vs traditional lecture and sections. Simultaneous versions of the same class; random assignment of students; pre-written exams. Inexperienced postdoc vs highly-respected well-rated experienced prof. Mid-term results came in. The comment was that if it had been a medical trail, they would have had to stop immediately, rather than finish the term. It would have been clearly unethical to continue to deny the control group the intervention.


Sounds interesting, source?


Story told by Edward Prather (Arizona)[1] at MIT[2]. I failed to quickly find an obvious publication.

My recollection is the course was elsewhere, he was invited to lecture (astronomy education improvement is his thing), and said let's instead set it up as a trial, and use this student instead of me.

[1] https://profiles.arizona.edu/person/eprather [2] http://web.mit.edu/physics/events/colloquia.html May 11, 2017, but this link may rot soon.

EDIT: I should amend "inexperienced postdoc" with "who was trained in active learning instruction, and prepared the course with a highly-experienced astronomy education researcher". Because a key issue with active learning is that doing it well, and thus deploying it rapidly, is hard.


Do you mean traditional lecture performed poorly?


MD here - I remember leaving a lecture theatre after 8 hours of didactic lectures, and I could not remember the topic of any of the lectures other than the one i'd just seen, and I could not remember a single fact from any of them. I persisted with the lectures as that seemed to be what was expected of us, however it generally seemed like lectures are there to suit faculty rather than students.

When it came to actual hardcore study for board and specialty exams, I used 100% active recall testing with flashcards with variable repetition based on performance (Anki mainly). I never had any problems passing anything.

I believe an excellent, underutilized way medicine could be taught would be computer based simulation of emergency dept patient presentations, with the player taking a history, performing an investigation and labs, and making decisions +/- referring patient as appropriate.


Minimum viable US medical school (in theory):

Prerequisites: High school AP bio/Chem/physics/stats. College biochemistry.

Standard interview process.

Preclinical: 1.5 years of online instruction and drilling (UsmleWORLD, firecracker, Pathoma etc) for USMLE 1. Cost ~$500 per year.

1.5-2 years clinical rotations. Cost: at current yearly tuition.

Then graduate, residency.


That's a reasonable approximation of PA school: publichealth.tufts.edu/Academics/Physician-Assistant-Program/Curriculum


Only when looked at from the craziness of medical school. Tuition: $42k/year (similar to most PA schools, though a few have better in-state pricing) http://publichealth.tufts.edu/Admissions/Financing-Your-Educ...

Also, "All academic coursework AND a bachelor's degree from an accredited university/college must be completed before submitting an application to the PA Program". A bachelor's degree is such a painful gatekeeper: four years of opportunity cost in addition to steep tuition. I'm not going to argue that undergrad is strictly vocational in purpose, but many students view it as such and thus get nothing further from it.


If only we lived in this world.


you mean the "dissect a cadaver" thing is not required?


Good! Lectures are terrible for students. It's fun and pretty easy to give a lecture though, which is why they persist.

This is a great example of a "flipped classroom", which is gaining traction across American higher-ed. A lot of research went into developing the SCALE-UP paradigm, which is a great place to start if you're curious:

https://en.wikipedia.org/wiki/SCALE-UP


I am glad I didn't have to study that way. I am notoriously bad at "collaborative learning" in team settings. I probably wouldn't have learned anything in school.


Agreed, learning teamwork and team problem solving is useful, but only after individual mastery of the skills in question first. There's no way to shortcut mastery, it takes individual effort, practice, and time. Do that first, then teamwork later with a group of similarly skilled people.

Also, different people have different individual learning styles (aural, visual, doing) and learning paces, so when you mix a bunch of folks with those differences, before all of them have mastered the material, some are going to get left behind and others held back. Ideally learning environments and methodologies account for and adjust for that.

To be fair, Vermont Med School's methodology in the article does appear to account for this, requiring some degree of individual mastery before group work:

>A lot of the science of pharmacokinetics is simply mathematical equations. If you have a lecture, it's simply presenting those equations and maybe giving examples of how they work.

>In an active learning setting, you expect the students to learn about the equations before they get there. And when you get into the classroom setting, the students work in groups solving pharmacokinetic problems. Cases are presented where the patient gets a drug in a certain dose at a certain time, and you're looking at the action of that over time and the concentration of the drug in the blood.

>So, those are the types of things where you're expecting the student to know the knowledge in order to use the knowledge. And then they don't forget it.


That's really interesting to me because I'm the opposite. I skipped many of my lectures in college and learned through the book or working with my peers. I didn't feel like going to lectures added much to my learning but they added a ton to burnout. Everyone is a little different I guess.


Collaborative learning means less individual problem solving. The moment there is someone a bit faster then you in the group, the less occasion to think you have.


It's not even about someone being faster but just different. That together with some people being dominant I would probably contribute nothing to the group because my approach is often different.


> It's fun and pretty easy to give a lecture though, which is why they persist.

o_O!?

Fun and easy? I like to think I'm pretty good at giving a lecture, and it takes me real work to give a good one. Especially when teaching to beginners.

And "active learning"--that's even MORE work. Teaching a single sophomore CS class with automated testing running on the assignments was about 50 hours per week of time.

In addition, effective active learning requires that the students have a lot more interaction with the teacher and other students. Now it's not a 3 hour lecture, it probably has to be more like 6 hours. And has to be split across multiple days because holding someone's attention for more than about 3 hours is simply not possible (and even that's difficult).

Lectures persist because active learning would cost a lot more.


It takes a lot of work the first time, and less work for each subsequent delivery.

As a performer, I enjoy standing in front of a group of people and commanding their attention. It's fun to make them laugh, after all. That said, lectures are pretty easy. Write a story, speak clearly, and write legibly. Congratulations, you did it. Active learning environments require more work from the instructor, but they actually work better.

Lectures persist for a few reasons:

Once you have a notebook of prepared lectures, provided primary compensation is not based on educational outcomes, most faculty will preferentially reuse the work they've already done.

Most faculty, being faculty, are among the small fraction of the population who could learn effectively from lectures. They are inclined to like lectures, and to see nothing wrong with them.

Most university classrooms were built to deliver lectures, not to foster active learning environments. Fixing this would cost more, but a lot of universities are building active learning classrooms because the results speak for themselves.


Don't forget another important point here: most lecturers at the post-secondary level are not teachers. That is, most of them have little to no formal training in how to educate others. It still kind of blows my mind that, to my knowledge, not one of my university instructors/professors, despite being subject matter experts, is credentialed to teach children - at any grade level - in a public school.


Not only that, but it is likely that they will have received no teaching training at all. When I was hired as lecturer I was just told these are the classes you need to teach and then left on my own to figure it out.


Teaching credentials do not effect teacher quality. Whether at primary or secondary level the best predictor is years of teaching experience, with measures of subject matter expertise (a Master's degree in the subject they're teaching, not education) and measures of general intelligence (IQ tests or close enough like GRE or SAT) also having large positive effects. The additional effects of more experience end at around six years experience.

Teaching qualifications have an effect on teacher performance that is not reliably distinguishable from zero.


Not to put too fine a point on it, but why the fuck is tuition so high if students are doing most of the work?

EDIT: FWIW, I felt the same way about engineering classes years ago when I ended up learning out of the books and Wikipedia myself.


The learner has to do most of the work, no matter how you structure the classroom. It's just a question of having them do productive work.

Giving lectures and having students do problem sets at home still has the students doing a ton of work, but the instructor's expertise isn't utilized efficiently. By having students watch videos/do pre-reading and then work problems in class, the instructor can actually help people how they need it.

You aren't paying someone to do the work for you. You're paying someone to structure the material in a way that helps you learn it. Think of it like you're paying a chef to prepare a meal that you have to chew. They aren't paid to baby-bird it to you.

"Why is tuition so high" is a really contentious question. Not many of those dollars go to instructional staff (I assure you their salaries have not grown alongside tuition). Where has the extra money gone? Buildings, administration, athletics, who knows...


Pools, saunas, arenas, orientation activities, facilities management, energy, space, housing, etc have basically nothing to do with teaching style.

College is expensive because of all the extras (teachers don't earn very much). Also basic overhead, but you can still do that and keep it affordable.


College is expensive because of all the extras (teachers don't earn very much).

One word: tenure


http://compleatlecturer.blogspot.co.uk/2017/06/the-compleat-...

Historically the original mode of lecturing was simply the transcription by the students of the manuscript the lecturer was reading.


At the University I attended most of the lecturers did not know how to teach.

So yea, you'd probably get better results from some system designed by good teachers that could then be replicated without requiring the talent that produced it on a regular basis.

But I'm sure that same training/workshop would be a much better experience when delivered by the minds that created it than a robot that is just repeating the steps.

When I went to University what I wanted was knowledge-sharing by subject matter experts with a passion for teaching. What I got in most cases was researchers that were reading from a script and essentially ticking the boxes of the course outline.

A well designed 'active learning' session is going to have value to students for the same reason a well designed lecture will, it was created by someone with the ability to teach.


It's always interesting how many commentors' intuitions about education are completely opposite of what research shows.

Active learning has been shown in hundreds of studies to be superior to traditional lecture-based instruction. Here's a recent meta-analysis: http://www.sciencemag.org/news/2014/05/lectures-arent-just-b... http://www.pnas.org/content/111/23/8410

And you don't need to learn from lecture 'first' to 'prepare' you for subsequent group learning or real-world learning. In fact, the opposite has been shown in research studies. Let students get their hands dirty with some field experience or try out a simulation even though they aren't quite ready for it yet or try to figure out some data, etc., and then when they receive a lecture on a related topic afterward, they will learn much more from it, because they can relate it back to their previous experience. They have a reason to listen to the lecture, rather than just trying to listen and memorize for a some future test or experience they know or care little about. Several studies have compared for example giving a lecture first followed by a simulation or other experience, vs. the other way around. And lecture-first is always inferior. See for example research on productive failure. Here are some examples: http://ideas.time.com/2012/04/25/why-floundering-is-good/ https://ww2.kqed.org/mindshift/2016/04/19/how-productive-fai... https://bulletproofmusician.com/productive-failure-how-strat... http://manukapur.com/productive-failure/


It depends what you're studying.

I did part of my computer science degree on campus with lectures that I didn't find useful. In one ear out the other. I have to slow down at some parts but can zip through other parts, but a lecture is steady paced. Huge number of hours wasted.

I did part of my degree at distance. The profs chose great textbooks and it was much better. It's always there as a reference and I can go at my exact pace.

I can imagine though that history or something might require lectures. You don't just give the kids obscure books with no context. For computer science though, an intro to algorithms textbook is pretty straight forward.


In my own experience learning and teaching Go and Bridge, two very complex subjects, a combination of lectures and interactive examples is best.


This is interesting, but it depends on the proposition that "active learning" is clearly better than more traditional education methods. That's not nearly as plain or uncontroversial as the article implies.

One issue is that when educational techniques are compared, it is troublingly common for the comparison to be between simplistic and homogenous definitions that are only rarely encountered in real classrooms. Educators and non-trivial courses that depend entirely on lectures or entirely on active exercises are so rare they statistically don't exist. Every class I've taken, taught or observed was a blend of multiple types of learning and instruction, distinctions being ones of emphasis, not of type.

The article also seems to assume that there is a best, or at least better, way to teach that is independant of how individual students learn or what particular topic or discipline is being taught. That's, to be frank, ludicrous. It's been clear for a very long time that appropriate and effective educational approaches depend a great deal on what is being taught and who it's being taught to. For example, imagine teaching history without lectures, or programming without hands-on activities.

Good educators tailor the mix of approaches to their particular students and topics. I'm baffled why any educator would support the idea that lectures should be completely eradicated. Why not, instead, take a hard look at the curriculum and mode of instruction with an interest in determining the best approach for the most students and then trust your faculty to make appropriate choices on how to tailor it to a class---with the odd lecture once in a while if it's useful. If they've got a problem with some courses or educators depending on lectures to the detriment of students, the solution is to redevelop the curriculum and train the educator on different techniques---not to just outright ban one approach.

Medical students are generally top students that are highly motivated. Further, cohorts tend to be in all the same classes with each other, encouraging mutual support. Those traits, along with medical education being task focussed, argue for active learning approaches. But that isn't true of, say, freshman English.

Finally, there has always been a circular definition problem with new educational approaches. For example, "active learning" often seems to have an implicit definition along the lines of "self-directed exploration that is sucessful". If you tried it and it wasn't effective, then what you were doing wasn't active learning, QED.


Addendum: there's also a definition problem with "lecture". Pretty much any educator will tell you that they encourage---actively wish for---questions from and conversations with students during a lecture. Passive lectures without interaction tend to happen mostly in giant lecture theatres. In that case, the problem isn't lectures: it's class size.


Postscript: another issue is time. Very often, you have n hours to cover m topics, and n always seems to be too small. This is unlikely to change. Active learning exercises tend to consume a great deal of time, to prepare, to run, to give feedback on. In general, you have to use the techniques sparingly, often reserving them for key issues or to help integrate knowledge about several topics. Lectures are often there to ensure at least a minimum amount of coverage of other topics, to offer an opportunity to ask questions and to clarify how a topic fits into the rest of the course. It seems unfair to examine and grade students on topics that were never taken up in class at all.


hasn't McMaster (in Hamilton Ont. Canada) done this for ... decades?


Yes, they invented the "problem-based learning" method/style, I do believe.


Note to self: don't get sick in Vermont.


> "OK, if you like doing appendectomies using an old method because you like it, and you're really good at it, but it's really not the best method for the patient, would you do it?" Of course, the answer is always no. And then you turn around and say, "Well this method of teaching is actually not as good as other methods. Would you do that?"

it's just teaching. which method pays more?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: