Hacker News new | past | comments | ask | show | jobs | submit login
Teacher caught students using ChatGPT on their first assignment. Debate ensues (businessinsider.com)
33 points by thunderbong 10 days ago | hide | past | favorite | 74 comments





This comment reflects my personal opinion and experience [please take it with open mind]

I really hate these type of questions. The open ended questions that is about myself is not something useful. Students will usually do it because they have and will come up with anything (even before LLM). I understand the point that part of that is for students to get to know each other and professor know more about their expectations.

I was once a grader for Astronomy class in the summer for the public (anyone could take it). Part of the assessment was that in each assignment students will ask one question and each student should answer at least one other question. I really questioned my relation with the world grading those questions. Astrology was the common topic. Random questions from all about anything you could come up related to Astronomy (or you think it is related). I no longer teach or grade and my experience were pre-ChatGPT era. ChatGPT will change education and educators have to adapt.

They were forced to do it and they just did it for the grade. The purpose was to encourage communication and collaboration but in my experience this rarely work. People will just put bare minimum to just do this. LLM makes this too easy and it will be too tempting to use it. Some people did good job both asking and answering. But many were just not interested and had to do it anyway.

Scientists collaborate out of necessity and because most of the them have genuine interest in this field. They get trained during their grad school and usually they still don't do well communicating and collaborating with each other. They are human after all.

For the case we have in hand I can see a student tempted to write poor version on answer to this question without having to worry about grammar, typos..etc and will ask LLM to write in an academic style. How would they practice writing academic writing and express their thoughts properly? Probably never but this is them tricking themselves.

Another problem which makes this case unique is that this course is about ethics. So lets set plagiarism and university rules aside. A discussion on using LLMs would be better start for a class about ethics and technology.


Reminds me of the "discussion" assignments we were given in a couple college classes I was in. The idea was that students would write a top level response, then respond to someone else.

In reality, I would write my response, then come up with 2-3 sentences of reply that basically amounted to "I agree with that guy." It never really fostered the type of debate the professor was looking for, because most other students had the same approach, and nobody else cared enough to go back and read the replies to themselves.


This must be the standard distance learning process because both my son and my wife (working on her masters) have to write a post and then respond to 2 other posts. It seems like fake discourse to me, all comments are written for the instructor to read and not for the OP. There is never a response to a response, thus no real discussion.

In your second paragraph, you said you didn’t find it useful and then you said why it’s useful.

Sort of. I read it more as "here's why it's done but it's totally effing useless".

> I really hate these type of questions. The open ended questions that is about myself is not something useful.

Yes, it can be helpful for a professor, who could see what is interesting for the audience and modify their lectures according to it.

But I believe you underestimate usefulness of such exercises for students. It triggers them to reflect on themselves, why they are studying and what they want to achieve. It is useful as a general lesson on reflection, on consciousness, on active learning, but even more useful for this particular course.

When you think of your goals, verbalize questions you hope to find answers for, then you direct your mind, your attention. You become more of an attentive and active learner instead of a passive one, who just listens and tries to remember as much as they can.

One of the recommendations I heard about how to read a book, was "formulate a question that the book can answer, then read the book searching for the answer". I believe it helps. I struggle to read some books if they are not particularly interesting for me, if I'm reading them in a "fishing" mode: maybe I'll find something useful or at least interesting? But if I had read the title, the introduction, maybe some reviews, to understand what book talks about, if I had then selected questions to the book, and then read the book looking for answers, I have no issues with focusing my attention on reading. If my attention wanders, it does it to think about what I had just read, which is a good sign. Such reading mode guarantees that the time wouldn't be wasted.

> They were forced to do it and they just did it for the grade. The purpose was to encourage communication and collaboration but in my experience this rarely work. People will just put bare minimum to just do this.

Yeah, I know. It is sad. I took an undergraduate course at 35, I studied with people of 18-20, and most of them just trying to fulfill the formal requirements that needed for grades, without trying hard to see what happens. I ended with a small group of students who really tried. Even when we didn't understand the purpose of some exercise we tried hard, got some results (grades, yes, but not just them), and then speculated on what was the point of the exercise. Sometimes we asked professors about the point of the exercise. Most students started with speculating about the point of the exercise and when they didn't see any, they didn't really tried to do the exercise, they just chased grades with the minimum effort.


There is an inevitability to the tone of this article that I find really depressing (probably because I tend to feel the same way). That The ability of students to think deeply and learn well has been severely diminished over the last decade. That AI use is not even a cause so much as a possible means of escape from the problem. On the one hand this feels true to me given my own experience, but on the other hand I know I still manage to learn and undertake complex tasks, so why should "kids these days" be doomed?

How are you all feeling your attention span change? Are you able to sit and think or read long passages without distracting yourself? Were you ever able to do this? If you find yourself struggling, how do you manage?

I find myself reaching for audiobooks much more frequently these days. Longer tasks are still possible if I find sufficient motivation. Curiosity is not always enough. I have to craft specific goals for myself in order to maintain motivation, but this takes me quite far in my self-study.


It is not strange that companies now compete for your attention. People who do things and learn stuff need to focus and have a long timespan doing anything they are trying to do. The world is now full with many things trying to get your attention. Practicing self restrain mentally would prove to be the most important skill anyone can have nowadays.

> Practicing self restrain mentally would prove to be thr most important skill anyone can have nowadays.

Which by definition will not be possible for the vast majority of people, human psychology doesn't change so easily and there are massive amounts of money spent by entities employing people trying to design experiences which tap into that.

It's a similar issue with obesity, practicing self-restraint sounds logical and plausible until you look at the bigger picture and realise that while companies tap into human psychology/physiology to sell their sugary products it's not possible on a societal level to solve the issue by putting the responsibility unto individuals.


Why would it be strange? We recognized that we were in an information economy 50-plus years ago, and were quite proud of it (at least back then). It turns out people don't buy information to stick in their storage locker. Without attention, you have no information business.

>> The ability of students to think deeply and learn well has been severely diminished

I think it just reflects that most students are just taking classes in order to fulfill whatever requirements they need to progress to the next stage.

If it was my class I would have just passed out index cards and given the students five minutes to write their responses before handing them back in.

The upside to all of this is that teachers will more highly value actual responses over the cliche responses that students currently think they are supposed to provide (and which they get from ChatGPT)


At some point, schools were filled with students who wanted to go there. As more and more years in school become mandatory, you'll see more and more students doing the least amount of work. They don't want to be there. It's always been like this, we've just scaled it up by a lot (at least in Europe) lately.

The burden of proof of abilities will keep going up as more advanced tools are developed.

Employers will find new metrics to judge people based on. That's probably where coding interviews came from. Perhaps we'll go back to nepotism, or a social networking "achievements" web site; LinkedIn but with verified credentials. Maybe we'll all have to be published in a journal to show our abilities.


I don't think it has anything to do with general population level attention span. It's just a result of growing older. My kids inhale books. I don't find the same joy in them as I used to unless they are very well written and/or contain sufficiently interesting ideas.

As I get older time starts feeling more precious and there are more external pressures. So I've got more selective over everything. Maybe it's the same for you? Audiobooks let you do something else while listening.


That The ability of students to think deeply and learn well has been severely diminished over the last decade.

I don’t like the trend of blaming everything on the student. It feels like blaming the customer for not liking the chef’s food. Is it truly impossible to reach these kids or are teachers just using the wrong approach? Disclaimer: I am fully aware that bad student behaviors can exacerbate the situation and I realize teachers are forbidden from fixing that problem.


I wouldn’t take this as blaming the students. If a whole generation of students is worse at something, it must be systemic in some way.

Isn't the implication here that it is happening externally? To stick with your analogy, imagine something in the environment is altering the perception of taste. The blame is not being placed on the customer or the chef, but on the environmental factor.

IMO this question is different for software people because they may be spending hours per day already focused on specific problems.

This is in no small-part due to how broken the system is in providing value to the students.

Why do the teachers care? Let the students not learn and suffer from it.

The only reason they pull stunts like this is because it actually doesn't have as much of an impact.

Why do employers stop asking about grades, and instead focus on actual experience? Because all things equal, that's what matters more. So can we blame students for trying speedrun these barriers to get to what's actually important? And not what they're told is important by educators who have an obvious bias, but what they can see in the reality of the job market.


> Why do the teachers care? Let the students not learn and suffer from it.

> The only reason they pull stunts like this is because it actually doesn't have as much of an impact.

> And not what they're told is important by educators who have an obvious bias, but what they can see in the reality of the job market.

Oh, how you've missed the forest for the trees.

These students paid real money to voluntarily attend an undergraduate ethics and technology course, and like the complacent naivete of lemmings, immediately walked right into a philosophy professor's well-laid trap in a way that directly supports her research agenda[1]:

>> Her current projects focus on ways in which emerging technologies threaten to undermine essential conditions for human flourishing.

These students were played just as easily as the commercial tools they blindly leaned into, and if that doesn't sharpen their critical faculties while eliciting a meaningful opportunity to introspect on what the future has in store for them, I don't know what will.

[1] https://ualr.edu/philosophy/faculty/megan-fritts/


> These students paid real money to voluntarily attend an undergraduate ethics and technology course

Did they really? I'm pretty sure mine was required and most of us didn't really care about it. We were paying for the other classes, not that one.


Yeah, really. It's an upper division elective[1] bucketed into PHIL 4380 generic catch-all, capped at 17 seats, and primarily targeting philosophy majors[2]. To be sure, my alma mater taught me a thing or two about independent research towards a due diligence imperative. Did yours?

Moreover, you voluntarily paid for and committed to earning a holistic degree---full stop. How you appraise its normative pieceparts is wholesale irrelevant, and the matter of extracting value from the vested experience is entirely on adult you. Something about leading a horse to water comes vaguely to mind.

A relevant counterfactual is that you could have bypassed lower division altogether and just audited upper division courses which you independently deemed to contain intrinsic market value as a non-degree-seeking student, dramatically reducing financial liabilities and time commitments to academic lethargy. Hell, you could have skipped university altogether and developed the necessary technical skills that you think the market values online for free! And yet you didn't...almost as if you desire the fringe benefits that society extends to formal degree holders without actually putting forth the honest effort expected of your accredited peer equivalents across the nation. Have your cake and eat it too much?

Example of real world consequences? Every year, my employer swallows an absurd indirect expense explicitly reminding its workforce of statutory regulation, internal policy, and professional obligation towards ethical practice to cover its ass because so many of my so called "learned" peers conveniently dismiss the softer elements of their formal education---major election years being particularly egregious---and then these former employees cry when an industry conditioned to anticipate such halfbaked behavior catches them in the act, chews them up, and spits them out like expendable cud without regret.

You reap what you sow.

[1] https://a.ualr.edu/classes/index.php?term=202460&campus=M&se...

[2] https://catalog.ualr.edu/preview_program.php?catoid=20&poid=...


Not everything in education is about getting a job. I took electives in college that weren't part of my degree, and they were interesting and enriching. They didn't help me get a job, and that's fine. There is the point of view that education should first be about produced well-rounded and informed citezenry, not just be a mill for producing endless young grist for corporations.

That's what the parent is saying. If you are in school to try and be a more well rounded person, but then rely on tools like ChatGPT to do it for you, then the only loss is yours. Which then questions, who cares? Why not let you waste your time and money?

But he is also saying that employers have perverted the incentives, giving reason for students to lean on such tools to gain an edge in the job market with minimal input. Which is what raises alarm with educators, as they know their job is on the line if employers start to question the authenticity of the degree. There isn't enough money in people just wanting to be better people. Let's be honest, the vast, vast, vast majority or students are only there because they believe they need to tick a box when job searching.


> Let's be honest, the vast, vast, vast majority or students are only there because they believe they need to tick a box when job searching.

Maybe they -actually- need to tick that box and it doesn’t matter to employers how it’s ticked, just that it is ticked. For some reason, academics and educators really seem to hate the reality that most people go to university not to get a well rounded education, and not for personal enrichment, but because employer gatekeepers require it.

If employers stopped requiring a college degree, I bet college enrollment would drop 90%.


To be fair, modern university was originally sold on the idea of getting more people into university research labs where they could work to develop and take ownership of new capital. This is where the idea that you will make more money if you go to university comes from. As capital is a multiplier on labour, labour * capital = higher income over labour alone.

Of course, people don't listen and the vast majority just went and got jobs working for someone else's existing capital after spending a modicum of time in university classrooms, somehow thinking that is the same thing. As a result, the higher income and other promises never materialized. Incomes have held quite stagnant even as more and more people graduate from university.

I can see why educators are annoyed that the grand hope for a future paved with intensive capital creation ended up being just a bunch of dumb-dumbs who believe they need to tick a box. But it is still their bread and butter. Reality has to be faced.

But perhaps what they really are annoyed by is that they equally failed to create capital themselves, and thus try to sooth the pangs of failure with pretending "Oh, I was just here to become a better person!"?


Yes, that's often the common contrast between higher education and the early years: first, you gain knowledge and learn facts, you memorize by rote, you pass standardized tests.

In college/university, you learn how to think. You learn how to reason. You increase wisdom, solve problems, and reach an intellectual maturity. That's always been the value of college for me; a degree being a means to an end, perhaps, but the quest for wisdom and reason is far greater than that of career.


> In college/university, you learn how to think.

I learned how to think on the farm after school. You don't have much choice but to when the shit gets real. Is the rise in college attainment directly a result of most kids no longer having access to farms (or even anything close to being equivalent)?

> That's always been the value of college for me

The historic value was collaboration on ideas. It is all well and good being able to think in solitude, but one can only think so much. Many people all thinking at once and sharing their thoughts is an incredible force multiplier. College offered a place for minds to come together.

I'd argue that the internet has become that place nowadays. Which is perhaps why you see college as a metaphorical farm these days instead?


Thank you for that cool insight and perspective. Indeed, not all of us need college to reach intellectual maturity. A good family farm is a great way to face the realities of nature and day-to-day human existence. Our retired bishop was raised on a farm, and despite his subsequent education and formation, I daresay it would've been life on the farm that prepared him for life in the church, and he may well agree!

The fact that it's a class on ethics and technology is quite ironic.


.ph ccTLD mirror wouldn't load for me.

https://archive.is/bK8L9


Flip the script, ask the students to record videos instead and then use AI to extract text out of the videos.

That... sounds like a fantastic idea.

Having read the reddit thread linked in the article, I think the main problems would be one of (depending on the teacher):

1. Not wanting to

2. Needing funding/support to AI-transcribe all the video for each of their numerous students, but their institution not providing it.

But those seem [eventually] surmountable. I really like the idea.


I found the linked Reddit thread to be particularly interesting (educators discussing use of ChatGPT by students, 1y ago):

https://www.reddit.com/r/Professors/comments/17v6478/chatgpt...


> > a skilled LLM user can generate solid responses to almost anything.

> I don't think my students, who often leave the "as an AI I have no opinions" in their essays, are skilled LLM users

This thread is really funny; or really depressing, depending on your point of view. My conclusion: teachers should start using ChatGPT to grade these assignements. Without even reading them. That would be a win-win for everybody.


My takeaway from the thread is that LLM-use seems to be the manifestation of student's disinterest, dissatisfaction, and potentially poor attention spans. Comment from one student (who, from what I glean from the thread, is probably more capable than average).

There is one comment from a student in the thread. Sure they seem angry, but I also see their point:

> I have used chatgpt in literally all of my essays this semester, and got an average grade of 8-9.

> ...

> But you know why, at least in my case, I always get away with it? Because all of the work I have to do requires ZERO critical thinking from my side. In fact, it is even worse to use it sometimes. Don't you even dare to come up with your own opinions!!! Just quote someone that is already quoting someone and so and so on.

> I know most students don't give a single flick about your class and that unmotivates but damn it is the 5th year you ask for the same essay and we know you don't even read half of them. Why would I lose so much time doing something literally no one cares about.

> TL;DR: If a machine can do most college essays it's because they were mechanical jobs already.


It's strange, all these professors asking students to generate outlines, when outlines are probably the worst way to write an essay. The kids don't want to write because they have no investment in it: professors ask for systematic, boring, easily interpretable essays, they get essays generated by a program that writes in the most easily interpretable way possible.

This is just sad.

Part of the foundation a post knowledge storytelling society can stand on.

I'm part of a local physics meetup where AI is often a topic of conversation. One member is a well-regarded high school physics teacher (his students end up at MIT, Georgia Tech, etc., and some go on to get physics PhDs).

He shared that some teachers now require AI/LLMs for homework assignments, such as writing essays. The actual assignment is to critique the output of the LLM.

As a millennial, even our middle school classes taught information literacy in various forms in the "computer lab". And that was the nascent days of the web.

AI is a tool, not unlike a calculator or Wikipedia. They were both controversial and even forbidden at times. Students adapted. So did education.


A couple of years ago, one of the GE English courses I took was 100% oriented toward producing exactly one research paper.

The class, therefore, involved laying the foundations, doing the research, constructing footnotes and a bibliography, producing a couple of drafts, and finally submitting a finished work.

And so the instructor was there to shepherd us the entire way through the whole process, with oversight and feedback at every step. This was not simply, "OK you took a class on <X> and it's time to spew out a paper of <N> pages on it." This was learning how to correctly do a research paper from start to finish.

It was fantastic because it really made cheating stupid. Whether you were going to purchase a paper online, or have an LLM write it, or pay a friend to do your work, every cheating method was profoundly irrelevant and useless in the face of this process. At the end, either you've learned something about writing a good college paper, or you haven't.

I took other classes that taught about rhetoric, evaluation of sources, and extra credit was to read and analyze a novel. The instructors were top-notch and highly credentialed, at the top of their game. Especially for community college where I was taking on a FAFSA grant, I thought that the whole process was absolutely rigorous, educational, and extremely edifying even at my advanced age.


I will say that in other classes, we had very short research papers to do in a brief time frame, and the most difficult aspect of those, for me, was paraphrasing.

I could easily look things up in primary/secondary sources and know exactly what I wanted to put into the paper. But because it often involved technical, precise jargon to express my thoughts, I struggled to paraphrase, or put it in layman's terms or something, without losing the essential gist of it all.

Paraphrasing from a source, I believe, may be a very strong suit of an LLM, where you could just paste in a block and say "rewrite it so I'm not plagiarizing", and that's powerfully dangerous.


I imagine the output was something like this...

>> Hi I'm ChatGPT and I can't take this class because I am a chat bot.

The smart ones swapped their name in when they copy and pasted the answer and went undetected.


"say what you're hoping to get out of this class" is an invitation for bullshit, something LLMs are good at and at least this human (me) is bad at. This makes a lot of sense. Who's taking that class for a reason other than "it fulfills the required credit"? I'm sure it's not enough to expect a wide range of creative responses the instructor seems to have been expecting.

The teacher notes in the thread (https://x.com/freganmitts/status/1828796730634330593) that this is not a GE class, and as it's peer-conversation based, so there was some basis for helping students get to know each other.

I definitely took many classes in college for reasons other than getting credit, and I hope you did / do too!


Exactly, I was CS, but took Psychology classes for my own interest. It was fun bringing a different point of view and only realizing it once we started discussing the reading material.

One time in a small seminar class on mental models the professor asked us to each draw a tree. Only once we were done and everyone started sharing their tree did I realize that I had gone straight to a directed acyclic graph. I completely forgot about the green leafy things. Everyone had a good laugh, and it was actually a good example for what were talking about.

Asking ChatGPT for what I should say during a class is just play acting at education. Really feels liek a waste of time and money.


> The teacher notes in the thread (https://x.com/freganmitts/status/1828796730634330593) that this is not a GE class, and as it's peer-conversation based, so there was some basis for helping students get to know each other.

Then why make it a written assignment? If it's a peer-conversation based non-GE class, why not ask them to stand up for two minutes and do the awkward intro thing in person on their first day? Then the professor could have had a dialog about what students hope to get from her class, if she really cared.

This was pointless busywork through and through, the kind I'd expect ChatGPT to create for a lazy educator.


I took zero that didn't fulfill a requirement. I presume anything else would have been cost more in time and money than value I'd get from it. It would almost certainly delayed graduation and sucked up free time and added stress.

If it's a peer conversation class, perhaps it would have been more appropriate for them to report what another student told them what they hoped to get out of the class...

From the article:

> "A lot of students who take philosophy classes, especially if they're not majors, don't really know what philosophy is," she said. "So I like to get an idea of what their expectations are so I can know how to respond to them." Sounds very reasonable to me.


I'm not sure why you (and, apparently, several students) are assuming that a simple, straightforward question is actually a trick that demands a complex, rambling answer.

I was asked this numerous times I college and it gave me anxiety every time I had to answer it. I simply could not understand why anyone would be interested in my response, so I couldn't synthesize anything to my audience (the person asking).

It sounds like the trouble is that people are putting a lot of effort into trying to come up with a detailed lie that will flatter the professor's expectations, when (as I read the OP) the professor has no expectations and actually wants the simple truth. So everybody loses.

Maybe teachers need to start putting a "there are no wrong answers" disclaimer on these things.


Experience, I assume; IME, one of the more annoying things about school was that prose questions expected you to ramble on for pages and pages about something that really deserved about a paragraph.

I agree - if I was a student, I would be very tempted to use an LLM for this sort of coursework-irrelevant busywork assignment, especially if I had other work on my plate. It's not so hard to rationalize using it for this type of thing vs. an "actual" assignment, all due respect to the professor, who I'm sure means well

As an instructor, I suppose I'm happy to tailor the class to the AI's wishes. ;)

If you have the time and the attention span: "ChatGPT is bullshit"

https://link.springer.com/article/10.1007/s10676-024-09775-5

(although they concluded it's "soft" bullshit, as it doesn't aim to 'harm')


Do folks think ChatGPT soon will become a paid only platform or heavily curtail the free tier like claude.ai is to make Open AI profitable? Just food for thought.

It really has become a defacto flag bearer for all LLMs in the market, and I don't know a single person who doesn't use save for my father who is from the boomer generation.


It's hard to see how they're gonna break even otherwise. The market position is only useful if it's also bringing in enough revenue for them to stay afloat and competitive.

Maybe there's a layer of 'whales' they can access, like a premium tier for businesses that's got more bells and whistles than the living room variety? Other than that, who knows.


tbh even functional role of a teacher is about to be upend with AI. All I know is I rather my children be taught something that will be useful with assumption that everyone will have access to have a tool like chatgpt at home/work.

ChatGPT is a tool, and it's hard to argue that someone should be able to work without a useful tool. This is a great argument for advocating in favor of AI at work. Tools are great.

However, education is not work. The point of education is not the production of a finished output. The point is to learn. If teachers let students defer their understanding to AI by using ChatGPT to write assignments that means education stops being about the thing they're meant to be learning and starts being a test of prompt engineering. That has it's uses, sure, but you have to be honest about admitting that's what you're arguing in favor of.

Should education be about getting a real, fundamental understanding of a subject, or should it be about prompting an AI to do work in that subject area? I'd suggest the former is a lot more useful and valuable in the long term. If I was spending money on my education that's what I'd want at the end.


> The point is to learn.

If you've ever been around a growing baby, you'll have noticed that learning happens naturally. And if you have kept watching them as they continue on their life journey, you'll have noticed that they are sponges that take in incredible amounts of information. Point being, you don't have to go out of your way to learn.

No, the point of formal education is to develop the ability to separate thought from emotion. That is what humans, being emotional creatures, won't do naturally, but what is necessary to be able to "think big" about the world.


>the point of formal education is to develop the ability to separate thought from emotion

I doubt that's possible. Thoughts are emotional after all. Now the senses, that's a different story.


What are you on about?

The point of education is a) (before higher levels of education) general education and b) learn skills that help you in your life and work.

For the latter examples being critical thinking or understanding causality. Drawing conclusions that need thinking deeper than the surface. Being patient.

There are ”side products” too, such as emotional growth or learning and coming to accept that sometimes you have to do things you don’t want to do.

At higher levels of education the point is still to learn but it’s less broad and about more specific subject. You still refine the related skills. The side products also still exist.


Sorry, referring to the internet as "education" is a new one to me. I agree, that is the point of the internet. I thought you were talking about formal education.

I meant to convey similar sentiment as you with my comment.

I’d like my kids to learn how to research for and write essays so that they are able to discern well researched texts from hallucinations.

Maybe an AI is best able to detect when another AI did someone's homework.

The purpose of an assignment is not to produce a document, it's for the student learn how to synthesize information.

Having an AI produce the document does not provide the student, or the rest of society who needs the student to be good for something, with anything.

In this case it was even stupider. It wasn't a test, it was an informational question about themselves.

What in the ever loving hell is the defense or value in having an AI fabricate a random answer to a question about what you think about something? At least outside of some marketing data gathering context where there is a reason to give false information.


> Maybe an AI is best able to detect when another AI did someone's homework.

That's a losing battle. The best CAPCHA solvers are now better than humans.


If you’re gonna run with this fallacy, why not cross the finish line? If AI replaces education, why should taxpayers keep footing the bill for schools? Now we have a society where wealthy children attend private school and the rest do not. The few viable opportunities for advancement are limited to those who were educated in critical thinking. It sounds like slipping backwards as a developed country.

The functional role of a teacher is not going to be upended by AI.


> If AI replaces education, why should taxpayers keep footing the bill for schools?

The parent claims that AI will replace human teachers, not education. If we accept that state-funded education is worthwhile now, why would it stop being worthwhile when the teacher just happens to be AI?


I asked Gemini to add some grammar error that 13-year old might make. Worked very well, it added some teen-typical inflection errors.

English is of course different thing, because there is no actual grammar, it is just word salad and everything goes.


> They are also using it to word the questions they ask in class

This seems like a straightforward upgrade of the school experience. There truly are dumb questions; many of them get asked in class because speaking to a rubber duck first would be "disruptive." If you're going to take the whole class's attention down a tangent, why is it bad to get a smart generalist's opinion on the optimal shape for that tangent, first?


Think of it as debate prep.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: