Hacker Newsnew | past | comments | ask | show | jobs | submit | nucrow's commentslogin

Not sure that's true. https://assets.weforum.org/editor/cXaoElD7K4eR-JTqP-ViH1LkcC... https://climateataglance.com/wp-content/uploads/2022/11/lomb... (disclaimer: quickly looked these up from memory, haven't verified their accuracy)


First off, congrats! Cool paper! Q: did you measure the correlation between CodeAid use and test scores? When educational youtube videos took off, people liked them a lot, but turns out people weren't really retaining very much information. You'd think this wouldn't be the case for CodeAid, but it's important to check


Thanks!

> did you measure the correlation between CodeAid use and test scores?

Unfortunately, we couldn't measure such correlations due to many external factors that can impact students' performance on test scores.

However, our previous research specifically compared students learning python coding for the first time with and without LLM code generators. Austin and I wrote a summary on it here: https://austinhenley.com/blog/learningwithai.html

We found that students who performed higher on our Scratch programming pre-tests (before starting the Python lessons), performed significantly better if they had access to the code generator. However, the system that we used in that study was very different, it showed the exact code solution to the students' query. Here with CodeAid we were trying to avoid producing direct code solutions!


> Unfortunately, we couldn't measure such correlations due to many external factors that can impact students' performance on test scores.

Fair. Guess we'll have to roll it out to increase that sample size :)

> We found that students who performed higher on our Scratch programming pre-tests (before starting the Python lessons), performed significantly better if they had access to the code generator.

Interesting!

From the paper > Additionally, students in the Codex group were more eager and excited to continue learning about programming, and felt much less stressed and discouraged during the training

This by itself could reap great benefits years down the line. Me and my wife have felt the same way at work!


Imagine we could, with the snap of a finger, come up with an AI tutor that is objectively better than human TAs. Better as in: between 2 groups of 10,000 students, those with AI tutors do better on perfomance metrics 95% of the time than those students with human tutors. Would you be opposed to replacing human tutors then?

If your answer is yes, (in some flavor of "protecting and helping the jobs of those who teach), I would argue your ethics are focused on the wrong group. Teaching is for students to learn, not for teachers to have jobs.

We don't have said technology yet, but it's reasonable to think we can get close. If there's a good chance to improve how well students can learn, I don't think "teachers don't appreciate it" is a good reason not to do it.


> Would you be opposed to replacing human tutors then?

Yes.

> If your answer is yes, (in some flavor of "protecting and helping the jobs of those who teach), I would argue your ethics are focused on the wrong group. Teaching is for students to learn, not for teachers to have jobs.

OK. But your framing here projects upon me the idea that I'm solely concerned about replacing jobs, in order for your argument to succeed. (Though again, that is the cold-rationalist AI zeitgeist[0]: why should people have jobs when an AI can do it?)

It elides the possibility that it is inherently better to learn from a real person, who has invested time and effort into teaching you. What is the point of higher education in particular if you are not learning, at some point, from people who are directly adjacent to cutting edge thinking?

> We don't have said technology yet, but it's reasonable to think we can get close. If there's a good chance to improve how well students can learn, I don't think "teachers don't appreciate it" is a good reason not to do it.

Well then. I can't argue with this, if you think it's OK to take humanity out of teaching. I think — perhaps feel — you are so wrong that I can barely even string the words together to explain. And that is an unbridgeable divide.

[0] https://www.theatlantic.com/technology/archive/2024/05/opena... or https://archive.ph/AL81B

'In response to one question about AGI rendering jobs obsolete, Jeff Wu, an engineer for the company, confessed, “It’s kind of deeply unfair that, you know, a group of people can just build AI and take everyone’s jobs away, and in some sense, there’s nothing you can do to stop them right now.” He added, “I don’t know. Raise awareness, get governments to care, get other people to care. Yeah. Or join us and have one of the few remaining jobs. I don’t know; it’s rough.”'


> It elides the possibility that it is inherently better to learn from a real person, who has invested time and effort into teaching you.

I did elude it, for the sake of the argument. If it turns out that in fact human tutoring is fundamentally better, then there's of course no point in using an inferior system (sweeping accesibility and other concerns under the rug). Go humans, if we're better!

> What is the point of higher education in particular if you are not learning, at some point, from people who are directly adjacent to cutting edge thinking?

For the subset that do research, this matters a lot. But for most everyone else looking for a better job, it's not really relevant.

> Well then. I can't argue with this, if you think it's OK to take humanity out of teaching. I think — perhaps feel — you are so wrong that I can barely even string the words together to explain. And that is an unbridgeable divide

I appreciate your candidness, and perhaps it's true that we may just not be able to agree. For what it's worth, my bet is tutors' quality will improve, rather than them getting displaced. My point however is: I want my kid to learn as best as possible. If that turns out to be with a robot, I'm not making my kid worse off to save some guy's job.


I'm curious as to how you would frame it instead.


I wouldn't put myself in that position.

When the question is of the form "what do you think about the ethical implications of using AIs?" and the quick answer is "humans also make mistakes" I think the entire premise is on pretty shaky ground.


I think I may have been unclear.

>... I would still see that this kind of linguistic framing is insensitive and self-defeating if not wholly inappropriate and demeaning if you want to see any co-operation from educators

I think this was spot on, and I agree with you. I meant to ask: have you thought about a way to frame it, in which educators don't feel threatened but instead excited ? Given that, from my perspective, there's a great chance to enhance both the teaching and learning experience, as well as results


I don’t think I can answer because again I think I would not put myself in that position.

I am going to retreat from this argument because this entire topic has me questioning my planned career change away from development towards training.

If this is what people think about human teachers, what at all is the point?


The goal of building washing machines, highways, etc. is not a cleaner environment. The point of windfarms is, but it defeats the purpose if producing them emits more carbon than they save.


It would defeat the purpose if windfarms made more pollution than they removed, but they don’t. By orders of magnitude. With current processes. They don’t.


Listen, I believe you

But only because those studies were done and numbers were produced.

It's important that we actually prove claims like this with facts.


Which is great, but the point is that we should "fret" about making sure that's the case.


That's such a common framing.

Green energy sources like wind and solar don't remove emissions. They add less.

> Energy transition aspirations are similar. The goal is powering modernity, not addressing the sixth mass extinction. Sure, it could mitigate the CO2 threat (to modernity), but why does the fox care when its decline ultimately traces primarily to things like deforestation, habitat fragmentation, agricultural runoff, pollution, pesticides, mining, manufacturing, or in short: modernity. Pursuit of a giant energy infrastructure replacement requires tremendous material extraction—directly driving many of these ills—only to then provide the energetic means to keep doing all these same things that abundant evidence warns is a prescription for termination of the community of life.

https://dothemath.ucsd.edu/2024/03/lets-make-a-deal/


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: