Just watched an ACM webcast on teaching computer science with tools like Copilot.
The profession of software development isn't about to go away, but it's going to change rapidly. What we teach may also need to change (just as we no longer spend much time on assembly language).
I am concerned that we may be on a path to paying a corporation to use stolen code (and burn a bunch of carbon) to solve our problems.
@LouisIngenthron There's a lot of discussion around this point. There's certainly a risk that, if we keep doing things the way we have been, some students will use LLMs to cheat in CS1/CS2, then be stuck in later classes: the LLM can't solve the problems and the students haven't developed an understanding of the fundamentals.
That said, learning to read (and then to test and debug) code *could* be a useful step in learning to write code.
@peterdrake With an LLM properly trained for education, I could see the value in it.
The problem is that new coders don't know when the bot is talking out of its ass (which it does frequently). And, worse, the bot learns from the user, which creates negative feedback loops for newer coders. That's why the current iteration of this tech still requires a wider domain knowledge (that new coders don't have) to be truly useful.
@peterdrake Maybe specifically teaching how these bots can fail would be a valuable lesson, though. Introduce the tech while showing that it isn't a flawless solution...
@peterdrake That sounds extremely problematic. Entry-level coders should *not* be using tools like CoPilot... all that will do is reinforce their mistakes and flaws and make them worse coders in the long-run.