It’s always strange to me that educators encourage “be critical of the information from x” where x is a new technology (internet, Wikipedia, ChatGPT), but that skepticism doesn’t last long.
“Continuous improvement” is based on the idea that we knew what we needed when we started and the system was the path to it. “Design” is based on the idea that we evaluate our goals and the systems as we “improve.”
I don’t disagree that devices are a significant source of students being disengaged in school. Split attention, FOMO, the rewards they give; yes, they are all real effects. I taught long before these devices arrived. Students were disengaged back then too.
I’m increasingly convinced education’s response to generative AI must focus on convincing students that *the skills they hone* are more valuable than the outcomes they achieve. This will require a monumental mind shift in the field.
“I don’t believe in science because they keep changing facts.” Yes it does. That’s the whole point. Be careful, however, because many of the surprising discoveries reported in the news are later rejected. Again, that’s what it’s designed to do.
OK seriously... folks tell me they save time using AI to write emails. I ask them why they don't use templates as they are even quicker than AI. The most common response is "what is a template?"
I told a leader once what my goal was in the organization. She said, “it’s not going to happen here.” I thought she was telling me she wouldn’t allow it. I realize she was warning me about the culture.
An acquaintance interviewed for a position where I work. Part of the process was a open forum. He rejected the offer and told me the primary reason was the toxicity he sensed during the open forum.