If these “signs of AI writing” are merely linguistic, good for them. This is as accurate as a lie detector (i.e., not accurate) and nobody should use this for any real world decision-making.
The real signs of AI writing are not as easy to fix as just instructing an LLM to “read” an article to avoid them.
As a teacher, all of my grading is now based on in person performances, no tech allowed. Good luck faking that with an LLM. I do not mind if students use an LLM to better prepare for class and exams. But my impression so far is that any other medium (e.g., books, youtube explanation videos) leads to better results.
I do both of these as well as smaller but more frequent tests, group work, project work over several sessions etc… The only things I stopped doing are reports to write at home, paper summaries etc. Doesn’t make sense anymore.
If these “signs of AI writing” are merely linguistic, good for them. This is as accurate as a lie detector (i.e., not accurate) and nobody should use this for any real world decision-making.
The real signs of AI writing are not as easy to fix as just instructing an LLM to “read” an article to avoid them.
As a teacher, all of my grading is now based on in person performances, no tech allowed. Good luck faking that with an LLM. I do not mind if students use an LLM to better prepare for class and exams. But my impression so far is that any other medium (e.g., books, youtube explanation videos) leads to better results.
I sucked in oral exams and therefore hated them. Then again, if they had been mixed into regular school, it might not have sucked so much.
Doesn’t need to be oral, I remember occasionally having exams that were essay questions that needed to be answered in class.
I do both of these as well as smaller but more frequent tests, group work, project work over several sessions etc… The only things I stopped doing are reports to write at home, paper summaries etc. Doesn’t make sense anymore.