“Teachers must see generative AI as an opportunity and teach how to use it correctly, otherwise they will train young people for unemployment.
Generative AI is currently in the process of sustainably changing our job market. This change particularly affects knowledge workers, as this is where the greatest potential for automation exists. Generative AI offers the opportunity to automatically create not only texts, graphics, music or videos, but also source code, designs or construction plans. We are currently only scratching the surface of what is possible.
It is only a matter of time before generative AI processes can, for example, generate sensible circuit diagrams for special machines or blueprints for individual pharmaceuticals. In the future, the use of generative AI - and of course AI more broadly - will be as natural for employees as the use of word processing programs is today. Anyone who can't handle it will be replaced by an AI across the board. But you are more likely to be replaced by your competitors - which may also be the person sitting next to you - who use AI to add value and are therefore many times more productive.
The teaching is changing
Generative AI does not stop at the education system either. Among other things, there is a great opportunity to individualize teaching content for learners depending on their previous knowledge and their learning progress. Learning foreign languages can also be much more successful through individual suggestions for improvement. Generative AI therefore not only affects the job of the AI professor who teaches the relevant procedures, but also the direct bread-and-butter business of all teachers.
Teachers at all levels - from primary school teachers to professors - must be aware of this change, develop an understanding of (generative) AI and integrate it sensibly into their work. Learners, on the other hand, also need to develop an understanding of them, especially how they work, how to use them correctly and the limitations of these procedures.
New approaches are required
There is currently a lot of discussion about how the unauthorized use of generative AI can be discovered among learners. Questions about whether students have used ChatGPT to write their thesis or homework are currently omnipresent. Whether submissions can be reliably checked automatically does not seem particularly promising.
It is therefore obvious that some forms of examination, such as take-home exams or term papers, will have no future on their own. Teachers must sensibly combine these with other forms of assessment, such as oral examinations or projects, in order to be able to check learning success.
Instead of prohibiting the use of generative AI, conducting detective attempts to prove its use and then sanctioning it, I have turned the tables in my teaching: I not only allow students to use generative AI, but I explicitly demand this from the students. This applies to the generation of texts and graphics as well as source code.
Generative AI tools provide, among other things, sensible structural suggestions, formulation aids or source code, which students then check, adapt and expand. Of course there are also rules for its use. Among other things, students must critically examine all generated content and, if necessary, substantiate it with appropriate sources. However, if whole sentences from the training data unexpectedly end up in the generated texts, this would still be plagiarism that would have to be sanctioned.
The students must also briefly describe the exact use of these tools. My entire set of rules can be found in the appendix. Initial results show that by using generative AI, students have more time for research and methodology and, overall, submit better theses. In the subsequent oral examinations, the focus is even more on discussing individual passages from the final theses submitted. In this way I ensure that the students have really understood these parts.
Innovation to brake by AI Act
The EU's legislative institutions are currently discussing the "AI Act" - the AI regulation - in the trilogue process. In the EU Parliament's current compromise text, automated assessments and approval fall into the "high risk" category. In addition, the individualization of content could also be included, as can be deduced from the formulation "... assessing the appropriate level of education for an individual ...". There are also a wealth of other specific regulations for large language models like ChatGPT. Of course, these rules do not per se prohibit the use of generative AI in education. However, they effectively prevent many corresponding innovations due to the high compliance requirements and the resulting implementation and certification costs. So we risk unnecessarily falling behind in international competition and leaving the field of educational revolution to China.
The legislature should refrain from imposing many requirements on teachers and educational institutions when using generative AI. Due to the dynamics of the AI environment, the half-life of today's rules is short. Personally, I am very excited to see how my own rules and regulations will change in the coming years. In my opinion, further training courses for teachers and learners on (generative) AI make much more sense.
Rules for the use of generative AI in theses
You are expected to use (generative) AI tools, such as ChatGPT for text or DALL-E for images, when writing your thesis. Like this, you will be more productive and possibly submit a far better thesis. AI tools will provide you very quickly with a structure or draft that you can then amend, improve and extend. The following rules apply:
You are expected to substantially amend, improve and extend the output of AI tools.
Critically assess the output of AI tools. They may sound convincing but could be completely wrong.
Add references in order to support the claims generated by AI tools.
Clearly state which AI tools you have used for which parts of your thesis.
Check your thesis for plagiarism. Some AI tools may simply copy entire sentences or paragraphs from other sources.
Read the privacy policy of AI tools before you use them. Do not send confidential data or trade secrets to cloud-based AI tools." [1]
1. Wie ChatGPT die Universität verändert. Frankfurter Allgemeine Zeitung (online)Frankfurter Allgemeine Zeitung GmbH. Oct 17, 2023. Von Patrick Glauner
Komentarų nėra:
Rašyti komentarą