By Dr Tina Parolin, Interim CEO, The Australian Council of Learned Academies (ACOLA)
The APS Academy is creating greater ties between the Academic community and the Australian Public Service. The Academy co-hosted a MasterCraft Series event in Canberra recently, which was facilitated by The Australian Council of Learned Academies (ACOLA), with the Department of Education. The seminar drew an impressive attendance of over 500 people with lively discussions and insightful questions reflecting widespread interest and engagement in the topic. Here are some of ACOLA’s reflections from the seminar.
Secretary of Education – Tony Cook, Professor Tama Leaver, Tina Parolin Interim CEO ACOLA, Professor Maarten de Laat
The Australian Council of Learned Academies (ACOLA) is pleased to have collaborated with the APS Academy and the Department of Education, to deliver a seminar on AI & its impact on education on 12 June 2024.
The seminar was opened by Department Secretary Tony Cook PSM, who welcomed our two speakers, Professor Tama Leaver (Curtin University), an expert on children’s engagement with generative artificial intelligence, and Professor Maarten de Laat (UniSA) whose research focuses on teachers as the front-line of engagement with AI in an educational environment. The event was facilitated by Dr Tina Parolin, Interim CEO, ACOLA.
Tony Cook outlined recent policy work of the Department of Education in preparing for the AI challenge and opportunity, both now and for the future. He noted learnings from international counterparts including Singapore and Canada, and the Department’s leadership in informing a major international OECD collaboration on AI in education.
Professors Leaver and de Laat were set a formidable challenge in covering the broad ranging implications and impacts of new AI technologies across the education system and on its key actors – students, teachers, educational institutions and policymakers.
Prof. Leaver noted that despite the arrival of ChatGPT and its rapid evolution, there is a general misconception about how advanced AI currently is and will be in the near future. The Large Language Models (LLM) on which much of the technology is based can generate material across a wide range of fields and media, but they are only as effective and useful as the data fed into them. Crucially, these LLMs do not understand what they generate. Despite the name, AI is not ‘intelligent’ – and Artificial General Intelligence, that is, achieving human-level capability, may never exist, despite the hype.
The experts pointed to several examples of the positive application of AI in educational settings. These included an early learning or primary educator collaborating with students to create and illustrate a story using generative AI (GenAI), asking students to describe the key features of characters and then using the GenAI to produce the images.
Or that of a university student, nervous about recording their own voice, using a Generative Voice AI to create an engaging narrator for a mixed media assignment (along with proper acknowledgements).
"Generative AI might create the lowest level pass version of an essay, or an image, or even a piece of music, but education at all levels is about learning skills to make a better version. GenAI might raise the bar, but education makes our students skilled users of GenAI tools in formal education and future creative and employment settings.
Prof. Tama Leaver, Curtin University
Prof. De Laat drew on his recent research evaluating the use of EdChat, a GenAI being piloted in schools by the South Australian Department of Education. The research highlighted positive experiences of both teachers and students, including helping students by giving feedback and prompting them on how to expand their work, augmenting the work of teachers in developing lesson plans, generating classroom activities and providing personalised student learning.
Both experts also highlighted the ongoing challenges and risks of AI in educational settings, including the prevalence of bias in GenAI (due largely to scraping data from the internet and reproducing the inherent bias found there), through to plagiarism, data privacy and the rights of the child, ownership and copyright infringements, and the environmental impacts from the energy required to sustain these systems.
Developing and nurturing critical thinking as a foundational goal for student outcomes was a key takeaway from the experts (and was reinforced by Tony Cook in his survey of international experiences). The experts talked about the need for underpinning critical thinking and digital literacy skills to meet the challenges of the unprecedented pace of change demanded by new technologies, and to build the skills needed for future workforces to ensure the development of secure, responsible and ethical AI.
Teachers will play in a vital role in AI adoption. Given the pace of change and complexities associated with student use of AI, they will need support to integrate ‘traditional’ classroom knowledge practices with AI capabilities. Supporting teachers with practice-based research and through teacher-led support networks was seen as vital to Australia’s effective adoption of AI in education.
There is much research underway in Australia and internationally to develop tools and processes that allow students, teachers and the community at large to realise the benefits of AI, whilst being alert to its potential to do harm. The audience expressed a strong desire to hear more about the results of ongoing research to ensure evidence-informed policy development.