Part four - discrimination
Key Takeaways
Discrimination Risks
When deploying AI systems, schools need to consider societal drawbacks such as systemic bias, discrimination and inequality for marginalised groups of students. This is due to the risk of algorithmic bias present in AI systems which will systematically and unfairly discriminate against certain individuals or groups in favour of others.
To date, there has been little work on discrimination as it pertains to schools and AI; however, this area of law will continue to evolve as AI becomes increasingly imbedded into educational settings, potentially impacting decisions relating to enrolments, grading, and access to resources.
CAN ALL STUDENTS ACCESS AI SYSTEMS?
In an era where at home learning can be implemented instantly, schools must prioritise the accessibility of AI systems to provide equal opportunities for students. Disparities in technological resources, internet connectivity, and digital literacy may hinder equitable access to AI tools, potentially exacerbating existing educational inequalities.
how will algorithmic bias impact teaching and learning?
Algorithmic bias describes systematic and repeatable errors in a computer system, developed by someone that create "unfair" outcomes, such as "privileging" one category over another in ways different from the intended function of the algorithm.
The Australian Framework for Generative Artificial Intelligence attempts to mitigate the risk of algorithmic bias by guiding teachers and students to attain a level of 'AI literacy.’
AI Literacy equips users with the ability to understand how GenAI algorithms find patterns, make connections in data, and potentially disseminate discriminatory content reflecting the inherent bias of the algorithm's developer.
Engaging in AI literacy is crucial to understanding AI technologies and their broader societal impacts.
Does ai accommodate to the needs of children with disabilities?
Schools need to ensure that they do not inadvertently discriminate against children by utilising AI systems which perpetuate discriminatory outcomes.
There also needs to be an active consideration of the different needs and experiences of students with disabilities. This is so that schools can be prepared to make reasonable adjustments for children with disabilities so that they are not treated differently on the basis of their disability. Utilising AI systems which cannot be enjoyed by children due to their disability may expose schools to indirect disability discrimination liability under the Disability Discrimination Act 1992 (Cth).
What schools need to do?
Before deploying AI systems, school must:
how can mcw help?
We can provide up to date advice on the development of AI regulations or assist you in creating or updating your school policies to ensure you are adequately safeguarding the risks that AI poses to privacy. We can also provide training to your organisation so that all staff are aware of their obligations to operate and deploy, safe, responsible and ethical AI systems.
Get in touch with us today if you would like to discuss how these changes may impact your school and the steps you should be taking to ensure compliance and to manage risk.
GET IN TOUCH WITH US!
Principal
Law Clerk
Don't Miss a Beat
Subscribe to MCW Insights
Still Have Questions?
Make an Enquiry