The NASUWT has set out 12 principles for the ethical development and application of artificial intelligence and digital technologies in education

Introduction
NASUWT principles

  1. A public good and human right
  2. Legal and regulatory frameworks
  3. Legal and regulatory frameworks: data protection and privacy
  4. Legal and regulatory frameworks: equality, equity and inclusion
  5. Consultation, negotiation and agreement
  6. Teacher professionalism and agency
  7. Curriculum, assessment and pedagogy
  8. Good work - protecting jobs and decent working conditions
  9. Good work - training, professional development and learning
  10. Good work - managing performance
  11. Accountability for and governance of digital technologies
  12. Commercial and third-party providers - protecting education interests

Explainers and definitions
Further information

Introduction

Our principles and accompanying advice and guidance are intended to help NASUWT Representatives to judge whether digital technologies, including artificial intelligence (AI) technologies, are being designed, developed, procured and implemented in ways that secure and protect the lawful rights and interests of teachers, school leaders and learners.

This includes their educational and human rights, as well as their rights in relation to privacy and data protection, equality, employment and decent working conditions.

The UK General Data Protection Regulation (UK GDPR) provides an important means for challenging unlawful policies and practices relating to the use of digital technologies.

It is important to identify when personal data is being collected and processed through digital technologies and whether the data is being or could be used to monitor, predict or judge behaviours and performance.

There is also a need to establish:

  • who makes decisions about the design, development, procurement and application of the system or technology;

  • how decisions are made;

  • who controls the use of the system or technology;

  • who has access to the datasets generated by the technology or application; and

  • what happens to those datasets.

Our principles and related advice and guidance will help NASUWT Representatives to play an active role in supporting and challenging the introduction and use of digital technologies in the school or setting.

NASUWT principles

A public good and human right
  1. Education is a public good and a human right, therefore digital technologies must serve the broader goals and objectives of education. This includes ensuring high-quality inclusive and equitable education for all, recruiting, developing and retaining a high-quality teaching workforce, and building a just, sustainable society [1]

This means that:

  • decisions about how digital technologies, including AI, are designed, developed, procured and applied in education are based on the lawful rights, including the human rights, and interests of learners, teachers and the wider school community;

  • digital technologies provide clear and demonstrable benefits for both learners and teachers;

  • digital technologies support the design and delivery of high-quality inclusive education and those technologies should be accessible. They must not create barriers to learning or disadvantage some groups of learners;

  • there is a strategic plan to address the design, development, procurement and application of the digital infrastructure and the technologies that sit within it. The plan is subject to meaningful consultation with staff and negotiated and agreed with recognised workforce unions. The plan should address the financial implications of developing, procuring and maintaining the technologies;

  • digital technologies are developed and applied in ways that are environmentally friendly, sustainable, support global citizenship and promote social justice;

  • in the case of AI-based technologies, there is human involvement and oversight at all stages of development and application of the technology or system;

  • the NASUWT’s principles address the principles set out in UNICEF’s Policy guidance on AI for children;

  • the NASUWT’s principles apply to third parties, including those who develop and supply digital technologies or who manage or support their use in schools.

For more support and further information, see:

The guidance draws from the United National Convention on the Rights of the Child (UNCRC) and sets out nine principles for upholding children’s rights:

  1. support children’s development and wellbeing;

  2. ensure inclusion of and for children;

  3. prioritise fairness and non-discrimination for children;

  4. protect children’s data and privacy;

  5. ensure safety for children;

  6. provide transparency, explainability and accountability for children;

  7. empower government and businesses with knowledge of AI and children’s rights;

  8. prepare children for present and future developments in AI;

  9. create an enabling environment.

Legal and regulatory frameworks
  1. The design, development, procurement and application of digital technologies comply with legislation, regulations and good practice standards

This means that:

  • digital technologies are designed, developed, procured and implemented in ways that uphold children’s rights including their rights under the UNCRC;

  • digital technologies are designed, developed, procured and implemented in ways that protect teachers and leaders’ rights, including their rights in relation to data protection and privacy, health and safety, equality and human rights, work and working conditions;

  • schools consult teachers, learners and the wider school community about the rationale, design, development, procurement and application of the technology;

  • schools actively engage recognised workforce unions in decisions about the design, development, procurement and application of digital technologies;

  • impact assessments identify risks, including for data protection and privacy, equality and equity, health and safety, workers’ rights and protections including workload, wellbeing, jobs and job roles. The results of assessments are shared with workforce unions and influence decisions about the introduction or use of the technology, including limitations and measures to mitigate adverse impacts;

  • the mechanisms for protecting rights are clearly set out in policies and procedures that are communicated to all relevant parties. Policies and procedures explain how breaches will be addressed;

  • if a technology or system is being used for new or different purposes, its use is suspended until it is clear that it does not have an adverse impact on the lawful rights and interests of learners or staff. Its use for a new or different purpose should also be agreed with recognised workforce unions;

  • staff and learners have the right to ‘opt out’ if they have legitimate concerns about the use of a particular technology or system which cannot be addressed through mitigations.

For more support and further information, see:
Legal and regulatory frameworks: data protection and privacy
  1. Digital technologies are designed, developed, procured and implemented in ways that uphold rights to privacy and data protection

This means that:

  • schools comply with the UK GDPR;

  • schools are transparent about the rationale, design, development, procurement and application of digital technologies;

  • schools explain how the technology or system is to be used in ways that teachers, learners and the wider school community understand. The explanation includes the steps to be taken when there are concerns about its use/misuse;

  • schools consult staff and negotiate with recognised trade unions about the rationale, design, development, procurement and application of the technology;

  • schools disclose what personal data is being collected through digital technologies, how the data will be stored and how it will be processed. This includes disclosing when algorithmic management systems are being used. It also includes providing information about the use of data to profile or predict behaviour or to judge performance;

  • schools ensure that the purposes for collecting and processing personal data are lawful and fair. This includes ensuring that:

    • the school only collects the personal data that is necessary for the intended purposes;

    • the intended purposes are stated explicitly;

    • the data is only processed for the intended purposes; and

    • the data is kept securely;

  • schools ensure that teachers, including those working as supply teachers, other staff and learners have access to their data and the right to object to the data or edit the data;

  • data protection impact assessments (DPIAs) are undertaken in consultation with the workforce and others whose personal data is collected, stored and processed. A DPIA must:

    • describe the nature, scope, context and purposes of the processing;

    • assess necessity, proportionality and compliance measures;

    • identify risks to individuals;

    • identify any additional measures to mitigate those risks;

  • a technology is not used if the rights of staff and learners cannot be ensured and risks mitigated;

  • digital technologies are not used to monitor the workforce or for capability or disciplinary purposes or to monitor communications between workers and NASUWT Representatives as this would breach data protection and privacy regulations;

  • schools commit to not selling or giving away datasets that include personal information about learners, staff or the wider school community.

For more support and further information, see:
Legal and regulatory frameworks: equality, equity and inclusion
  1. Digital technologies comply with equalities legislation and support and enhance equality, equity and inclusion

This means that:

  • decisions about whether and how a digital technology/system is designed, developed, procured and applied are guided by detailed assessments of its equality impact, including but not limited to those with a protected characteristic under equalities legislation:

    • digital technologies are not designed, developed, procured or applied in ways that unlawfully discriminate against or exclude particular groups of staff or learners. AI algorithms do not introduce or replicate existing biases or prejudice;

    • issues that may affect groups of teachers, learners and/or members of the school community, including those who share a protected characteristic under equalities legislation, are identified;

    • particular attention is paid to how a digital technology addresses diversity, including through the stages of value chain, i.e. design, development, trialling, procurement, application, review and evaluation;

    • assessments consider how digital technologies will support action to eliminate unlawful discrimination, advance equality of opportunity and foster good relations between groups;

    • assessments of a digital technology to support learning consider whether the technology could help to include learners who have an additional learning need. This includes clarifying the pedagogies and practices that support the inclusive use of the technology;

    • the results of assessments are shared with workforce unions and influence decisions about the use of a technology.

  • digital technologies are monitored to ensure that they are not being applied unfairly, in ways that result in unlawful discrimination or bias, or in ways that exclude;

  • consideration is given to using algorithmic systems to monitor and flag up issues relating to the equality impact of policies, procedures and practices, including those relating to pay, promotion and career progression and professional learning and development;

  • periodic reviews evaluate the extent to which digital technologies are supporting and enhancing equality, equity and inclusion;

  • equality impact assessments are undertaken when changes to digital technology policies and practices are proposed.

For more support and further information, see:
Consultation, negotiation and agreement
  1. The NASUWT and recognised workforce unions are engaged meaningfully at national and local levels in decisions about the design, development, procurement, implementation, review and continued use of digital technologies

This means that:

  • digital technology agreements are established which clarify the arrangements for consultation, negotiation and decision-making;

  • agreements make it clear that schools will consult staff and negotiate with recognised trade unions about the rationale, design, development, procurement and application of digital technologies. Agreements also clarify that decisions will only be taken forward with the agreement of workforce unions;

  • the NASUWT and recognised workforce unions are kept informed of proposals, plans and decisions relating to the design, development, procurement, application and review of digital technologies;

  • staff and learners who could be affected by a digital technology are consulted as part of the impact assessment process and their views and needs influence decisions about the design, development, procurement, application, review and withdrawal of digital technologies;

  • there is ongoing monitoring of digital technologies and periodic reviews and evaluations of their use and impact and recognised workforce unions are actively engaged in the review and evaluation processes.

For more support and further information, see:
Teacher professionalism and agency
  1. Teachers have a voice and influence and their views and needs inform decisions about whether and how digital technologies are designed, developed, procured and applied

This means that:

  • decisions about how a digital technology is designed, developed, procured and/or applied are based on the views of teachers and the needs of teachers and learners;

  • teachers are consulted about how a technology will support or hinder teaching and learning in their classrooms and decisions about whether to use the technology are based on this feedback. A technology is not introduced if it will have a detrimental impact on teaching and learning;

  • where needed, teachers are given appropriate information and technical support so that they can provide informed and meaningful feedback and opinions;

  • measures are in place to enable a technology to be trialled and reviewed;

  • teachers are kept informed of proposals and decisions about the use of digital technologies;

  • digital technologies do not deskill teachers or de-professionalise teaching, for instance by removing tasks that involve the teacher’s professional judgement and expertise. Teachers must maintain their professional autonomy and agency;

  • impact assessments consider whether the introduction of a technology could change the role and core professional responsibilities of the teacher, either directly or indirectly;

  • digital technologies are monitored for their impact on teachers’ and leaders’ roles and responsibilities;

  • periodic reviews are undertaken to ensure that digital technologies are not having an adverse impact on the professional responsibilities and agency of teachers.

For more support and further information, see:
Curriculum, assessment and pedagogy
  1. Decisions about whether and how digital technologies are designed, developed, procured and applied are determined by curriculum goals, learning objectives and the purposes of assessment

This means that:

  • digital technologies support and enhance teaching and learning, including in ways that support equality, equity, inclusion and global citizenship;

  • pedagogies support effective learning:

    • new pedagogical approaches may be needed to make effective use of digital technologies;

  • curricula and assessments may need to respond to new digital developments, including those outside of education, which may influence how learners engage with learning and assessments;

  • the curriculum educates learners to function and thrive in a digital world. This includes supporting them to be digitally literate, helping them to distinguish between fact and fiction and enabling them to become active global citizens who make informed decisions about the safe, sustainable and ethical use of AI and digital technologies;

  • teachers have sufficient time to assess the usefulness of digital technologies and to plan and prepare for their incorporation into lessons.

For more support and further information, see:
Good work - protecting jobs and decent working conditions
  1. Digital technologies are designed, developed, procured and implemented in ways that protect jobs and workers’ rights and secure good working conditions

This means that:

  • teachers and other staff who will be affected by a technology or system are consulted about the proposals and plans are negotiated and agreed with recognised workforce unions;

  • impact assessments assess the impact of a digital technology or system on workload, stress and wellbeing, jobs and job roles and the results inform and influence decisions about the design, development, procurement, implementation, adjustment and/or withdrawal of digital technologies;

  • digital technologies do not replace or displace teachers, including teachers who do not have a permanent contract or place of work, such as supply teachers;

  • if digital technologies require teachers to undertake new tasks, those tasks replace existing tasks and are manageable and sustainable;

  • digital technologies do not deskill or undermine the professional status of the teacher;

  • particular attention is paid to how digital technologies impact on practice:

    • policies make it clear that teachers have the right to switch off and the timeframe within which this should happen, e.g. 6pm to 8am;

    • steps are taken to prevent teachers feeling under pressure to work longer hours or undertake additional tasks, e.g. introducing digital systems that prevent communications being received outside working hours;

  • action is taken to address any pre-existing generators of workload;

  • digital technologies are monitored for their impact on workload and this feeds into periodic reviews and evaluations of teacher and leader workload and wellbeing;

  • the companies that develop or supply the digital technology respect workers’ rights, including their right to decent pay and working conditions and their right to organise and join a trade union;

  • digital technologies should:

    • reduce bureaucracy and workload burdens;

    • support teachers and leaders to teach and lead teaching and learning;

    • be implemented in ways that ensure the right to a work/life balance, including the right to switch off.

For more support and further information, see:
Good work - training, professional development and learning
  1. Teachers have sufficient and equal access to personalised training and continuing professional development and learning (CPDL) to enable them to make informed decisions about the use of digital technologies in their teaching

This means that:

  • teachers have an entitlement to CPDL and dedicated time within the working day to undertake CPDL. This entitlement is fully funded and covers all teachers, including those who work part time and those who have no permanent contract or place of work;

  • CPDL enables teachers to develop a critical understanding of digital technologies. This includes the technical aspects of their use and the benefits and limitations for teaching and learning, including an understanding of the implications for equality, inclusion and human rights;

  • teachers have access to specialist support and to professional learning networks to share ideas and innovative practice relating to the use of digital technologies;

  • teachers have access to experts who can explain how the technology functions and the legal, regulatory and good practice standards relating to its introduction and use. Time is provided to enable any associated training to take place;

  • consideration is given to using digital management systems to map teacher competencies and skills and flag up professional learning and development needs. Such systems would need to be designed to ensure equality and equity of access and must not be applied punitively.

For more support and further information, see:
Good work - managing performance
  1. Where digital technologies enable monitoring of a teacher’s practice, this information is controlled by the teacher and, where used, is only used for self-reflection and personal development purposes

This means that:

  • staff who will be subject to a performance management system are consulted about the design, development and implementation of the system;

  • workforce unions are actively engaged at all stages of the process to design, develop, procure and implement performance management systems and decisions about their introduction and use are negotiated and agreed with the unions;

  • digital technology is not used for punitive, high-stakes performance management purposes:

    • using digital technologies for hig-stakes performance management purposes, including purposes such as pay progression and promotion, may breach data protection regulations.

For more support and further information, see:
Accountability for and governance of digital technologies
  1. Digital technologies are inclusively governed, but the employer is the ‘responsible body’, liable for any harms that arise from the deployment of the technologies

This means that:

  • the employer is responsible for ensuring that digital technologies are implemented in ways that comply with legislation, regulations and good practice standards. This includes ensuring that data is kept secure and confidential;

  • the employer is accountable for any errors or biases that arise from implementing the technology or system;

  • a participatory data stewardship approach is adopted to involve teachers, learners and other members of the school community in the use of data. The approach is consistent with the Ada Lovelace Institute’s (ALI’s) framework for participatory stewardship, which sets five commitments:

    • inform: a commitment to keep people informed about how their data is being governed;

    • consult: a commitment to listen to, acknowledge, and provide feedback to people on concerns and aspirations for the governance of their data;

    • involve: a commitment to work with those people to ensure that their concerns and aspirations are directly reflected in data governance;

    • collaborate: a commitment to seek their advice and innovation in the design of data governance models and to incorporate their recommendations where possible;

    • empower: a commitment to advise and assist in line with their decisions about their own data governance models;

  • digital technologies are monitored and reviewed and evaluated periodically to assess the ongoing educational, social, ethical benefits and risks associated with their use:

    • evaluations draw on the results of monitoring about privacy and data protection, equality and equity, workload and wellbeing;

    • reviews and evaluations ensure that digital technologies are not being used beyond their original purpose.

For more support and further information, see:
Commercial and third-party providers - protecting education interests
  1. Developers and suppliers of digital technology systems and products operate in ways that are consistent with the principles above

This means that:

  • the developer and/or supplier of the technology or system operate(s) fair and sustainable employment practices, including those workers who are based in other countries. They provide decent work and working conditions and recognise workers’ rights to organise and join a trade union;

  • procurement/supplier contracts include explicit clauses to clarify that the school has joint data access and control. This means that contracts should:

    • include the right to demand amendments to or the withdrawal of digital systems if intended or unintended harms or faults are detected;

    • clarify that the school will not accept updates or amendments to digital technologies/systems that could have an adverse impact on the lawful rights, including human rights, and interests of learners and staff;

    • clarify that the school will have a say in what happens to data when a contract is terminated;

  • developers/suppliers have conducted impact assessments and shared the results of these assessments with the school. Assessments should identify the risks and detail mitigations in relation to:

    • data protection and privacy of staff, learners and the wider school community;

    • equality, equity and inclusion of staff, learners and the wider school community, including those who share a protected characteristic under equalities legislation;

    • workload and wellbeing of staff and learners;

  • the results of impact assessments are shared with recognised workforce unions and made available to staff, learners and the wider school community:

    • there should be clear evidence that the designer/developer/vendor has taken appropriate action to identify and mitigate potential adverse impacts;

  • developers/suppliers monitor and periodically review and evaluate the digital technology/system;

  • datasets that include the personal data of learners, staff and/or other members of the school community should not be sold, given away or transferred to third parties without explicit consent;

  • there is evidence that developers/suppliers are committed to operating in ways that are sustainable and have a positive impact on society and the environment;

  • if the school or setting has no direct influence over the design of a digital technology, e.g. a digital platform by a global company such as Microsoft or Google:

    • the technology is applied in ways that adhere to the principles above;

    • functions that pose risks for data protection and privacy are identified and disabled or rejected or users are informed of the risks and their right to ‘opt out’;

    • there is ongoing monitoring of the technology to identify new and emerging risks and unacceptable practices that are reported to the employer and to the NASUWT and recognised workforce unions:

      • e.g. evidence that user data (teachers, learners, parents) is being used to profile and target products and resources;

      • e.g. evidence that the user data is being repackaged and sold to third parties who then target those users.

For more support and further information, see:

Explaining digital technologies, AI, algorithms, generative AI and algorithmic management

Digital technologies are electronic tools, systems, devices and resources that generate, store or process data. [2]

Digital learning refers to any type of learning that uses technology.

Digital technology in education can be categorised as technology that is used for education management and delivery, technologies for learning and assessment, and technologies to support and enhance teaching.

There are many definitions of AI and definitions are likely to change as AI evolves. UNICEF defines AI as:

‘…machine-based systems that can, given a set of human-defined objectives, make predictions, recommendations, or decisions that influence real or virtual environments. AI systems interact with us and act on our environment, either directly or indirectly. Often, they appear to operate autonomously, and can adapt their behaviour by learning about the context.’ (UNICEF 2021)

An algorithm is a mathematical rule or process which is followed to perform a calculation or solve a problem.

Algorithms set out the logical steps that digital technologies follow to process data to make predictions, recommendations or decisions.

Machine learning allows algorithms to extract correlations from data, build models and refine the models as the machine learns. These models can be extremely complex making it impossible for a human to work out how a particular decision, recommendation or prediction was reached.

Machine learning also means that the way in which the technology operates may change over time.

Generative AI is a type of AI system where the algorithms have been trained to respond to prompts and create new outputs such as images, text and audio. Generative AI deals with huge amounts of data at speed. ChatGPT is one example of a generative AI system.

Algorithmic management is a system which either partially or fully automates a management task. It includes technology-driven surveillance or monitoring, for instance to track and allocate tasks or to set performance targets or measures.

Further information

NASUWT advice and guidance
Data protection and privacy
Remote and hybrid education
TUC
Information Commissioner’s Office
Hubs providing resources and tools for unions on digitalisation and AI
Other sources of advice, support and information

Footnote
[1] This reflects NASUWT policy, e.g. see World Class Schools, but also international agreements such as the Sustainable Development Goal for Education (SDG4) and the purposes of education set out in the United Nations Convention on the Rights of the Child (UNCRC)
[2] Department of Education, Victoria State, Australia - https://www.education.vic.gov.au/school/teachers/teachingresources/digital/Pages/teach.aspx

 



Your feedback

If you require a response from us, please DO NOT use this form. Please use our Contact Us page instead.

In our continued efforts to improve the website, we evaluate all the feedback you leave here because your insight is invaluable to us, but all your comments are processed anonymously and we are unable to respond to them directly.