Introduction
Preparing to negotiate an agreement
Negotiating a digital technology agreement
Information and resources
Appendix - Things to ask a third party provider

Introduction

This guidance is intended to help NASUWT representatives to negotiate collective digital agreements relating to the use of artificial intelligence (AI) and digital technologies. It draws on the TUC’s guidance, People-powered technology: collective agreements and digital management systems and reflects NASUWT principles for the ethical design, development, procurement and application of digital technologies and AI, which can be found on the Artificial Intelligence and Digital Technologies page. This means that the guidance addresses work and working conditions, as well as factors relating to teachers’ and leaders’ professional practice and the impacts on learners, learning and education.

Sections of the guidance may also be useful in supporting work to develop and review a digital strategy and when bargaining around the use of digital technologies and AI.

Preparing to negotiate an agreement

Before you negotiate a collective agreement for digital technologies and AI, you should seek to understand the existing arrangements, strategies and policies that could impact on the agreement and what needs to be negotiated. This might include strategies, policies and arrangements for particular technologies or systems, as well as those that relate to the application and procurement technologies and AI generally.

Tip: You might also find it useful to consider the following if your school is developing or reviewing its strategy for digital technologies and AI:

Points to consider:

Does the employer have a digital technology strategy?

If there is a strategy:

What does it cover?

  • Does it cover the design, development, procurement, application and review of technologies?

  • Does it cover matters relating to learners and learning, teachers and teaching and work and working conditions?

  • Does it cover all technology within a school or across a group of schools, or does it just cover some elements, e.g. technical infrastructure, digital management platforms?

  • When was it produced?

  • Who was involved in its development?

  • What, if any, consultation took place with recognised trade unions?

  • Who is responsible for it?

  • When is it due to be reviewed?

  • Does it need updating or replacing?

Are there agreements or policies that cover particular technologies or particular groups within the school(s) or school community?

  • If so, what do they cover and who do they cover?

  • Do they address matters relating to equality, equity and inclusion?

What policies are in place to address the use of digital technologies and AI in teaching and learning?

  • What policies and procedures are in place to address the use of digital technologies and AI in assessments?

  • In the case of formal qualifications and assessments, do policies and procedures adhere to guidance intended to protect the integrity of qualifications?

  • How are matters relating to equality, equity and inclusion addressed?

Do teachers and other staff receive training and professional development regarding the use of digital technologies and AI?

  • Is this fair and equitable?

  • Are the professional learning and support needs of teachers and other staff being met?

Do contracts of employment and policies relating to conditions of service address the use of digital technologies?

  • If these have been updated to include the use of digital technologies, were they agreed with the individuals affected and workforce unions?

  • How is this reflected in the digital strategy, if there is one?

Do contracts for systems and services cover the use of digital technologies?

  • How are these reflected in the digital strategy, if there is one?

How are decisions to adopt or to stop using a particular technology, system or application made?

  • Are teachers and users of the technology, system or application consulted and involved in decision-making?

What assessments of technologies have been undertaken and what do they say?

  • Did they cover data protection and privacy, equality, staff workload and wellbeing, health and safety?

  • Was NASUWT consulted and involved in the assessment(s)?

  • Were teachers and other staff consulted?

  • How are the views of NASUWT, teachers and other staff reflected in assessments and subsequent decisions?

  • Are there gaps and improvements that could be made to assessment processes?

In relation to a particular technology or system, who are the people or groups who may be affected by the technology?

  • How will it impact on them?

  • What risks does it pose?

  • Are there opportunities or potential benefits from adopting the technology?

Are there regular audits of the technologies, systems and applications used across the school or setting and are they catalogued?

  • Are systems, technologies and applications streamlined and coherent?

  • Is there a process for evaluating the use and value of a technology system or application? If there is, who is involved in this process?

  • What happens when conflicts between systems, technologies and applications are identified?

Negotiating a digital technology agreement

You should address the following areas when you negotiate a collective agreement:

  1. Defining the technology or technologies to be covered by the agreement

  2. Principles that underpin the agreement

  3. Redlines

  4. Mechanisms that will facilitate consultation, participation, negotiation and the provision of information

  5. Arrangements that the agreement should cover:

    1. assessment, monitoring, review and evaluation; and

    2. governance and accountability.

  6. Things to ensure:

    1. human expertise, human review and human contact;

    2. decisions are transparent and explainable;

    3. equality, equity, diversity and inclusion;

    4. good work;

    5. staff and learners have power and control over their data; and

    6. just distribution of rewards.

Defining the technology or technologies to be covered by the agreement

You will need to clarify whether the agreement will apply to all digital technologies and systems or specific ones.

It will be useful to have an agreement that covers different technologies and contexts, particularly where principles are established to guide what should and should not happen. However, it may not always be possible to do this as agreements may already exist that cover particular technologies, groups (e.g. learners) or contexts (e.g. contracts with third parties).

In such instances, you should try and secure an agreement that could be updated in the future to cover the wider use of technology or different contexts or groups. You may also need to address what will happen if agreements overlap or situations occur where an agreement impacts on other agreements.

You should try to anticipate the different ways in which a technology might be used. For instance, video technology such as CCTV might be used for security and behaviour management purposes but could also be used to monitor staff. Similarly, sensors on portable devices and wearables might enable managers to monitor a teacher and use that information to judge their performance. Therefore, it will be important to include limitations and conditions relating to the use and removal of a technology or system under the terms of a collective agreement.

Technologies evolve and may include new features and capabilities. You should try to ensure that any agreement is flexible enough to remain relevant to these changes. It will also be vital to ensure that rights are protected under these new purposes.

An agreement will need to clarify whether it applies to teachers, school leaders, learners or the wider school community.

It will be important to ensure that teachers working as supply teachers or on temporary contracts are covered by a collective agreement and that appropriate time is provided (with pay) for them to read and understand any such agreement prior to undertaking an assignment.

Principles that underpin the agreement

NASUWT principles for the ethical design, development, procurement and application of digital technologies and AI.

NASUWT has established 12 principles for the ethical design, development, procurement and application of digital technologies and AI in education which are reflected through this guidance and can be found on the Artificial Intelligence and Digital Technologies page.

The webpage includes an explanation of what each principle means in practice, along with links to additional sources of support and further information. This may be useful if you want more information to support subsequent stages of work to secure an agreement.

Redlines

Tip: The redlines may be useful if you are bargaining around the use of AI and digital technologies.

You should establish redlines to ensure that digital technologies and AI cannot be used to introduce harmful and unacceptable practices into the school or setting.

If the agreement relates to a particular technology or technologies, it may be useful to focus on clarifying redlines that relate to the work or activities that will or could be undertaken in relation to that technology. In some instances, it may be necessary to outlaw specific technologies. In other instances, it may be important to clarify the situations or purposes when a technology should not be used.

Redline principles might include:

  • Explainable and understandable:

Employers must only be permitted to use AI and digital management systems where the purposes of those systems and the decisions made by those systems can be sensibly explained and understood by staff, learners and others using or subject to those systems.

  • Equality, equity, diversity and inclusion:

Employers and third parties delivering or managing services used by schools or settings must not use a technology, process data or operate a system in a way that results in unlawful discrimination or bias. Technologies and systems should also be designed and implemented in ways that support the inclusion of all learners and staff, including those with a protected characteristic and help to address inequities.

It is important to note that a school or setting’s responsibilities in relation to equalities legislation, including responsibilities relating to the Public Sector Equality Duty (PSED) in Great Britain and Section 75 of the Northern Ireland Act, also apply to a provider carrying out functions on behalf of the school or setting. This means that contracts and agreements with that party will need to cover these responsibilities.

  • Educational benefits:

A technology must not be used simply because it is available or because it is an innovation. A digital technology should support educational goals and objectives and there should be clear educational benefits from using a digital technology. Proposals to introduce a technology must be subject to impact assessments, including on workload. The benefits of using a technology and actions to mitigate any adverse impacts should be clarified before a decision is made to use the technology.

  • Affordable and sustainable:

The costs of purchasing, leasing, implementing, maintaining and updating or replacing a technology must be affordable and sustainable. It will be important to consider what technology already exists and whether a digital technology fits within the existing technical infrastructure. It will also be crucial to clarify the costs of training staff, of providing teachers with time to become familiar with the technology and integrate it into their teaching practice and the cost of providing any expert support that may be needed.

  • Human expertise and involvement in decision-making:

Teachers, including their union representatives, must be part of the decision-making process when determining whether and how a digital technology is used for teaching and learning and whether and how a digital management system is implemented. Decisions should be informed by the results of impact assessments. A digital technology must not replace the professional responsibilities or judgement of the teacher.

  • Disclosing when and how a digital management system is being used:

Teachers, leaders and their union representatives must know when a digital management system is being used in the school or setting and how it is being used. The information provided must be accessible and intelligible.

  • Openness and transparency:

Sufficient information must be provided about the way in which a digital management system or technology operates in the employment and/or educational contexts so that teachers and leaders can be satisfied that the technology is being used in a way that is accurate, rational, non-discriminatory, proportionate, lawful and ethical. This includes systems and technologies supplied and managed by third parties.

  • Privacy and data protection:

A digital technology or system must comply with data protection and privacy legislation and regulations. It must not be used unless it is clear that the privacy of individuals, including teachers and learners, is ensured. This must address the collection, storage and processing of personal data, including who has access to the data and for what purposes, as well as the deletion of data. It must also address the use of data for profiling purposes.

  • Health and safety:

A digital technology or digital management system must not have a negative impact on the physical or mental health or safety of staff or learners. New technology must comply with health and safety laws and regulations as well as agreed procedures to deal with health and safety issues. New technology risk assessments must be undertaken with the full involvement of all relevant workforce union representatives.

  • Monitoring and surveillance:

Digital technologies must not be used to monitor staff for capability and performance purposes. All digital technologies that are capable of surveillance and monitoring must be identified and the applications of those technologies explicitly agreed with workforce unions and communicated to staff.

  • Workload and wellbeing:

Digital technologies must not add to staff workload. If digital technologies require teachers, leaders and other staff to undertake new tasks, these tasks should replace existing tasks. Workload impact assessments must be undertaken and inform decisions about whether and how a technology is introduced and implemented. Digital technologies must also be monitored for their impact on workload and wellbeing and the risk of workload intensification kept under review.

  • Right to disconnect:

Teachers, leaders and other staff must have the right to a reasonable work-life balance, including the right to disconnect. Staff must not be penalised for exercising this right. Schools and settings must manage parental expectations and clarify that teachers should not respond to queries or requests from parents outside of agreed working hours.

Explicit policies should specify that teachers have a right to a reasonable work-life balance and clarify that this is the employer’s expectation for staff. In having regard to this, leaders and managers should ensure that they adhere to the working limits set out in the Working Time Regulations 1998. A policy might set out when staff should not respond to emails or queries from parents or learners. Systems should be set up in consultation with the recognised trade unions to enable staff to disable access outside these times.

  • Pay progression and promotion:

Staff must not be penalised for exercising their rights to switch off and to a work-life balance.

  • Access to training, support and expertise:

Teachers must not be expected to use digital technologies in their teaching unless they have received appropriate training and agree to its use. They must have access to expert support, where needed. There must be specific time for training and professional development and learning within the working day. This time should be distinct from and additional to time that is provided for other professional duties such as marking, planning and assessment.

  • Risks to jobs and job security:

The use of a digital technology must not result in job losses. Digital technologies must not be used to replace teaching posts or change the structure of the workforce. Employers must consult and seek agreement with workforce unions where technologies could have implications for jobs, job security, job roles and workforce structures.

  • Commercial and third parties providing technologies and/or managing systems and services:

Procurement arrangements and contracts must comply with this guidance, including these redlines.

It may be difficult to establish redlines that apply across all contracts and technologies provided by third parties, particularly where companies are global players. However, it will be important to try and secure commitments that are consistent with NASUWT’s principles.

This could be done through their inclusion in procurement arrangements and contracts. It might also be appropriate to secure an agreement with the employer about how the technology is applied, including disabling particular functions.

You should try to secure a commitment that personal data relating to staff, learners and members of the wider school community will not be used to profile those users and that the data will not be repurposed and sold.

You should press for an agreement with a third party that commits the third party to work collaboratively with teachers and unions. This should include working collaboratively to monitor and review the use of the technology or system in order to identify and provide solutions to problems, but it might also include collaboration to develop and implement a technology or system.

Mechanisms that will facilitate consultation, participation, negotiation and the provision of information

Data protection regulations mean that the employer is obliged to inform individuals (including staff and learners) when their personal data is being collected and for what purpose(s). This information should be provided in a way that is transparent and understandable. The agreement should clarify the arrangements for ensuring that this happens.

The agreement should address the purposes of consultation and negotiation. It should clarify the arrangements for enabling teachers to contribute to decision-making and ensure that their needs and views are heard. This includes the arrangements for negotiating with union representatives.

The agreement should clarify the arrangements for consulting and negotiating where third parties provide or manage the digital technology or system. It will be important to clarify the role of the third party in providing information about the technology or system and to address issues such as profiling and the repurposing of data and the mechanisms for securing changes where issues are identified.

You may find the Ada Lovelace Institute’s Framework for Participatory Data Stewardship (Inform; Consult; Involve; Collaborate; Empower) useful when negotiating arrangements that will support the inclusive governance of AI and digital technologies.

The agreement should commit to ensuring that teachers and other members of the school community have access to the necessary expertise to enable them to make informed judgements about the use of a digital technology.

It may be appropriate to establish a digital technology committee and to have union representation on that committee. The agreement should clarify that the employer will provide representatives with training so that they can fulfil the role and that union representatives will receive facility time to undertake the role. Members of the committee may also need to have access to technical experts. The agreement should clarify when and how this will be provided.

Arrangements that the agreement should cover

Assessment, monitoring, review and evaluation

The agreement should clarify the arrangements for assessing the potential impact of a technology before decisions are made to procure or introduce the technology. The results of the assessment should inform decisions about whether to procure or use the technology.

The agreement should also set out the arrangements for monitoring, reviewing and evaluating a technology that has been introduced, including a technology that is provided or managed by a third party. This needs to include the arrangements for stopping use of a technology and amending or withdrawing from a contract.

Further, the agreement should set out the arrangements for engaging workforce unions and representatives in the assessment, monitoring, review and evaluation of a technology or system.

Impact assessments should:

  • identify the persons and groups sharing a relevant interest;

  • analyse the risks and impacts of the technology or system, including those relating to:

    • human rights, equality, equity, diversity and inclusion;

    • data protection, safety, privacy and security;

    • good work, including workload and wellbeing,

    • jobs, including job security, job roles and job status;

    • teaching and learning;

    • training and professional development needs;

  • include a technical audit;

  • provide a statement about the process for stakeholder participation, including union involvement and the period for wider consultation; and

  • identify the issues and risks and the actions to mitigate risks and issues that have been identified.

Data Protection Impact Assessment

The UK GDPR requires the employer to undertake a data protection impact assessment (DPIA) where processing is likely to result in a high risk to individuals. High risk includes the processing of special category data (such as data relating to health or ethnic background), the use of profiling or automated decision-making to make judgements about people (e.g. for performance management and pay progression), processing personal data in a way that involves tracking individuals’ online or offline location or behaviour and processing of children’s personal data for profiling or automated decision-making.

The Information Commissioner Office (ICO) guidance also clarifies that it is good practice to undertake a DPIA even when it is not required. You should press for a collective agreement to include the commitment to undertake a DPIA when a new technology is being considered.

Equality Impact Assessment

The employer has obligations in relation to equalities and human rights legislation.

These obligations also apply to third parties carrying out public functions on behalf of the employer. The PSED requires schools in England, Wales and Scotland to be proactive in eliminating unlawful discrimination, advancing equality of opportunity and fostering good relations between groups.

Equality impact assessments are a vital means for identifying and mitigating potential risks as well as identifying and promoting positive practice. You should press for the agreement to set out how these responsibilities and requirements will be addressed.

Health and Safety Risk Assessment and Workload and Wellbeing Impact Assessments

Health and safety legislation requires employers to ensure the health, safety and welfare of employees as far as reasonably practicable. It also requires them to consult and work cooperatively with health and safety representatives to achieve this responsibility.

Further, it requires them to assess the risks and take action to prevent exposure to risks. The agreement should clarify that the employer or any third party carrying out functions for the employer will undertake a health and safety risk assessment and take action to mitigate any risks that are identified. This includes identifying and mitigating potential risks to teachers’ workload and wellbeing.

Governance and accountability

The UK GDPR imposes a framework of responsibilities and obligations for data controllers. If unions are playing a greater role in data governance arrangements, you will need to consider any additional responsibilities that may arise from this. The agreement should clarify the nature of union involvement and the responsibilities that arise from their involvement.

The agreement should clarify that the employer is the responsible body and will be liable for any harms that arise from the deployment of a digital technology or AI system. This includes ensuring that the technology or system is implemented in ways that comply with legislation and regulations and that data is kept secure and confidential. Further, the agreement should clarify that the employer is accountable for any errors or biases that arise from implementing the technology or system.

You should try to secure a participatory approach to data stewardship and the oversight of AI and digital technologies or systems. See the Ada Lovelace Institute’s framework for participatory stewardship of data (inform; consult, involve; collaborate; empower). The agreement should set out these commitments.

The agreement may need to address governance and accountability arrangements where a third party is providing or managing a digital technology, system or service on behalf of the school or setting. The agreement should clarify that these will be set out in procurement arrangements and contracts.

Things to ensure
Human expertise, human review and human contact

It would not be appropriate for AI or a digital technology (e.g. an adaptive tutoring or assessment system) to be introduced in ways that remove professional responsibilities or judgements about teaching and learning from the teacher. When responding to proposals to introduce AI or a digital technology to support teaching and learning, you might consider questions such as:

  • How does the technology or system interact with the role of the teacher?

  • Is it being used to enhance teaching and learning?

    • If so, does the teacher retain control over decisions about the pedagogical approaches used in the classroom as well as judgements about learners’ learning and progress? Or are these judgements being made by the AI system or programme?

  • If AI or digital technologies are being introduced to reduce teachers’ workload, does the teacher have choice about how they plan and deliver lessons?

  • Are teachers expected to adapt AI-generated lesson plans to ensure that they are inclusive and relevant to the class? And is time factored in to enable teachers to do this?

​​​​​You should press for the agreement to clarify that AI and digital technologies will not be used to remove professional responsibilities from the teacher and that the teacher will be able to exercise their professional judgement about the appropriate use of AI and digital technologies. The agreement should also clarify that teachers will have sufficient time to plan for and fulfil these responsibilities.

The agreement needs to clarify that teachers have the right to a human review if a technology or AI system is used for digital management purposes. However, performance management should be a constructive and developmental process, based on professional dialogue. You should press for the agreement to clarify that digital management systems will only be used in ways that are consistent with this position.

It will be useful if the agreement includes a commitment stating that data relating to staff will remain confidential to the individual and will not be used to judge or evaluate the performance of individuals.

AI and digital technologies should not be used in ways that could mislead people (e.g. staff, learners, parents) into believing that they are communicating with a person when they are actually communicating with the technology. The agreement should clarify this point and stress the importance of human contact in teaching and learning, particularly in relation to mental health and wellbeing.

Decisions are transparent and explainable

You will need to ensure that the agreement includes a commitment to let staff know when technology is being used to make decisions about them at work. It should also be clear when technology is being used to make decisions about learners. The way in which decisions are made should be easy to explain and to understand.

There should be enough information about the technology to ensure that teachers, leaders and the wider school community can trust that it will operate fairly. This information is also critical in order to challenge unfair or discriminatory decisions made by the technology, or to know when inaccurate or misleading data has been used.

An agreement should clarify that transparent and explainable decision-making applies to decisions that relate to job applicants, supply or agency teachers and staff and prospective learners.

An agreement might distinguish between information that is transparent and explainable for the purposes of general awareness and transparent and explainable information that is targeted at particular groups or individuals.

Information will need to be kept up to date and the agreement should cover the arrangements for ensuring this happens, along with the mechanisms for keeping the staff, union representatives and other members of the school community informed of any changes.

Equality, equity, diversity and inclusion 

The agreement should set out the steps that will be taken to ensure that the introduction, procurement and/or application of a technology or AI system does not result in unlawful discrimination or bias and that the school or setting fulfils its wider responsibilities in relation to equalities legislation.

The agreement should clarify the arrangements for assessing the impact of proposed systems and technologies and for monitoring, reviewing and evaluating systems and technologies when they are in place. You will need to ensure that this includes clarifying the roles of consultation and negotiation in these processes and the steps that will be taken to mitigate adverse impacts.

The agreement will need to cover the role of third parties that design, develop and/or manage a technology or system on behalf of the school or setting. It may be necessary to obtain information from the third party in order to ensure that principles and commitments are being met. It will be helpful if the agreement clarifies that teachers’/workers’ rights and learners’ rights must be recognised and should be prioritised over intellectual property rights.

You might use the school’s or setting’s responsibilities in respect of equalities legislation, data protection, safeguarding and wellbeing to press for this to be included in the agreement and reflected through contracts and procurement arrangements.

Good work

The agreement should address NASUWT’s good work principles that:

  • technologies are designed, developed, procured and implemented in ways that protect jobs and workers’ rights and secure good working conditions;

  • teachers have sufficient and equal access to personalised training and continuing professional development and learning (CPDL) to enable them to make informed decisions about the use of digital technologies in their teaching; and

  • where digital technologies enable monitoring of a teacher’s practice, this information is controlled by the teacher and, where used, is only used for self-reflection and personal development purposes.

It is often claimed that a key benefit of digital technology or AI is that it will reduce teachers’ and leaders’ workload and improve their wellbeing. However, NASUWT evidence finds that the introduction of digital technologies often has the opposite effect.

You should ensure that the agreement addresses the factors that will help to reduce workload; for example, a commitment that new tasks will replace old tasks as well as the broader commitment to tackle the generators of workload.

The agreement should clarify that a workload impact assessment will be undertaken before decisions are made to introduce a technology or make changes to working practices. The agreement should also commit to ongoing monitoring and periodic reviews of the technology’s impact on workload and wellbeing.

If teachers use digital technologies or AI to plan lessons or develop resources for use in their classrooms, they will need time to critique, contextualise and adapt the resource for use in their teaching in order to meet the needs of the learners they teach. You should press for the agreement to clarify that teachers will be given time within the working day to critique and adapt digital technology and AI-generated resources so that they can be used effectively.

You will need to ensure that the agreement addresses the wellbeing of staff. It should include a commitment to assess the impact of a technology or system on wellbeing and set out the arrangements for monitoring and reviewing wellbeing, including the mechanisms for consulting and obtaining feedback from staff.

The mechanisms for obtaining feedback from staff should include the right to report any unintended or unplanned consequences of a technology or system.

You should press for the agreement to make reference to policies that recognise and seek to mitigate the risks of digital technologies and AI on workload, work intensification and wellbeing.

The agreement should clarify that teachers should not be contactable or required to respond outside of working hours. It might also commit to a technology being designed or implemented in a way that prevents access or responses outside agreed hours.

Staff and learners have power and control over their data

A collective agreement must address the collection and use of data. This will need to be picked up in relation to data protection, privacy and rights under the UK GDPR, including rights relating to transparency and explainability. You should also press for the agreement to address broader factors that enable staff and learners to have power and control over their data.

The agreement should include a commitment that data relating to staff will remain confidential to the individual and will not be used for individual performance management. It should also include commitments to train and develop staff and learners so that they have the necessary knowledge and understanding to have control and influence over their data.

Further, you should seek a commitment that union representatives will be provided with the training on the use of data, digital technologies and digital management systems so that they can carry out their representation and negotiating roles effectively.

You should aim to get commitments that ensure effective procedures and safeguards are in place. These need to cover the collection of data, how it is stored, how it is used, whether and how the data is shared and how the school or setting will ensure that data is fair and accurate.

The commitments also need to address restrictions on the use of data beyond its original agreed use, restrictions on selling the data and provide assurance that individuals or groups will not be subject to any detriment as a result of inaccurate processing of data.

Just distribution of reward

Employers may seek to use digital technologies and AI systems to respond to systemic pressures such as the teacher recruitment and retention crisis and school funding pressures. They might also seek to use them to disrupt or secure changes that serve their interests.

It is vital that digital technologies and AI are introduced and implemented in ways where the educational benefits are demonstrable, but also where the rewards are shared and their introduction and uses have been agreed.

You might use factors relating to equality, equity and inclusion, good work and training and development to challenge attempts to introduce or use digital technologies or AI for negative purposes. However, it will also be important to press for the agreement to recognise that the positive benefits of digital technologies and AI should be shared across the workforce and the wider school community.

A collective agreement might include a commitment to reinvest savings from the use of AI and digital technologies in training and professional development, higher pay, measures to reduce the workload, improve the wellbeing and work-life balance of all staff and/or improve inclusion of all learners, particularly those who have additional needs.

Information and resources

Further information and sources of support

NASUWT

Artificial Intelligence and Digital Technologies - principles for the ethical design, development, procurement and implementation of digital technologies and AI in education

(The webpage provides links to other sources of help and further information relating to digital technologies and AI)

Other NASUWT publications and resources that may be helpful when negotiatinga collective digital agreement include:

External resources

The following resources may be particularly useful when negotiating a collective digital agreement:

Appendix – Things to ask a third party provider

Information that might be needed to fully understand an AI system or technology and how it operates:

  • the programme developer and implementer of the system;

  • a description of the objectives of the system;

  • details of the training data and data sources;

  • details of the variable used and how they have been weighted or prioritised;

  • information on the performance of any impact study and/or independent external audit and its results;

  • assessment of the percentage of false positives and false negatives expected or identified by the developer; and

  • any proposals for evaluations and adjustments if risks or harms are identified.

If AI is being used as part of an automated management system, it will be important to obtain information about the following:

  • the stages of any machine learning taking place;

  • how the problem has been defined;

  • the data being used;

  • a summary statistics review; and

  • data partitioning (including the model used for selection, training and deployment).

Source - TUC, People-powered technology, collective agreements and digital management systems