Author | Simon Lewis |
Created | April 2026 |
To be reviewed | April 2028 |
Principles:
At JCG we are committed to transforming learning and teaching by using innovative technology. On leaving JCG, we aspire for every student to possess advanced digital skills that will give them the greatest opportunities in a fast-changing world. Central to this vision is for students and staff to become informed, innovative and expert users of Artificial Intelligence (AI) tools. Training in using AI to support and deepen learning and teaching, and to reduce workload for all staff (teaching and non-teaching) forms an important part of our digital curriculum and professional development provision. As part of our commitment to encourage the safe, age-appropriate use of technology, including AI, we will ensure that students continue to engage in the often difficult struggle of learning and are taught to think critically about information presented to them. This policy is intended to ensure that students, parents and staff are clear about what constitutes responsible and ethical use of AI for learning and teaching.
Aims
To provide clarity regarding what use students can and cannot make of AI tools in their homework, classwork, and assessment processes.
To ensure that teaching staff are properly trained in relevant and up-to-date AI tools for lesson preparation and to model exemplary use of AI to students.
To support non-teaching staff in the use of AI tools to reduce workload.
To ensure that parents are informed about how to support students in their use of AI.
Objectives
To provide a clear framework for appropriate student use of AI tools as they progress through College.
To ensure that all staff and students receive training about the ethical use of AI which includes bias, incorrect output, prompt engineering, image creation and copyright.
To ensure that parents are informed of the College’s approach to AI use and given guidance for how they can best support their daughters in the ethical use of AI.
To provide opportunities for staff to receive training on the tools available in order to increase efficiency and reduce workload in teaching, support and operations.
To provide clarity to staff about compliance issues that impact the use of AI including GDPR and copyright, and tools (such as AI detection) to ensure that staff are able to assess student work accurately.
Key Principles guiding safe and effective use of AI1:
Microsoft Copilot is the only approved Large Language Model for use across the States of Jersey public sector. This States of Jersey policy is grounded on the security property of Microsoft Co-pilot which ensures that all data remains within the States of Jersey security boundary. Use of other AI tools risks sharing of personal or copyright data and is a breach of States of Jersey policy.
The intended use of AI by staff and students should have specified and clear benefits.
Students should only use generative AI in school with appropriate safeguards in place including appropriate filtering and monitoring.
Personal data must be protected in line with data protection legislation.
Staff must be aware of the intellectual property considerations of uploading student work to an AI platform. Materials created by students (and other teachers) may be copyright material.
Staff must not cause student work to be used to train an AI model without the specific permission of the student or parent.
Students and staff need to be aware of regulations concerning malpractice in the use of AI for assessments including coursework. Specifically,
(1) that students must not use AI in any form of assessment, and doing so will be considered malpractice
(2) that any use of AI in NEAs or coursework must be acknowledged by students
(3) that AI cannot be used as the sole marker in assessments including NEAs.
Students should be encouraged to take responsibility for:
Complying with the following age restrictions on AI use:
Years 7 & 8: students are not permitted to use text-based generative AI tools such as ChatGPT, Microsoft Copilot, Claude, Gemini, Perplexity.ai for school work. They may use platforms that use embedded AI such as Century Tech, Duolingo, Seneca, Scratch with AI, and Khanmigo.
Year 9 – 13: students may use Microsoft Copilot for ethical educational use (see Appendix 1), subject to obtaining parental permission.
Following the guidance given in Guidance to students for using AI (Appendix 1)
Ensuring academic openness and honesty in all their interactions with teachers
Complying with the AI requirements in the Acceptable Use Agreement; in particular, the creation or distribution of any false, misleading, or harmful text, image, audio or video material is strictly prohibited
Being familiar with regulations concerning AI use for examinations (including coursework and NEAs)
Parents have responsibility for:
Supporting the College’s ethos in setting expectations for ethical and responsible use of AI tools
Subject Teachers have responsibility for:
Following guidance in the States of Jersey AI Policy SoJ AI Policy 2025
Using only Microsoft Copilot (as an LLM)for all school work
Modelling exemplary use of AI in lesson preparation and delivery, and discussing with students effective and ethical use of AI for learning
Giving students in Y9-13 opportunities to develop expertise in the innovative use of AI for learning, including with revision
Undertaking any necessary professional development in order to develop greater understanding of AI tools for learning
Not using AI to create or distribute any false, misleading, or harmful text, sound, images or video material
Reporting immediately to the DSL any incidents of harmful AI-related outputs
Not asking students in Year 7 and 8 to use AI for an assignment (they are younger than the 13+ minimum age restriction).
Ensuring that students in Year 9-13 are aware of when and for what purpose AI can be used in an assignment, ensuring that students understand that learning should be based on their own efforts.
Being transparent about the use they make of AI, for example by indicating AI used in the creation of a worksheet or other lesson material.
Designing homework tasks that encourage independent thinking, reasoning, practical application or personal reflection.
When identifying potential misuse of AI, using professional judgement and a clear understanding of the student’s typical work rather than relying solely on an AI detection tool (see Appendix 3 “Indications of Student Use of AI”)
Reporting suspected AI misuse to the HOF and/or HOS and taking action according to the Home Learning Policy.
Ensuring that outputs from generative AI used in class are checked for accuracy and appropriateness.
Not using AI to mark or grade student assessments.
Being aware, and raising awareness with students, of the sustainability issues around AI use in general and ensuring that prompts they use are necessary (i.e could not be made on a search engine) and well-considered.
Should States of Jersey policy change to permit the use of other LLMs in the future, all the above apply and in addition,
Subject teachers must:
Ensure that they do not enter or upload any data in a prompt which identifies a student or their work
Ensure that they do not upload any original creative work into an AI platform which trains on the user prompts (see below). Original creative work includes all student work, school policies or other school documents, and any copyrighted material
The following AI platforms process user-generated prompts and uploads to train the model:
AI | User prompts used to train model by default | May be disabled in Settings | Privacy statement |
ChatGPT | Yes | Yes | https://help.openai.com/en/articles/7730893-data-controls-faq |
Microsoft Copilot for 365 | No | N/A | https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy |
Claude | No | N/A | https://privacy.anthropic.com/en/articles/7996885-how-do-you-use-personal-data-in-model-training |
Gemini | Yes | Yes (change setting in Personalisation) | |
Perplexity.ai | No | N/A | https://docs.perplexity.ai/faq/faq#where-are-perplexitys-language-models-hosted |
Snapchat (MyAI) | Yes (and can be shared with other organisations) | No | https://help.snapchat.com/hc/en-us/articles/22535044112532-Learn-About-How-Snap-Uses-Data |
Heads of Department have responsibility for:
Promoting and sharing good practice in the application of AI to both teaching and learning
Ensuring that Schemes of Learning include guidance for students (in Year 9 -13) of what use may be made of AI in the completion of tasks, where relevant.
Investigating suspected AI misuse and taking action according to the Home Learning Policy.
Reporting any potential breach of JCQ’s regulations on AI misuse in Assessments to the HOF and Assistant Headteacher with responsibility for Operations (PM)
Heads of Faculty have responsibility for:
Supporting the Heads of Department in promoting and sharing good practice in the use of AI in the preparation and delivery of lesson materials or reducing workload.
Supporting the Heads of Department to ensure that any instances of suspected AI misuse in NEAs/Coursework are identified and acted upon, including reporting to the Assistant Headteacher with responsibility for Operations (PM)
Heads of School have responsibility for:
Supporting subject teachers in ensuring that students comply with ethical use of AI guidelines (see Appendix 1)
Liaising with parents, as appropriate, if there are repeated behaviour marks for AI misuse
Supporting students (including signposting appropriate help) who are affected by harmful AI-related outputs
Supporting all students in the appropriate use of AI within the curriculum
Bursar has responsibility for
Ensuring that AI use in College meets the requirements of relevant States of Jersey safeguarding, online harm and GDPR policies and legislation.
Assistant Headteacher (Digital Learning and Curriculum Design) has responsibility for:
Monitoring the implementation of this policy and ensuring that all staff and students remain informed and updated on the responsible use of AI
Encouraging the creative and innovative use of AI by staff in teaching and learning
Leading on the use of AI tools to increase efficiency in Operations and Support roles
Dealing with repeated or serious instances of AI misuse
Creating an information timetable that ensures that students are fully aware of the details of this policy
Seeking and collating student voice on all aspects of their Digital experience including ethical use of AI
Ensuring that parents are advised on details in this policy and are kept informed of any changes
Advising parents on ways in which student use of AI can be monitored at home
Assistant Headteacher (Student Guidance) has responsibility for:
Overseeing the implementation of the College’s Safeguarding Policy to deal with current and emerging AI risks to students, working with the AHT (DLCD) to risk assess current AI tools and practices
Ensuring that Heads of School receive necessary training to respond to incidents of harmful AI generated images or videos and their impact on students, including pathways for support
Reviewing annual staff safeguarding training to ensure that staff are aware of emerging safeguarding risks from AI
Ensuring that AI practice in lessons meets appropriate safeguarding thresholds
Exploring data privacy laws as required and their impact on dealing with harmful AI generated images on student devices
Assistant Headteacher (Character and Personal Development) has responsibility for:
Ensuring that the PSHE curriculum includes opportunities for students to learn about the opportunities, benefits and risks that AI offers.
Principal has responsibility for:
Overseeing the application of this policy
Providing appropriate support and necessary action to ensure the policy has a positive impact on learning and achievement and quality of teaching