Artificial Intelligence Policy

Author

Simon Lewis

Created

April 2026

To be reviewed

April 2028

Principles: 

 At JCG we are committed to transforming learning and teaching by using innovative technology. On leaving JCG, we aspire for every student to possess advanced digital skills that will give them the greatest opportunities in a fast-changing world.  Central to this vision is for students and staff to become informed, innovative and expert users of Artificial Intelligence (AI) tools. Training in using AI to support and deepen learning and teaching, and to reduce workload for all staff (teaching and non-teaching) forms an important part of our digital curriculum and professional development provision. As part of our commitment to encourage the safe, age-appropriate use of technology, including AI, we will ensure that students continue to engage in the often difficult struggle of learning and are taught to think critically about information presented to them.  This policy is intended to ensure that students, parents and staff are clear about what constitutes responsible and ethical use of AI for learning and teaching. 

 

Aims 

  • To provide clarity regarding what use students can and cannot make of AI tools in their homework, classwork, and assessment processes. 

  • To ensure that teaching staff are properly trained in relevant and up-to-date AI tools for lesson preparation and to model exemplary use of AI to students. 

  • To support non-teaching staff in the use of AI tools to reduce workload. 

  • To ensure that parents are informed about how to support students in their use of AI. 

 

Objectives  

  • To provide a clear framework for appropriate student use of AI tools as they progress through College. 

  • To ensure that all staff and students receive training about the ethical use of AI which includes bias, incorrect output, prompt engineering, image creation and copyright. 

  • To ensure that parents are informed of the College’s approach to AI use and given guidance for how they can best support their daughters in the ethical use of AI. 

  • To provide opportunities for staff to receive training on the tools available in order to increase efficiency and reduce workload in teaching, support and operations. 

  • To provide clarity to staff about compliance issues that impact the use of AI including GDPR and copyright, and tools (such as AI detection) to ensure that staff are able to assess student work accurately. 

Key Principles guiding safe and effective use of AI1: 

  • Microsoft Copilot is the only approved Large Language Model for use across the States of Jersey public sector. This States of Jersey policy is grounded on the security property of Microsoft Co-pilot which ensures that all data remains within the States of Jersey security boundary. Use of other AI tools risks sharing of personal or copyright data and is a breach of States of Jersey policy. 

  • The intended use of AI by staff and students should have specified and clear benefits. 

  • Students should only use generative AI in school with appropriate safeguards in place including appropriate filtering and monitoring. 

  • Personal data must be protected in line with data protection legislation. 

  • Staff must be aware of the intellectual property considerations of uploading student work to an AI platform. Materials created by students (and other teachers) may be copyright material. 

  • Staff must not cause student work to be used to train an AI model without the specific permission of the student or parent. 

  • Students and staff need to be aware of regulations concerning malpractice in the use of AI for assessments including coursework. Specifically,  
    (1) that students must not use AI in any form of assessment, and doing so will be considered malpractice 
    (2) that any use of AI in NEAs or coursework must be acknowledged by students 
    (3) that AI cannot be used as the sole marker in assessments including NEAs. 

 

1 https://www.gov.uk/government/publications/generative-artificial-intelligence-in-education/generative-artificial-intelligence-ai-in-education 
 

 
 

Students should be encouraged to take responsibility for: 

  • Complying with the following age restrictions on AI use: 
     
    Years  7 & 8: students are not permitted to use text-based generative AI tools such as ChatGPT, Microsoft Copilot, Claude, Gemini, Perplexity.ai for school work. They may use platforms that use embedded AI such as Century Tech, Duolingo, Seneca, Scratch with AI, and Khanmigo. 
     
    Year 9 – 13: students may use Microsoft Copilot for ethical educational use (see Appendix 1), subject to obtaining parental permission.  
     

  • Following the guidance given in Guidance to students for using AI (Appendix 1) 

  • Ensuring academic openness and honesty in all their interactions with teachers 

  • Complying with the AI requirements in the Acceptable Use Agreement; in particular, the creation or distribution of any false, misleading, or harmful text, image, audio or video material is strictly prohibited 

  • Being familiar with regulations concerning AI use for examinations (including coursework and NEAs) 
     

Parents have responsibility for:  

  • Supporting the College’s ethos in setting expectations for ethical and responsible use of AI tools 
     

Subject Teachers have responsibility for:  

  • Using only Microsoft Copilot (as an LLM)for all school work 

  • Modelling exemplary use of AI in lesson preparation and delivery, and discussing with students effective and ethical use of AI for learning   

  • Giving students in Y9-13 opportunities to develop expertise in the innovative use of AI for learning, including with revision 

  • Undertaking any necessary professional development in order to develop greater understanding of AI tools for learning 

  • Not using AI to create or distribute any false, misleading, or harmful text, sound, images or video material 

  • Reporting immediately to the DSL any incidents of harmful AI-related outputs 

  • Not asking students in Year 7 and 8 to use AI for an assignment (they are younger than the 13+ minimum age restriction). 

  • Ensuring that students in Year 9-13 are aware of when and for what purpose AI can be used in an assignment, ensuring that students understand that learning should be based on their own efforts. 

  • Being transparent about the use they make of AI, for example by indicating AI used in the creation of a worksheet or other lesson material. 

  • Designing homework tasks that encourage independent thinking, reasoning, practical application or personal reflection. 

  • When identifying potential misuse of AI, using professional judgement and a clear understanding of the student’s typical work rather than relying solely on an AI detection tool (see Appendix 3 “Indications of Student Use of AI”) 

  • Reporting suspected AI misuse to the HOF and/or HOS and taking action according to the Home Learning Policy. 

  • Ensuring that outputs from generative AI used in class are checked for accuracy and appropriateness. 

  • Not using AI to mark or grade student assessments. 

  • Being aware, and raising awareness with students, of the sustainability issues around AI use in general and ensuring that prompts they use are necessary (i.e could not be made on a search engine) and well-considered. 

https://assets.publishing.service.gov.uk/media/6842f27f57f3515d9611f067/Module_3_Developing_the_safe_use_of_AI_in_education_-_Transcript.pdf 

 

Should States of Jersey policy change to permit the use of other LLMs in the future, all the above apply and in addition,

Subject teachers must: 

  • Ensure that they do not enter or upload any data in a prompt which identifies a student or their work 

  • Ensure that they do not upload any original creative work into an AI platform which trains on the user prompts (see below). Original creative work includes all student work, school policies or other school documents, and any copyrighted material 
     
    The following AI platforms process user-generated prompts and uploads to train the model:  

 

AI 

User prompts used to train model by default 

May be disabled in Settings 

Privacy statement 

ChatGPT 

Yes 

Yes 

https://help.openai.com/en/articles/7730893-data-controls-faq 

Microsoft Copilot for 365 

No 

N/A 

https://learn.microsoft.com/en-us/copilot/microsoft-365/microsoft-365-copilot-privacy 

Claude 

No 

N/A 

https://privacy.anthropic.com/en/articles/7996885-how-do-you-use-personal-data-in-model-training 

Gemini 

Yes 

Yes (change setting in Personalisation) 

https://support.google.com/gemini/answer/13594961?hl=en 

Perplexity.ai 

No 

N/A 

https://docs.perplexity.ai/faq/faq#where-are-perplexitys-language-models-hosted 

Snapchat (MyAI) 

Yes (and can be shared with other organisations) 

No 

https://help.snapchat.com/hc/en-us/articles/22535044112532-Learn-About-How-Snap-Uses-Data 

 

Heads of Department have responsibility for: 

  • Promoting and sharing good practice in the application of AI to both teaching and learning 

  • Ensuring that Schemes of Learning include guidance for students (in Year 9 -13) of what use may be made of AI in the completion of tasks, where relevant. 

  • Investigating suspected AI misuse and taking action according to the Home Learning Policy. 

  • Reporting any potential breach of JCQ’s regulations on AI misuse in Assessments to the HOF and Assistant Headteacher with responsibility for Operations (PM) 
     

Heads of Faculty have responsibility for:  

  • Supporting the Heads of Department in promoting and sharing good practice in the use of AI in the preparation and delivery of lesson materials or reducing workload. 

  • Supporting the Heads of Department to ensure that any instances of suspected AI misuse in NEAs/Coursework are identified and acted upon, including reporting to the Assistant Headteacher with responsibility for Operations (PM) 

 

Heads of School have responsibility for: 

  • Supporting subject teachers in ensuring that students comply with ethical use of AI guidelines (see Appendix 1) 

  • Liaising with parents, as appropriate, if there are repeated behaviour marks for AI misuse 

  • Supporting students (including signposting appropriate help) who are affected by harmful AI-related outputs 

  • Supporting all students in the appropriate use of AI within the curriculum 

Bursar has responsibility for 

  • Ensuring that AI use in College meets the requirements of relevant States of Jersey safeguarding, online harm and GDPR policies and legislation. 

Assistant Headteacher (Digital Learning and Curriculum Design) has responsibility for: 

  • Monitoring the implementation of this policy and ensuring that all staff and students remain informed and updated on the responsible use of AI 

  • Encouraging the creative and innovative use of AI by staff in teaching and learning 

  • Leading on the use of AI tools to increase efficiency in Operations and Support roles 

  • Dealing with repeated or serious instances of AI misuse 

  • Creating an information timetable that ensures that students are fully aware of the details of this policy 

  • Seeking and collating student voice on all aspects of their Digital experience including ethical use of AI 

  • Ensuring that parents are advised on details in this policy and are kept informed of any changes 

  • Advising parents on ways in which student use of AI can be monitored at home 

 

Assistant Headteacher (Student Guidance) has responsibility for: 

  • Overseeing the implementation of the College’s Safeguarding Policy to deal with current and emerging AI risks to students, working with the AHT (DLCD) to risk assess current AI tools and practices 

  • Ensuring that Heads of School receive necessary training to respond to incidents of harmful AI generated images or videos and their impact on students, including pathways for support 

  • Reviewing annual staff safeguarding training to ensure that staff are aware of emerging safeguarding risks from AI 

  • Ensuring that AI practice in lessons meets appropriate safeguarding thresholds 

  • Exploring data privacy laws as required and their impact on dealing with harmful AI generated images on student devices 

 

Assistant Headteacher (Character and Personal Development) has responsibility for: 

  • Ensuring that the PSHE curriculum includes opportunities for students to learn about the opportunities, benefits and risks that AI offers. 

 

Principal has responsibility for: 

  • Overseeing the application of this policy 

  • Providing appropriate support and necessary action to ensure the policy has a positive impact on learning and achievement and quality of teaching 

Appendix 1 

Guidance to students for using AI 
 

Principles 

 AI can provide useful help in many areas of student learning including getting started on an assignment, self-testing, and creating revision resources. Students need to be aware that: 
1. Over-reliance on AI can reduce a student’s capacity for developing independent thinking, learning, knowledge and understanding. 

2. Outputs from AI platforms may be factually incorrect (such as including quotations from characters in books that are made up or a solution to a Maths problem which is wrong), biased, simplistic, incomplete or misleading. 

3. Free-to-use AI platforms will often not meet data protection requirements and so may use student data or prompts for marketing or training purposes. 

4. AI data centres use vast quantities of water and energy and so any AI use has implications for sustainable living. 

 

Guidance 

Students in Year 7 and 8 are not allowed to use AI platforms such as ChatGPT, Microsoft Copilot, Claude, Gemini, My AI. 

Students in Year 9 to 13 may use Microsoft Copilot for educational use. Legitimate, and beneficial, uses may include: 
 

  • to help generate ideas 

  • to explain difficult concepts or ideas in simpler terms 

  • to explain new or subject specific vocabulary 

  • to generate self-testing questions 

  • to help create notes or other resources for revision 

  • to help plan a piece of work 

  • to create appropriate images  

 

Students are responsible for checking the accuracy and relevance of any information that AI generates.  

 Students must reference in any work or assignment any use made of AI including example prompts used. 

 Students may not use AI platforms to: 

  • create or distribute any false, misleading, offensive, or harmful text, image, audio or video material 

  • contribute to any part of their work towards an NEA unless it complies with JCQ requirements (see JCQ’s “AI and Assessments A quick guide for students”)  

  • directly answer homework questions or assignments 

  • generate or improve all or part of an answer to a written homework or assessment task, for example, by weaving AI generated text into their own answer 

  • improve spelling or punctuation on a task where those are to be explicitly assessed 

 

Students should be aware that submitting work that is not their own is plagiarism and will be taken extremely seriously. Teachers will check student work for potential AI misuse and apply sanctions according to the Home Learning policy. 

Appendix 2 

 

Guidance to staff for using AI 

Principles 

AI is a powerful tool that can enhance and improve many aspects of teaching and administration. It is important that staff receive regular training on AI in order to remain aware of and confident in using AI, and understand AI’s limitations in a fast changing landscape. Staff are encouraged to explore AI’s capabilities, to innovate, and to share good practice. 

 

Guidance 

Staff are encouraged to: 
(the items below draw heavily on North London Collegiate School’s Responsible AI Policy NCLS AI policy): 

  • use AI to improve lesson planning: by creating outline lesson plans, schemes of work, lesson resources, including up to date geographical, social or political information and handouts, which should then be checked, refined and adapted in order to ensure accuracy and relevance. 
     

  • use AI to support the delivery of lesson activities: these could include questions for discussions, role plays in which the AI takes on the key figure, and the creation of additional questions on particular syllabus areas to extend or deepen thinking. 
     

  • use AI to reduce workload and increase efficiency: helping with drafting emails, summarising documents, analysing anonymised data (that does not disclose any personal information), assist with administrative tasks. 
     

  • use AI to assist in report writing, as long as the generated text is then reviewed and adapted to ensure that it accurately reflects their own judgements, observations and assessments of the students’ performance. 
     

  • use AI to prepare for meetings or potentially difficult conversations, subject to not disclosing any information that could identify a student. 

 

 Restrictions and limitations 

  • Microsoft Copilot is the only LLM approved for work use for States of Jersey employees 
     

Should other LLMs be permitted in the future: 

  • Staff must not upload or disclose to an AI platform any school or student data information that can be used to train the model (see Table on p4-5) 
     

  • Staff must not upload to AI any student or staff created material, or any original material from other sources, that can be used to train the model (see Table on p4-5) 

 

 

 

Appendix 3

Indications of AI Use in student work 

 

The following is reproduced with permission from the “Artificial Intelligence” Policy of Clifton High School  

https://cliftonhigh.static.amais.com/Artificial_Intelligence__10_24_v_1pdf-1763.pdf?version=638646709688170000 

 
If you see the following in student work, it may be an indication that they have misused AI:  

  • A default use of American spelling, currency, terms and other localisations; 

  • A default use of language or vocabulary which might not be appropriate to the qualification level; 

  • A lack of direct quotations and/or use of references where these are required/expected;  

  • Inclusion of references which cannot be found or verified (some AI tools have provided false references to books or articles by real authors);  

  • A lack of reference to events occurring after a certain date (reflecting when an AI tool’s data source was compiled), which might be notable for some subjects; Instances of incorrect/inconsistent use of first-person and third-person perspective where generated text is left unaltered; 

  •  A difference in the language style used when compared to that used by a pupil in the classroom or in other previously submitted work;  

  • A variation in the style of language evidenced in a piece of work, if a pupil has taken significant portions of text from AI and then amended this;  

  • A lack of graphs/data tables/visual aids where these would normally be expected;  

  • A lack of specific local or topical knowledge;  

  • Content being more generic in nature rather than relating to the pupil themself, or a specialised task or scenario, if this is required or expected; 

  •  The inadvertent inclusion by pupils of warnings or provisos produced by AI to highlight the limits of its ability, or the hypothetical nature of its output;  

  • The submission of pupil work in a typed format, where their normal output is handwritten; 

  • The unusual use of several concluding statements throughout the text, or several repetitions of an overarching essay structure within a single lengthy essay, which can be a result of AI being asked to produce an essay several times to add depth, variety or to overcome its output limit;  

  • The inclusion of strongly stated non-sequiturs or confidently incorrect statements within otherwise cohesive content;  

  • Overly verbose or hyperbolic language that may not be in keeping with the candidate’s usual style. 

 

Relationship to other policies 

 

  1. External Policies: Data Protection, Keeping Children Safe in Education 

  2. Internal Policies:   

  • Home Learning Policy 

  • Examinations Policy 

  • Acceptable Use Agreement 

Close

Select Language

Close