Categories
Advice and Guidance

Age Restrictions and Consent to use Generative AI

Version 1.1 – 3rd Sept 2024. First version published 9th Aug 2024

One key challenge institutions are facing when looking to adopt Generative AI relates to age restrictions and consent issues for learners under 18. This is not a new problem; age restrictions have been in place for several years when using online tools, and institutions have long been investigating and providing learners with preferred tools.

Unfortunately, two of the most popular options (Microsoft Copilot with commercial data protection and Google Gemini) are not currently available to learners under the age of 18.

In this blog, we explore potential options for learners under 18 to access generative AI tools, including the necessity of obtaining parental or guardian consent for some tools.

It is also useful to understand the perspectives of AI tool providers and their reasons for implementing age restrictions in compliance with data processing laws and regulations, so we also explore the legal situation.

 

Popular options not available to learners under 18

Microsoft Copilot (with commercial data protection)

Users must be over 18 to access the version of Copilot that you log in to with your college account. This is the version that protects users’ data (known as ‘commercial data protection’).

Individual Microsoft account holders under 18 can technically access Copilot with parent or guardian permission. However, this would require users under 18 to access Copilot with personal Microsoft accounts, for which they need consent, while using their education accounts for all other activities. The complexity and administrative burden of having users regularly access two Microsoft accounts makes this an impractical option.

Microsoft Copilot, however, still remains the best option for staff in colleges as many institutions are already using Microsoft tools and it gives commercial data protection.

Note: Microsoft have mentioned the possibility of a version of Copilot with commercial data protection for learners ages 13-17. We don’t have any more information about this at the moment but will share it when we do.

Google Gemini

All Google users (work, school and personal account holders) in the UK must be over 18 to access Google Gemini.

Options available to learners under 18 without parental consent

We have picked Grammarly and Canva as the best options here, as they are both already widely used in colleges. They might seem slightly unusual suggestions as they aren’t traditional chatbots but both offer a range of generative AI features.

Grammarly

Individual Grammarly accounts are available without parental consent for users in the UK who are 16 or older. Institutions with Grammarly for Education subscriptions can provide access to learners younger than 16, provided they obtain permission from a parent or guardian.

The downside of Grammarly is that the number of prompts is limited to 100 per month for free users. However, it is an education-focused tool, and has several features to support the responsible use of generative AI including suggesting appropriate prompts and encouraging users to acknowledge their use of AI.

Canva

Individuals who are 13 or older can use Canva without parental consent and it is one of the best options currently available to learners for image generation.

Snapchat

Individuals who are 13 or older can use Snapchat without parental consent. Snapchat’s AI buddy is based on OpenAI’s GPT models. Although a web version is available, learners will still need a phone to sign up. Given that it is primarily a chat tool, its use will need to be carefully managed. We mention Snapchat due to its popularity with learners, though we acknowledge that there may be concerns about its use in an educational setting.

Options available to learners under 18 with parental consent

There are many options here, but we have picked two that we think are particularly relevant to education – ChatGPT because of its pervasiveness, and Perplexity, because of its features, which make it particularly relevant to education.

ChatGPT

Individual personal ChatGPT accounts are available for users aged 13 and older with parent or legal guardian permission.

Perplexity

Individual personal Perplexity accounts are available for users aged 13 and older with parent or legal guardian permission.

You may be less familiar with Perplexity.  Its main focus is in Generative AI search, which is in itself useful for education due to the way it provides references, but it has also grown to provide strong Chatbot type features.

Gaining parental consent

We asked the AI in education community to share what institutions are doing to obtain parental consent. So far, we have two examples we can show.

University of the Highlands and Islands

As part of the enrolment process, lecturers will distribute parent and guardian consent forms. They are then responsible for obtaining and retaining these forms. The consent form includes information on how the tool will be used and the reasoning behind it. Only AI tools that have been approved by the university can be used and listed on the consent form, with a section at the bottom to address age-related access.

University of the Highlands and Islands – AI Consent Form

Woking College

Woking College has gained consent by sending a letter to all parents and guardians in which they explain their approach to AI. The college then assume consent is given unless they are contacted to opt-out.

The letter explains that the college is proactively developing strategies to guide the safe, effective, and responsible use of AI tools. It then outlines the college’s AI principles and specifies which AI tools will be used, along with the goals and reasoning for their use.

Woking College’s – AI Consent Form

As we can see, both consent forms include information about the reasons and justifications, much like existing processes to gain consent for using IT systems.

 

What about running your own Generative AI service?

A number of colleges have opted to run their own Generative AI services, by running an ‘Open-Source’ model locally or by developing a service that calls the APIs of a provider such as OpenAI.  Both these approaches remove the legal need to obtain consent from parents.  However, this must be balanced by the overheads of providing such a service, including ensuring it is secure, reliable and is continuously maintained.  We think for most colleges the best option is to use a commercially available solution, but obviously, requirements and resources will vary between colleges.

 

Why the variations? What does the law say?

Age restrictions on using generative AI are influenced by a combination of platform policies, legal requirements, and ethical considerations. Compliance with laws like Children’s Online Privacy Protection Rule (COPPA), the GDPR and the Children’s Code is essential. Most platforms use self-declaration as a means of consent to processing data, where a user states their age without providing evidence to confirm it. The minimum age for this type of consent is generally 13 years old.

The use of this self-declaration method and 13 as a minimum age requirement by platforms is based on COPPA. This US federal law outlines the responsibilities of organisations that collect and use data from children. It mandates specific protections when a company collects personal information from a child under the age of 13. COPPA demands that you seek parental consent. However, over the age of 13 children can effectively consent for themselves and it is compliant with COPPA principles to allow children to confirm themselves to be 13 or over at the sign-up stage without any further checks.

Within Europe and the UK, Article 8 of the GDPR states the conditions applicable to children’s consent in relation to information services. The processing of the personal data of a child shall be lawful where the child is at least 16 years old.  If the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child. The 16-year age limit is not absolute. Under Article 8(1) GDPR, Member States can adjust this age requirement to anywhere between 13 and 16 years.

Domestically, the UK GDPR states that 13 is the age at which children can consent to the processing of their personal data in the context of information technology services. Children aged 13 years and over may lawfully provide their own consent for the processing of their personal data. An adult with parental responsibility must provide consent for processing if the child is under 13.

And finally…

There are many generative AI tools on the market, and their terms and conditions are ever-changing. It is worth regularly checking these regarding age restrictions and tailoring AI literacy training accordingly to ensure responsible use. Many FE colleges have the building blocks in place to utilise generative AI; however, it is crucial to determine which AI tools to use and ensure they align with the institution’s AI strategy and AI literacy guidance for staff and learners.

We hope this blog post has been helpful, we have also published a blog that looks into the pros and cons of different AI tools which you can read here Licensing Options for Generative AI. If you would like to share your thoughts and how you have tackled this, please get in touch.

 


Change log

V1 .1 – 03 Sept 2024

Options available to learners under 18 without parental consent – Snapchat added.


Find out more by visiting our Artificial Intelligence page to view publications and resources, join us for events and discover what AI has to offer through our range of interactive online demos.

For regular updates from the team sign up to our mailing list.

Get in touch with the team directly at AI@jisc.ac.uk

 

3 replies on “Age Restrictions and Consent to use Generative AI”

Thanks for the update. This is something that we are currently trying to navigate. Is there any possibility that the 2 organisations who have generously share their consent forms (thank you!) would be able to offer some insight into how they plan to deal with opt-outs or non-consent?

Thanks for your comment. Opting out would work similarly to when a parent or guardian doesn’t give permission for photos to be taken. In this case, the learner wouldn’t be able to use AI tools. A conversation to explain further might be helpful.

Leave a Reply

Your email address will not be published. Required fields are marked *