AI Generative Tools: Could Age Restrictions in Licensing Terms be Considered Discriminatory?
The proliferation of AI technologies, particularly generative tools, has brought with it a host of legal and ethical concerns. In Australia, there are laws to protect minors' privacy and data, but these do not automatically prevent children from using AI tools.
AI software providers often implement age restrictions in their licensing terms due to concerns over intellectual property protection and the limited capacity of minors to enter binding contracts. While these concerns are valid, such restrictions can sometimes be overly broad, excluding responsible young users and raising potential issues under anti-discrimination laws.
Specific Protections in Australia
In Australia, there are legal frameworks that protect minors' access to certain technologies, particularly concerning privacy and data collection. The Privacy Act 1988 (Cth) (Privacy Act) and the Australian Privacy Principles (APPs) regulate the collection, use, and disclosure of personal information, including that of minors. These laws ensure that minors' data is protected, particularly in online environments. However, while these privacy protections are strong, they do not outright prohibit minors from accessing generative AI tools or similar technologies. Instead, they tend to focus on how companies handle minors' data, particularly ensuring that any data collection from minors is done with appropriate consent.
That said, privacy laws can influence the licensing terms for generative AI tools. For example, a company offering an AI tool might impose age restrictions in its terms of use to comply with privacy regulations, such as ensuring that users under a certain age are not subject to unnecessary data collection without parental consent. These privacy driven restrictions are different from those that prohibit access solely based on a user's age without any underlying justification, such as protecting intellectual property (IP).
Minor's Limited Capacity to Enter Binding Contracts
One key consideration for software providers when setting age restrictions is the limited capacity of minors to enter legally binding contracts. In most legal systems, including in Australia, minors are not able to enter into contracts that are enforceable unless they are for essential goods or services. This legal principle is deigned to protect minors from potential exploitation or decisions that may not be in their best interests. This limited capacity can present a challenge for software providers, as they may be unable to enforce the terms of use for minors in the same way they can for adults.
For instance, if a minor were to enter into a licensing agreement for a generative AI tool, they may later void the contract, leaving the provider in a vulnerable position. This is particularly problematic if the licensing agreement includes provisions that protect the provider's IP or stipulate the ownership of adaptations and developments created using the software. If the minor is not bound by these terms, the provider risks losing control over their IP, which is a key concern for software providers.
This can be seen as a significant barrier for providers trying to create unbiased, fair licensing terms. In an effort to protect themselves legally, providers may implement age restrictions to avoid entering into contracts with minors altogether. This approach, while protecting the provider from the legal ramifications of an unenforceable contract, could also exclude potentially responsible young users from accessing the software, thus limiting opportunities for education, development, or innovation. It may not be that minors are inherently incapable of understanding and abiding by the licensing terms, but rather that the provider is erring on the side of caution, potentially resulting in an overly broad and restrictive licensing policy.
Anti-Discrimination Law
The Anti-Discrimination Act 1991 (Qld) (the Act) prohibits discrimination on various grounds, including age, in areas such as employment, education, and the provision of goods and services. While generative AI tools often have age restrictions in their terms of use, these restrictions must not violate anti-discrimination laws.
Under the Act, a service provider could face scrutiny if it restricts access to its service based solely on age without a legitimate and justifiable reason. Importantly, age discrimination is not automatically unlawful, there are exemptions. The Act allows for age based restrictions if they serve a legitimate purpose, such as ensuring the safety and wellbeing of users, or if they are necessary to comply with other legal frameworks, such as child protection laws.
Thus, if an AI tool imposes an age restriction to protect younger users or to comply with privacy laws, it is likely to be seen as justified and not discriminatory. However, if the restriction is arbitrary, overly broad, or lacks a clear rationale, it may be considered unlawful discrimination under the Act. Schools and other institutions enforcing such restrictions should be cautious, as this could result in unintended risks or unfair exclusion.
Licensing Terms and Discrimination
Licensing agreements for generative AI tools typically outline the conditions under which users can access and use the services. Age restrictions are often included in these terms for several reasons:
-
Privacy concerns: To comply with privacy regulations, such as ensuring that data from minors is not collected without proper consent.
-
Liability concerns: To ensure that users are capable of understanding and responsibly engaging with the technology.
-
Safety and welfare: To protect younger users from potentially harmful content or features that may not be suitable for their age.
While age based restrictions in licensing terms are often accepted by schools and other license holders, they may not always be reasonable or lawful, particularly if they are arbitrary, excessive, or unfairly exclude a particular age group without a legitimate reason. Restrictions based on age could face scrutiny under anti-discrimination laws if they lack a clear and justifiable purpose.
For example, if a generative AI tool were to impose a blanket age restriction that excludes users under a specific age, without any link to privacy, safety, or regulatory concerns, the restriction might be deemed unreasonable and discriminatory. This would be particularly problematic if the restriction disproportionately impacts certain groups without a sound legal or safety based rationale.
Justifications for Age Restrictions
In many cases, generative AI tools impose age restrictions for clear and legitimate reasons:
-
Privacy protection: As noted, the Privacy Act requires specific protections for minors' data, leading companies to restrict access for minors (e.g. users under 13 or 16) to avoid non compliance with these regulations.
-
Safety concerns: Younger users may lack the maturity to understand the implications of using certain generative AI tools, especially those that allow the creation or sharing of content that could be inappropriate or harmful.
-
Legal compliance: Age restrictions may be necessary to comply with other laws, such as child protection laws or industry specific regulations (e.g. for gaming or online content creation).
In these cases, the restrictions are generally considered justified, as they are aimed at protecting users or ensuring compliance with legal requirements. When age restrictions are imposed for privacy, safety, or legal compliance, they are less likely to be seen as discriminatory under the Act.
The Role of IP Protection
Another important consideration for AI tool providers when imposing age restrictions is the protection of IP. Software providers, particularly those offering AI tools, often include terms in their licensing agreements to protect their IP, including prohibiting users from adapting, modifying, or developing the software without the provider's consent. These provisions help ensure that the provider retains control over their software and any derivative works created from it.
Providers are particularly concerned with protecting their IP from potential exploitation by users who might reverse engineer, adapt, or distribute the software, potentially using it to create derivative works. In the case of minors, providers may be wary of allowing access to the software, knowing that the minor may not be fully bound by the terms of use, especially when it comes to the ownership of adaptations, modifications, or developments. If a minor were to modify the software, the provider may have limited ability to enforce the ownership over those derivative works, thus leaving the provider vulnerable to potential IP losses.
This may lead to overly strict age based restrictions in licensing agreements, which may unnecessarily limit access to the software for users who are capable of using it responsibly, even if they are minors. Thus, in an effort to protect their IP, providers may impose blanket age restrictions that prevent minors from accessing the software. While these restrictions may be justified from an IP protection standpoint, they could also result in bias against younger users who may not present the same legal risks as the provider anticipates. The fear of losing IP control could unintentionally drive providers to draft terms that are more exclusionary than necessary.
Conclusion
The imposition of age restrictions in the licensing terms for generative AI tools is primarily driven by the need to protect minors' privacy, ensure legal compliance, and safeguard intellectual property. These restrictions are generally justifiable when they are designed to protect users from privacy violations, ensure safety, and comply with relevant laws, such as child protection or data privacy regulations. However, the application of age restrictions must be carefully considered to avoid discrimination and overly broad exclusions of young users who may be capable of responsibly engaging with the technology.
The concerns surrounding minors' limited capacity to enter binding contracts, particularly regarding IP ownership, are legitimate, but they should not lead to overly restrictive policies that unnecessarily exclude responsible young users. To strike the right balance, AI providers must ensure that age restrictions are not arbitrary or excessively restrictive, and that they are clearly tied to legitimate concerns, such as privacy, safety, or legal compliance.
Providers should also be mindful of anti-discrimination laws, ensuring that their age based restrictions are reasonable, justified, and do not unfairly discriminate against younger users without a clear and necessary rationale. By carefully crafting their licensing terms, AI tool providers can protect both their intellectual property and the rights of young users, fostering a more inclusive and equitable approach to access.
Please reach out to our Employment and Safety Team if you would like some assistance in identifying and managing the risks to worker health and safety in your workplace.