Research Library

DAAA Frequently Asked Questions

2024

Digital Age Assurance Act (DAAA): Frequently Asked Questions Last Updated: October 25, 2024 Table of Contents:

  1. What is device-based age verification?
  2. Who is required to comply under the Digital Age Assurance Act?
  3. What are the proposed enforcement mechanisms?
  4. Why is U.S.C. 2256(2)(A) referenced?
  5. Is device-based age verification considered Constitutional?
  6. There are age verification bills already, why aren't we just relying on those?
  7. How does device-based age verification impact current social media bills?
  8. What types of devices does the Digital Age Assurance Act apply to, and how would this work for laptops?
  9. Does the Digital Age Assurance Act offer protections on the app store level?
  10. Why is a "commercially reasonable" method of age verification so broad?
  11. What would be the fiscal impact of a proposed bill?
  12. What if an individual does not have the proper documentation to verify their age?
  13. What happens if a user does not want to verify their age upon device activation?
  14. If the device is indicated as a child device, how is an application store expected to receive verifiable parental consent?
  1. What is device-based age verification?
    1. Device-based age verification is an age verification method in which a user verifies their age once through their device's operating system. The user's age is then securely stored on the user's device within the operating system.
      • A user would verify their age at the time of device activation, or through Operating System (OS) updates for devices sold prior to the effective date.
      • When a user attempts to access a website, application, application store, or online service that requires age verification, the user's operating system would share a user's verification status with the applicable website, application, application store, or online service through a secure application programming interface (API). The API would then provide the website, application, application store, or online service with a response on whether the user meets the defined age requirement thresholds to access the platform.
  2. Who is required to comply under the Digital Age Assurance Act?
    1. Under the proposed Digital Age Assurance Act, Covered Manufacturers, websites, applications, and online services are required to comply.
      • Covered Manufacturers are defined as the manufacturer of a device, an operating system for a device, or an application store, and are required to take commercially reasonable and technically feasible steps to determine or estimate the age of the device's primary user. Covered Manufacturers are also required to provide websites, applications, application stores, and online services with a digital signal regarding the device user's age threshold via a real-time application programming interface ("API").
        • If the Covered Manufacturer is an application store, the application store must obtain parental or guardian consent prior to permitting an individual under sixteen years of age to download an application from the application store.
        • The application store must also provide the parent or guardian with the option to connect with the developer of an application for the purpose of facilitating parental supervision tools.
      • Websites, applications, or online services that make available mature content are required to recognize and allow for the receipt of digital age signals.
        • Websites, applications, or online services that make available a substantial portion of mature content are required to block access to individuals indicated as under eighteen years of age.
        • Websites, applications, or online services that knowingly make available less than a substantial portion of mature content are required to block access to known mature content to individuals indicated as under eighteen years of age.
      • Application stores are required to recognize and allow for the receipt of digital age signals to determine whether an individual is under sixteen years of age, and obtain parental or guardian consent as described in the above Covered Manufacturer requirements.
  3. What are the proposed enforcement mechanisms?
    1. It's important to understand that the proposed Digital Age Assurance Act language may differ state-by-state, and as such, enforcement actions should be determined within each jurisdiction. The enforcement mechanisms outlined within the Digital Age Assurance Act include the following:
      • If the state believes an entity is in violation of the age verification requirements, including Covered Manufacturers, websites, applications, and online services, the state may provide an entity with written notice of specific violations. If the entity does not respond to the state or continues to violate the act in breach of an express written statement, the state may bring an action and seek damages against the entity.
        • Within the action, the state may collect a civil penalty of up to $10,000 per violation.
        • The state may alternatively seek damages for up to $2,500 per each minor actually harmed in violation of the act.
      • Covered Manufacturers that have taken commercially reasonable and technically feasible steps to determine or estimate the age of the device's user, are not subject to liability.
  4. Why is U.S.C. 2256(2)(A) referenced? Within the Digital Age Assurance Act, "mature content" is defined as U.S. Code 2256(2)(A), which is the federal code definition for "sexually explicit conduct" and is defined as:
    1. Except as provided in subparagraph (B), "sexually explicit conduct" means actual or simulated -
      • Sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex;
      • Bestiality;
      • Masturbation;
      • Sadistic or masochistic abuse; or
      • Lascivious exhibition of the anus, genitals, or pubic area of any person.
    2. This code was specifically included within this legislation to provide a standard definition that can be consistently referenced across various states. Additionally, U.S.C. 2256(2)(A) is inclusive and is non-descriptive towards an individual or group's sexual orientation or identity.
  5. Is device-based age verification considered Constitutional?
    1. Unlike alternative age verification methods, device-based age verification is considered Constitutional because it neither burdens users nor requires the disclosure of identity or other personal information in order to access mature content, and no Constitutionally protected speech is impeded.
    2. For a full analysis on the Constitutionality of device-based age verification, please refer to the Digital Age Assurance Act Constitutional Analysis.
  6. There are age verification bills already, why aren't we just relying on those?
    1. In the United States, there are currently 19 bills that have been passed requiring age verification for mature content. States with laws currently in effect include: Arkansas, Idaho, Indiana, Kansas, Kentucky, Louisiana, Montana, Mississippi, Nebraska, North Carolina, Texas, Utah, Virginia, and Alabama.
    2. The existing age verification laws pose significant concerns to user data privacy and security, present Constitutional challenges to First Amendment free speech rights, dissuade proper platform compliance and enforcement, and can lead to negative outcomes from unintended user behavior.
    3. Age verification solutions must be properly implemented to achieve the primary goal of protecting children while simultaneously preserving the privacy and Constitutional rights of adult consumers.
    4. As such, there is a significant need for legislation that places the age assurance mandate at the source, on the device, to resolve these challenges. Device-based technology as a method of age verification is technically feasible to implement and relies on elements that are already common practice across industries. Device-based age assurance verifies a user's age through their device's operating system and shares the user's verified age or age range with the application, service, or website the user is attempting to reach.
    5. By standardizing age verification on the device-level, users would no longer be required to provide personal information or PII numerous times across multiple platforms, significantly reducing the risk of misuse, data breaches, and overall concerns for data privacy. In turn, device-based age verification encourages compliance across websites, applications, and online services and significantly reduces opportunities for bypass, eliminating circumstances in which a minor circumvents the anticipated protections by finding ways to access adult content through simple actions like using a basic VPN service, or by accessing non-compliant sites. This creates a safer, privacy preserving approach to age verification that achieves its central goal in protecting children, while also addressing the various Constitutional and compliance challenges posed by existing iterations of current legislation.
  7. How does device-based age verification impact current social media bills?
    1. Existing social media bills are separate from existing age verification bills for adult content; however, both contain a requirement for entities to verify the age of the user attempting to access their website or service.
    2. The verification mechanism proposed by device-based age verification can be leveraged by both social media companies and adult content websites in order to fulfill applicable age verification requirements. Rather than requiring a user to verify their age on a per-platform basis, device-based age verification centralizes the user's point of verification. This allows users to reduce the amount of personal information provided across platforms, including both social media and adult content websites.
  8. What types of devices does the Digital Age Assurance Act apply to, and how would this work for laptops?
    1. The Digital Age Assurance Act would apply to all devices that are designed for and capable of communicating across a computer network for the purpose of transmitting, receiving, or storing data, including, but not limited to, a desktop, laptop, cellular telephone, tablet, or other device designed for such purpose.
    2. Desktop computers and laptops run on an operating system, such as Google (Android), Apple (macOS/iOS), or Microsoft (Windows), and device-based age verification can be implemented through the operating system at the profile level set-up. This allows for device-based age verification to work with the integrated security processes in place for accessing desktop and laptop computers, including shared devices with multiple accounts or profiles.
  9. Does the Digital Age Assurance Act offer protections on the app store level?
    1. The Digital Age Assurance Act requires application stores to receive the digital age signal regarding whether an individual is under the age of thirteen, at least thirteen years of age and under sixteen years of age, at least sixteen years of age and under eighteen years of age, or at least eighteen years of age. The application store would receive the digital age signal from the device's operating via the same real-time API integration required for websites, applications or online services.
    2. If the device's user is under sixteen years of age, application stores would be required to obtain parental or guardian consent prior to permitting an individual to download an application from the application store. Additionally, application stores would be required to provide the parent or guardian with the option to connect to the developer of the downloaded application for the purpose of facilitating parental supervision tools.
  10. Why is a "commercially reasonable" method of age verification so broad?
    1. Covered manufacturers are required to take commercially reasonable and technically feasible steps to determine or estimate the age of the device's user upon device activation.
    2. Given the ever-evolving state of technology, providing the option for a "commercially reasonable method" of age verification allows Covered Manufacturers to deploy verification methods in line with their existing processes and technology. Many major device manufacturers, such as Google and Apple, have already integrated age verification processes in their standard device setup practices. Rather than outlining stringent, narrowly defined requirements for age verification, "commercially reasonable" methods allow the trusted handful of Covered Manufacturers to implement processes conducive to their existing system, and therefore allows for the seamless implementation of device-based age verification.
  11. What would be the fiscal impact of a proposed bill?
    1. The Digital Age Assurance Act would allow the enforcing body to seek civil damages and issue fines for Covered Manufacturers or websites, applications, application stores, or online services found in violation. These fines may total up to $10,000 per entity found in violation of this act, or up to $2,500 per each minor harmed in violation of the act.
    2. Enforcement costs are anticipated to be very low as investigation could be conducted through API implementation check, which would include scanning websites, applications, application stores, or online services on a mass scale for the appropriate API integration. Violations identified during the investigations would largely fund any associated costs of enforcement. Note the fiscal impact of a proposed device-based age assurance bill may vary across states, depending on how the enforcing agency implements the existing violations.
  12. What if an individual does not have the proper documentation to verify their age?
    1. As with all legislation, accommodations are made for exceptions (limitations for the undocumented also exist with platform-level age verification). The commercially reasonable method of age verification should allow Covered Manufacturers to determine different methods of verification, from government-issued identification cards to age estimation technology. At present, these are potential options for age verification:
      • Documentation: If the Covered Manufacturer allows for government-issued documentation, it can be any government document, including a foreign ID.
      • Technology: If the individual attempting to verify does not have any government-issued identification, a Covered Manufacturer could employ existing technologies, such as age estimation technology.
    2. It's important to note that age verification is not identity verification, and with existing technology capabilities it is possible for Covered Manufacturers to tie an age to an individual's account without knowing the identity of an individual.
  13. What happens if a user does not want to verify their age upon device activation? The Digital Age Assurance Act would require all users to verify their age upon device activation, or during OS updates for devices sold prior to the effective date. If a user's age is verified as under eighteen or another applicable age threshold, the user would not be permitted to access websites, applications, or online services that require age verification.
  14. If the device is indicated as a child device, how is an application store expected to receive verifiable parental consent?
    1. Application stores would be required to implement a reasonably designed, technically feasible method for receiving verifiable parental consent, similar to how manufacturers of devices or operating systems are required to take commercially reasonable and technically feasible steps to verify a user's age.
    2. The Digital Age Assurance Act does not prescribe a specific method for receiving parental consent so that application stores may design and deploy a mechanism that is best suited to their existing technology. Note that Covered Manufacturers or websites, applications, or online services will still be required to comply with separate, federal regulations regarding parental consent not associated with the Digital Age Assurance Act.
    3. Several existing U.S. federal privacy laws suggest acceptable methods to obtain verifiable parental consent, including:
      • Having the parent provide a copy of a government-issued ID that can be checked against a database.
      • Having the parent use a credit or debit card to make a transaction, which notifies the account holder.
      • Having the parent answer a series of knowledge-based questions.
      • Having the parent call a toll-free number staffed by trained personnel.
 Read More

The First Line of Defense in Online Child Protection: A Model Framework for Employers of Content Moderators

2024

Read More

DAAA Constitutional Analysis

2024

THE DIGITAL AGE ASSURANCE ACT PASSES CONSTITUTIONAL MUSTER  The Digital Age Assurance Act of 2024 (the “Digital Age Assurance Act”) rests on firm  constitutional grounds. It would require adult-content websites and related providers to utilize  existing technology to curb minors’ access to sexually explicit, adults-only online content.  Because age verification can occur automatically whenever a device attempts to gain access to an  adult-only site, and neither burdens users nor requires disclosure of identity or any other personal information, it avoids the constitutional concerns that have been raised against prior age verification schemes or other measures to curb minors’ access to online adult content. Users do not  have to reveal any personal information or take any affirmative action. Operating systems already  have the Application Programming Interfaces (“APIs”) and software capable of providing owners’  age-status whenever a website asks for age verification. The bill would not materially burden users,  content providers, or even operating system manufacturers.   From a constitutional perspective, the Digital Age Assurance Act falls well within  established constitutional limits. No constitutionally protected speech is impeded, none is chilled.  The Digital Age Assurance Act even addresses the concerns raised in a pending Supreme Court  case (Paxton v. Free Speech Coalition) addressing much more intrusive age-verification laws in  19 states. If enacted, it would pass constitutional muster under existing Supreme Court holdings  (discussed below), such as United States v. Playboy Ent’t Grp., Inc. This is a completely safe, legal,  and appropriate way to prevent minors’ access to online adult content.   BACKGROUND  The Digital Age Assurance Act is quite simple. Device-based age assurance verifies a  user’s age through their device’s operating system and shares the user’s verified age or age range  with the application, service, or website the user is attempting to visit. Existing features on all  operating systems, such as Apple’s Wallet®, can provide the device-owner’s age without  disclosing any personal information. Privacy is fully protected: the only information relayed is  age-range, which is not constitutionally protected when adult-only content is the subject.   Only three provisions of the Digital Age Assurance Act are constitutionally significant here: it requires (1) manufacturers to take commercially reasonable and technically feasible steps  to establish the age of the device’s owner or user; (2) manufacturers to provide websites,  applications, and online services with a signal indicating whether the individual is under age 13,  between 13 and 15, between 16 and 18, or 18 or older; and, (3) websites, applications, and online  services with statutorily defined “mature content” to recognize these age signals and to block  access if the individual is underaged. None of these requirements raises a constitutional red flag.   ANALYSIS  1. The Digital Age Assurance Act Meets All Constitutional Tests for Age Verification Mandates. States have broad authority to enact laws restricting minors’ access to material that is  “obscene as to youths,” Erznoznik v. City of Jacksonville, 422 U.S. 205, 213 (1975), recognizing  that what is acceptable for adults may be inappropriate for children. See Ginsberg v. New York,  390 U.S. 629, 634-35 (1968) (holding that a state law requiring distributers to verify customers’  age when selling pornographic material did not violate First Amendment so long as a rational basis existed for finding that the material was harmful to children); Reno v. ACLU, 521 U.S. 844 875  (1997) (government’s valid “interest in protecting children from harmful materials …. does not  justify an unnecessarily broad suppression of speech addressed to adults”). Minors have no First  Amendment right of access to the “mature” content subject to the Digital Age Assurance Act.  Indeed, the Digital Age Assurance Act presumes that the content providers are required by various  federal and state laws to prohibit minors’ access to mature, adults-only content. The rights at issue  belong to device owners, manufacturers, websites, applications, online services, and content  providers, and none is infringed by the Digital Age Assurance Act.  

  • No interference with adults’ access to mature content. Adults would maintain the exact same access to adult content that they presently have. Contrast with United States v. Playboy Ent’t Grp., Inc., 529 U.S. 803, 813 (2000) (finding that federal law restricting hours when sexually explicit television programming could be shown infringes adult viewers’ First Amendment rights of access to non-obscene communication). 
  • No chilling effect because no burden on users. The principal objection to age verification statutes is the burden placed on adults when accessing mature-content sites or services: a requirement to provide government-issued or comparable identification, which, due to privacy concerns, chills their access and implicates First Amendment concerns. Age verification has no chilling effect. Owners and users are not affected because they are not required to act. The only information revealed is the user/owner’s age status. 
  • No invasion of user privacy. Because age is the only information transmitted by the signal, the user/owner does not reveal any personal information. There is no risk of disclosure from hacks or other intrusions. 
  • No undue burden on manufacturers. The technology to implement the Digital Age Assurance Act already exists. Requiring an operating system sold or used in a state to transmit the owner’s age-range based upon the device’s registration data is a minimal burden that neither interferes with interstate commerce (i.e., it does not violate the dormant commerce clause despite potentially broad extraterritorial effect outside of the enacting state), (see Nat’l Pork Producers Council v. Ross, 598 U.S. 356 (2023) (upholding California statute requiring all pork sold in state to meet animal-welfare requirements)), nor violates due process, (see Exxon Corp. v. Maryland, 437 U.S. 117,  125 (1978) (holding that state law regulating in-state economic conduct of out-of-state  companies does not violate substantive due process when it is rationally related to  legitimate state purpose)).  
  • No burden or chilling effect on content providers. Because the Digital Age Assurance Act merely requires content providers to accept and utilize the age verification data that would be provided automatically when a device seeks access to a site with mature content, it does not burden or deter their First Amendment rights to free speech and expression. Requiring an adult site to recognize a signal indicating that the user is a minor is no different than requiring a vendor in the non-digital world to ascertain a customer’s adult status prior to selling adult-only products. 
  • No less restrictive alternatives. Even if the Digital Age Assurance Act burdens adults’ right to access constitutionally protected content—and it does not—it still would pass any possible requirement1of “strict scrutiny” under the First Amendment, which requires the state to show that the statutory scheme is narrowly tailored such that no less restrictive alternative is available. See Playboy, 529 U.S. at 813. Here, no less intrusive means of age verification is even possible. All other means of restricting minors’ access (such as screening software or requiring actual proof of age status) have much greater effect and burden on adult owners/users. Indeed, by avoiding these more intrusive alternatives, the Digital Age Assurance Act enhances adults’ access. 
The Digital Age Assurance Act thus readily satisfies all applicable constitutional tests.  2. The Digital Age Assurance Act Avoids the Constitutional Issues Raised in Paxton. The constitutionally benign character of the Digital Age Assurance Act is clear in contrast to the age verification statutes at issue in Paxton and lower-court decisions addressing their  potential unconstitutionality. Those laws require online adult-content users to identify themselves  by government-issued identification, facial recognition technology, or other commercially  reasonable means to establish their age before accessing the content. In Free Speech Coal., Inc. v.  Paxton, 95 F.4th 263 (5th Cir. 2024), a divided Fifth Circuit panel declined to enjoin preliminarily a Texas age verification statute, ruling that it was no more intrusive than the verification  requirement applied to direct physical access in Ginsberg and thus passed constitutional muster  under Ginsberg’s minimum-scrutiny test. Id. at 269-71. The U.S. Supreme Court has granted a writ  of certiorari to review the Fifth Circuit’s decision. None of the concerns raised by the Paxton dissent, the Supreme Court petitioners, or critics  of the Fifth Circuit decision applies to the requirements set forth in the Digital Age Assurance Act.  Indeed, device-based age verification is exactly the type of benign measure that First Amendment  advocates have been insisting that states pursue in lieu of more intrusive or burdensome means.   First and foremost, no speech for adults is regulated by the Digital Age Assurance Act. By  contrast, the Texas age-verification statute in Paxton “necessarily encompasses non-obscene,  sexually expressive—and constitutionally protected—speech for adults.” Paxton, 95 F.3d at __  (Higginbotham, J., dissenting). Because no adult speech is regulated, the statute is assessed under  minimal scrutiny (rational basis), and, since Ginsberg, age verification to restrict minors’ access  to sexually explicit material is a universally accepted legislative objective.   Second, as shown above, the Digital Age Assurance Act passes strict scrutiny. Its age  verification mechanism is the least restrictive alternative because it is narrowly tailored to protect  user anonymity without impeding adult users’ access. Indeed, age verification has long been cited  as the line separating permissible age verification in the physical world (adult purchasers are  visually age verified in most cases) from the means attempted in prior online access cases.2 Thus,  the principal issue under review by the Supreme Court—whether strict scrutiny or minimum  scrutiny applies to the Texas statute—is immaterial here.   Third, because anonymity and privacy are completely protected, the Digital Age Assurance  Act poses no chilling effect on access to protected material. Contrast with Paxton, 95 F.3d at 303  (Higginbotham, J., dissenting) (“the age verification mandate will chill protected speech”).   Finally, the lack of any tangible burden on manufacturers, websites, content providers, or  users stands in stark contrast with the Texas statute under review in Paxton. There, users must  authenticate their age before accessing the materials; providers must warn of risks from viewing  the material; and users must view the warnings before gaining access. The Digital Age Assurance  Act’s age verification process seamlessly avoids any such barriers between user and content  provider.   In sum, the Digital Age Assurance Act deftly avoids all of the constitutional concerns  raised in Paxton. Read More

DAAA Technical Whitepaper

2024

Device-Based Age Assurance:  A Safer Approach to Ensuring Access to Age-Appropriate Content Last Updated Date: July 31, 2024  Executive Summary  Age verification has become a priority for lawmakers in their efforts to make the internet a safer  space. Unfortunately, current efforts to enact age verification laws to prevent minors from  accessing adult content pose significant data privacy and security concerns, present  Constitutional challenges to First Amendment free speech rights, increase barriers to proper  platform compliance and enforcement by dissuading user retention, and can lead to negative  outcomes from unintended user behavior. Age verification solutions must be properly  implemented to achieve the primary goal of protecting children, while simultaneously preserving  the privacy and Constitutional rights of adult consumers.   As such, there is a significant need for legislation that places the age assurance mandate at the  source, on the device, to resolve these challenges. Device-based technology as a method of age  assurance is technically feasible to implement and relies on elements that are already a  common practice across industries. Device-based age assurance verifies a user’s age through  their device’s operating system and shares the user’s verified age or age range with the  application, service, or website the user is attempting to reach, creating a safer, privacy  preserving approach to age verification, while also addressing the various Constitutional and  compliance challenges posed by existing iterations of current legislation.  Implications of Current Online Age Verification Laws   The current age verification requirements under recent legislation are ineffective and pose  significant implications to the privacy and Constitutional rights of adults. In June 2022,  Louisiana passed an age verification law requiring platforms and websites that contain “a  substantial amount of adult material” to implement an age verification method prior to granting  users access to the website’s content. Since then, eighteen (18) additional states have followed  suit. Requirements vary largely across states for what constitutes a reasonable age verification  method, ranging from highly invasive methods such as uploading a government-issued  identification (ID) card, to vague methods so long as they are “commercially reasonable.”   While these age verification laws may be well-intentioned in protecting minors from accessing  age-inappropriate content, they fail to do so. Additionally, they aim to subject adult users to  upload personal information and sensitive data prior to accessing content, posing adverse  consequences to user privacy and constitutionally protected speech. As current legislation requires verification to occur on a per platform basis, adult users are required to upload or  provide personal information numerous times across multiple platforms, significantly  increasing the risk of misuse and phishing, to their information being compromised in data  breaches, and potential widespread identity theft. Additionally, existing age verification  mandates burden adult users’ access to Constitutionally protected speech. Existing age  verification laws, including Texas’ H.B.1181, are actively undergoing challenges in district and  appellate courts as well as the Supreme Court for harming the speech rights of adults by  creating a government mandated, restrictive barrier to access.  Though some companies with a genuine interest in protecting children and the privacy of adult  users may comply with these regulations and take users’ safety and privacy into account, many  companies and sites may not have the resources or desire to comply in a comprehensive  manner. This results in a patchwork approach to compliance with age verification laws – each  of the potentially hundreds of thousands of platforms may have their own systems or third-party  vendors with a high degree of variance on how securely they store information, how much due  diligence they have for third-party vendors, and how strongly they or their third parties uphold  data deletion policies.   Additionally, existing age verification laws dissuade compliance. Compliant sites that  implement proper verification protocols have experienced a significant exodus of users since adult users that do not want to share personal information will seek out non-compliant sites,  many of which are located outside the jurisdictions of the states. This has the effect of naturally  redistributing users to non-compliant platforms and websites. Many smaller platforms and  websites who are not compliant continue operating without effective processes to verify the  age of users, or without proper safeguards in place to protect the personal information  collected from users. In the end, the goal of protecting minors online falls woefully short.  What is Device-Based Age Assurance  The most effective, secure, and equitable solution for protecting all users, both minors and  adults alike, is to implement a mechanism that verifies a user’s age only once and at the point  of access to the internet: on the device. The user’s age or age range can be shared with the  application, online service, or website they are attempting to reach. This approach, otherwise  known as device-based age assurance, would require a user’s age to be independently verified  one time by the device’s operating system, and would securely store the user’s age locally on  the individual device.   When a user attempts to access a website containing adult content, the user’s operating  system would then share a user’s verification status with the applicable website through a secure Application Programming Interface (API), which would provide the website with a  response on whether the user meets the defined age thresholds to access the platform. This  approach ensures a seamless experience between the user and the platform that user is trying  to access, eliminating the need to upload personal information to a third-party verification  system or to each adult content platform visited, removing the barriers to access  Constitutionally protected speech.   Device-based age assurance is straightforward and effective. The technology already exists and  standardizes the age verification process, reducing potential points of failure including privacy,  Constitutional, and compliance concerns with existing age verification laws. The crux of the  approach requires collaboration with operating system companies, such as Apple (iOS), Google  (Android), and Microsoft (Windows), to leverage existing infrastructure and technology to  deploy a secure method to validate and store a user’s age, and create a secure API in which a  user’s age or age range can be shared with the adult website in an anonymized and secure  manner.   Technical Feasibility of Device-Based Age Assurance  Current hardware and software systems are already beyond the maturity-level required to  deploy a device-based age assurance solution. Apple, Inc., one of the leading operating systems  and technology companies, can be examined as a case study to demonstrate the existing  technological feasibility of device-based age assurance. More recently, Google has deployed  age assurance functionality in the United Kingdom.  Secure data stored by device manufacturers and operating systems can be accessed through  readily available, trusted, and developed APIs. An API is a set of protocols that allow software  programs to communicate and access specific data points from other operating systems,  applications, or services. This API integration provides websites with the functionality to request  information, including age information, directly from the device’s operating system without  requiring the website to authenticate the personal information of the user. This allows platforms  and websites to request and access data stored within the device without needing to directly  interact with the backend architecture of the device’s operating system.  As an example, Apple already maintains a Wallet API that is capable of the functionalities  required for device-based age assurance. The Verify by Wallet is an example of an API that  allows integrations that share verified, authenticated age information to approved third party  applications. The data shared is limited only to the integration’s use case, ensuring the privacy  of the device user. This prevents device manufacturers from oversharing user data beyond the  approved use case and allows websites to minimize the amount of data they collect. Though the method in which device manufacturers and operating systems may perform and collect age verification information can vary, the Verify by Wallet API demonstrates an example of the  existing technology entities have in place to share limited personal data with third parties.  A Potential Implementation of Device-Based Age Assurance  By leveraging the above technology to securely store and share verified age information through  APIs, it is possible to implement a highly effective device-based age assurance mechanism that  addresses the unintended privacy risks of current, mandated platform-level age verification  requirements. The user, device/operating system, and platforms/websites are all able to safely  interact, verify ages using privacy-preserving approaches, and protect minors from accessing  age-inappropriate content.   Step 1: Age Verification of the User  Upon activation of a device, a user will validate their age through commercially reasonable  methods put into place by the operating system, such as inputting the required information on  the local device.   Once the age information is verified, it can be stored locally on the device or by other secure  methods implemented by the operating system. Storage on the device can be done so securely,  similarly to how government-issued IDs are currently stored on devices.  Step 2: Websites Requiring Age Verification Must Implement Sufficient API Integrations Any website that is legally required to verify the ages of their users must implement a sufficient  API integration with operating systems. The API integration must be reviewed and approved by  the operating system before the site can request and receive any age data.   APIs are a common practice and already exist on devices, major operating systems, websites,  and applications. Websites and applications use APIs on a daily basis in order to communicate  with other services without needing to access the other’s codebase or backend architecture.  Each API integration use case is tracked by the operating systems as standard procedure to  ensure the traceability and accountability of websites using these APIs. Websites are required  to provide full transparency into the identity information the app requests.   Step 3: User Attempts to Access Age Restricted Site  When a user attempts to visit a site using such an API, the site will send a request for the age  verified data through the approved API. The API then receives and processes the request. Based  on the agreed upon terms of the API integration, such as the use case for this information and  the age data needed, the API will retrieve the necessary information from the operating system.  The operating system could provide either the exact age of the user, or provide signals based on  the legal thresholds (<13, <16, 18+, 18-) defined within the state. The device will then provide the  verified age data to the site.   After obtaining age data, the site can then allow access or display permitted content to the user  as per the site’s age restriction policies. If a site is properly compliant, the user will not be able  to access the site if the user’s age is determined to be below the threshold to access. Effective, Secure, and Equitable Age Assurance  Device-based age assurance is an effective, easy to implement, and technically feasible solution  for preventing minors from accessing age-inappropriate material while protecting the privacy  and Constitutional rights of adult users. By verifying a user's age through the device’s operating  system and securely sharing through an API to approved websites and platforms, device-based  age assurance mitigates the inherent privacy risks, Constitutional challenges, and patchwork  nature of compliance currently posed by existing age verification laws. In addition, a device based age assurance mechanism does not dissuade users from visiting compliant platforms  and websites. Compliance with device-based age assurance would be considered better for  business, reducing the number of non-compliant websites and therefore the opportunities for  minors to access age-inappropriate content. Overall, the common goal of protecting minors  online would be achieved.  Device-based age assurance is technically feasible to implement and can be securely leveraged  across all platforms, apps, and websites. As demonstrated by Apple, one of the three major  operating system companies, the innovations and technologies required to implement device based age assurance are already widely in use and could be easily updated to enable this  assurance mechanism globally within a short time horizon. Users would only need to validate  and share their personal information with their operating system, which many users already trust with a high level of privacy and security. Device-based age assurance creates a simpler,  more transparent and secure ecosystem for all parties, and fulfills its main purpose of  protecting minors from accessing inappropriate content online.  ///Read More

daaa

Digital Age Assurance Act Legislation

2024

Model State Legislation  Digital Age Assurance for Mature Content  Be it enacted by the State Legislature --  Section 1. Short title.  This act shall be known as the “Digital Age Assurance Act of 2024.”  Section 2. Findings and purposes.  [Points to make:  

  • Creating a barrier between minors and mature content and online experiences requires  a solution that protects minors while protecting privacy and safeguarding personal  information. 
  • Current efforts to make online environments safer for children are ineffective.
  • 95% of U.S. teens have access to smartphones, and on average teens in the U.S. use 40 different apps on digital devices. 97% of U.S. teens are daily internet users, with 46% saying they are online almost constantly. 
  • Individual app or website-based age assurance will not protect minors across this  broad online ecosystem. 
  • The most effective and privacy-protective way of achieving age assurance is at the  point of access, on devices themselves. 
  • Parents want more control over their minors’ device use and online experiences. The  best way to empower parents is to create an industry-wide solution where all online  services are held to the same consistent standard.] 
Section 3. Definitions. As used in this act, the term: (a) “Application Store” means a publicly available website, software application, or  online service that distributes third party platforms’ software applications to a  computer, a mobile device, or any other general-purpose computing device.  (b) “Covered Manufacturer” means a manufacturer of a device, an operating system for a  device, or an application store.   (c) “Department” means the Department of [justice/legal affairs].  (d) “Device” means a device or a portion of a device that is designed for and capable of  communicating across a computer network with other computers or devices for the  purpose of transmitting, receiving, or storing data, including, but not limited to, a  desktop, laptop, cellular telephone, tablet, or other device designed for and capable of  communicating with or across a computer network and that is used for such purpose. (e) “Mature content” means the content defined in 18 U.S.C. 2256(2)(A) [or a state  analogue].  (f) “Minor” means an individual under the age of 18.  (g) “Operating System Provider” means an entity that develops, distributes, and/or  maintains, a device’s operating system, and provides common services. This includes but  is not limited to the design, programming, and supply of operating systems for various  devices such as smartphones, tablets, and other digital equipment.   (h) "Substantial portion" means more than 33-1/3% of total material on a website, application, or online service.  Section 4. Age assurance required.  (a) A Covered Manufacturer shall take commercially reasonable and technically feasible  steps to –  (1) upon activation of a device, determine or estimate the age of the device’s  primary user;  (2) provide websites, applications, application stores, and online services with a  digital signal regarding whether an individual is under the age of thirteen, at least  thirteen years of age and under sixteen years of age, at least sixteen years of age  and under eighteen years of age, or at least eighteen years of age via a real-time  application programming interface (“API”). (3) if the Covered Manufacturer is an Application Store, obtain parental or  guardian consent prior to permitting an individual under sixteen years of age to  download an application from the Application Store and provide the parent or guardian with the option to connect the developer of such application with the  approving parent or guardian for the purpose of facilitating parental supervision  tools.  (b) For devices sold prior to the effective date of this act, Covered Manufacturers shall  ensure that the requirements under subsection 4(a) are included in its operating system  and app store versions and updates by default after the effective date. (c) Any website, application, or online service which makes available mature content:   (1) Is required to recognize and allow for the receipt of digital age signals as  intended in this act;  (2) Where the website, application, or online service makes available a substantial  portion of mature content, it is required to:  (i) block access to the website, application, or online service if an age  signal is received indicating an individual as under eighteen years of age; (ii) provide a disclaimer to user or visitors that it contains mature content;  and   (iii) label itself as restricted to adults.   (3) Where the website, application, or online service knowingly makes available  less than a substantial portion of mature content, it is required to: (i) block access  to known mature content if an age signal is received indicating an individual is  not at least eighteen years of age, and (ii) provide a disclaimer to users or visitors  prior to displaying known mature content.  (d) A website, application, or online service with actual knowledge that a user is under 18  years of age, including via receipt of a signal regarding an individual’s age in accordance  with section 4(a)(2) shall, to the extent applicable and technically feasible, provide  readily available features for parents or guardians to support a minor with respect to their  use of the service, including features to help manage which individuals or accounts are  affirmatively linked to the minor, to help manage the delivery of age appropriate content, and to limit the amount of time that the minor spends daily on the website, application, or  online service.  (e) A Covered Manufacturer shall comply with this Act in a nondiscriminatory manner,  specifically including, but not limited to:  (1) A Covered Manufacturer shall impose at least the same restrictions and  obligations on its own websites, applications, and online services as it does on  those from third parties;  (2) A Covered Manufacturer shall not use data collected from third parties, or  consent mechanisms deployed for third parties, in the course of compliance with  this Act to compete against those third parties, give the Covered Manufacturer's  services preference relative to those of third parties, or to otherwise use this data  or consent mechanism in an anticompetitive manner.  (f) After notice and comment, the department may promulgate such rules and regulations  as may be necessary to establish the processes by which entities are to comply with the  provisions of this section.  Section 5. Enforcement.   (a) The Department shall have exclusive authority to enforce violations of this act.  (b) Prior to initiating any action under this act, the Department shall provide an entity 45  days’ written notice identifying the specific provisions of this act the Department alleges  are being violated. If within 45 days the entity cures the noticed violation and provides  the Department a written statement that the alleged violations have been cured, no further  action shall be initiated against the entity.   (c) If an entity continues to violate this act in breach of an express written statement  provided to the Department under this section or fails to provide such written statement,  the Department may initiate an action and seek damages for up to $10,000 per violation of this act [or … damages for up to $2,500 per each minor actually harmed in violation  of this act]. Damages shall begin accruing after completion of the 45-day cure period in  subsection 5(b).   (d) Nothing in this act shall be construed as providing the basis for, or be subject to, a  private right of action to violations of this act or under any other law. (e) A Covered Manufacturer, as defined in Section 3(c), shall not be subject to liability  for failure to comply with this statute if that Covered Manufacturer has taken  commercially reasonable and technically feasible steps to determine or estimate the age of the device’s user as provided in Section 4(a)(1).  Section 6. Severability. If any provision of this act or its application to any person or  circumstances is held invalid, the invalidity does not affect other provisions or applications of  this act, which can be given effect without the invalid provision or application, and to this end  the provisions of this act are severable.  Section 7. Effective date. This act shall take effect one year after it shall have become a law.  Section 8. Uniform Standards. This act is intended to provide for uniformity of law. Any prior  state laws, local ordinances, regulations, or policies adopted by a county, municipality,  administrative agency, or other political subdivision of this state that are in conflict with the  provisions of this act are hereby superseded and shall be deemed null and void to the extent of  the conflict with this act.  Read More

Child Abuse/Exploitation, Child Sexual Abuse Material (Child Pornography), Missing Children/Child Abduction, Colombia, Ecuador, El Salvador, Guatemala, Honduras, Indonesia, Kenya, Moldova, Romania, Trinidad & Tobago, Uganda

Comparative analysis of ICMEC’s Multisectoral Response and Capacity Assessments (MRCs)

2024

The global fight against Child Sexual Exploitation and Abuse (CSEA) requires that the different actors involved have tools and models that allow them to understand, evaluate, and optimize the response capacities of the countries and the articulation of these capacities at the inter-sectoral, regional, and international levels.

This paper aims to conduct a Comparative Analysis of the Multisectoral Response and Capacity Assessments (MRC) that were conducted by ICMEC's National Capacity Development program in 11 countries and 3 different regions of the world between December 2021 and March 2024.

   Read More

Child Abuse/Exploitation, Child Protection, International Schools, Parents, Schools, Teachers

Protocol – Managing Allegations of Child Abuse by Educators and other Adults

2018

This protocol provides a framework to guide school leaders and strengthen decision-making as they manage allegations of child abuse by educators and other adults currently or previously working in international school settings. Building on the work of the International Task Force on Child Protection (ITFCP) and the Safeguarding Unit at Farrer & Co, we continue our work across professions to develop child protection resources for the international school community. We intend to enable schools to achieve the highest possible standards in child protection and safeguarding, which may exceed those set by local law.Read More

ICMEC Publications

Part I – A statistical analysis of applications made in 2021 under the Hague Convention of 25 October 1980 on the Civil Aspects of International Child Abduction: Global Report

2024

(HCCH) This is the fifth statistical study to look into the operation of the Hague Convention of 25 October 1980 on the Civil Aspects of International Child Abduction. This study concerns all applications received by Central Authorities in 2021. Previous Studies analysed data from applications made in 2015 (Fourth Study), 2008 (Third Study), 2003 (Second Study) and 1999 (First Study).  Read More

ICMEC Publications, Missing Children/Child Abduction

Part II – A statistical analysis of applications made in 2021 under the Hague Convention of 25 October 1980 on the Civil Aspects of International Child Abduction: Regional report

2024

(HCCH) This is the fifth statistical study to look into the operation of the Hague Convention of 25 October 1980 on the Civil Aspects of International Child Abduction. This study concerns all applications received by Central Authorities in 2021. The Regional Report is the second part of the Study, following the Global Report. The Regional Report analyses region-specific data from the European Union, Latin American and the Caribbean, and Asia Pacific States.  Read More

Child Abuse/Exploitation, Child Pornography, Child Protection, Child Sexual Abuse Material (Child Pornography), Cybercrime, ICMEC Publications, Legislation, Child Protection, online exploitation, Online Safety, Policies and Procedures

Child Sexual Abuse Material: Model Legislation & Global Review (10th Edition)

2023

This groundbreaking report, often referred to as ICMEC's Rule of Law project, analyzes child sexual abuse material (CSAM) legislation in 196 countries around the world, and offers a “menu” of concepts to be considered when drafting anti-CSAM legislation. First released in April 2006, the report is currently in its 10th edition. The latest edition's findings show that since inception 150 countries have refined or implemented new anti-CSAM legislation, 140 countries criminalize simple CSAM possession, 125 countries define CSAM, and only 32 countries require ISP reporting of suspected CSAM.Read More

Child Abuse/Exploitation, Child Protection, ICMEC Publications, Legislation, Brunei, Cambodia, Indonesia, Laos, Malaysia, Myanmar, Philippines, Singapore, Thailand, Vietnam

Protecting Children Against Sexual Offences in ASEAN Member States

2023

February 2023, ICMEC Report - Protecting children against sexual offences in ASEAN Member States An overview and analysis of the current legal framework for the protection of children against sexual exploitation and abuse in ASEAN Member States. Published by the International Centre for Missing & Exploited Children and Freshfields Bruckhaus Deringer.Read More

Child Abuse/Exploitation, Child Protection, Child Trafficking, ICMEC Publications, Migration, Missing Children/Child Abduction

Protecting Children on the Move – Understanding and Addressing the Risks of Abuse Exploitation and Going Missing during Migration

2022

April 2022 (ICMEC)  Millions of children around the world—accompanied and unaccompanied—have been forced to migrate or displaced within their countries. Their lack of maturity and education, as well as vulnerabilities related to their ongoing physical and psychological development, predispose them to an increased risk of exploitation and to going missing prior to, during, and after their migratory journeys. With this publication, we hope to provide even greater insight into the ongoing and ever-evolving issue of child migration and the risks children face throughout their journeys.  ICMEC aims to encourage cross-sector, cross-industry collaboration and engagement; promote ongoing training and education of healthcare professionals, educators, border personnel, and other key stakeholders; and encourage the provision of safe spaces for children regardless of their migrant status to ensure their basic rights are being upheld. With the proper resources, partnerships, and tools in place, migrant children will have a better chance of living healthier and safer lives. Read More

Child Abuse/Exploitation, Child Pornography, Child Protection, Child Sexual Abuse Material (Child Pornography), Cybercrime, ICMEC Publications, Legislation, Philippines

Philippines Legal Review Position Paper

2021

October 2021 (ICMEC, Romulo Mabanta Buenaventura Sayoc & Delos Angeles) The purpose of this paper is to strengthen the child protection legal framework such that it leaves no child in the Philippines vulnerable to sexual exploitation and abuse. The analysis and recommendations reflect the real-world challenges faced by dozens of professionals navigating the, admittedly, robust child protection legal framework in the Philippines, and recommendations from a diverse stakeholder group that supports the child protection agenda. The paper enumerates several impediments faced by law enforcement, regulators, policymakers, private industry, civil society, and others to advance effective action to combat this crime. Through practical recommendations, we endeavor to detail the current legal statutes that hinder or collectively challenge operational action by the diverse stakeholder group. We respectfully urge policymakers in the Philippines Congress to take up the cause and support the review, revision, and/or drafting of new legislation that will address the challenges identified to close the gaps that still exist leaving our children vulnerable.Read More

FOUND: A Story of Hope

2021

Elieth Samara was abducted when she was just 19 days old. For 33 long hours, her mother Nayelly lived a nightmare until she was reunited with her daughter thanks to the tireless investigation of ICMEC and our allies around the globe. Watch this powerful story of heartbreak and hope. Once Elieth Samara was reported missing, ICMEC served as the on-the-ground advisor in the investigation. For 33 hours, ICMEC coordinated with dozens of partners from the U.S. Embassy, the Civil National Police, the Public Ministry in Guatemala, Alba-Keneth Alerts, and Facebook. These front-line child protection heroes acted swiftly and in coordination thanks to the training, technology and inter-agency collaboration that has been fostered for more than a decade.  While Elieth has been safely reunited with her mother, our work is not done. More than a million children like her are reported missing each year around the world – and many more are victims of abuse and exploitation. That’s why ICMEC remains dedicated to continuing our work, because one child missing, abused, or exploited is one too many.Read More

Child Protection, ICMEC Publications, International Schools, Parents, Mandatory Reporters, Schools, Teachers, Training

Mandatory Reporter Infographic for Youth Serving Professionals

2020

(ICMEC) Your Role as a Mandatory Reporter - published by ICMEC to support youth serving professionals in identification of signs and indicators, grooming behavior and trauma informed response to child disclosure.  Additional support for training on identifying grooming is available in these additional resources.Read More

Child Abuse/Exploitation, Child Protection, International Schools, Parents, Missing Children/Child Abduction, Special Needs, Autism, IT safeguarding curriculum, Schools

Special Education Needs Abuse Prevention and Response

2020

All ICMEC resources are applicable to children with different abilities and needs.  Some resources are written to address specific vulnerabilities of children with commuication or psychosocial disabilities, in particular Austistic Spectrum Condition (ASC).  These research-informed resources were compiled by trusted child protection partners such as NCMEC (US), NAS (US), NSPCC (UK), SWGfL (UK), and INEQE (UK).Read More

Child Abuse/Exploitation, Child Protection, Peer-Peer Abuse, Policies and Procedures, Schools, Teachers

School Based Violence Prevention Handbook

2020

WHO - This handbook addresses the key elements of violence or abuse prevention in schools. It provides guidance for school officials and education authorities on how schools can embed violence prevention within their routine activities and across the points of interaction schools provide with children, parents and other community members. If implemented, the handbook will contribute much to helping achieve the SDGs and other global health and development goals. Produced by the World Health Organization in collaboration with UNESCO and UNICEF.  Available in English, Spanish and French.Read More

Safety Precautions for Teachers and Students During COVID

2020

ISPCAN - Infographic to support healthy return to in person school. Read More

Child Sexual Abuse in Sports

2020

(IICSA) - Findings of the Independent Inquiry into Childhood Sexual Abuse in the United Kingdom The report concludes with some suggestions for change from the victims and survivors. Including, continuing to raise awareness of sexual abuse in a sports context, better support and protection for those coming forward and improving the communication that organistions have with survivors of abuse.Read More

Child Abuse/Exploitation, Child Protection, Cybercrime

ITU Child Online Protection Guidelines 2020

2020

The International Telecommunication Union (ITU) developed its very first set of Child Online Protection (COP) Guidelines in 2009. Since then, the Internet has evolved beyond all recognition. While it has become an infinitely richer resource for children to play and learn, today’s children face many risks online.

The new updated ITU Guidelines on Child Online Protection are a comprehensive set of recommendations for all relevant stakeholders on how to contribute to the development of a safe and empowering online environment for children and young people. There are four sets of the 2020 Child Online Protection (COP) Guidelines targeting childrenparents and educators, industry, and policymakers.  
Read More

Education Portal Tour

2020

(ICMEC) View this introductory tour on how to use the Education Portal.Read More

Child Protection, ICMEC Publications, International Schools, Peer-Peer Abuse, Risk Assessment, Schools

Student Safety Survey Samples

2020

Student safety surveys are used to identify child protection risks to be mitigated.  They should not be used if staff are not trained to respond to student disclosre, there is no child protection policy, or clear reporting pathway for students to disclose abuse.  A lack of action following a safety survey could exacerbate existing instances of abuse.Read More

Child Protection, ICMEC Publications, International Schools, Parents

Resources in Simplified Chinese_zh-cn

2020

We are currently developing resources in Chinese versions.  Please share applicable resources to EdPortal@ICMEC.org.Read More

Child Protection, ICMEC Publications, International Schools, Parents, Chinese, Corona Virus, Covid19, Online Safety, Schools, Teachers, Virtual School

Safer Virtual School

2020

(ICMEC) Safeguarding challenges of online school can be overcome with risk mitigating codes of conduct and child protection policies and procedures.  Ensure staff are aware of their professional responsibilities to maintain appropriate boundaries with students.  Usual school principles of avoiding 1-on-1 contact with students and supervised, observable and interruptable contact with vulnerable students should be in place in online learning environments.Read More

Child Abuse/Exploitation, Child Trafficking, ICMEC Publications

Improving Healthcare Services for Trafficked Persons: The Complete Toolkit

2019

This toolkit is designed to assist medical and mental health professionals, health administrators, government officials, shelter staff, and other care providers in assessing and improving health care services available to trafficked children and adults, either on-site at their own organization, or at one or more local facilities (‘referral network’). The kit contains four sections: (1) an overview of human trafficking, (2) a service-assessment tool for determining strengths and challenges in a given facility’s medical and/or mental health care delivery, (3) guidelines for developing or improving medical and mental health services for trafficked persons, and (4) a template for organizing the names of key local and national partners and their contact information. This list will help ensure comprehensive care for trafficked persons.Read More

Child Protection, International Schools

ITFCP Child Protection Policy Planning Form

2019

(ITFCP) - Form that incorporates the new accreditation standards for child protection into policy planning for educational institutions.Read More

Child Abuse/Exploitation, Child Pornography, Child Protection, Child Sexual Abuse Material (Child Pornography), Cybercrime, ICMEC Publications, Legislation

Child Sexual Abuse Material: Model Legislation & Global Review

2023

This groundbreaking report, often referred to as ICMEC's Rule of Law project, analyzes child sexual abuse material (CSAM) legislation in 196 countries around the world, and offers a “menu” of concepts to be considered when drafting anti-CSAM legislation. First released in April 2006, the report is currently in its 9th edition. The latest edition's findings show that since inception 150 countries have refined or implemented new anti-CSAM legislation, 140 countries criminalize simple CSAM possession, 125 countries define CSAM, and only 32 countries require ISP reporting of suspected CSAM. For a brief overview of the 9th edition click here.Read More

Child Abuse/Exploitation, Child Trafficking, Cybercrime, ICMEC Publications, Legislation

Studies in Child Protection: Technology-Facilitated Child Sex Trafficking

2018

(ICMEC) - The Internet has global reach, which fuels the need for international legal cooperation to develop more stringent, overt laws to protect children from technology-facilitated child sex trafficking. While vast research exists regarding child sex trafficking broadly, this paper specifically focuses on: how and why technology is increasingly used to recruit, advertise, and send/receive payments for child sex trafficking; examining available international and regional legal instruments; reviewing a sampling of relevant national legislation; presenting model legislative language for consideration; and discussing the role of the technology and financial industries to deter traffickers from misusing their platforms to sexually exploit children.Read More

Older

Filter by

Filters