Skip to content

Latest commit

 

History

History
1214 lines (1086 loc) · 299 KB

File metadata and controls

1214 lines (1086 loc) · 299 KB

Data Collection: Protecting personal information

Evaluating data collection takes into consideration best practices of limiting the type and amount of personal information collected from a user to only the information needed to provide the application or service.

2.1.1: Collect PII (BASIC)

Do the policies clearly indicate whether or not the vendor collects personally identifiable information (PII)?

  • Indicator
    • Discloses Personally Identifiable Information (PII) is collected.
    • Discloses how the product collects personal information.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • FERPA defines the term personally identifiable information (PII) to include direct identifiers (such as a student's or other family member's name) and indirect identifiers (such as a student's date of birth, place of birth, or mother's maiden name). Indirect identifiers include metadata about a student's interaction with an application or service, and even aggregate information can be considered PII under FERPA if a reasonable person in the school community could identify individual students based on the indirect identifiers together with other reasonably available information, including other public information. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 2; See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, p. 2.
    • Companies collect a wide range of personal information from users—from personal details and account profiles to a user’s activities and location. We expect companies to clearly disclose what user information they collect and how they do so. See Ranking Digital Rights, P3.
    • The term “user information” appears in many indicators throughout the Privacy category. An expansive interpretation of user information is defined as: “any data that is connected to an identifiable person, or may be connected to such a person by combining datasets or utilizing data-mining techniques.” As further explanation, user information is any data that documents a user’s characteristics and/or activities. This information may or may not be tied to a specific user account. This information includes, but is not limited to, personal correspondence, user-generated content, account preferences and settings, log and access data, data about a user’s activities or preferences collected from third parties either through behavioral tracking or purchasing of data, and all forms of metadata. See Ranking Digital Rights, P3.

2.1.2: PII Categories

Do the policies clearly indicate what categories of personally identifiable information are collected by the product?

2.4.1: Collection Limitation (BASIC)

Do the policies clearly indicate whether or not the vendor limits the collection or use of information to only data that are specifically required for the product?

2.1.3: Geolocation Data

Do the policies clearly indicate whether or not precise geolocation data are collected?

  • Indicator
    • Discloses location information is collected.
    • Discloses location information is derived from usage information.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • Location information collected in the mobile context is considered a persistent identifier that can be used to recognize a user over time and across different websites or online services. Geolocation data includes information sufficient to identify the latitude and longitude coordinates of a user that can correspond to a specific street, address, name of a city or town. If location data is collected and shared with third-parties, companies should work to provide consumers with more prominent notice and choices about its geolocation data collection, transfer, use, and disposal practices. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 33; See also U.S. v. Jones, 132 S. Ct. 945, 955 (2012)("GPS monitoring generates a precise, comprehensive record of a person's public movements that reflects a wealth of detail about her familial, political, professional, religious, and sexual associations").
    • For mobile ecosystems, we expect companies to clearly disclose what options users have to control the collection of their location information. A user’s location changes frequently and many users carry their mobile devices nearly everywhere, making the collection of this type of information particularly sensitive. In addition, the location settings on mobile ecosystems can influence how other products and services access their location information. For instance, mobile apps may enable users to control location information. However, if the device on which those mobile apps run collects geolocation data by default and does not give users a way to turn this off, users may not be able to limit that mobile app's collection of their location information. For these reasons, we expect companies to disclose that users can control how their device interacts with their location information. See Ranking Digital Rights, P7.

2.1.4: Health Data

Do the policies clearly indicate whether or not any health or biometric data are collected?

  • Indicator
    • Discloses health or biometric related information is collected.
  • Citation
    • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
    • General Data Protection Regulation: ("genetic data" means personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or the health of that natural person and which result, in particular, from an analysis of a biological sample from the natural person in question) See General Data Protection Regulation (GDPR), Definitions, Art. 4(13)
    • General Data Protection Regulation: ("biometric data" means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data) See General Data Protection Regulation (GDPR), Definitions, Art. 4(14)
    • General Data Protection Regulation: ("data concerning health" means personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status) See General Data Protection Regulation (GDPR), Definitions, Art. 4(15)
  • Background
    • Biometric data are physical or behavioral characteristics which can be used to identify unique individuals. Biometric technologies measure these unique characteristics electronically and match them against existing records to create a highly accurate identity management system. Fingerprints, retnia scans, or voice and facial recognition are examples of physcial identification technologies. It uses the layout of facial features and their distance from one another for identification against a "gallery" of faces with similar characteristics. See Privacy Best Practice Recommendations For Commercial Biometric Use, NTIA Discussion Draft (July 22, 2015), p. 1.
    • The ability of facial recognition technology to identify consumers based solely on a photograph, create linkages between the offline and online world, and compile highly detailed dossiers of information, makes it especially important for companies using this technology to implement privacy by design concepts with robust choice and transparency policies. Such practices should include reducing the amount of time consumer information is retained, adopting reasonable security measures, and disclosing to consumers that the facial data collected may be used to link them to information from third-parties or publicly available sources. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 46.

2.1.5: Behavioral Data

Do the policies clearly indicate whether or not any behavioral data are collected?

  • Indicator
    • Discloses behavioral or usage information is collected.
  • Citation
    • Children's Online Privacy Protection Act: (An operator is prohibited from including behavioral advertisements or amassing a profile of a child under the age of 13 child without parental consent) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • General Data Protection Regulation: ("biometric data" means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data) See General Data Protection Regulation (GDPR), Definitions, Art. 4(14)

2.1.6: Sensitive Data

Do the policies clearly indicate whether or not sensitive personal information is collected?

  • Indicator
    • Discloses collection of sensitive information such as ethnic, racial, national origin, cultural, religious, or social personal information.
  • Citation
    • General Data Protection Regulation: (Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited unless: (a) the data subject has given explicit consent to the processing of those personal data for one or more specified purposes, except where Union or Member State law provide that the prohibition ... may not be lifted by the data subject) See General Data Protection Regulation (GDPR), Processing of special categories of personal data, Art. 9(1)-(2)(a)

2.1.7: Usage Data

Do the policies clearly indicate whether or not the product automatically collects any information?

  • Indicator
    • Discloses non-personal usage information is collected.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • The Children's Online Privacy Protection Act (COPPA) defines "personal information" to include identifiers, such as a customer number held in a cookie, an IP address, a processor or device serial number, or a unique device identifier that can be used to recognize a user over time and across different websites or online services, even where such an identifier is not paired with other items of personal information. Companies should disclose in their privacy policy, and in their direct notice to parents, their collection, use or disclosure practices of persistent identifiers unless: (1) the company collects no other "personal information," and (2) persistent identifiers are collected on or through a company's site or service solely for the purpose of providing "support for the internal operations" of the site or service. See FTC, Complying with COPPA: Frequently Asked Questions, q. 6.
    • Persistent identifiers collected for the sole purpose of providing support for the internal operations of the website or online service do not require parental consent, so long as no other personal information is collected and the persistent identifiers are not used or disclosed to contact a specific individual, including through behavioral advertising; to amass a profile on a specific individual; or for any other purpose. See FTC, Complying with COPPA: Frequently Asked Questions, q. 5.
    • The data on students collected and maintained by Ed Tech can be extremely sensitive, including medical histories, social and emotional assessments, progress reports, and test results. Online services also collect new types of data, which were not contemplated by and may not be protected by federal privacy laws. New data types collected by Ed Tech include "metadata," such as a student’s location, how many attempts a student made to answer a question, and whether a student is using a desktop or a mobile device. Metadata can be put to good use to personalize learning and to improve educational products. It can also be used to influence or market to students or to their parents. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 3.
    • A vendor should describe the types or categories of student information that they acquire from schools, school districts, teachers, parents, or students. Data types may include behavioral data reflecting how a student used the site or service or what content the student has accessed or created through it, and transactional data, such as persistent unique identifiers, collected through the use of your site or service. While unique identifiers are evolving with technology, currently such identifiers include, but are not limited to, cookies, device IDs, IP addresses, and other data elements if used to identify devices or users. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 11.

5.2.1: Collection Consent

Do the policies clearly indicate whether or not the vendor requests opt-in consent from a user at the time information is collected?

3.8.1: Third-Party Collection

Do the policies clearly indicate whether or not a user's personal information is collected by a third party?

Data Sharing: Protecting data from third parties

Evaluating data sharing takes into consideration best practices that protect the disclosure of a user's personal information to third parties.

3.1.1: Data Shared (BASIC)

Do the policies clearly indicate if collected information (this includes data collected via automated tracking or usage analytics) is shared with third parties?

  • Indicator
    • Discloses user information is shared with third parties.
    • Discloses the type of user information shared with third parties.
  • Citation
  • Background
    • Online educational services increasingly collect a large amount of contextual or transactional data as part of their operations, often referred to as "metadata." Metadata refer to information that provides meaning and context to other data being collected; for example, information about how long a particular student took to perform an online task has more meaning if the user knows the date and time when the student completed the activity, how many attempts the student made, and how long the student's mouse hovered over an item (potentially indicating indecision). See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.
    • Metadata that have been stripped of all direct and indirect identifiers are not considered protected information under FERPA, because the data are not PII. A provider that has been granted access to PII from education records under the "school official" exception may use any metadata that are not linked to FERPA-protected information for other purposes, unless otherwise prohibited by the terms of their agreement with the school or district. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.
    • Companies collect a wide range of personal information from users—from personal details and account profiles to a user’s activities and location. Companies also often share this information with third parties, such as advertisers, governments, and legal authorities. We expect companies to clearly disclose what user information they share and with whom. Company disclosure should specify if it shares user information with governments and with commercial entities. See Ranking Digital Rights, P4.

3.1.2: Data Categories (BASIC)

Do the policies clearly indicate what categories of information are shared with third parties?

  • Indicator
    • Discloses the categories of information shared with third parties.
  • Citation
  • Background
    • Consumers deserve more transparency about how their data is shared beyond the entities with which they do business directly, including "third-party" data collectors. This means ensuring that consumers are meaningfully aware of the spectrum of information collection and reuse as the number of firms that are involved in mediating their consumer experience or collecting information from them multiplies. The data services industry should follow the lead of the online advertising and credit industries and build a common website or online portal that lists companies, describes their data practices, and provides methods for consumers to better control how their information is collected and used or to opt-out of certain marketing uses. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 62.
    • What is the "School Official" Exception? In some cases, providers need PII from a students's education records in order to deliver the agreed-upon services. FERPA's school official exception to consent is most likely to apply to the schools' and districts' relationships with service providers. When schools and districts outsource institutional services or functions, FERPA permits the disclosure of PII from education records to contractors, consultants, volunteers, or other third-parties provided that the outside party meets specified requirements. See 34 C.F.R. § 99.31(a)(1)(i); See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 2; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 3-5.

3.2.1: Sharing Purpose

Do the policies clearly indicate the vendor's intention or purpose for sharing a user's personal information with third parties?

3.11.1: Third-Party Categories

Do the policies clearly indicate the categories of related third parties, such as subsidiaries or affiliates with whom the vendor shares data?

3.2.2: Third-Party Analytics

Do the policies clearly indicate whether or not collected information is shared with third parties for analytics and tracking purposes?

3.2.3: Third-Party Research

Do the policies clearly indicate whether or not collected information is shared with third parties for research or product improvement purposes?

3.10.1: Third-Party Providers

Do the policies clearly indicate whether or not third-party services are used to support the internal operations of the vendor's product?

  • Indicator
    • Discloses third-party service providers may be used to support the product.
  • Citation
  • Background
    • Disclosure of personal information for the "internal operations" of the website or online service, means activities necessary for the site or service to maintain or analyze its functioning; perform network communications; authenticate users or personalize content; serve contextual advertising or cap the frequency of advertising; protect the security or integrity of the user, website, or online service; ensure legal or regulatory compliance; or fulfill a request of a child. See 16 C.F.R. 312.2; See also FTC, Complying with COPPA: Frequently Asked Questions, q. 5.

3.10.2: Third-Party Roles

Do the policies clearly indicate the role of third-party service providers?

3.14.1: Social Login (BASIC)

Do the policies clearly indicate whether or not social or federated login is supported to use the product?

  • Indicator
    • Discloses social login is supported to authenticate with the product.
  • Citation
    • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6(c)

3.16.1: Third-Party Limits (BASIC)

Do the policies clearly indicate whether or not the vendor imposes contractual limits on how third parties can use personal information that the vendor shares or sells to them?

  • Indicator
    • Discloses contractual obligations or restrictions are placed on third parties who receive user information.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
    • Family Educational Rights and Privacy Act: (An exception for disclosing personally identifiable information without obtaining parental consent exists for sharing data with a third party who is considered a "school official" with a legitimate educational interest, and under direct control of the school for the use and maintenance of education records) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(1)(i)(B)
    • Student Online Personal Information Protection Act: (An operator may disclose student information to a third party service provider, but the third party is prohibited from using the information for or any purpose other than providing the service) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(4)(E)(i)
    • Student Online Personal Information Protection Act: (A third party service provider may not disclose student information to any subsequent third party) See Student Online Personal Information Protection Act (SOPIPA),Cal. B.&P. Code § 22584(b)(4)(E)(ii)
    • General Data Protection Regulation: (The processor shall not engage another processor without prior specific or general written authorisation of the controller. In the case of general written authorisation, the processor shall inform the controller of any intended changes concerning the addition or replacement of other processors, thereby giving the controller the opportunity to object to such changes.) See General Data Protection Regulation (GDPR), Processor, Art. 28(2)
    • General Data Protection Regulation: (Processing by a processor shall be governed by a contract or other legal act under Union or Member State law, that is binding on the processor with regard to the controller and that sets out the subject-matter and duration of the processing, the nature and purpose of the processing, the type of personal data and categories of data subjects and the obligations and rights of the controller.) See General Data Protection Regulation (GDPR), Processor, Art. 28(3)
    • General Data Protection Regulation: (Where a processor engages another processor for carrying out specific processing activities on behalf of the controller, the same data protection obligations as set out in the contract or other legal act between the controller and the processor ... shall be imposed on that other processor by way of a contract or other legal act under Union or Member State law, in particular providing sufficient guarantees to implement appropriate technical and organisational measures in such a manner that the processing will meet the requirements of this Regulation. Where that other processor fails to fulfil its data protection obligations, the initial processor shall remain fully liable to the controller for the performance of that other processor's obligations.) See General Data Protection Regulation (GDPR), Processor, Art. 28(4)
    • General Data Protection Regulation: (The processor and any person acting under the authority of the controller or of the processor, who has access to personal data, shall not process those data except on instructions from the controller) See General Data Protection Regulation (GDPR), Processing under the authority of the controller or processor, Art. 29
  • Background
    • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
    • A company's data would not be "reasonably linkable" to a particular consumer or device to the extent that the company implements three significant protections for that data: (1) a given data set is not reasonably identifiable, (2) the company publicly commits not to re-identify it, and (3) the company requires any downstream users of the data to keep it in de-identified form. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
    • The ability to re-identify "anonymous" data supports the FTC's framework application to data that can be reasonably linked to a consumer or device, because consumers' privacy interest in data goes beyond what is strictly labeled PII. There exists a legitimate interest for consumers in having control over how companies collect and use aggregated or de-identified data, browser fingerprints, and other types of non-PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 18-19.
    • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
    • A vendor should contractually require their service providers who receive covered information acquired through the site or service to use the information only to provide the contracted service, not to further disclose the information, to implement and maintain reasonable security procedures and practices as required by law, and to return or delete covered information at the completion of the contract. Include a requirement that any service providers notify the vendor immediately of any unauthorized disclosure of the student information in their custody, and then act promptly to provide proper notice as required by law. Make clear to service providers that they may separately face liability for the mishandling of student data. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 13.

Data Security: Protecting against unauthorized access

Evaluating data security takes into consideration best practices that protect the integrity and confidentiality of a user's data.

8.1.1: Verify Identity

Do the policies clearly indicate whether or not the vendor or vendor-authorized third party verifies a user's identity with personal information?

  • Indicator
    • Discloses users are required to verify their identity with a government issued identification or with other forms of identification that could be connected to their offline identity.
    • Discloses users are required to verify their identity with personal information for parental consent purposes.
  • Citation
  • Background
    • The ability to communicate anonymously is essential to freedom of expression both on and offline. The use of a real name online, or requiring users to provide a company with identifying information, provides a link between online activities and a specific person. This presents human rights risks to those who, for example, voice opinions that don't align with a government's views or who engage in activism that a government does not permit. It also presents risks for people who are persecuted for religious beliefs or sexual orientation. We therefore expect companies to disclose whether they might ask users to verify their identities using government-issued ID or other forms of identification that could be connected to their offline identity. We acknowledge that users may have to provide information that could be connected to their offline identity in order to access paid features of various products and services. However, users should be able to access features that don't require payment without needing to provide information that can be tied to their offline identity. See Ranking Digital Rights, F11.

8.2.1: Account Required (BASIC)

Do the policies indicate whether or not the vendor requires a user to create an account with a username and password in order to use the product?

  • Indicator
    • Discloses users are required to create an account to use the product.

8.2.2: Managed Account (BASIC)

Do the policies clearly indicate whether or not the vendor provides user managed accounts for a parent, teacher, school or district?

  • Indicator
    • Discloses managed accounts are provided for parents, teachers, schools, or district staff.
    • Discloses accounts are created for students by parents, teachers, schools, or district staff.

8.2.3: Two-Factor Protection

Do the policies clearly indicate whether or not the security of a user's account is protected by two-factor authentication?

  • Indicator
    • Discloses user accounts can be protected with two-factor authentication.
    • Discloses managed accounts can be protected with two-factor authentication.

8.3.1: Security Agreement

Do the policies clearly indicate whether or not a third party with access to a user's information is contractually required to provide the same level of security protections as the vendor?

8.4.1: Reasonable Security (BASIC)

Do the policies clearly indicate whether or not reasonable security standards are used to protect the confidentiality of a user's personal information?

  • Indicator
    • Discloses security protections in place for users' information are based on industry standards and best practices.
    • Discloses complex passwords and failed login lockouts protect user information.
    • Discloses advanced authentication methods are provided by the company to prevent fraudulent access.
    • Discloses users can view their recent account activity and login information.
    • Discloses users are notified about unusual account activity and possible unauthorized access to their accounts.
  • Citation
  • Background
    • A vendor should provide a general description of the technical, administrative and physical safeguards you use to protect student information from unauthorized access, destruction, use, modification, or disclosure. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.
    • A vendor should implement and maintain reasonable security measures appropriate to the nature of the student information, including covered information, acquired through your site or service. Designate and train someone responsible and use a risk management process: identify your data assets, assess threats and vulnerabilities, apply appropriate controls, monitor their effectiveness, and repeat the process. As discussed in the California Data Breach Report, the Center for Internet Security’s Critical Security Controls is a good starting point for high-priority security controls. The Federal Trade Commission’s Start with Security also offers helpful guidance. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
    • This indicator is applicable to internet and mobile ecosystem companies. Companies hold significant amounts of user information, making them targets for malicious actors. We expect companies to help users protect themselves against such threats. Companies should clearly disclose that they use advanced authentication techniques to prevent unauthorized access to user accounts and information. We also expect companies to provide users with tools that enable them to secure their accounts and to know when their accounts maybe compromised. See Ranking Digital Rights, P17.

8.4.2: Employee Access

Do the policies clearly indicate whether or not the vendor implements physical access controls or limits employee access to user information?

  • Indicator
    • Discloses security processes are used that limit or monitor employee access to users' information.
    • Discloses physical access controls are used to limit employee access to users' information.
  • Citation
    • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a description of the actions the third party will take, including the designation and training of responsible individuals, to ensure the security and confidentiality of pupil records) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(5)
  • Background

8.5.1: Transit Encryption (BASIC)

Do the policies clearly indicate whether or not all data in transit is encrypted?

  • Indicator
    • Discloses the transmission of user communications are encrypted using Secure Socket Layer (SSL).
    • Discloses the transmission of user communications are encrypted using unique keys.
    • Discloses users can secure information with their own user supplied encryption keys.
    • Discloses user communications are encrypted by default.
  • Citation
    • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
    • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)
  • Background
    • Encryption is an important tool for protecting freedom of expression and privacy. The UN Special Rapporteur on Freedom of Expression has stated unequivocally that encryption and anonymity are essential for the exercise and protection of human rights. We expect companies to clearly disclose that user communications are encrypted by default, that transmissions are protected by “perfect forward secrecy,” that users have an option users have to turn on end-to-end encryption, and if the company offers end-to-end encryption by default. For mobile ecosystems, we expect companies to clearly disclose that they enable full-disk encryption. See Ranking Digital Rights, P16.

8.6.1: Storage Encryption (BASIC)

Do the policies clearly indicate whether or not all data at rest is encrypted?

  • Indicator
    • Discloses user information is encrypted or inaccessible while in storage.
    • Discloses user information on mobile devices is encrypted with full disk encryption.
    • Discloses user information is encrypted if stored with third parties.
    • Discloses user information is encrypted while archived.
  • Citation
    • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
    • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)

8.7.1: Breach Notice (BASIC)

Do the policies clearly indicate whether or not the vendor provides notice in the event of a data breach to affected individuals?

  • Indicator
    • Discloses processes for notification of users affected by a data breach.
    • Discloses notification is provided to relevant legal authorities without unreasonable delay when a data breach occurs.
    • Discloses steps taken by the company to remedy the impact of a data breach on users.
  • Citation
  • Background
    • The breach notification laws in California and the 46 other states are similar in many ways, because most are modeled on the original California law. All of them require notifying individuals when their personal information has been breached, prefer written notification but allow using the "substitute method" in certain situations, allow for a law enforcement delay, and provide an exemption from the requirement to notify when data is encrypted and the keys required to de-crypt the data are still secure. However, there are some differences, primarily in three areas: (1) the notification trigger, (2) the timing for notification, and (3) the definition of covered information. See CA DOJ, California Data Breach Report (2016).
    • A vendor should develop and describe the process for notifying schools or school districts, parents, legal guardians, or eligible students, as well as any appropriate government agencies, of any unauthorized disclosure of student information. Determine whether the incident and the types of data involved also require notification under California's breach notification law, and if so, take appropriate action. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
    • When the security of users' data has been compromised due to a data breach, companies should have clearly disclosed processes in place for addressing the security threat and for notifying affected users. Given that data breaches can result in significant threats to an individual's financial or personal security, in addition to exposing private information, companies should make these security processes publicly available. Individuals can then make informed decisions and consider the potential risks before signing up for a service or giving a company their information. Company press releases or blog posts addressing a data breach after it has occurred do not qualify as sufficient disclosure for this indicator. We expect companies to have formal policies in place regarding their handling of data breaches if and when they occur, and companies to make this information about these policies and commitments public. See Ranking Digital Rights, P15.

Data Rights: Controlling rights to data

Evaluating data rights takes into consideration best practices of providing users with the ability to review, access, modify, delete, and export their personal information and content.

5.1.1: User Submission (BASIC)

Do the policies clearly indicate whether or not a user can create or upload content to the product?

  • Indicator
    • Discloses user content may be created or uploaded to the product.

5.6.1: Data Ownership

Do the policies clearly indicate whether or not a student, educator, parent, or the school retains ownership to the Intellectual Property rights of the data collected or uploaded to the product?

  • Indicator
    • Discloses copyright ownership of content remains with the user who created or uploaded the content to the product
    • Discloses the company does not retain any control or ownership over the operation, use, inputs, or outputs of the product after it has been purchased by the consumer.
  • Citation
  • Background

6.1.1: Access Data (BASIC)

Do the policies clearly indicate whether or not the vendor provides authorized individuals a method to access a user's personal information?

6.3.1: Data Modification (BASIC)

Do the policies clearly indicate whether or not the vendor provides authorized individuals with the ability to modify a user's inaccurate data?

  • Indicator
    • Discloses processes for the correction or modification of users' information.
  • Citation
    • California Online Privacy Protection Act: (If the operator maintains a process for a consumer to review and request changes to any of their personally identifiable information they must provide a description of that process) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(2)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.) See General Data Protection Regulation (GDPR), Right to rectification, Art. 16

6.4.1: Retention Policy

Do the policies clearly indicate the vendor's data retention policy, including any data sunsets or any time-period after which a user's data will be automatically deleted if they are inactive on the product?

  • Indicator
    • Discloses a timeframe in which the company may retain user information.
    • Discloses users' information is automatically deleted after a specified timeframe.
    • Discloses users' information is retained for different timeframes based on the type of data collected.
  • Citation
  • Background
    • A vendor should retain student information acquired through the site or service only as long as allowed or required by the school or district. A vendor should also describe their data retention policy, including how long they retain student information and why. A vendor's default retention period for covered information should not be indefinite. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.
    • Companies collect a wide range of personal information from users in exchange for the use of and access to the company's products and services. This information can range from personal details, profiles, and account activities to information about a user's activities and location. We expect companies to clearly disclose how long they retain user information and the extent to which they remove identifiers from user information they retain. Users should also be able to understand what happens when they delete their accounts. Companies that choose to retain user information for extended periods of time should take steps to ensure that data is not tied to a specific user. Acknowledging the ongoing debates about the efficacy of de-identification processes, and the growing sophistication around re-identification practices, we still consider de-identification a positive step that companies can take to protect the privacy of their users. If companies collect multiple types of information, we expect them to provide detail on how they handle each type of information. See Ranking Digital Rights, P6.

6.5.4: Deletion Process (BASIC)

Do the policies clearly indicate whether or not the vendor provides a process for the school, parent, or eligible student to delete a student's personal information?

6.5.2: Account Deletion

Do the policies clearly indicate whether or not a user's data are deleted upon account cancellation or termination?

6.5.1: Deletion Purpose

Do the policies clearly indicate whether or not the vendor will delete a user's personal information when the data are no longer necessary to fulfill its intended purpose?

  • Indicator
    • Discloses users' information will be deleted when no longer neccessary for the purpose in which it was collected.
  • Citation
    • Children's Online Privacy Protection Act: (An operator may retain information collected from a child only as long as necessarily to fulfill the purpose for which it was collected and must delete the information using reasonable measures to prevent unauthorized use) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.10
    • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a certification that a pupil's records shall not be retained or available to the third party upon completion of the terms of the contract and a description of how that certification will be enforced) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(7)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: (a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(1)(a)

6.1.2: Restrict Access

Do the policies clearly indicate whether or not the vendor provides mechanisms (permissions, roles, or access controls, etc.) to restrict what data are accessible to specific users?

6.6.1: User Export

Do the policies clearly indicate whether or not a user can export or download their data, including any user created content on the product?

Individual Control: Controlling data use

Responsible data use practices limit how personal information is used to only what's necessary to provide the application or service and user controls allow data practices to change.

5.4.1: User Control

Do the policies clearly indicate whether or not a user can control the vendor or third party's use of their information through privacy settings?

  • Indicator
    • Discloses how users can control the collection, use, or disclosure of their information.
  • Background
    • While notice and consent remains fundamental in many contexts, it is important to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the benefits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 61.

4.1.1: Purpose Limitation

Do the policies clearly indicate whether or not the vendor limits the use of data collected by the product to the educational purpose for which it was collected?

  • Indicator
    • Discloses use of information is limited to the purpose for which it was collected.
    • Discloses user information is only used if it is directly relevant or necessary for the product.
  • Citation
  • Background
    • Any PII from a students's education record that the provider receives under FERPA's "school official" exception may only be used for the specific purpose for which it was disclosed (i.e., to perform the outsourced institutional service or function, and the school or district must have direct control over the use and maintenance of the PII by the provider receiving the PII). Further, under FERPA's school official exception, the provider may not share or sell FERPA-protected information, or re-use it for any other purposes, except as directed by the school or district and as permitted by FERPA. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 5.
    • Companies should publicly commit to the principle of use limitation, which is part of the OECD privacy guidelines, among other frameworks. See Ranking Digital Rights, P5.

4.1.2: Data Purpose

Do the policies clearly indicate the context or purpose for which data are collected?

3.13.1: Vendor Combination

Do the policies clearly indicate whether or not data collected or maintained by the vendor can be augmented, extended, or combined with data from third-party sources?

  • Indicator
    • Discloses user information is combined with information from third parties by the vendor.
  • Citation

4.2.1: Combination Type

Do the policies clearly indicate whether or not the vendor would treat personally identifiable information (PII) combined with non-personally identifiable information as PII?

  • Indicator
    • Discloses any collected information combined with personal information is treated as Personally Identifiable Information (PII).
  • Citation
  • Background
    • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.

4.3.1: Context Notice

Do the policies clearly indicate whether or not notice is provided to a user if the vendor changes the context in which data are collected?

4.4.1: Context Consent

Do the policies clearly indicate whether or not the vendor will obtain consent if the practices in which data are collected change or are inconsistent with contractual requirements?

  • Indicator
    • Discloses consent will be obtained if the context in which data are collected or used changes.
  • Citation
    • General Data Protection Regulation: (Where the processing for a purpose other than that for which the personal data have been collected is not based on the data subject's consent or on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1), the controller shall, in order to ascertain whether processing for another purpose is compatible with the purpose for which the personal data are initially collected, take into account, several factors.) See General Data Protection Regulation (GDPR), Lawfulness of Processing, Art. 6(4)(a)-(d)

5.3.1: Complaint Notice

Do the policies clearly indicate whether or not the vendor has a grievance or remedy mechanism for users to file a complaint after the vendor restricts or removes a user's content or account?

  • Indicator
    • Discloses notification is provided to users if their account or content is restricted.
    • Discloses notification is provided to users who attempt to access content that has been restricted.
    • Discloses users can file a complaint if their account or content is restricted.
    • Discloses the reasons why a user's account or content may be restricted.
    • Discloses an appeal process for users to request their account or content be restored.
    • Discloses data about the number of accounts it restricts or closes on its own initiative.
    • Discloses data about the number of accounts it restricts or closes as a result of a government request.
    • Discloses data about the number of accounts it restricts or closes as a result of a request from private third-parties.
  • Citation
  • Background
    • Companies often set boundaries for what content users can post on a service as well as what activities users can engage in on the service. Companies can also restrict a user’s account, meaning that the user is unable to access the service, for violating these rules. For mobile ecosystems, this can include restricting access to an end-user’s account or a developer’s account. See Ranking Digital Rights, F3.
    • We also expect companies to clearly disclose whether they have a policy of granting priority or expedited consideration to any government authorities and/or members of private organizations or other entities that identify their organizational affiliation when they report content or users for allegedly violating the company’s rules. See Ranking Digital Rights, F3.
    • This indicator focuses on whether companies clearly disclose that they notify users when they take these types of actions (whether due to terms of service enforcement or third-party restriction requests). A company's decision to restrict or remove access to content or accounts can have a significant impact on users' freedom of expression and access to information rights. We therefore expect companies to disclose that they notify users when they have removed content, restricted a user's account, or otherwise restricted users' abilities to access a service. If a company removes content that a user has posted, we expect the company to inform that user about its decision. If a different user attempts to access content that the company has restricted, we expect the company to notify that user about the content restriction. We also expect companies to specify reasons for their decisions. This disclosure should be part of companies' explanations of their content and access restriction practices. See Ranking Digital Rights, F8.

5.5.2: Disclosure Request

Do the policies clearly indicate whether or not a user can request the vendor to provide all the personal information the vendor has shared with third parties?

  • Indicator
    • Discloses what types or categories of information users can obtain from a request.
    • Discloses users can obtain a copy of all their information collected by the product.
    • Discloses users can obtain a copy of all their information shared with third parties.
    • Discloses users can obtain their information in a structured data format.
  • Citation
  • Background
    • Users should be able to obtain all information that companies hold about them. We expect companies to clearly disclose what options users have to obtain this information, what data this record contains, and what formats users can obtain it in. See Ranking Digital Rights, P8.

5.5.3: Disclosure Notice

Do the policies clearly indicate whether or not the vendor will provide the affected user, school, parent, or student with notice in the event the vendor receives a government or legal request for their information?

  • Indicator
    • Discloses users are notified when government entities (including courts or other judicial bodies) request their user information.
    • Discloses notification is provided to an affected individual(s) of a government or private request for information.
    • Discloses the number of legal requests for information received.
    • Discloses situations when the company might not notify users, including a description of the types of government requests it is prohibited by law from disclosing to users.
    • Discloses the number of legal requests the company is prohibited by law from disclosing.
    • Discloses commitment to carry out due diligence on requests before deciding how to respond and to deny unlawful requests.
    • Discloses guidance or examples of its process of providing notice.
  • Citation
    • Family Educational Rights and Privacy Act: (An educational agency or institution may disclose information for lawful reasons if they make a reasonable effort to notify the parent or eligible student of the order or subpoena in advance of compliance, so that the parent or eligible student may seek protective action) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(9)(ii)
    • California Electronic Communications Privacy Act: (Prohibits a government entity from compelling the production of or access to electronic communication information or electronic device information, without a search warrant, wiretap order, order for electronic reader records, or subpoena issued pursuant under specified conditions, except for emergency situations) See California Electronic Communications Privacy Act, Cal. Pen. Code § 1546-1546.4)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: ... (d) the personal data have been unlawfully processed) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(1)(d)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller restriction of processing where one of the following applies: ... (d) the data subject has objected to processing pursuant to Article 21(1) pending the verification whether the legitimate grounds of the controller override those of the data subject.) See General Data Protection Regulation (GDPR), Right to restriction of processing, Art. 18(1)(d)
  • Background
    • We expect companies to clearly disclose a commitment to notifying users when governments and private parties request data about users. We acknowledge that this notice may not be possible in legitimate cases of an ongoing investigation; however, we expect companies to specify what types of government requests they are prohibited by law from disclosing. See Ranking Digital Rights, P12.

Data Sold: Preventing sale of data

Evaluating data selling takes into consideration best practices of not sharing, renting, or selling a user’s personal information to third parties for financial gain.

3.4.1: Sell Data (BASIC)

Do the policies clearly indicate whether or not a user's personal information is sold or rented to third parties?

5.5.1: Opt-Out Consent

Do the policies clearly indicate whether or not a user can opt out from the disclosure or sale of their data to a third party?

7.1.1: Transfer Data (BASIC)

Do the policies clearly indicate whether or not the vendor can transfer a user's data in the event of the vendor's merger, acquisition, or bankruptcy?

7.1.3: Transfer Notice

Do the policies clearly indicate whether or not the vendor will notify users of a data transfer to a third-party successor, in the event of a vendor's bankruptcy, merger, or acquisition?

  • Indicator
    • Discloses notification is provided and consent is obtained from users before data is transferred to a third party.

7.2.1: Delete Transfer

Do the policies clearly indicate whether or not a user can request to delete their data prior to its transfer to a third-party successor in the event of a vendor bankruptcy, merger, or acquisition?

  • Indicator
    • Discloses users' may request deletion of their information before data is transferred to a third party.

7.3.1: Contractual Limits

Do the policies clearly indicate whether or not the third-party successor of a data transfer is contractually required to provide the same privacy compliance required of the vendor?

  • Indicator
    • Discloses contractual obligations are imposed on third-party data transfer successors of the same privacy protections provided by the company.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
    • Student Online Personal Information Protection Act: (An operator may transfer a student's personal information to a third party in the event of a merger, acquisition, or bankruptcy, but the successor entity is subject to the same onward data privacy and security obligations) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(3)
    • General Data Protection Regulation: (Any transfer of personal data which are undergoing processing or are intended for processing after transfer to a third country or to an international organisation shall take place only if, subject to the other provisions of this Regulation, the conditions laid down in this Chapter are complied with by the controller and processor, including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation. All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.) See General Data Protection Regulation (GDPR), General principle for transfers, Art. 44
  • Background

3.15.1: Data Deidentified

Do the policies clearly indicate whether or not a user's information that is shared or sold to a third-party is only done so in an anonymous or deidentified format?

  • Indicator
    • Discloses user information is shared in an anonymized or de-identified format.
    • Discloses user information is sold in an anonymized or de-identified format.
  • Citation
  • Background
    • There is nothing wrong with a provider using de-identified data for other purposes, because privacy statutes, govern PII, not de-identified data. But because it can be difficult to fully de-identify data, as a best practice, an agreement between a company and third-party should prohibit re-identification and any future data transfers unless the third-party also agrees not to attempt re-identification. It is also a best practice to be specific about the de-identification process. De-identification typically requires more than just removing any obvious individual identifiers, as other demographic or contextual information can often be used to re-identify specific individuals. Retaining location and school information can also greatly increase the risk of re-identification. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, P. 3.
    • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
    • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
    • If a vendor shares covered information for the development and improvement of educational sites or services, they should de-identify and aggregate the information first. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

3.15.2: Deidentified Process

Do the policies clearly indicate whether or not the deidentification process is done with a reasonable level of justified confidence, or whether the vendor provides links to any information that describes their deidentification process?

  • Indicator
    • Discloses the process or method in which user information is anonymized or de-identified.
  • Citation
  • Background
    • While data shared in the aggregate can reduce the risk of re-identifying anonymous individuals, it does not completely eliminate the risk, and sharing of aggregate data should be carefully reviewed. The aggregation of student-level data into school-level (or higher) reports removes much of the risk of disclosure, since no direct identifiers (such as a name, Social Security Number, or student ID) are present in the aggregated tables. Some risk of disclosure does remain, however, in circumstances where one or more students possess a unique or uncommon characteristic (or a combination of characteristics) that would allow them to be identified in the data table (this commonly occurs with small ethnic subgroup populations), or where some easily observable characteristic corresponds to an unrelated category in the data table (e.g., if a school reports that 100% of males in grade 11 scored at "Below Proficient" on an assessment). In these cases, some level of disclosure avoidance is necessary to prevent disclosure in the aggregate data table. See PTAC, Frequently Asked Questions—Disclosure Avoidance (Oct 2012), p. 2.
    • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
    • A company must take reasonable measures to ensure that the data is de-identified. This means that the company must achieve a reasonable level of justified confidence that the data cannot reasonably be used to infer information about, or otherwise be linked to, a particular consumer, computer, or other device. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
    • Anonymous data is 'data that is in no way connected to another piece of information that could enable a user to be identified.' This expansive view is necessary to reflect several facts. First, skilled analysts can de-anonymize large data sets. This renders nearly all promises of anonymization unattainable. In essence, any data tied to an 'anonymous identifier' is not anonymous; rather, this is often pseudonymous data that may be tied back to the user’s offline identity. Second, metadata maybe as or more revealing of a user's associations and interests than content data, thus this data is of vital interest. Third, entities that have access to many sources of data, such as data brokers and governments, may be able to pair two or more data sources to reveal information about users. Thus, sophisticated actors can use data that seems anonymous to construct a larger picture of a user. See Ranking Digital Rights, P3.

3.2.3: Third-Party Research

Do the policies clearly indicate whether or not collected information is shared with third parties for research or product improvement purposes?

3.16.2: Combination Limits

Do the policies clearly indicate whether or not the vendor imposes contractual limits that prohibit third parties from reidentifying or combining data with other data sources that the vendor shares or sells to them?

  • Indicator
    • Discloses contractual obligations are placed on third parties from re-identification of anonymized or de-identified data.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
  • Background
    • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.
    • The FTC recommends that third-party data brokers take reasonable precautions to ensure that downstream users of their data do not use it for eligibility determinations or for unlawful discriminatory purposes. Of course, the use of race, color, religion, and certain other categories to make credit, insurance, and employment decisions is already against the law, but data brokers should help ensure that the information does not unintentionally go to unscrupulous entities that would be likely to use it for unlawful discriminatory purposes. Similarly, data brokers should conduct due diligence to ensure that data that they intend for marketing or risk mitigation purposes is not used to deny consumers credit, insurance, employment, or the like. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), pp. 55-56.
    • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
    • The FTC's framework application applies to data that, while not yet linked to a particular consumer, computer, or device, may reasonably become so. There is significant evidence demonstrating that technological advances and the ability to combine disparate pieces of data can lead to identification of a consumer, computer, or device even if the individual pieces of data do not constitute PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 20.

Data Safety: Promoting responsible use

Evaluating safety takes into consideration best practices that protect a user's physical and emotional health.

9.1.1: Safe Interactions (BASIC)

Do the policies clearly indicate whether or not a user can interact with trusted users?

  • Indicator
    • Discloses users can have social interactions with trusted or other known users.
    • Discloses users can have social interactions with students in the same classroom or school.
  • Citation

9.1.2: Unsafe Interactions

Do the policies clearly indicate whether or not a user can interact with untrusted users?

  • Indicator
    • Discloses users can have social interactions with unknown users in the product.
    • Discloses users can have social interactions with unknown individuals outside the product across the Internet.
  • Citation

9.1.3: Share Profile

Do the policies clearly indicate whether or not information must be shared or revealed by a user in order to participate in social interactions?

  • Indicator
    • Discloses what type of user profile information can be shared for social interactions.
    • Discloses user profile information must be shared to use the product.
  • Citation

9.2.1: Visible Data (BASIC)

Do the policies clearly indicate whether or not a user's personal information can be displayed publicly in any way?

  • Indicator
    • Discloses users' personal information can be made publicly visible.
  • Citation

9.2.2: Control Visibility

Do the policies clearly indicate whether or not a user has control over how their personal information is displayed to others?

  • Indicator
    • Discloses users can control how their personal information is displayed to others.

9.3.1: Monitor Content

Do the policies clearly indicate whether or not the vendor reviews, screens, or monitors user-created content?

  • Indicator
    • Discloses processes to review, screen, or monitor user-created content.

9.3.2: Filter Content (BASIC)

Do the policies clearly indicate whether or not the vendor takes reasonable measures to delete all personal information from a user's postings before they are made publicly visible?

  • Indicator
    • Discloses processes to filter and delete users' personal information before it is made publicly visible.
  • Citation
    • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Background
    • Companies may employ staff to review content and/or user activity or they may rely on community flagging mechanisms that allow users to flag other users’ content and/or activity for company review. See Ranking Digital Rights, F3.

9.3.3: Moderating Interactions (BASIC)

Do the policies clearly indicate whether or not social interactions between users of the product are moderated?

9.3.4: Log Interactions

Do the policies clearly indicate whether or not social interactions are logged by the vendor and are available for review or audit?

  • Indicator
    • Discloses social interactions between users are logged by the company.

9.4.2: Report Abuse

Do the policies clearly indicate whether or not a user can report abusive behavior, or cyberbullying?

  • Indicator
    • Discloses processes for users to report abusive or cyber-bullying conduct.

Ads & Tracking: Prohibiting the exploitation of users' decision making process

Evaluating ads and tracking takes into consideration best practices of not using a user’s personal information for any third-party marketing, behavioral advertising, tracking, or profile generation purposes.

3.2.4: Third-Party Marketing (BASIC)

Do the policies clearly indicate whether or not personal information is shared with third parties for advertising or marketing purposes?

10.2.1: Traditional Ads (BASIC)

Do the policies clearly indicate whether or not traditional advertisements are displayed to a user based on a webpage's content, and not that user's data?

  • Indicator
    • Discloses traditional advertisements are displayed to users on the product.
    • Discloses advertisements are displayed to users without using of any collected personal information.
  • Citation

10.3.1: Behavioral Ads (BASIC)

Do the policies clearly indicate whether or not behavioral advertising based on a user's personal information are displayed?

  • Indicator
    • Discloses behavorial advertisements are displayed to users on the product.
    • Discloses advertisements are displayed to users based on thier personal or non-personal information.
  • Citation
  • Background
    • Online behavioral or targeted advertising is the practice of collecting information about consumers' online interests in order to deliver targeted advertising to them. This system of advertising revolves around ad networks that can track individual consumers—or at least their devices—across different websites. When organized according to unique identifiers, this data can provide a potentially wide-ranging view of individual use of the Internet. These individual behavioral profiles allow advertisers to target ads based on inferences about individual interests, as revealed by Internet use. Targeted ads are generally more valuable and efficient than purely contextual ads and provide revenue that supports an array of free online content and services. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), pp. 11-12.
    • The FTC recommends that affirmative express consent is appropriate when a company uses sensitive data for any marketing, whether first or third-party. When health or children's information is involved, for example, the likelihood that data misuse could lead to embarrassment, discrimination, or other harms is increased. This risk exists regardless of whether the entity collecting and using the data is a first-party or a third-party that is unknown to the consumer. In light of the heightened privacy risks associated with sensitive data, first parties should provide a consumer choice mechanism at the time of data collection. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 47.
    • The FTC believes affirmative express consent for first-party marketing using sensitive data should be limited. Certainly, where a company's business model is designed to target consumers based on sensitive data – including data about children, financial and health information, Social Security numbers, and certain geolocation data – the company should seek affirmative express consent before collecting the data from those consumers. On the other hand, the risks to consumers may not justify the potential burdens on general audience businesses that incidentally collect and use sensitive information. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 47-48.
    • If a vendor displays targeted advertising they should not use any information, including covered information and persistent unique identifiers, acquired through the site or service as a basis for targeting advertising to a specific student or other user. This includes both advertising delivered on the site or service that acquired the information and advertising delivered on any other site or service based on that information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.

10.4.1: Third-Party Tracking (BASIC)

Do the policies clearly indicate whether or not third-party advertising services or tracking technologies collect any information from a user of the product?

10.4.2: Track Users (BASIC)

Do the policies clearly indicate whether or not a user's information is used to track users and display target advertisements on other third-party websites or services?

10.4.3: Data Profile (BASIC)

Do the policies clearly indicate whether or not the vendor allows third parties to use a student's data to create an automated profile, engage in data enhancement, conduct social advertising, or target advertising to students, parents, teachers, or the school?

10.6.1: Marketing Messages

Do the policies clearly indicate whether or not the vendor may send marketing emails, text messages, or other related communications that may be of interest to a user?

10.6.2: Third-Party Promotions

Do the policies clearly indicate whether or not the vendor may ask a user to participate in any sweepstakes, contests, surveys, or other similar promotions?

10.7.1: Unsubscribe Ads

Do the policies clearly indicate whether or not a user can opt out of traditional, contextual, or behavioral advertising?

  • Indicator
    • Discloses users can opt-out whether their information is used for advertising purposes.
    • Discloses users can contact third-party advertisers to control whether their information is used for advertising purposes.
  • Citation
  • Background
    • We expect companies to enable users to control the use of their information for the purpose of targeted advertising. Targeted advertising requires extensive collection and retention of user information that is tantamount to tracking. Companies should therefore clearly disclose whether users have options to control how their information is being used for these purposes. See Ranking Digital Rights, P7.

10.7.2: Unsubscribe Marketing

Do the policies clearly indicate whether or not a user can opt out or unsubscribe from a vendor or third party marketing communication?

Parental Consent: Protecting children’s personal information

Evaluating parental consent takes into consideration best practices of protecting children under 13 years of age by requiring a parent’s or guardian's verifiable consent before the collection, use, or disclosure of a child's personal information to an application or service.

1.8.1: Children Intended (BASIC)

Do the policies clearly indicate whether or not the product is intended to be used by children under the age of 13?

  • Indicator
    • Discloses the product is intended to be used by children under the age of 13.
  • Citation

2.2.2: Child Data

Do the policies clearly indicate whether or not the vendor collects personal information online from children under 13 years of age?

  • Indicator
    • Discloses personal information from children under 13 years of age is collected.
  • Citation
  • Background
    • The Children's Online Privacy Protection Act (COPPA) requires a privacy policy to list the kinds of personal information collected from children (for example, name, address, email address, hobbies, etc.), how the information is collected, and how the company uses the personal information. It also requires companies to indicate whether they disclose information collected from children to third-parties. If so, the company must also disclose the kinds of businesses in which the third-parties are engaged, the general purposes for which the information is used, and whether the third-parties have agreed to maintain the confidentiality and security of the information. See 15 U.S.C. § 6502; 16 C.F.R. Part 312.
    • If a company knows that a user of the online website or service is under the age of 13, the Children's Online Privacy Protection Act (COPPA) will impose more stringent requirements on the collection of information from those users. COPPA requires that operators seeking to collect, use, or disclose personal information from children under the age of 13, must first obtain verifiable parental consent. Even where a user is 13 or older, COPPA remains a source of best practices for companies that collect personal information from users, particularly when those users are still minors. See 15 U.S.C. §§ 6501-6506;16 C.F.R. Part 312.
    • COPPA permits the collection of limited personal information from children under 13 for the purposes of: (1) Obtaining verified parental consent; (2) providing parents with a right to opt-out of an operator's use of a child's email address for multiple contacts of the child; and (3) to protect a child's safety on a website or online service. See 15 U.S.C. 6502(b)(2); 16 C.F.R. 312.5(c)(1)–(5).

1.8.4: Parents Intended

Do the policies clearly indicate whether or not the product is intended to be used by parents or guardians?

11.1.1: Actual Knowledge

Do the policies clearly indicate whether or not the vendor has actual knowledge that personal information from children under 13 years of age is collected by the product?

  • Indicator
    • Discloses the company has actual knowledge users of the product are under the age of 13.
    • Discloses a user's age or birthday is collected upon account registration.
    • Discloses the product utilizes an age-gate or other mechanism to verify the age of a user.
    • Discloses the product is directed or would appeal to children under 13 years of age.
    • Discloses the product provides features intended for children under 13 years of age.
  • Citation
    • Children's Online Privacy Protection Act: (A general audience site is where the operator has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required before collection of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A site directed to children is where the operator has actual knowledge the site is collecting information from children under the age of 13 and parental consent is required before any collection or use of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A vendor who may obtain actual knowledge that it is collecting information from a child must not encourage a child from disclosing more information than reasonably necessary through an age verification mechanism. An age gate should be: age-neutral; not encourage falsification; list day, month, and year; have no prior warning that under 13 children will be blocked; and prevent multiple attempts) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.3(d)
  • Background
    • The Children's Online Privacy Protection Act (COPPA) requires an operator to post a link to a notice of its information practices on the homepage of its web site or online service and in each area of its web site where it collects "Personal Information" from children. An operator of a general audience web site with a separate children's area must also post a link to its privacy policy on the homepage of the children's area. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312
    • COPPA applies anytime an operator of a website or online service has actual knowledge that it is collects, maintains, uses, or discloses personal information from a child under 13. In these situations an operator is generally required to obtain verified parental consent.
    • COPPA requires companies to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. Companies should minimize what they collect in the first place and take reasonable steps to release personal information only to service providers and third-parties capable of maintaining its confidentiality, security, and integrity. Always obtain assurances that third-parties will live up to their contractual privacy responsibilities. Also, companies should hold on to personal information only as long as is reasonably necessary for the purpose for which it was collected. They should securely dispose of it once they no longer have a legitimate reason for retaining it. See FTC, Six-Step Compliance Plan for Your Business.

11.1.2: COPPA Notice

Do the policies clearly indicate whether or not the vendor describes: (1) what information is collected from children under 13 years of age, (2) how that information is used, and (3) its disclosure practices for that information?

  • Indicator
    • Discloses COPPA or children's privacy is applicable to the product.
    • Discloses how the company collects, uses, and discloses information from children under 13 years of age.
  • Citation

11.3.1: Parental Consent (BASIC)

Do the policies clearly indicate whether or not the vendor or third party obtains verifiable parental consent before they collect or disclose personal information?

11.3.2: Limit Consent

Do the policies clearly indicate whether or not a parent can consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties?

  • Indicator
    • Discloses parental consent can be limited with respect to use with third parties.
    • Discloses parental consent can be given for the collection and use of information with the company seperate from use with third parties.
  • Citation
    • Children's Online Privacy Protection Act: (An operator can not condition a child's participation in the service with sharing any collected information with third parties. A parent is required to have the ability to consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(a)(2)

11.3.3: Withdraw Consent

Do the policies clearly indicate whether or not the vendor responds to a request from a parent or guardian to prevent further collection of their child's information?

11.3.4: Delete Child-PII

Do the policies clearly indicate whether or not the vendor deletes personal information from a student or child under 13 years of age if collected without parental consent?

  • Indicator
    • Discloses the company will delete personal information from a student or child under 13 years of age if collected without parental consent.
  • Citation

11.3.5: Consent Method (BASIC)

Do the policies clearly indicate whether or not the vendor provides notice to parents or guardians of the methods to provide verifiable parental consent under COPPA?

  • Indicator
    • Discloses the parental consent method(s) that are available for submission of consent by a parent or guardian.
  • Citation
    • Children's Online Privacy Protection Act: (An operator is required to provide direct notice to parents describing what information is collected, how information is used, its disclosure practices and exceptions) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.4(b)
    • Children's Online Privacy Protection Act: (Existing methods to obtain verifiable parental consent include: (i) Providing a consent form to be signed by the parent and returned to the operator by postal mail, facsimile, or electronic scan; (ii) Requiring a parent, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each discrete transaction to the primary account holder; (iii) Having a parent call a toll-free telephone number staffed by trained personnel; (iv) Having a parent connect to trained personnel via video-conference; (v) Verifying a parent's identity by checking a form of government-issued identification against databases of such information, where the parent's identification is deleted by the operator from its records promptly after such verification is complete) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(i)-(v)
    • Children's Online Privacy Protection Act: (If an operator does not “disclose” children's personal information, they may use an email coupled with additional steps to provide assurances that the person providing the consent is the parent. Such additional steps include: Sending a confirmatory email to the parent following receipt of consent, or obtaining a postal address or telephone number from the parent and confirming the parent's consent by letter or telephone call. An operator that uses this method must provide notice that the parent can revoke any consent given in response to the earlier email.) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(vi)
  • Background
    • Under most circumstances an operator is required to obtain verified parental consent before the collection, use, or disclosure, of personal information from children under the age of 13. The method used to obtain parental consent must be reasonably calculated (taking into account available technology) to ensure that the person providing consent is actually the child's parent.

School Purpose: Following student data privacy laws

Evaluating school purpose takes into consideration best practices of companies that collect personal information from students or teachers in K-12 and the legal obligations for the privacy and security of that information.

1.8.5: Students Intended (BASIC)

Do the policies clearly indicate whether or not the product is intended to be used by students in preschool or K-12?

2.2.1: Student Data

Do the policies clearly indicate whether or not the vendor collects personal information or education records from preK-12 students?

  • Indicator
    • Discloses education records from preK-12 students is collected.
  • Citation
  • Background
    • The Family Educational Rights and Privacy Act of 1974 (FERPA), provides parents of students the right to access their children's Student Data or education records, and Students 18 years of age and older the right to access their own education records. In addition, FERPA provides the right to have the records amended, and the right to have some control over the disclosure of personally identifiable information (PII) in the education records. Furthermore, strict storage guidelines surround Student Data which require organizations to maintain accurate, and up-to-date records. See 20 U.S.C. § 1232g; 34 C.F.R. Part 99.1.
    • What are Education Records? FERPA defines educational records as records that are: (1) directly related to a student; and (2) maintained by an educational agency or institution or by a party acting for the agency or institution. These records include, but are not limited to, transcripts, class lists, student course schedules, health records, student financial information, and student disciplinary records. It is important to note that any of these records maintained by a third-party acting on behalf of a school or district are also considered education records. 20 U.S.C. § 1232g (a)(4)(A); 34 CFR § 99.3; See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 1; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 2.

1.8.6: Teachers Intended

Do the policies clearly indicate whether or not the product is intended to be used by teachers?

  • Indicator
    • Discloses the product is intended to be used by teachers.
  • Citation

11.2.1: School Purpose (BASIC)

Do the policies clearly indicate whether or not the product is primarily used, designed, and marketed for preschool or K-12 school purposes?

11.2.2: Education Records

Do the policies clearly indicate the process by which education records are entered into the product? For example, are data entered by district staff, school employees, parents, teachers, students, or some other person?

11.2.3: School Contract

Do the policies clearly indicate whether or not the vendor provides a contract to a Local Educational Agency (LEA) or otherwise provides notice to users of additional rights?

  • Indicator
    • Discloses a separate agreement or contract is provided to schools or districts of their rights.
    • Discloses notification is provided to schools or districts of their rights.
  • Citation
    • Family Educational Rights and Privacy Act: (An educational institution must annually notify parents of their rights to inspect and review a student's education records, make corrections, delete, or consent to the disclosure of information) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.7(a)
    • Family Educational Rights and Privacy Act: (Any rights to access, modify, or delete student records may transfer to an "eligible" student who is over 18 years of age) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.5(a)(1)
    • General Data Protection Regulation: (The controller shall, at the time when personal data are obtained, provide the data subject with the following further information necessary to ensure fair and transparent processing: ... (e) whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data) See General Data Protection Regulation (GDPR), Information to be provided where personal data are collected from the data subject, Art. 13(2)(e)
    • California AB 1584 - Privacy of Pupil Records: (Authorizes a Local Educational Agency (LEA) to enter into a third party contract for the collection and use of pupil records that must include a statement that the pupil records continue to be the property of and under the control of the local educational agency, a description of the actions the third party will take to ensure the security and confidentiality of pupil records, and a description of how the local educational agency and the third party will jointly ensure compliance with FERPA) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code §§ 49073.1
  • Background
    • FERPA is a Federal law that protects personally identifiable information in students' education records from unauthorized disclosure. It affords parents the right to access their child's education records, the right to seek to have the records amended, and the right to have some control over the disclosure of personally identifiable information from the education records. When a student turns 18 or enters a postsecondary institution at any age, the rights under FERPA transfer from the parents to the student ("eligible student"). 20 U.S.C. § 1232g; 34 C.F.R. Part 99; See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, pp. 1-3.
    • FERPA denies federal funding to educational agencies or institutions that have a practice or policy of permitting the release of student information without parental consent. There is an exception where such information is released to "school officials" who have been determined by the educational agency or institution to have a legitimate educational interest.
    • A vendor should describe the procedures for a parent, legal guardian, or eligible student to review and correct covered information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

11.2.4: School Official

Do the policies clearly indicate whether or not the vendor is under the direct control of the educational institution and designates themselves a 'School Official' under FERPA?

11.3.10: School Consent

Do the policies clearly indicate whether or not responsibility or liability for obtaining verified parental consent is transferred to the school or district?

  • Indicator
    • Discloses the obligation to obtain verifiable parental consent from a parent of guardian are transferred to the school or district.
    • Discloses the school or district are required to provide verifiable parental consent records to the company upon request.
  • Background
    • Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent . . . As a best practice, the school should consider providing parents with a notice of the websites and online services whose collection it has consented to on behalf of the parent. Schools can identify, for example, sites and services that have been approved for use district-wide or for the particular school. See FTC, Complying with COPPA: Frequently Asked Questions, M. COPPA AND SCHOOLS, 2-4..

11.3.8: FERPA Exception

Do the policies clearly indicate whether or not the vendor may disclose personal information without verifiable parental consent under a FERPA exception?

11.3.9: Directory Information

Do the policies clearly indicate whether or not the vendor discloses student information as 'Directory Information' under a FERPA exception?

  • Indicator
    • Discloses student information can be shared without parental consent as Directory Information.
    • Discloses what type of student information can be shared as Directory Information under a FERPA exception.
  • Citation
  • Background
    • What is the "Directory Information" Exception? An exception to parental consent that permits the disclosure of PII from education records under FERPA. Information designated by the school or district as directory information may be disclosed without consent and used without restriction in conformity with the policy, unless the parent, guardian, or eligible student opts out. Examples of directory information about students include name, address, telephone number, email address, date and place of birth, grade level, sports participation, and honors or awards received. Before a school or district can disclose directory information, it must first provide public notice to parents and eligible students of the types of information designated as directory information, the intended uses for the information, and the right of parents or eligible students to "opt out" of having their information shared. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 3; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 3-4.