AI, Data, and Privacy

FERPA and the Role of Teachers as “In Loco Parentis” 

Under the Family Educational Rights and Privacy Act (FERPA), educators are responsible for protecting students' personally identifiable information (PII). FERPA mandates that schools receiving federal funding must obtain parental consent before sharing student data with third parties unless such disclosure is for legitimate educational purposes. Specifically, the law states: An educational agency or institution may disclose personally identifiable information from an education record of a student without the consent required... if the disclosure is to other school officials, including teachers, within the agency or institution whom the agency or institution has determined to have legitimate educational interests (34 CFR § 99.31)​ (Mondaq) (Security Boulevard).

As educators, teachers act in loco parentis—in place of the parent—making them legally responsible for protecting students’ rights and privacy in school settings. This role requires teachers to navigate carefully between technology use and student safety, ensuring compliance with both legal obligations and institutional policies regarding privacy and consent.

This responsibility extends to the use of digital tools. Teachers must ensure that any platform used—whether AI-driven like ChatGPT or more conventional tools like Adobe and Canva—complies with privacy standards and does not compromise students' data security.

ChatGPT vs. Kami, Adobe, Canva

Many educational platforms collect data from users to improve services, often sharing or selling it under specific agreements. Kami, Adobe, and Canva, for example, require users to create accounts, which allows these companies to collect metadata, usage logs, and even personal data. Some of this data may be shared with third-party partners for marketing or analytics purposes​ (Education Framework) (CITE).

ChatGPT and other AI tools operate similarly, storing conversations and user inputs to improve the AI model over time. However, students using AI tools might inadvertently share personal information or academic work that becomes part of the AI's broader dataset. This is why FERPA compliance and careful oversight are essential when integrating AI tools. Educators should compare AI tools with traditional platforms, asking questions like:

If the company does not have an explicit, written policy about how they navigate, secure, or share data, always assume the data will be sold, tracked, and shared to their benefit and proceed with caution.

How Data is Tagged and Sold: User Profiling and Digital Identity

Educational tools often use persistent identifiers (PID)—such as email addresses or browser cookies—that tag individual users and track their behavior across platforms. These identifiers help build digital profiles of students, which are sometimes used for targeted advertising. While platforms claim to anonymize data, patterns in behavior and metadata can still be traced back to individual users, especially when combined with other data sources​ (Future of Privacy Forum) (Security Boulevard).

Companies like Adobe and Canva may share user data with third-party advertisers. Although educational platforms have policies to prevent direct sales of student data, aggregated and de-identified data is still monetized to inform product development or advertising strategies. For educators, this raises concerns about transparency and control over students' personal information.

Guidelines and Tips to Ensure Student Privacy