Your "For You" feed feels personal, but it's shaped by powerful algorithms and, increasingly, by government rules. In this lesson, we'll explore how different governments are trying to regulate these platforms and what it means for our culture. We'll look at two major examples from 2024 and 2025: one from the United States and one from the European Union.
Who controls your feed?
Our online experience is a mix of personal choice, algorithmic suggestions, and government regulations.
Warm-up (5 mins)
Think about your favorite social media app. In pairs or small groups, discuss this question:
“What is one way algorithms have changed culture in my city or country?”
Think about music, fashion, news, local businesses, or even language. Be ready to share one interesting idea with the class.
Input: Two Approaches to Regulation (10 mins)
Governments around the world are creating new rules for social media platforms. These rules, often called regulations, can have a huge impact on what we see and create online. Let's look at two different approaches.
Briefing 1: The United States’ “Divest-or-Ban” Law
In April 2024, the United States passed a law targeting the popular video app TikTok. The law stated that its parent company, ByteDance, had to sell its U.S. operations by a deadline in April 2025, or the app would face a ban in U.S. app stores. U.S. lawmakers argued that the app posed a national security risk. On the other hand, the company claimed that a ban would violate free speech. After the Supreme Court upheld the law in January 2025, the deadline passed, leading to a series of complex legal challenges and negotiations that continued to unfold throughout the year.
Briefing 2: The European Union’s Digital Services Act (DSA)
The European Union has taken a different path. Instead of targeting one company, their Digital Services Act (DSA) creates a wide set of rules for all major online platforms. The DSA's main goal is to create a safer digital space by protecting users' fundamental rights.
For example, in April 2024, the EU launched an investigation into a new app feature called TikTok Lite, which rewarded users for watching videos. Regulators were concerned that the feature was potentially addictive and had been launched without a proper risk assessment, especially for younger users. The EU alleges that the company failed to provide enough safeguards. This approach appears to be more focused on process and user protection across the entire industry, rather than on the nationality of the company's owners. It is likely that we'll see more actions like this under the DSA.
Language Focus: Reporting, Hedging, and Contrasting
When discussing current events, it's important to be precise about where information comes from and to show different perspectives. We use specific language to do this.
Reporting Verbs
These verbs help us report what someone said or believes without stating it as a fact. The form is typically: Subject + Reporting Verb + (that) + clause.
| Verb | Meaning | Example |
|---|---|---|
| state | to say something formally | The law states that the company must sell its U.S. operations. |
| claim | to say something is true, often without proof | The company claims a ban would violate free speech. |
| argue | to give reasons for an opinion | Lawmakers argue that the app is a security risk. |
| allege | to accuse someone of wrongdoing, without proof | The EU alleges that the company failed to provide safeguards. |
Hedging Language
This language helps us sound less certain and more objective, which is useful when discussing possible outcomes or complex situations.
| Phrase | Function | Example |
|---|---|---|
| appears / seems to be | signals an observation or impression | The EU's approach appears to be more focused on process. |
| potentially / possibly | suggests something is a possibility, not a certainty | The feature was potentially addictive. |
| It is likely that... | shows a strong probability | It is likely that we will see more actions like this. |
Words of Contrast
These words help us introduce an opposing idea or compare two different things.
| Word/Phrase | Function | Example |
|---|---|---|
| On the other hand | introduces a contrasting point of view | The government is concerned about security. On the other hand, creators are concerned about their income. |
| However | introduces a contrasting idea | Many people enjoy the app. However, it faces legal challenges. |
| While / Whereas | used to contrast two ideas in one sentence | While the U.S. law targets one company, the EU's rules apply to many platforms. |
Activity 1: Algorithm Audit (20 mins)
Instructions
Now, let's think about the cultural impact of these platforms. In small groups, choose one social media platform (like Instagram, YouTube, TikTok, etc.).
Your task is to identify potential risks and their cultural impacts. Use the table below as a guide. You can write it in a Google Doc. Try to identify at least two risks.
| Platform Feature | Potential Risk to Users | Potential Cultural Impact |
|---|---|---|
| Example: "For You" Feed | Echo Chamber (only seeing content you agree with) | Less exposure to diverse music; increased political division. |
| 1. | ||
| 2. |
After 15 minutes, each group will share one key finding with the class.
Activity 2: Policy Micro-Design (20 mins)
Instructions
Imagine you have the power to create one small rule—a safeguard—for social media use in your city or at your school/university. The goal is not to ban anything, but to make the digital space better for everyone.
In your groups, draft a simple, 3-line policy. Your policy should focus on one of these areas:
- Transparency: Helping users understand why they see certain content.
- Labeling: Clearly marking sponsored content or AI-generated images.
- Youth Defaults: Making default settings safer for users under 18.
Examples:
Transparency Rule:
1. All campus event posts must include a "Why you're seeing this ad" button.
2. Clicking it will explain the targeting criteria (e.g., "age 18-25, interest in music").
3. This aims to increase media literacy among students.
Labeling Rule:
1. Any post by a campus club using AI-generated images must be labeled #AIgenerated.
2. This helps viewers distinguish between real and created content.
3. The goal is to promote authentic communication.
After 15 minutes, a representative from each group will present their 3-line policy to the class.
Exit Task (5 mins)
Think about all the ideas discussed today. Turn to a partner and answer this question:
“What is one safeguard or rule you would personally support, and why?”