Discussion about this post

User's avatar
Lyan T.'s avatar

Companies would need to publish regular public reports about how their platforms affect kids. That means real data about screen time, types of content being shown, and how many minors are using their services. If a platform is exposing kids to harmful material, they would need to admit it. This kind of openness could finally bring some accountability to how apps operate behind the scenes.

Expand full comment
Hannah P.'s avatar

Tech platforms might soon have to explain how their recommendation systems work (not Substack, duh 😉). If a platform is promoting content using a secret algorithm, users (especially kids and their parents) will have the option to opt out and switch to a simple, chronological feed. Transparency is the goal, thanks to forthcoming Bill SB1748, and it could implicitly affect how social apps keep young people glued to their screens.

Expand full comment

No posts