Lawmakers are coming after addictive tech product design, opaque algorithms, and weak safeguards to protect kids online, thanks to Bill SB1748 which is now gaining momentum.
Companies would need to publish regular public reports about how their platforms affect kids. That means real data about screen time, types of content being shown, and how many minors are using their services. If a platform is exposing kids to harmful material, they would need to admit it. This kind of openness could finally bring some accountability to how apps operate behind the scenes.
Tech platforms might soon have to explain how their recommendation systems work (not Substack, duh 😉). If a platform is promoting content using a secret algorithm, users (especially kids and their parents) will have the option to opt out and switch to a simple, chronological feed. Transparency is the goal, thanks to forthcoming Bill SB1748, and it could implicitly affect how social apps keep young people glued to their screens.
Companies would need to publish regular public reports about how their platforms affect kids. That means real data about screen time, types of content being shown, and how many minors are using their services. If a platform is exposing kids to harmful material, they would need to admit it. This kind of openness could finally bring some accountability to how apps operate behind the scenes.
Tech platforms might soon have to explain how their recommendation systems work (not Substack, duh 😉). If a platform is promoting content using a secret algorithm, users (especially kids and their parents) will have the option to opt out and switch to a simple, chronological feed. Transparency is the goal, thanks to forthcoming Bill SB1748, and it could implicitly affect how social apps keep young people glued to their screens.