staircase@programming.dev to Technology@lemmy.worldEnglish · edit-21 month agoMeta found liable in child exploitation casewww.theguardian.comexternal-linkmessage-square159linkfedilinkarrow-up11.25Kfile-text
arrow-up11.25Kexternal-linkMeta found liable in child exploitation casewww.theguardian.comstaircase@programming.dev to Technology@lemmy.worldEnglish · edit-21 month agomessage-square159linkfedilinkfile-text
minus-squareBoiglenoight@lemmy.worldlinkfedilinkEnglisharrow-up57·1 month agoFacebook made 200 billion in revenue in 2025. https://stockanalysis.com/stocks/meta/revenue/ They were fined $375 million. They averaged $550 million per day last year.
minus-squareBarneyPiccolo@lemmy.todaylinkfedilinkEnglisharrow-up8·1 month agoSo the fine was basically paid by Lunchtime on January 1.
minus-squareBoiglenoight@lemmy.worldlinkfedilinkEnglisharrow-up5·1 month agoYes. It’s probably concerning if they are continuously fined, but unless there’s a mechanism that ensures that, this is likely just annoying and not meaningful.
minus-squareBarneyPiccolo@lemmy.todaylinkfedilinkEnglisharrow-up2·1 month agoJust an itch to be scratched. Done. What’s for lunch?
minus-squareMyMindIsLikeAnOcean@piefed.worldlinkfedilinkEnglisharrow-up1·30 days agoIf social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away. As it stands bad actors use bots to stay one step ahead of automated moderation.
Facebook made 200 billion in revenue in 2025.
https://stockanalysis.com/stocks/meta/revenue/
They were fined $375 million. They averaged $550 million per day last year.
So the fine was basically paid by Lunchtime on January 1.
Yes. It’s probably concerning if they are continuously fined, but unless there’s a mechanism that ensures that, this is likely just annoying and not meaningful.
Just an itch to be scratched. Done. What’s for lunch?
If social media companies were required to moderate their content…if they were responsible for what’s posted…all problems would go away.
As it stands bad actors use bots to stay one step ahead of automated moderation.