The Supreme Court ruled today in two cases that could have a major impact on how social media platforms operate and how the government can interfere on behalf of political speech on these platforms.
Replacing independent fact-checkers with social media users to moderate online posts raises many questions about content ...
New York Attorney General Letitia James has announced social media companies will now need to report their content moderation ...
The Supreme Court punted on cases challenging Texas and Florida laws that regulate social media platforms’ content moderation practices. The case raised questions of whether Facebook, X/Twitter, ...
Two state laws that could upend the way social media companies handle content moderation are still in limbo after a Supreme Court ruling sent the challenges back to lower courts, vacating previous ...
Social media apps operating in New York will be required to disclose their content moderation policies to safeguard users ...
California will no longer enforce key provisions of a law requiring social media companies to disclose details about their content moderation practices after settling a lawsuit with Elon Musk’s X Corp ...
New York Attorney General Letitia James announced Thursday that social media companies must start reporting their content moderation policies to her office, as required by the “Stop Hiding Hate” Act.
New York’s Stop Hiding Hate Act requires large social media companies to submit reports about content moderation policies ...
Social media platforms commonly use artificial intelligence for content moderation, with the AI software itself relying on algorithms to screen content posted by social media users. Ultimately, the AI ...
OpenAI is diving deeper into the world of AI media generation. Its AI video generator Sora is getting its own social media ...