Just as the FDA officially approved Pfizer’s COVID-19 vaccine for kids between the ages of five and 11, Meta, Facebook’s brand new identity, announced that it’s rolling out stricter policies for vaccine misinformation targeted at children (via Engadget). The platform previously put restrictions on COVID-19 vaccine misinformation in late 2020, but didn’t have policies specific to kids.
Meta says in a new blog post that it’s partnering with the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) to take down harmful content related to children and the COVID-19 vaccine. This includes any posts that imply the COVID-19 vaccine is unsafe, untested, or ineffective for children. Additionally, Meta will provide in-feed reminders in English and Spanish that the vaccine has been approved for kids, and will also provide information about where it’s available.
Meta notes that it’s taken down a total of 20 million pieces of COVID-19 and vaccine misinformation from both Facebook and Instagram since the beginning of the pandemic. These numbers are at odds with what we’ve seen from the leaked internal documents from Facebook — the Facebook Papers made it clear just how unprepared the platform was for misinformation related to the COVID-19 vaccine. If Facebook were more prepared, it might’ve rolled out campaigns to combat misinformation earlier in the pandemic, both for children and adults, possibly removing more false content as a result.
Credit: Source link