[ad_1]
5 billion people around the world use social media, and millions of posts are shared on Meta (Facebook) every minute.
Virtual life can be fun and beneficial, but it can also be harmful. Misinformation about climate science is widespread. So are conspiracy theories and disinformation about vaccine science, as we have witnessed during the COVID-19 pandemic.
This misinformation not only undermines trust in science, but actively threatens public health. The World Health Organization ranks vaccine hesitancy among the top 10 global health threats of our time, along with air pollution and climate change health.
Evidence shows that vaccines save lives and that climate change is strongly linked to rising water temperatures, rising sea levels, shrinking polar ice sheets, loss of biodiversity and more extreme weather events. 2023 was the hottest year on record. Greenhouse gas emissions hit an all-time high. Earth is sounding a distress signal.
But ironically, as the scientific evidence becomes more convincing, public opinion seems to be becoming more divided.
This hesitation and denial appears to be based on a misunderstanding of how science works.
Vaccines save millions of lives each year, and improvements in technology are changing the public perception of climate change, with clear data showing that rising CO2 and temperatures are linked to the burning of fossil fuels. Shown.
This is science in action.
But opponents on social media and other news outlets say the scientific evidence on both sides is unconclusive and contradictory. Opposing views are often supported by a few vocal scientists in white coats who spread misinformation, uncertainty, and doubt.
The fossil fuel industry also helps undermine science by emphasizing so-called “dissenting voices” within the scientific community.
This tactic is not new. For decades, the industry has created dummy organizations and corporations to fund climate science denial and clouding public understanding of climate science.
These days, fossil fuel industries and other major polluters are using social media to reach new audiences. An analysis found that 16 of the world’s biggest polluters were responsible for running more than 1,700 false and misleading ads on Facebook in 2021.
There are always a few scientists who challenge the accepted view. However, after reviewing existing evidence, more than 97 percent of professional medical and climate scientists agree that vaccines are safe and that human activity is increasing carbon dioxide and warming the Earth’s surface. I am.
The key message that the public needs to hear is that “it’s the preponderance of the evidence that counts, not the opinions of any particular individual.”
Facts are considered true because they are based on overwhelming evidence, whereas opinions can be true or not.
How social media, which platforms opinion and falsehoods and excludes traditional gatekeepers of knowledge such as news editors, fosters the impression that the science around climate change and vaccines is unresolved or “controversial.” is easy to understand.
This is because 80% of social media platforms do not have content moderation policies that include comprehensive and universal definitions of climate misinformation, according to a September 2023 report from the Climate Action Against Disinformation Coalition. This is especially true.
Artificial intelligence (AI) can further support quality dissent with new ways to deceive and question.
AI algorithms ensure that people who click on anti-vaccine posts are continuously fed anti-vaccine and other anti-science content. Similarly, AI can be used to target misinformation to vulnerable audience sectors. And AI programs trained to write text using vast datasets have automated the generation of more persuasive fake news.
Change starts with education
Education is needed to bridge the gap in agreement between believers and deniers of science.
Learning about the scientific method is important, but in an age of conspiracy theories, misinformation, and “alternative facts,” critical literacy and philosophical inquiry are just as important.
Expanding school curricula to help students navigate complex ideas, virtual digital realities, and make sense of an increasingly complex world could be helpful.
Since children are involved with social media from an early age, it makes sense that the education system helps them develop new skills for determining truth from falsehood when faced with new information.
Related articles: How widespread scientific repression is disenfranchising citizens | A simpler presentation of climate change | How misinformation, disinformation, and hate speech threaten human progress
Preschool through 12th grade education includes conspiracies commonly discussed on social media, from vaccination scare tactics to climate denialism, flat Earth theory, claims that the moon landing was a hoax and the Earth is 6,000 years old. Age-appropriate exercises may be included regarding older.
If started early enough, this education can align with the child’s development and the transition of thinking skills from concrete to more abstract learning, a process that begins around the age of six.
‘World Social Media Organization’ could be the solution
Social media platforms have been essentially allowed to self-regulate for decades. This is a problem. Because money can be made from companies that pay to promote “fake news” and other forms of anti-science misinformation.
There have been several moves to regulate social media platforms in recent years, with varying degrees of success.
In mid-2023, the Australian government proposed legislation to strengthen the Australian Communications and Media Authority’s powers to pressure tech companies to combat online misinformation. However, the legislation has not yet been passed and Australia is still in the early stages of responding to fake news and disinformation.
Individual platforms have their own mechanisms to reduce misinformation.
For example, X (formerly Twitter) “Community Note” function Here, below the tweet with misinformation, you’ll see “context added by readers” with the corrected facts.
However, the Climate Action Coalition Against Disinformation 2023 report lists X as the worst offender in terms of spreading climate disinformation. Despite efforts by Meta, TikTok, and YouTube to combat misinformation about climate change on their platforms, policy enforcement is lacking.
Rather than relying on governments to legislate on their own or expecting social media platforms to effectively regulate themselves, we are faced with a constant barrage of misinformation and harm that pervades the online world. Some form of international regulation is urgently needed to prevent this from happening.
Founded in 2021, Project Liberty brings together a global alliance of technologists, academics, policymakers, and citizens to build a safer and healthier internet and social media. The organization supports efforts to report misinformation, intervene to reduce deepfakes, and educate online news.
The European Commission has also been tackling online disinformation with a high-level expert group that has made important recommendations, including the development of tools to educate platform users.
In 2018, the International Grand Commission on Disinformation, made up of the United Kingdom, Argentina, Belgium, Brazil, Canada, France, Ireland, Latvia, and Singapore, met for the first time. The committee subsequently met in Ottawa and Dublin and co-hosted a series of seminars in the United States.
These efforts are just the beginning. They show that cross-border cooperation is possible in tackling the global problem of internet disinformation.
But given the scale of the disinformation problem, and especially the growing concerns about AI, the international community needs to go a step further.
Social media companies should be no different than global pharmaceutical companies, for example, which are heavily regulated by the FDA and TGA to produce medicines that are not harmful to humans.
One solution might be to establish a global social media organization made up of global partners from each country to oversee harmful content.
Like the WHO, which supports global health, this organization could focus on promoting the responsible dissemination of knowledge and preventing harm and hate speech in order to promote the well-being of society.
In today’s society, where digital information is ubiquitous and rapidly changing, we embrace robust debate, listen to experts, verify what they say, and understand the methods they use. And we need to avoid being swayed by ignorant people and naysayers.
We need digital subjects to become digital citizens who protect their rights rather than being manipulated by social media.
If society fails to regulate social media, historians 100 years from now may write: “People in the early 21st century were so overwhelmed with digital information that they failed to develop the skills and systems to adequately process it, to the detriment of society.”
** **
This article was first published by 360 information™.
Editor’s note: The opinions expressed here by the authors are their own and not those of Impakter.com. — Cover photo credit: Unsplash.
[ad_2]
Source link