The governor of California recently approved a bill prohibiting the distribution of AI-generated media including deceptive audio and visual media about a political candidate within 60 days of an election, unless clearly indicating that the content was AI-generated. While this represents an attempt to curtail the use of false information to influence elections, it is becoming increasingly difficult to distinguish between misinformation and disinformation. The advent of artificial intelligence has complicated things even further.
Misinformation refers to false content that is inadvertently shared with others, while disinformation involves the deliberate use of falsified material to influence others’ opinions. The COVID-19 pandemic is a recent reminder of the potential impact this type of information has on people’s public perceptions. So many people compromised their health by following unverified health suggestions that in 2021, the U.S. surgeon general asked people to stop the spread of health misinformation. This begs the question: how can people be certain of what information can be trusted?
While installing a ministry of truth is too Orwellian, targeting the way people consume and make decisions based on information they view in the media can help. The practice of media literacy has gained increasing attention as a means of helping people to assess the validity of what they are viewing. Basic media literacy skills include identifying the source of information in order to assess its credibility, recognizing who is sharing this information and what their goal is, and determining who the intended target audience is and how the message is likely to affect them. By thinking critically about these aspects of the media, people are less likely to be tricked into believing false or malicious information. Of course, this takes time and energy, but in a media-saturated world, we need to be discerning about what we choose to believe, both in terms of practical decision-making and maintaining our mental health.
There is a large body of literature indicating that media exposure can make us stressed, anxious, and depressed. This is due in part to the fact that media outlets, in their effort to attract our attention and advertising dollars, spend more time covering negative, sensational stories than more benign, solution-based content. A diet of such information can impact our trust in the people around us, our institutions, and the future. While practicing media literacy doesn’t solve problems or cure our malaise, it can help us to maintain perspective and focus on solutions rather than imminent doom.
This is particularly important during an election year, especially if AI-generated content online increases. Social media use increased during the 2020 election, pointing to the growing reliance on these platforms to share information and stay informed. Some researchers captured the emotional states of online exchanges and noted that greater online engagement occurred with more charged political rhetoric. Twitter, for example, saw an increase in polarizing comments related to political issues being discussed throughout the debates leading up to the 2020 presidential election. Whether the initial tweets were AI or user-generated remains uncertain, but engagement with this type of content was heightened during campaign season.
The rise of AI-generated content on social media platforms may pose significant problems for younger voters. Studies show younger people are more likely to use social media platforms like TikTok and Instagram as their primary means for obtaining news information. Awareness is going to be a key ingredient in avoiding misleading information that is AI-generated. It is up to us to make sure we do our due diligence before consuming and sharing information online this election season. This can be achieved in part by practicing media literacy.
Post co-written with Dr. Joseph Torres, a recent graduate of the University of Texas at San Antonio Doctoral program, where his research focused on media literacy and mental health.