Social media has completely changed how people interact, exchange information, mobilise, and impact public life in the twenty-first century. Bypassing traditional media gatekeepers, politicians can interact directly with voters through platforms like Instagram, Facebook, X, and YouTube. They can frequently reach a sizable portion of the electorate (roughly around 35–40% of the population) in a matter of seconds. Before social media, politicians’ campaigns relied on journalists, television, newspapers, and rallies to shape narratives. However, today, parties can instantly publish speeches, manifestos, and targeted messages. Voters’ feeds primarily contain political content, which increases visibility, engagement, and opportunities for participation.
However, the speed and virality of social media also present significant risks such as hate speech, divisive content, false information and manipulation. Such information can spread in minutes and change people’s opinions before any organisation completes fact-checks. Agenda-setting itself has changed as a result of this shift. For example, former US President Donald Trump directly influenced news cycles through his frequent use of Twitter, a phenomenon that BBC News extensively covered. Further, social media significantly influenced the 2019 and 2024 general elections in India as political parties used influencer-driven campaigns, hashtags and targeted advertisements to mobilise voters. While these platforms have increased transparency and mobilisation, they also demand stronger safeguards for media literacy, platform accountability and regulation to uphold fair democratic processes and to ensure that rapid, widespread political communication serves the public interest rather than jeopardises it.
How Social Media Shapes Voter Perception
1. Algorithmic Personalisation
Social media platforms use a computer program called an algorithm to decide what content each user sees based on their browsing history or content decided by the moderator. These algorithms closely monitor a person’s online behaviour, such as which posts they like, videos they watch, comments they leave, and what and where they share. Based on this behaviour, the platform starts showing some more content similar to what the user has already preferred. The purpose of this system is to keep users interested and active on the platform through algorithms. The result is that no two users see the same social media feed. Each person receives a customised version of news and opinions.
2. Filter Bubble
Over time, algorithmic personalisation creates a filter bubble. In a filter bubble, a voter mostly sees political views and opinions that match their own beliefs. Content that challenges or opposes their beliefs and thinking is shown less often or not at all. This makes people feel that their view is the most common or the right one, even if many others think differently. This leads to the formation of echo chambers.
3. Echo Chambers
An echo chamber is a situation where the same opinion is repeated again and again. When voters repeatedly see posts that support their existing beliefs, then those beliefs become stronger. At the same time, opposing views are either ignored or shown negatively. As a result, people begin seeing those with different opinions as opponents rather than fellow citizens with different perspectives. This might create an ‘us v. them’ mindset, which reduces healthy discussion, debate or understanding.
4. Us v. Them
Another important way social media influences voters is through targeted political advertising. Political parties and candidates can use social media platforms to send ads to groups or specific audiences. These groups are selected based on age, location, gender, interests and online behaviour. For example, during the Bihar election campaign, Spotify ran an ad that highlighted previous developments in favour of Bihar. This ad was heard only by people who live in Bihar or whose mobile numbers are registered in Bihar. In some cases, targeted ads can be useful, such as informing local voters about the nearby campaign events or voting booth.
However, this system is prone to misuse, as political parties or candidates may present different facts or promises to different groups depending on what they think will influence them most. When people receive different versions of the truth, it becomes difficult for society to have a common understanding of a certain political issue. Digital political ads rely heavily on emotions. Instead of explaining policies in detail, many ads use images, music, slogans, and personal stories to evoke feelings such as fear, anger, pride, or hope.
5. Emotional Messages
Emotional messages read quickly and stay longer, as they are more likely to be shared with others. While emotions can motivate people to participate in elections, they can also distract voters from facts and crucial thinking. When voters receive fragmented information, an opinion about a particular person or party can form easily and be unstable. Democracy depends on a shared set of facts and public debates where claims can be tested or challenged. If the public divides into isolated information groups, each with its own facts, then making collective decisions for the betterment of society becomes too difficult.
Influence, Misinformation, and the Changing Media Landscape
Political opinion has changed dramatically due to social media. The main sources of political information used to be newspapers and journalists. Actors, YouTubers, influencers, and producers of digital content now command similar, and often larger, audiences. Examples like Andrew Tate and Elon Musk show how social media can regularly spark international debate. This shows how quickly personal opinions can shape political discourse. Celebrities and influencers in India who use campaign hashtags, reels and live sessions to spread political narratives during elections. Such practices have received both strong support and harsh criticism. To increase engagement, influencers frequently use divisive language, poignant narratives, memes and brief videos.
Algorithms reward such content with greater visibility, thereby boosting the influencer’s power and authority. The distinction between political advertising and personal opinion is blurred by sponsored posts, political partnerships, hashtag campaigns and targeted messaging. Their ability to normalise opinions even in the absence of factual verification is directly correlated with the size of their audience.
Conclusion: Negative Impacts and Encouraging Responsible Use
Social media has many benefits, but we cannot ignore the adverse effects on voters and democracy. To reduce this harm, it is essential to encourage responsible use of social media by users, influencers and government. Addressing the negative impact of social media on voters’ perceptions of politics requires multifaceted strategies that focus on education, regulation, transparency, and ethical practices.
First, media literacy is critical; people should be taught to understand news on social media, including how to identify fake news, biased content, and misleading information and headlines. When voters know how to check sources and start questioning information, they are less likely to be misled.
Secondly, fact-checking and verification should be encouraged. Voters should have a habit of cross-checking political news with reliable sources before believing it to be true and sharing it.
Thirdly, there should be transparency in digital political advertising, meaning that political ads on social media should clearly show who paid for them and why the user is seeing this particular ad. When people know the sources of an advertisement, they can judge it more carefully and avoid manipulation.
Fourth, platforms should curb misinformation and fake accounts and present balanced content. Government and platforms should take steps to address fake profiles, and users should not only see one-sided content; exposure to diverse opinions helps people think more openly and understand others’ perspectives. Clear rules should be established, and swift action is needed to curb the spread of harmful and false information.


