A new guide co-authored by Amelia Jamison and Dr. Sandra Quinn from the University of Maryland School of Public Health highlights the types of malicious actors (primarily, bots and trolls) on Twitter, describes their behaviors and introduces strategies to combat them. David Broniatowski of George Washington University’s department of engineering management and systems engineering was a co-author of the paper published in the American Journal of Public Health.
Using examples from the team’s earlier work on vaccines and weaponized health communication, the researchers detail how the two subgroups of malicious actors, automated accounts (various bots) and human-driven accounts (mostly trolls), directly influence vaccine-related Twitter discourse. They describe how these actors can work against public health as a primary goal or use the popularity of vaccines to accomplish other goals (like spreading malware or gaining followers). Their tactics ultimately distort social media data used to gauge public sentiment and erode public confidence in public health communications.
To combat these malicious actors, the researchers underscore the importance of combating the bot-driven messages, not the bots themselves. Promoting the right messages, improving social media literacy and focusing on trusted offline relationships is important. They urge public health researchers, practitioners and communicators to partner with computer scientists and other technology experts to understand and stop the spread of misinformation online.
The study is part of the Supplementing Survey-Based Analyses of Group Vaccination Narratives and Behaviors Using Social Media project co-led by Dr. Sandra Quinn and Dr. David Broniatowski.Friday Letter Submission, Publish on June 14