Stay informed with our Brazilian election interference reports

Election interference and disinformation threats are evolving constantly, increasing cyber incidents and malicious posts aimed at swaying public perception. It is important to consider and mitigate the impact of election disinformation and cyber election interference. Read our fortnightly reports and mitigation measures below.

Mitigation measures

  • Review your passwords and privacy settings on devices, applications, and social media sites. Ensure passwords to your most sensitive and critical accounts are long and sufficiently complex. When able, use application-based two-factor authentication (2FA) to further secure your information.
  • Use social media in a way that protects your most sensitive information through risk-based decisions. Maintain maximum situational awareness and keep a healthy level of scepticism.
  • One of the best defences against firmware-based malware on electronic devices in the current threat environment is to regularly power cycle your devices (turn them off and on again). This increases the work for threat actors as it forces them to regain a foothold in the user’s device following each restart.
  • All devices should have reputable anti-virus software installed with regular scanning features enabled.
  • Regularly update your device’s operating system (iOS or Android) to patch known security vulnerabilities. Keep all apps up to date, as updates often include security patches to address newly discovered vulnerabilities, and only download apps from trusted sources such as Google Play or the Apple App Store. Avoid third-party or unknown app stores that might host malicious software.
  • Debunk or fact-check social media posts before reposting, avoid repeating misinformation without including a correction.
  • Be cautious of unfamiliar social media accounts, websites or domains that seem unreliable. Rely on established and reputable sources, like major news organizations or academic institutions.
  • Pay attention to video content or AI-generated images, as deepfakes are becoming increasingly convincing. Always look for other sources verifying the video's authenticity. Use Reverse Image Search tools to verify if an image has been altered or is being used out of context.
  • Organizations should consult our mitigation pages on Seerist for general malware and malware distribution. Phishing detection and response training is also recommended.
  • Be cautious about sharing sensitive information (such as credentials) online or via phone. 
  • Organizations should develop guidelines for identifying, addressing, and debunking misinformation swiftly and designate a crisis response team to handle high-priority cases.
  • Partner with or create dedicated fact-checking teams to verify and debunk election-related claims in real-time.
  • By proactively building a cybersecurity culture, implementing stringent technical defences, and preparing for potential incidents, organizations can significantly reduce the likelihood and impact of cyberattacks.
  • Raising awareness about social engineering tactics among employees is crucial. Regular training sessions and simulated attacks can help employees recognize and respond appropriately to social engineering attempts. 
  • Implementing robust DDoS protection measures is essential, especially during high-risk periods. This includes using DDoS mitigation services and ensuring that critical infrastructure is resilient to such attacks. 
  • Educating employees and stakeholders about disinformation and its potential impact is important. This includes training on how to identify and respond to disinformation campaigns, particularly those targeting the organization or its industry. 
  • Actively monitoring social media platforms for disinformation and other cyber threats can help organizations respond quickly to emerging threats. This includes using tools and services that provide real-time alerts and analysis of social media activity. 

Disinformation education

Organisations should use the Actors, Behaviour, Content (ABC) disinformation framework to understand manipulative actors, their behaviour, intent, and the harmful content they are posting. This will assist in developing more effective strategies for countering misinformation.