However, as with any new technology, there will always be those that will utilize it for more harm than good. Just as the mass adoption of email and daily Internet browsing opened the floodgates for new and improved fraud attempts, malicious actors have adopted this technology to enhance their operations with several concerning methods.
Phishing emails, once easily identified by their awkward language and generic messages, can now be crafted by AI to closely imitate legitimate communication styles, making them far more convincing and difficult to detect. Voice cloning technology, powered by AI, allows attackers to replicate the voice of an individual with alarming accuracy, facilitating voice phishing (vishing) schemes that prey on individuals’ trust. Additionally, “deepfake” technology enables the creation of highly realistic video or audio content, which can be used to deceive targets by presenting convincing but false evidence. Together, these AI-driven techniques represent a significant escalation in the complexity and effectiveness of phishing attacks, posing a threat to both individuals and organizations.
Fortunately, no matter how advanced these fraud attempts may get, the principles of security awareness will not fail you:
Verify the sender
Is the email sender who they say they are? Look carefully at the sender’s full email address for red flags. If your coworker John Doe’s email is john.doe@company.com, be wary of jdoe@company.com or john.doe@compani.com.
Think before you click
Even if a link or attachment is coming from a trusted source, always check the details before clicking. Is the SharePoint link your coworker sent you actually taking you to SharePoint? Is that Word file attached to the emails actually a Word document?
Be skeptical
Ask yourself questions when you find the situation is even slightly abnormal. Would your manager urgently call or email you when you normally communicate via Teams or Slack? With the introduction of voice cloning technology, it’s important to think rationally, even if you hear the voice of an angry boss or a distraught relative. If there is any doubt of the caller or sender’s identity, it is recommended to end communication and directly call or email the supposed sender to get confirmation.
Keeping up with new technology is a never-ending race. Luckily, staying ahead of this emerging threat is as simple as reinforcing the basics. If you’d like to read more on voice cloning fraud, McAfee Labs conducted a study on the subject: https://www.mcafee.com/blogs/privacy-identity-protection/artificial-imposters-cybercriminals-turn-to-ai-voice-cloning-for-a-new-breed-of-scam/.
Authored by: Reed Buettner, Security