Dark
Light

LastPass stops fake audio from tricking employee in deception scam

1 min read
75 views

TLDR:

LastPass employee targeted by deepfake audio impersonating CEO, but employee recognized the scam due to suspicious urgency. Deepfakes are a rising threat for executive fraud campaigns. Industry experts recommend employee awareness training and verification steps to combat deepfake attacks.

Article Summary:

LastPass employee was recently subjected to a deepfake audio call impersonating LastPass CEO Karim Toubba, but due to suspicious urgency, the employee did not fall for the scam. LastPass shared the incident to raise awareness about the increasing use of deepfakes in executive impersonation fraud campaigns. Experts point out that deepfake technology poses a significant challenge to cybersecurity teams, as threat actors can now automate various stages of phishing processes to engage with potential victims on a larger scale.

Industry experts emphasize the importance of employee awareness training to combat deepfake attacks. Training should cover topics on verifying caller identities, spotting deepfake audio, and reporting suspicious activities. As deepfake technology continues to improve, it is likely that we will see an increase in deepfake attacks in the future, bypassing traditional security measures such as two-factor authentication. Experts recommend a cautious approach, verification steps, and critical thinking to prevent falling victim to deepfake scams.

Previous Story

CISA’s midnight blizzard directive, Microsoft in hot water

Next Story

Healthcare at Risk: New Ransomware Threat Looms Large and Real

Latest from News