(855) ER-TECH-1
healthcare managed it services
msp healthcare
(855) ER-TECH-1

The Dangers of Deepfake

Oct 12, 2021

Synthetic media technology has been around for a long time. The act of faking content is nothing new, dating back over a thousand years when Greek inventors designed machines that were capable of writing text, creating sounds, and playing music. These machines, however, weren’t capable of generating original content.


It was the rise of artificial intelligence (AI) that actually took synthetic media to a whole new level. Because of the innovations in the field of artificial intelligence and machine learning (ML), technology has given birth to the most prominent form of synthetic media: Deepfake.


What is deepfake?

Deepfake refers to synthetic media that involves using artificial intelligence to replace the person in an image or video with someone else’s likeness in a way that makes the video look authentic.


It’s easier to remember this way:

A form of AI called deep learning is used to create realistic-looking fake media.


Hence the portmanteau: deepfake.


With the ever-increasing power of ML and AI technologies, one can now manipulate or generate images, videos, and even audio files with a high potential to deceive.


Here are some popular examples

This AI-powered website, for example, churns out deceivingly realistic portraits of people who do not exist. Refresh the page to see a new (fake) person.


This viral
TikTok account dedicates itself to creating and posting short videos of a deepfake Tom Cruise. It’s definitely entertaining...in a creepy, unsettling way.


This 2020 video of a speech by a deepfake Queen Elizabeth aptly delivers a warning about how technology is enabling the proliferation of misinformation and fake news in our highly digital age.


Deepfake technology can mean bad news for your practice

Inasmuch as we’d like to celebrate the advancement of ML and AI technologies, the US Federal Bureau of Investigation (FBI) encourages private organizations to be careful and vigilant of falling victim to entities who use the deepfake technology for malicious campaigns.


In a Private Industry Notification (PIN) issued by its Cyber Division in March 2021, the FBI anticipates that malicious actors “almost certainly will leverage synthetic content for cyber and foreign influence operations.”


This isn’t really anything new. Since 2019, several campaigns have been found to use ML-generated social media profile images. However, the recent advances in ML and AI give cybercriminals the opportunity to generate and manipulate content that serves their malicious plans. 


How does this affect you?

Healthcare practices, large and small, have long been prime targets for cybercrime. This is because you handle especially sensitive data that could be worth a lot of money for cybercriminals. 


If hackers are working hard to improve the quality and increase the impact of their campaigns, you can be sure that they’re doing this with healthcare practices like yours as their target. Whether it’s tricking your staff into letting them in your network or launching external attacks against your cybersecurity defenses, hackers will definitely use technological advancements to their advantage.


Here’s what the FBI anticipates

According to their PIN, the FBI anticipates that deepfake technology will be employed broadly across their cyber operations. The sophistication of the synthetic media will take their existing spearphishing and social engineering campaigns to a different, more potent, level.


Besides using the technology on existing campaigns, cybercriminals are also anticipated to employ deepfake tools in a newly defined cyber attack vector called Business Identity Compromise (BIC).


BIC is the creation of synthetic corporate personas or the imitation of existing employees, likely to gain access to your company’s bank account, line of credit, tax refund, or personnel information.


How do we identify deepfakes?

As you may have observed from the examples above, some deepfakes are so realistic that they’re almost impossible to detect. 


Here are a few tips from the FBI on how you can identify and mitigate synthetic media such as deepfakes:


Keep an eye out for visual indicators.

Visual indicators include distortions, warping, and inconsistencies. In photos, check if the spacing and placement of the person’s eyes are distinct and consistent among several images.


In videos, check for inconsistencies in the movement of the person’s head and torso. You can also check if the movement of their face and/or lips are consistent with the audio.


Contact a reputable third-party organization.

If identifying synthetic content is too difficult or too confusing for you, a third-party research and forensic organization can help evaluate the media. A trusted IT and cybersecurity company can also offer valuable insight and advice.


Be familiar with media resiliency frameworks.

An example would be SIFT methodology, which encourages us to carry out the following steps when taking in information online:


  1. Stop. If you feel the urge to immediately believe or share something on the Internet, don’t. At least not until you’ve carried out the next steps.
  2. Investigate the source. Ask yourself, “Is this source what I thought it was? Is this source credible enough to share without any further checking? Is there anything that might disqualify this as a source?”
  3. Find trusted coverage. If you’re not sure you trust the source of your information, do a search and see who else is publishing or reporting it.
  4. Trace claims, quotes, and media to the original content. See the full context of the information. Read the whole article, check the date the content was published, verify if the information is up to date.

How do you protect your practice from deepfakes?

Deepfakes may be getting more and more realistic, but you can still protect your practice from getting fooled by cybercriminals who use them. Good cyber hygiene can significantly lower your risks of falling victim to malicious actors.


Here are a few FBI-endorsed security measures that your practice can adopt:


Information and Education

Inform and educate your entire workforce about the risks of deepfakes, making sure to include those in upper management and senior executive positions. Because they have access to the most sensitive and most valuable information in your organization, they are prime targets for cybercrime.


Cybersecurity Awareness

Conduct cybersecurity awareness training among your staff so that they know how to spot, avoid, and report social engineering, phishing, and other cyberattack attempts. You can partner with trusted experts who specialize in healthcare technology and have them share their valuable insight and advice to your team.


Identity Verification

Do not assume an online persona is legitimate. Seek multiple independent sources of information to validate or verify it. If you receive attachments, links, or emails from senders you don’t recognize, do not open them.


Privacy Protection

Never provide personal information in response to unsolicited inquiries. This includes usernames, passwords, birth dates, social security numbers, financial data, and other sensitive information. 


If you receive requests for sensitive or corporate information, be cautious about sending them over electronically or over the phone. If you can, verify these requests via secondary channels of communication.


Multi-factor Authentication

Use multi-factor authentication (MFA) on all systems to add an extra layer of security to your network. According to Microsoft, MFA can block 99.9% of account compromise attacks.


| Read more about MFA and how it can help your practice here...


Continuity Plans

Establish and implement processes that allow your practice to continue operations in the event one of your accounts is compromised and used to spread synthetic content.


Partner With Reputable IT and Cybersecurity Experts

All these tips and advice from the FBI are extremely helpful to healthcare practices everywhere. Unfortunately, dealing with threats involving synthetic media can be highly technical and you may not be able to handle everything yourself.


For
maximum protection and round-the-clock support, partner with a trusted IT company that specializes in protecting and optimizing healthcare practices.


TALK TO A CYBERSECURITY PRO TODAY

Search Articles

data diddling
By Aprillice Alvez 15 Apr, 2024
Protect your healthcare practice from data diddling by educating your team on vulnerabilities and investing in prevention techniques like data validation.
A businessman wearing headphones uses a cloud phone system to do business communications
By Karen Larsen 29 Feb, 2024
The business world is steadily shifting to cloud communications. Our new blog post gives you a few reasons why you should, too. Read on to learn more.
A digital brain is sitting on top of a computer motherboard, symbolizing AI in cybersecurity
By Karen Larsen 14 Feb, 2024
While AI can revolutionize cybersecurity practices, it can also expand the attack surface. How do you balance the risks & benefits of AI in cybersecurity?
A man is typing on a laptop computer with an email alert on the screen
By Karen Larsen 05 Feb, 2024
Phishing is the primary way cybercriminals access our healthcare systems. Our new blog post shows you how to stop an email phishing attack in its tracks.
An employee's laptop on a desk, showing the need for cybersecurity best practices in remote work
By Karen Larsen 22 Dec, 2023
Remote work is revolutionizing the world, but if you want it to work for your business, you’ll need to step up your cybersecurity game.
Mobile phone  displaying a health app with a padlock and a shield on it
By Karen Larsen 18 Dec, 2023
As the world becomes increasingly digital, thousands of patients and providers are downloading the first mobile health app they find. Here’s why you shouldn’t.
A stethoscope placed on a remote healthcare device, showing the connection between MSP & healthcare
By Karen Larsen 30 Nov, 2023
Remote healthcare is here to stay. Do you have the IT expertise to navigate it? Find out how partnering with an MSP can transform how you deliver care.
Computer keyboard with a key specifically for cloud network security
By Karen Larsen 15 Nov, 2023
Thanks to the massive influx of cloud technology, businesses are future-proofing their operations with cloud-based security. Here’s why you should, too!
A fingerprint staying securely on a circuit board symbolizing MFA benefits and cybersecurity
By Karen Larsen 03 Nov, 2023
Multifactor authentication (MFA) prevents 99.9% of account compromise attacks. Find out how MFA protects your business and why you must implement it ASAP.
Cloud securely stores data from a microchip and utilizing powerful ransomware prevention
By Karen Larsen 23 Oct, 2023
Falling victim to a ransomware attack can ruin everything you worked so hard to build. Here’s how you can keep cybercriminals out of your cloud environments.
Show More

Healthcare & Tech Articles

data diddling
By Aprillice Alvez 15 Apr, 2024
Protect your healthcare practice from data diddling by educating your team on vulnerabilities and investing in prevention techniques like data validation.
A businessman wearing headphones uses a cloud phone system to do business communications
By Karen Larsen 29 Feb, 2024
The business world is steadily shifting to cloud communications. Our new blog post gives you a few reasons why you should, too. Read on to learn more.
A digital brain is sitting on top of a computer motherboard, symbolizing AI in cybersecurity
By Karen Larsen 14 Feb, 2024
While AI can revolutionize cybersecurity practices, it can also expand the attack surface. How do you balance the risks & benefits of AI in cybersecurity?
More Posts
Share by: