Print this page

Estimated reading time: 2 minutes, 41 seconds

The Use of Artificial Intelligence by Cyber Criminals

The Use of Artificial Intelligence by Cyber Criminals Photo by Lionello DelPiccolo on Unsplash

Artificial Intelligence (AI) is responsible for technological advancements affecting many diverse aspects of society. Most of this development is focused on achieving economic or social benefits for the world’s population. As is often the case with powerful tools, there are also individuals and groups that would use the capabilities of AI and the related field of machine learning (ML) for nefarious purposes. We will take a look at a few ways where this use of technology has led to new issues and concerns by cybersecurity professionals. 

Conducting Spear-Phishing Attacks

Spear-phishing is a widely used method of performing cyber attacks and the ability to automate these attacks can make them exponentially more dangerous. IBM’s development of a highly targeted and evasive attack tool called DeepLocker is powered by AI. The company purposely hid malware in a video conferencing application and had DeepLocker unlock the program when prompted by facial recognition of a specific individual. 

IBM designed DeepLocker to understand how AI and malware can be used for new types of spear-phishing attacks. Its success indicates that this technology in the wrong hands can be used to cause havoc in many unexpected and detrimental ways. 

Identifying System Vulnerabilities

The United States Department of Defense has purchased technology from a private company that possesses self-healing capabilities and is optimized to employ machine learning techniques to discover system vulnerabilities. While this particular piece of software is under government control, there are certainly similar tools being developed by cyber criminals with the intention of finding exploitable vulnerabilities in the systems that society depends on every day.

Software vulnerabilities have always been a prime target of hackers. By delegating the labor and time-intensive tasks of identifying weak points to AI-powered programs, the hackers can spend more time sharpening the weapons with which to take advantage of vulnerabilities found by their artificially intelligent tools.

Controlling Autonomous Vehicles

The potential to use AI and ML techniques to hack into autonomous vehicles is a concern that needs to be addressed before the wide adoption of these machines. The ability to fool the vision and collision avoidance systems of these vehicles could lead to catastrophic results as traffic signs are ignored or rules of the road disregarded. 

Consumer drones can be weaponized and controlled by an application that uses ML to continually update their ability to avoid capture. These techniques could lead to new types of terrorist attacks that will be much harder to detect and prevent.

Impersonating Individuals

The ability to use machine learning to impersonate individuals in various ways opens up new avenues through which cybercriminals can exploit their victims. The creation of fake voice recordings and videos using AI are making it increasingly hard to determine between reality and an impersonation.

Using impersonation techniques, one may be tricked into providing personal details that otherwise would never be made public. This can lead to compromised systems or accounts and can be extremely detrimental in many ways.

These are just a few of the ways that the technologies of AI and ML may be used with malicious purpose. Society must be aware of the double-edged potential of these disciplines to bestow benefits on society or cause significant harm. 

 

Read 11712 times
Rate this item
(0 votes)
Robert Agar

I am a freelance writer who graduated from Pace University in New York with a Computer Science degree in 1992. Over the course of a long IT career I have worked for a number of large service providers in a variety of roles revolving around data storage and protection. I currently reside and write from a home office located in north-eastern Pennsylvania.