The Controversy Surrounding GPT's Role in Hacking Tools

The Controversy Surrounding GPT's Role in Hacking Tools

Brian Lv13

The Controversy Surrounding GPT’s Role in Hacking Tools

ChatGPT has made some major ripples in the tech space, with people using it for just about everything. This AI-powered chatbot can write stories, provide information, translate text, and so much more. But could the capabilities of ChatGPT be leveraged for darker purposes? Can ChatGPT be used to write malware, and, if so, what does this mean for our cybersecurity?

MUO VIDEO OF THE DAY

SCROLL TO CONTINUE WITH CONTENT

Disclaimer: This post includes affiliate links

If you click on a link and make a purchase, I may receive a commission at no extra cost to you.

Can ChatGPT Write Malware?

ChatGPT functions by working with the prompts you give it. So, if a user were to ask ChatGPT to write malware for them, would it comply?

It has been found that ChatGPT is capable of writing malware. In January 2023, a story emerged about cybercriminals’ use of ChatGPT to create malicious programs . A hacking forum user had uploaded a post about a Python-based infostealer they had written using ChatGPT. Infostealers are Trojan malware programs specifically designed to exfiltrate valuable data.

This is undoubtedly concerning, as ChatGPT is now incredibly popular and very easy to use.

A Recorded Future report stated that ChatGPT “lowers the barrier to entry for threat actors with limited programming abilities or technical skills”, essentially making it easier to conduct cybercrime. The report further noted that ChatGPT “can produce effective results with just an elementary level of understanding in the fundamentals of cybersecurity and computer science.”

What’s more, Recorded Future reported that ChatGPT can also help in various other forms of cybercrime, including “social engineering, disinformation, phishing, malvertising, and money-making schemes.”

Giving beginner cybercriminals the ability to have malware written for them opens a door to many, and essentially automates the process of malware creation.

So is ChatGPT a cybersecurity threat? The answer, sadly, is yes.

https://techidaily.com

ChatGPT’s Limitations

chatgpt conversation open on smartphone

https://techidaily.com

While you can access ChatGPT malware code through a chatbot request, it only seems to be able to write very simple malicious code at the time of writing. In the aforementioned Recorded Future report, it was stated that the malware created by ChatGPT found on illicit platforms was “buggy but functional”.

This isn’t to say that no risks are posed here. ChatGPT can still give cybercriminals an easier route to scamming and attacking victims, and may allow hackers to discover different avenues alongside their usual methods.

What’s more, there’s no knowing whether ChatGPT, or another AI-powered chatbot, will one day evolve to the point where sophisticated malware can be produced. This is the beginning, not the end.

https://techidaily.com

Are Hackers Using ChatGPT?

It’s clear that ChatGPT is already being abused by malicious actors, but the threat is currently quite mild. However, as AI advances, we may see more sophisticated chatbots used by cybercriminals to formulate far more dangerous malware programs. Unfortunately, only the future knows whether ChatGPT will play a role in the biggest cyberattacks.

SCROLL TO CONTINUE WITH CONTENT

Also read:

  • Title: The Controversy Surrounding GPT's Role in Hacking Tools
  • Author: Brian
  • Created at : 2024-11-02 06:14:09
  • Updated at : 2024-11-07 13:39:44
  • Link: https://tech-savvy.techidaily.com/the-controversy-surrounding-gpts-role-in-hacking-tools/
  • License: This work is licensed under CC BY-NC-SA 4.0.