Russian hackers are trying to exploit the worst ChatGPT

Russian cybercriminals have been found trying to circumvent restrictions on ChatGPT and use advanced AI-powered chatbots for their nefarious purposes.
Check Point Research (CPR) said it detected discussions on underground forums where hackers discussed various methods, including using stolen payment cards to pay for accounts. Upgraded user account on OpenAI, bypassing geo-fence restrictions and using “Russian semi-legal online SMS service”. to register for ChatGPT.
ChatGPT is a new artificial intelligence (AI) chatbot that has gained a lot of attention due to its flexibility and ease of use. Cybersecurity researchers have seen hackers use the tool to create trusted phishing emails, as well as code for Office files containing malicious, macros.
paper barrier
However, it is not easy to abuse this tool because OpenAI introduces some limitations. Russian hackers, due to the invasion of Ukraine, have even more hurdles to overcome.
For Sergey Shykevich, Director of Threat Intelligence Group at Check Point Software Technologies, the barriers weren’t good enough:
“It shouldn’t be too difficult to bypass OpenAI’s restrictions on specific countries accessing ChatGPT. Right now, we are seeing Russian hackers discussing and examining how to bypass geo-fences to use ChatGPT for their malicious purposes.
We believe these hackers are most likely trying to implement and test ChatGPT into their everyday criminal activities. Cybercriminals are more and more interested in ChatGPT, because the AI technology behind it can help hackers save money,” Shykevich said.
But hackers aren’t just looking to use ChatGPT – they’re also trying to cash in on the tool’s growing popularity to spread all kinds of malware (opens in a new tab) and steal money. For example, Apple’s mobile app store, the App Store, hosted an app that pretended to be a chatbot, but with a monthly subscription that costs about $10. Other apps (some of which are also found on Google Play), charge up to $15 for “service”.