ChatGPT Hallucinations Open Developers to Supply Chain Malware Attacks

Attackers can exploit ChatGPT’s penchant for returning false information to spread malicious code packages, researchers have found. This poses a significant risk for the software supply chain, as it can allow malicious code and trojans to slide into legitimate applications and code repositories like npm, PyPI, GitHub and others. 

By leveraging so-called “AI package hallucinations,” threat actors can create ChatGPT-recommended, yet malicious, code packages that a developer could inadvertently download when using the chatbot, building them into software that then is used widely,…

Read more on google

About bourbiza mohamed

Check Also

Pennsylvania state government will prepare to start using AI in its operations

HARRISBURG, Pa. (AP) — Pennsylvania state government will prepare to use artificial intelligence in its …

Leave a Reply

Your email address will not be published. Required fields are marked *