Google warns criminals are building and selling illicit AI tools – and the market is growing

Google warns criminals are building and selling illicit AI tools – and the market is growing



  • AI tools are being purpose built for criminals, new GTIG report finds
  • These tools side-step AI guardrails designed for safety
  • ‘Just-in-time’ AI malware shows how criminals are evolving their techniques

Google’s Threat Intelligence Group has identified a worrying shift in AI trends, with AI no longer just being used to make criminals more productive, but also now being specially developed for active operations.

Its research found Large Language Models (LLMs) are being used in malware in particular, with ‘Just-in-Time’ AI like PROMPTFLUX – which is written in VBScript and engages with Gemini’s API to request ‘specific VBScript obfuscation and evasion techniques to facilitate “just-in-time” self-modification, likely to evade static signature-based detection’.



Source: Techradar

Leave a Reply

Your email address will not be published. Required fields are marked *