AI-linked wallet drained via prompt injection in Bankr exploit

fiverr
Changelly


An AI-linked wallet associated with “Grok” was exploited on 4 May after an attacker used a prompt injection technique to trigger an unauthorized token transfer.

The attacker reportedly caused the wallet to send 3 billion DRB tokens, valued at roughly $155K–$180K at the time, via a command the system interpreted as legitimate.

Unlike typical exploits, the incident did not involve a smart contract vulnerability. Instead, it relied on manipulating how the AI interpreted user input.

The X account linked to the suspected attacker was later deleted, a common pattern seen in exploit cases following fund movements.

Binance

NFT unlock enabled full wallet permissions

The attack began when the attacker sent a Bankr Club Membership NFT to the wallet.

This NFT reportedly unlocked advanced tool permissions within the Bankr system, enabling the AI agent to perform actions such as transfers and swaps.

Once these permissions were active, the attacker moved to the next phase — crafting a malicious prompt.

Prompt injection triggered unauthorized transfer

According to available breakdowns, the attacker used a combination of:

  • social engineering
  • obfuscated instructions [including encoded or indirect commands]

The AI interpreted the prompt as a valid instruction and generated a transfer command.

That command was then executed via Bankr’s tooling, resulting in a standard ERC-20 transaction that moved the funds to an attacker-controlled wallet.

Bankr bot responding to the now-deleted malicious promtBankr bot responding to the now-deleted malicious promt
Source: X

The tokens were subsequently transferred again and rapidly sold.

Attack relied on AI behavior, not code flaws

This incident stands out because it did not exploit a vulnerability in smart contracts or blockchain infrastructure.

Instead, it targeted:

  • intent parsing
  • tool permission systems
  • AI decision-making layers

The exploit demonstrates how AI agents with execution capabilities can become vulnerable when user input is not properly constrained.

Funds partially recovered after public pressure

Following the incident, reports suggest that a large portion of the funds, estimated at 80% to 88%, was returned in ETH and USDC under public pressure.

The attacker’s associated social account was later deleted.

However, details around the recovery have not been fully verified through official statements at the time of writing.


Final Summary

  • An AI-linked wallet was drained of ~$170K after a prompt injection attack tricked the system into executing a token transfer via Bankr tools.
  • The incident highlights a new class of risk in crypto, where AI agents with wallet permissions can be exploited through manipulated inputs rather than code vulnerabilities.

 



Source link

BTCC

Be the first to comment

Leave a Reply

Your email address will not be published.


*