On February 21, 2025, Bybit lost 401,347 ETH — approximately $1.46 billion at execution price — in a single transaction. Not a smart contract exploit. Not a bridge attack. Not a flash loan cascade. A JavaScript injection into a web application that served a legitimate custody interface, targeted at a specific multisig address, designed to manipulate one signing ceremony.

The hardware wallets worked exactly as advertised. The Safe{Wallet} multisig contract worked exactly as designed. Three independent signers reviewed the transaction. Three of them approved it. All three approved the wrong transaction, and none of them had any way to know that.

The Setup

Bybit used Safe{Wallet} — the largest multisig custody platform in the space — for cold wallet management. Their configuration was textbook: a 3-of-N threshold, hardware wallets on every signer, geographic distribution. The setup reflects a genuine understanding of the threat model for custodial operations. No single signer could authorize a transaction. No single location compromise could drain the wallet. The model assumes adversaries attack individual keys or individual signers. It was well-reasoned. It was also wrong about where the actual attack surface was.

Lazarus Group — operating as TraderTraitor, attributed by multiple intelligence agencies — didn’t attack keys or signers. They attacked the interface.

The Compromise

Lazarus gained access to a Safe{Wallet} developer’s machine or credentials. The specific path isn’t fully public, but the outcome is documented: they obtained write access to the AWS S3 bucket serving the production JavaScript for app.safe.global — Safe’s web interface, the application every user runs in their browser to construct and sign transactions.

What makes this work is how web application delivery functions. JavaScript served from a CDN or object storage has no code signing requirement enforced by the browser. The browser fetches whatever the server returns and executes it. There is no mechanism analogous to a mobile app’s cryptographic signature check. An attacker with write access to the serving infrastructure has arbitrary code execution in every session that loads the page.

The injected code was targeted. It didn’t fire for all Safe users — that would have been detected quickly. It checked for Bybit’s specific multisig address. It had reconnaissance: Bybit’s wallet address, their signing threshold, their operational patterns. The low detection surface was deliberate design.

The Transaction Manipulation

Safe constructs transactions using EIP-712 structured data signing. When a transaction is initiated — move ETH to address X, call function Y — the interface builds a typed data structure encoding the destination address, the value, the calldata, the safe address, the chain ID, the nonce. That structure gets hashed and presented to each signer’s hardware wallet for approval.

The malicious JavaScript intercepted the construction step. When it detected Bybit’s address initiating a transaction, it replaced:

  • The to field: Bybit’s intended destination replaced with a Lazarus-controlled address
  • The calldata: the original call replaced with a delegatecall to an attacker-deployed contract

The delegatecall target was a contract containing a sweepETH() function. Executing it in the context of the Safe via delegatecall meant it ran with the Safe’s identity and permissions — not an external call to a contract, but code executing as if it were the Safe itself. The Safe’s entire balance was the scope of the sweep.

The injected JS also controlled the UI display. The address and amount shown in the browser, the confirmation dialogs, the transaction summary — all of that is rendered by the same JavaScript that constructed the manipulated payload. Signers saw a routine ETH transfer. They saw Bybit’s intended destination. They saw the expected amount. They were looking at a fabricated summary backed by a different transaction.

Why the Ledgers Didn’t Help

Each signer used a Ledger hardware wallet. This is where the misconception lives, and it’s worth being precise about what hardware wallets actually do.

A hardware wallet protects the signing key. It ensures the private key never leaves the device. It ensures signing requires physical confirmation. It prevents an attacker who compromises your machine from exfiltrating your key and signing transactions unattended. Within those guarantees, it works.

What it cannot do is validate that the transaction you’re being asked to sign is the transaction you intended to sign. That validation requires trusting something: either the application that constructed the payload, or an independent verification path that doesn’t share dependencies with the application.

Ledger renders EIP-712 structured data. Given a typed payload, it shows the decoded fields. If the payload says to: 0xAttackerAddress, data: delegatecall(sweepETH), that’s what Ledger shows — but signing ceremonies in production environments are routine operations. Signers are verifying that a transaction is happening, confirming an amount and a destination that matches what they were told to expect. They’re not decoding calldata on a hardware wallet’s small display and auditing the call stack. That’s not a failure of discipline; it’s the operational reality of any organization that moves funds more than once a week.

The hardware wallet security boundary is between the key and the network. The Bybit signers were on the wrong side of the boundary that was compromised.

What a Defense-in-Depth Ceremony Looks Like

The structural problem is that all three signers used the same web interface to construct and review the same transaction. The web interface was the single point of failure. Three independent signers, one shared dependency.

Defense-in-depth for multisig ceremonies means eliminating shared dependencies in the critical path:

Offline transaction construction. Build the transaction payload outside the browser. Safe’s CLI tools and the safe-eth-py library allow constructing the EIP-712 payload locally, without loading the web interface. The payload can be exported, reviewed, and distributed out-of-band.

Independent calldata verification. Before signing, each signer independently decodes the calldata. Not reading the summary. Not trusting the display. Decoding the hex. A delegatecall in calldata is not ambiguous when you look at the raw encoding — but you have to look. Tools like cast from the Foundry suite or eth-abi decode in seconds. The question is whether the ceremony protocol requires it.

Blind signing disabled. Most hardware wallet firmware supports disabling blind signing — signing a hash without fully decoding the payload. If EIP-712 data can’t be fully rendered on the device, the device should reject the signing request rather than proceeding with a warning. This trades operational friction for a harder guarantee.

Cross-referencing transaction hashes. Before the third signature, the expected transaction hash should be verified by at least one signer using an independent data source — not the same web session, not the same machine. An on-chain simulation of the transaction hash against a local or independently-run node catches payload manipulation before execution.

None of this is exotic. All of it adds friction. The friction is the point.

The Structural Argument

The multisig model is designed to eliminate single points of failure in the key management layer. Bybit’s setup did that. What the model doesn’t account for is a compromised common dependency that sits upstream of all signers simultaneously.

Every signing ceremony that runs through a browser-based interface shares that interface as a dependency. If the interface is compromised, every signer in the ceremony is operating on attacker-controlled data regardless of how their keys are stored. The hardware wallet is a very good lock on a door; the attack came through a window nobody was watching.

Lazarus didn’t need a novel cryptographic primitive. They needed write access to an S3 bucket and operational intelligence about one multisig address. The extraordinary thing about this theft isn’t the sophistication. It’s the scale — $1.5 billion moved in a single transaction — and what that scale implies about how many other multisig ceremonies are running the same architecture.

Safe{Wallet} disclosed the compromise, paused the service, and confirmed the S3 bucket modification. The malicious JavaScript was targeted; the low blast radius was intentional to preserve detection window. The attack was operationally patient and technically minimal.

That combination — patience, reconnaissance, minimal footprint, single targeted action — is what $1.5B looks like when the hard part isn’t cryptography.


PGP signature: bybit-safe-ui-poisoning-fifteen-hundred-million.md.asc — Key fingerprint: 5FD2 1B4F E7E4 A3CA 7971 CB09 DE66 3978 8E09 1026