Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] debug_trace_call not giving storage on an ERC20 balanceOf call #1347

Closed
NicolasWent opened this issue Sep 24, 2024 · 3 comments · Fixed by #1436
Closed

[Bug] debug_trace_call not giving storage on an ERC20 balanceOf call #1347

NicolasWent opened this issue Sep 24, 2024 · 3 comments · Fixed by #1436
Labels
bug Something isn't working

Comments

@NicolasWent
Copy link
Contributor

Component

provider, pubsub, rpc

What version of Alloy are you on?

alloy: v0.3.6 | alloy-core: v0.8.3

Operating System

Linux

Describe the bug

The issue

I am doing a debug_trace_call in order to debug an ETH call of a balanceOf on an ERC20.

The goal is to get a storage slot at the end (then I can use the storage slot to set an ERC20 token balance in anvil).

The issue that I encounter is that, there is no storage slot at all that gets accessed when I do debug_trace_call on a balanceOf call which shouldn't happen as this call must access storage at some point.

Minimal example to reproduce

Here is a minimal repo reproducing this bug.

Additionally here is the code:

main.rs
use std::env;

use alloy_primitives::{address, Address, Bytes, B256};
use alloy_provider::{ext::DebugApi, ProviderBuilder};
use alloy_rpc_types::{
    trace::geth::{
        GethDebugTracerConfig, GethDebugTracingCallOptions, GethDebugTracingOptions,
        GethDefaultTracingOptions, GethTrace,
    },
    BlockId, TransactionRequest,
};
use alloy_sol_types::{sol, SolCall};
use dotenv::dotenv;
use eyre::Result;

// WETH token address
const WETH: Address = address!("C02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2");
// The address of titan builder (just a random address to get the balance from)
const TITAN_BUILDER: Address = address!("4838B106FCe9647Bdf1E7877BF73cE8B0BAD5f97");

sol!(
    #[sol(rpc)]
    IErc20,
    "src/erc20_abi.json"
);

#[tokio::main]
async fn main() -> Result<()> {
    dotenv().ok();
    let provider = ProviderBuilder::new().on_http(
        env::var("HTTP_RPC")
            .expect("Missing HTTP_RPC environment variable")
            .parse()
            .expect("HTTP_RPC must be a valid HTTP RPC url"),
    );

    let debug_trace = provider
        .debug_trace_call(
            TransactionRequest::default().to(WETH).input(
                Bytes::from(
                    IErc20::balanceOfCall {
                        _owner: TITAN_BUILDER,
                    }
                    .abi_encode(),
                )
                .into(),
            ),
            BlockId::latest(),
            GethDebugTracingCallOptions {
                tracing_options: GethDebugTracingOptions {
                    config: GethDefaultTracingOptions {
                        enable_memory: Some(true),
                        disable_memory: None,
                        disable_stack: Some(false),
                        disable_storage: Some(false),
                        enable_return_data: Some(true),
                        disable_return_data: None,
                        debug: Some(true),
                        limit: Some(0),
                    },
                    tracer: None,
                    tracer_config: GethDebugTracerConfig::default(),
                    timeout: None,
                },
                state_overrides: None,
                block_overrides: None,
            },
        )
        .await?;

    if let GethTrace::Default(default_trace) = debug_trace {
        let mut ret: Option<B256> = None;
        for struct_log in default_trace.struct_logs {
            if !struct_log.storage.is_none() {
                let storage = struct_log.storage.unwrap();
                let mut keys = storage.keys();
                while let Some(key) = keys.next() {
                    ret = Some(*key);
                }
            }
        }

        if ret.is_none() {
            panic!("balanceOfCall didn't gave us any right storage slot for token {WETH:?} and wallet {TITAN_BUILDER:?}");
        } else {
            println!("Storage slot is: {}", ret.unwrap());
        }
    } else {
        panic!("Trace should be default, got {debug_trace:#?}");
    }

    Ok(())
}
Cargo.toml
[package]
name = "debug-trace-bug"
version = "0.1.0"
edition = "2021"

[dependencies]
tokio = { version = "1.39", features = ["full"] }
dotenv = "0.15"
eyre = "0.6"

alloy-primitives = { version = "0.8" }
alloy-sol-types = { version = "0.8", features = ["json"] }

alloy-contract = "0.3"
alloy-provider = { version = "0.3", features = [
    "debug-api",
] }
alloy-rpc-types = { version = "0.3", features = ["trace"] }

Minimal example output

Here is the actual output:

thread 'main' panicked at src/main.rs:84:13:
balanceOfCall didn't gave us any right storage slot for token 0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2 and wallet 0x4838b106fce9647bdf1e7877bf73ce8b0bad5f97
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace

It didn't gave me any storage.

I have also tried to print all operators and got these:

Operators
PUSH1
PUSH1
MSTORE
PUSH1
CALLDATASIZE
LT
PUSH2
JUMPI
PUSH1
CALLDATALOAD
PUSH29
SWAP1
DIV
PUSH4
AND
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
DUP1
PUSH4
EQ
PUSH2
JUMPI
JUMPDEST
CALLVALUE
ISZERO
PUSH2
JUMPI
JUMPDEST
PUSH2
PUSH1
DUP1
DUP1
CALLDATALOAD
PUSH20
AND
SWAP1
PUSH1
ADD
SWAP1
SWAP2
SWAP1
POP
POP
PUSH2
JUMP
JUMPDEST
PUSH1
PUSH1
MSTORE
DUP1
PUSH1
MSTORE
PUSH1
PUSH1
KECCAK256
PUSH1
SWAP2
POP
SWAP1
POP
SLOAD
DUP2
JUMP
JUMPDEST
PUSH1
MLOAD
DUP1
DUP3
DUP2
MSTORE
PUSH1
ADD
SWAP2
POP
POP
PUSH1
MLOAD
DUP1
SWAP2
SUB
SWAP1
RETURN

Which at no point show an SLOAD operation. Which is very weird since balanceOf should always access storage at least once.

Additional details

I also would like to point out that this code is actually few month old, and few month ago, it was working fine and giving me the storage slot, today I tried to run my test again and they are no longer passing.

Node version (HTTP_RPC):

reth Version: 1.0.7
Commit SHA: 75b7172cf77eb4fd65fe1a6924f75066fb09fcd1
Build Timestamp: 2024-09-19T18:05:21.642892157Z
Build Features: asm_keccak,jemalloc
Build Profile: maxperf

Lighthouse version:

Lighthouse v5.3.0-d6ba8c3
BLS library: blst-portable
BLS hardware acceleration: true
SHA256 hardware acceleration: true
Allocator: jemalloc
Profile: maxperf
Specs: mainnet (true), minimal (false), gnosis (true)
@NicolasWent NicolasWent added the bug Something isn't working label Sep 24, 2024
@DaniPopes
Copy link
Member

DaniPopes commented Sep 26, 2024

On alchemy mainnet:

Error: HTTP error 400 with body: {"jsonrpc":"2.0","id":0,"error":{"code":-32602,"message":"invalid 3rd argument: options object was missing 'tracer' element"}}

Serialized request:

{
    "method": "debug_traceCall",
    "params": [
        {
            "to": "0xC02aaA39b223FE8D0A0e5C4F27eAD9083C756Cc2",
            "input": "0x70a082310000000000000000000000004838b106fce9647bdf1e7877bf73ce8b0bad5f97"
        },
        "latest",
        {
            "enableMemory": true,
            "disableStack": false,
            "disableStorage": false,
            "enableReturnData": true,
            "debug": true,
            "limit": 0
        }
    ],
    "id": 0,
    "jsonrpc": "2.0"
}

cc @mattsse

@NicolasWent
Copy link
Contributor Author

NicolasWent commented Oct 4, 2024

Yes sometimes it also says that on other nodes but on reth it doesn't, it is weird :(

It is like if this rpc call behave differently depending on the node used

@mattsse
Copy link
Member

mattsse commented Oct 5, 2024

which tracer do you want to use?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants