Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scroll evm executor/v35 #1

Closed
wants to merge 33 commits into from
Closed
Show file tree
Hide file tree
Changes from 26 commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
2ef1c0c
fix modexp and pairing gas
lispc Aug 30, 2023
098f35e
use poseidon
lightsing Mar 20, 2024
ae94add
port changes
lightsing Apr 17, 2024
301846b
add scroll feature
lightsing Apr 17, 2024
391421f
add scroll handler
lightsing Apr 17, 2024
9698f98
Merge branch 'sse/v26-base' into sse/v26
lightsing Apr 17, 2024
03990e9
Merge branch 'see/v27-base' into see/v27
lightsing Apr 17, 2024
66a396a
Merge branch 'see/v28-base' into see/v28
lightsing Apr 17, 2024
63d75b2
Merge branch 'see/v29-base' into see/v29
lightsing Apr 17, 2024
c98e0cb
Merge branch 'see/v30-base' into see/v30
lightsing Apr 17, 2024
131facf
Merge branch 'see/v31-base' into see/v31
lightsing Apr 17, 2024
d0c493f
Merge branch 'see/v32-base' into see/v32
lightsing Apr 17, 2024
66e9e83
Merge branch 'see/v34-base' into see/v34
lightsing Apr 17, 2024
290b5a3
Merge branch 'see/v35-base' into see/v35
lightsing Apr 17, 2024
c9858f7
impl l1fee
lightsing Apr 17, 2024
337a2f8
disable some precompiles
lightsing Apr 17, 2024
591ee51
disable SELFDESTRUCT
lightsing Apr 17, 2024
476d7ed
BLOCKHASH
lightsing Apr 18, 2024
fddc9dd
fix modexp
lightsing Apr 18, 2024
91433b3
fix modexp
lightsing Apr 18, 2024
f6d4bd8
fix poseidon hash
lightsing Apr 18, 2024
d06633a
cleanup
lightsing Apr 18, 2024
c554acf
use keccak code hash for evm
lightsing Apr 22, 2024
06857da
disable BASEFEE
lightsing Apr 22, 2024
61d8ca7
handle l1 tx
lightsing Apr 22, 2024
b57e7d9
add NotImplemented error
lightsing Apr 22, 2024
9ecb479
fix pairing
lightsing Apr 26, 2024
74c5a77
fix bins
lightsing Apr 29, 2024
dae1ae4
remove halo2proofs dep
lightsing Apr 29, 2024
e1e8f7a
update poseidon dep
lightsing Apr 30, 2024
1d44eb0
[fix] ci (#4)
lightsing May 10, 2024
abb93af
add faster-hash gate (#3)
lightsing May 10, 2024
09daf95
fix: extcodesize regression (#5)
lightsing May 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
249 changes: 226 additions & 23 deletions Cargo.lock

Large diffs are not rendered by default.

3 changes: 3 additions & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,3 +22,6 @@ debug = true
[profile.ethtests]
inherits = "test"
opt-level = 3

[patch."https://github.com/privacy-scaling-explorations/halo2.git"]
halo2_proofs = { git = "https://github.com/scroll-tech/halo2.git", branch = "v1.1" }
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hahah so we need this scroll-tech/poseidon-circuit#28, then we can only use the poseidon primitive crate, no longer needs halo2_proofs, quite confusing.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

re-export peer dependency is a good pattern

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i mean we should get rid of halo2_proofs in future. How to do this? Split poseidon-circuit crate into 2, one is focusing on hashing computation, not need of halo2, one is focusing on hahsing circuit, need halo2. Here we only need the former

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Re-exporting peer dependencies might seem like a shortcut to a clean and concise codebase. After all, it keeps your imports tidy and avoids users from wrestling with dependency conflicts. But this convenience comes at a hidden cost, potentially leading you down a path of dependency management nightmares.

  1. Loss of Version Control: Peer dependencies are meant to be managed by the consumer, not the provider. By re-exporting, you're taking control over a dependency version that another package might need a specific version of. This can lead to conflicts and unexpected behavior when multiple packages using your library have different version requirements for the peer dependency.
    Imagine this: Your library re-exports a peer dependency, say mathjs, at version 5.0.0. Package A, which uses your library, needs mathjs at version 4.2.1 for specific functionality. Package B, also using your library, needs the latest mathjs (version 6.0.0). This creates a conflict. Package A might break because it relies on features removed in 5.0.0, while Package B might malfunction due to incompatibilities introduced in 6.0.0.
  2. Fragile Ecosystem: Re-exported dependencies become tightly coupled to your library's version. If you update your library and that update requires a newer version of the peer dependency, all your consumers are forced to upgrade as well, even if their code perfectly works with the older version. This creates a domino effect, potentially causing ripple effects throughout your project's ecosystem.
    Think of it this way: You build a house on a foundation designed for a specific weight limit. By re-exporting a peer dependency as part of your house, you're changing the foundation's specifications. If you later decide the house needs a stronger foundation, all the houses built on your design (consumers of your library) would need significant work, even if their original design was perfectly functional.
  3. The Illusion of Clarity: While re-exporting might seem to simplify your code on the surface, it actually obscures the true dependency picture. Consumers have no way of knowing which version of the peer dependency they're actually getting or why it's there. This can make debugging issues and understanding code flow more difficult.
    Consider this analogy: You borrow a toolbox from a friend. They've added their own custom wrench but haven't mentioned it. When you use the wrench on a project and it breaks something, troubleshooting becomes harder because you weren't aware of the extra tool and its potential impact.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hahahaha so we need this scroll-tech/poseidon-circuit#28, then we can only use the poseidon primitive crate, no longer needs halo2_proofs, quite confusing.

10 changes: 10 additions & 0 deletions crates/interpreter/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -40,6 +40,16 @@ negate-optimism-default-handler = [
"revm-primitives/negate-optimism-default-handler",
]

scroll = ["revm-primitives/scroll"]
# Scroll default handler enabled Scroll handler register by default in EvmBuilder.
scroll-default-handler = [
"scroll",
"revm-primitives/scroll-default-handler",
]
negate-scroll-default-handler = [
"revm-primitives/negate-scroll-default-handler",
]

dev = [
"memory_limit",
"optional_balance_check",
Expand Down
4 changes: 4 additions & 0 deletions crates/interpreter/src/host.rs
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,10 @@ pub trait Host {
/// Get code hash of `address` and if the account is cold.
fn code_hash(&mut self, address: Address) -> Option<(B256, bool)>;

#[cfg(feature = "scroll")]
/// Get keccak code hash of `address` and if the account is cold.
fn keccak_code_hash(&mut self, address: Address) -> Option<(B256, bool)>;

/// Get storage value of `address` at `index` and if the account is cold.
fn sload(&mut self, address: Address, index: U256) -> Option<(U256, bool)>;

Expand Down
16 changes: 16 additions & 0 deletions crates/interpreter/src/host/dummy.rs
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,9 @@ use crate::{
};
use std::vec::Vec;

#[cfg(feature = "scroll")]
use revm_primitives::POSEIDON_EMPTY;

/// A dummy [Host] implementation.
#[derive(Clone, Debug, Default, PartialEq, Eq)]
pub struct DummyHost {
Expand Down Expand Up @@ -64,10 +67,23 @@ impl Host for DummyHost {
}

#[inline]
#[cfg(not(feature = "scroll"))]
fn code_hash(&mut self, __address: Address) -> Option<(B256, bool)> {
Some((KECCAK_EMPTY, false))
}

#[inline]
#[cfg(feature = "scroll")]
fn code_hash(&mut self, __address: Address) -> Option<(B256, bool)> {
Some((POSEIDON_EMPTY, false))
}

#[inline]
#[cfg(feature = "scroll")]
fn keccak_code_hash(&mut self, __address: Address) -> Option<(B256, bool)> {
Some((KECCAK_EMPTY, false))
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using both feature and not feature seems like an antipattern. Will be a drag in long term maintainance.

#[inline]
    fn code_hash(&mut self, __address: Address) -> Option<(B256, bool)> {
        #[cfg(feature = "scroll")]
        Some((POSEIDON_EMPTY, false))

        Some((KECCAK_EMPTY, false))
    }
    
    

this will also remove un- needed tags on top of code that uses code_hash function also

#[inline]
fn sload(&mut self, __address: Address, index: U256) -> Option<(U256, bool)> {
match self.storage.entry(index) {
Expand Down
25 changes: 23 additions & 2 deletions crates/interpreter/src/instructions/host.rs
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,13 @@ pub fn extcodesize<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &mu
pub fn extcodehash<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &mut H) {
check!(interpreter, CONSTANTINOPLE);
pop_address!(interpreter, address);
let Some((code_hash, is_cold)) = host.code_hash(address) else {

#[cfg(not(feature = "scroll"))]
let result = host.code_hash(address);
#[cfg(feature = "scroll")]
let result = host.keccak_code_hash(address);

let Some((code_hash, is_cold)) = result else {
interpreter.instruction_result = InstructionResult::FatalExternalError;
return;
};
Expand Down Expand Up @@ -120,14 +126,23 @@ pub fn extcodecopy<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &mu
.set_data(memory_offset, code_offset, len, code.bytes());
}

pub fn blockhash<H: Host>(interpreter: &mut Interpreter, host: &mut H) {
pub fn blockhash<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &mut H) {
gas!(interpreter, gas::BLOCKHASH);
pop_top!(interpreter, number);

if let Some(diff) = host.env().block.number.checked_sub(*number) {
let diff = as_usize_saturated!(diff);
// blockhash should push zero if number is same as current block number.
if diff <= BLOCK_HASH_HISTORY && diff != 0 {
#[cfg(feature = "scroll")]
if SPEC::enabled(BERNOULLI) {
let number64 = as_usize_or_fail!(interpreter, number);
let mut hasher = crate::primitives::Keccak256::new();
hasher.update(host.env().cfg.chain_id.to_be_bytes());
hasher.update(number64.to_be_bytes());
*number = U256::from_be_bytes(*hasher.finalize());
return;
}
let Some(hash) = host.block_hash(*number) else {
interpreter.instruction_result = InstructionResult::FatalExternalError;
return;
Expand Down Expand Up @@ -231,6 +246,12 @@ pub fn selfdestruct<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &m
check_staticcall!(interpreter);
pop_address!(interpreter, target);

#[cfg(feature = "scroll")]
if SPEC::enabled(BERNOULLI) {
interpreter.instruction_result = InstructionResult::NotActivated;
return;
}

let Some(res) = host.selfdestruct(interpreter.contract.address, target) else {
interpreter.instruction_result = InstructionResult::FatalExternalError;
return;
Expand Down
5 changes: 5 additions & 0 deletions crates/interpreter/src/instructions/host_env.rs
Original file line number Diff line number Diff line change
Expand Up @@ -47,6 +47,11 @@ pub fn gasprice<H: Host>(interpreter: &mut Interpreter, host: &mut H) {

/// EIP-3198: BASEFEE opcode
pub fn basefee<H: Host, SPEC: Spec>(interpreter: &mut Interpreter, host: &mut H) {
#[cfg(feature = "scroll")]
if SPEC::enabled(BERNOULLI) {
interpreter.instruction_result = crate::InstructionResult::NotActivated;
return;
}
check!(interpreter, LONDON);
gas!(interpreter, gas::BASE);
push!(interpreter, host.env().block.basefee);
Expand Down
7 changes: 6 additions & 1 deletion crates/interpreter/src/instructions/opcode.rs
Original file line number Diff line number Diff line change
Expand Up @@ -235,7 +235,7 @@ opcodes! {
0x3D => RETURNDATASIZE => system::returndatasize::<H, SPEC>,
0x3E => RETURNDATACOPY => system::returndatacopy::<H, SPEC>,
0x3F => EXTCODEHASH => host::extcodehash::<H, SPEC>,
0x40 => BLOCKHASH => host::blockhash,
0x40 => BLOCKHASH => host::blockhash::<H, SPEC>,
0x41 => COINBASE => host_env::coinbase,
0x42 => TIMESTAMP => host_env::timestamp,
0x43 => NUMBER => host_env::number,
Expand Down Expand Up @@ -947,6 +947,11 @@ pub const fn spec_opcode_gas(spec_id: SpecId) -> &'static [OpInfo; 256] {
const TABLE: &[OpInfo;256] = &make_gas_table(SpecId::ECOTONE);
TABLE
}
#[cfg(feature = "scroll")]
SpecId::BERNOULLI => {
const TABLE: &[OpInfo;256] = &make_gas_table(SpecId::BERNOULLI);
TABLE
}
}
};
}
Expand Down
28 changes: 27 additions & 1 deletion crates/interpreter/src/interpreter/analysis.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,9 @@ use crate::primitives::{
use core::fmt;
use std::sync::Arc;

#[cfg(feature = "scroll")]
use crate::primitives::{poseidon, POSEIDON_EMPTY};

/// Perform bytecode analysis.
///
/// The analysis finds and caches valid jump destinations for later execution as an optimization step.
Expand Down Expand Up @@ -124,8 +127,9 @@ impl BytecodeLocked {
self.original_len == 0
}

/// Calculate hash of the bytecode.
/// Calculate poseidon hash of the bytecode.
#[inline]
#[cfg(not(feature = "scroll"))]
pub fn hash_slow(&self) -> B256 {
if self.is_empty() {
KECCAK_EMPTY
Expand All @@ -134,6 +138,28 @@ impl BytecodeLocked {
}
}

/// Calculate poseidon hash of the bytecode.
#[inline]
#[cfg(feature = "scroll")]
pub fn poseidon_hash_slow(&self) -> B256 {
if self.is_empty() {
POSEIDON_EMPTY
} else {
poseidon(self.original_bytecode_slice())
}
}

/// Calculate keccak hash of the bytecode.
#[inline]
#[cfg(feature = "scroll")]
pub fn keccak_hash_slow(&self) -> B256 {
if self.is_empty() {
KECCAK_EMPTY
} else {
keccak256(self.original_bytecode_slice())
}
}

#[inline]
pub fn unlock(self) -> Bytecode {
Bytecode {
Expand Down
10 changes: 10 additions & 0 deletions crates/precompile/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,16 @@ negate-optimism-default-handler = [
"revm-primitives/negate-optimism-default-handler",
]

scroll = ["revm-primitives/scroll"]
# Scroll default handler enabled Scroll handler register by default in EvmBuilder.
scroll-default-handler = [
"scroll",
"revm-primitives/scroll-default-handler",
]
negate-scroll-default-handler = [
"revm-primitives/negate-scroll-default-handler",
]

# These libraries may not work on all no_std platforms as they depend on C.

# Enables the KZG point evaluation precompile.
Expand Down
9 changes: 9 additions & 0 deletions crates/precompile/src/blake2.rs
Original file line number Diff line number Diff line change
@@ -1,12 +1,21 @@
use crate::{Error, Precompile, PrecompileResult, PrecompileWithAddress};
use revm_primitives::Bytes;

#[cfg(feature = "scroll")]
use revm_primitives::PrecompileError;

const F_ROUND: u64 = 1;
const INPUT_LENGTH: usize = 213;

pub const FUN: PrecompileWithAddress =
PrecompileWithAddress(crate::u64_to_address(9), Precompile::Standard(run));

#[cfg(feature = "scroll")]
pub const BERNOULLI: PrecompileWithAddress = PrecompileWithAddress(
crate::u64_to_address(9),
Precompile::Standard(|_input: &Bytes, _gas_limit: u64| Err(PrecompileError::NotImplemented)),
);

/// reference: <https://eips.ethereum.org/EIPS/eip-152>
/// input format:
/// [4 bytes for rounds][64 bytes for h][128 bytes for m][8 bytes for t_0][8 bytes for t_1][1 byte for f]
Expand Down
9 changes: 9 additions & 0 deletions crates/precompile/src/hash.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@ use crate::{Error, Precompile, PrecompileResult, PrecompileWithAddress};
use revm_primitives::Bytes;
use sha2::Digest;

#[cfg(feature = "scroll")]
use revm_primitives::PrecompileError;

pub const SHA256: PrecompileWithAddress =
PrecompileWithAddress(crate::u64_to_address(2), Precompile::Standard(sha256_run));

Expand All @@ -11,6 +14,12 @@ pub const RIPEMD160: PrecompileWithAddress = PrecompileWithAddress(
Precompile::Standard(ripemd160_run),
);

#[cfg(feature = "scroll")]
pub const RIPEMD160_BERNOULLI: PrecompileWithAddress = PrecompileWithAddress(
crate::u64_to_address(3),
Precompile::Standard(|_input: &Bytes, _gas_limit: u64| Err(PrecompileError::NotImplemented)),
);

/// See: <https://ethereum.github.io/yellowpaper/paper.pdf>
/// See: <https://docs.soliditylang.org/en/develop/units-and-global-variables.html#mathematical-and-cryptographic-functions>
/// See: <https://etherscan.io/address/0000000000000000000000000000000000000002>
Expand Down
27 changes: 27 additions & 0 deletions crates/precompile/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,8 @@ impl Precompiles {
PrecompileSpecId::BYZANTIUM => Self::byzantium(),
PrecompileSpecId::ISTANBUL => Self::istanbul(),
PrecompileSpecId::BERLIN => Self::berlin(),
#[cfg(feature = "scroll")]
PrecompileSpecId::BERNOULLI => Self::bernoulli(),
PrecompileSpecId::CANCUN => Self::cancun(),
PrecompileSpecId::LATEST => Self::latest(),
}
Expand Down Expand Up @@ -156,6 +158,27 @@ impl Precompiles {
})
}

/// Returns precompiles for Scroll
#[cfg(feature = "scroll")]
pub fn bernoulli() -> &'static Self {
static INSTANCE: OnceBox<Precompiles> = OnceBox::new();
INSTANCE.get_or_init(|| {
let mut precompiles = Precompiles::default();
precompiles.extend([
secp256k1::ECRECOVER, // 0x01
hash::SHA256, // 0x02
hash::RIPEMD160_BERNOULLI, // 0x03
identity::FUN, // 0x04
modexp::BERNOULLI, // 0x05
bn128::add::ISTANBUL, // 0x06
bn128::mul::ISTANBUL, // 0x07
bn128::pair::ISTANBUL, // 0x08
blake2::BERNOULLI, // 0x09
]);
Box::new(precompiles)
})
}

/// Returns the precompiles for the latest spec.
pub fn latest() -> &'static Self {
Self::cancun()
Expand Down Expand Up @@ -230,6 +253,8 @@ pub enum PrecompileSpecId {
BYZANTIUM,
ISTANBUL,
BERLIN,
#[cfg(feature = "scroll")]
BERNOULLI,
CANCUN,
LATEST,
}
Expand All @@ -251,6 +276,8 @@ impl PrecompileSpecId {
BEDROCK | REGOLITH | CANYON => Self::BERLIN,
#[cfg(feature = "optimism")]
ECOTONE => Self::CANCUN,
#[cfg(feature = "scroll")]
BERNOULLI => Self::BERNOULLI,
}
}
}
Expand Down
31 changes: 31 additions & 0 deletions crates/precompile/src/modexp.rs
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,9 @@ use aurora_engine_modexp::modexp;
use core::cmp::{max, min};
use revm_primitives::Bytes;

#[cfg(feature = "scroll")]
const SCROLL_LEN_LIMIT: U256 = U256::from_limbs([32, 0, 0, 0]);

pub const BYZANTIUM: PrecompileWithAddress = PrecompileWithAddress(
crate::u64_to_address(5),
Precompile::Standard(byzantium_run),
Expand All @@ -15,6 +18,12 @@ pub const BYZANTIUM: PrecompileWithAddress = PrecompileWithAddress(
pub const BERLIN: PrecompileWithAddress =
PrecompileWithAddress(crate::u64_to_address(5), Precompile::Standard(berlin_run));

#[cfg(feature = "scroll")]
pub const BERNOULLI: PrecompileWithAddress = PrecompileWithAddress(
crate::u64_to_address(5),
Precompile::Standard(bernoilli_run),
);

/// See: <https://eips.ethereum.org/EIPS/eip-198>
/// See: <https://etherscan.io/address/0000000000000000000000000000000000000005>
pub fn byzantium_run(input: &Bytes, gas_limit: u64) -> PrecompileResult {
Expand All @@ -29,6 +38,28 @@ pub fn berlin_run(input: &Bytes, gas_limit: u64) -> PrecompileResult {
})
}

#[cfg(feature = "scroll")]
pub fn bernoilli_run(input: &Bytes, gas_limit: u64) -> PrecompileResult {
let base_len = U256::from_be_bytes(right_pad_with_offset::<32>(input, 0).into_owned());
let exp_len = U256::from_be_bytes(right_pad_with_offset::<32>(input, 32).into_owned());
let mod_len = U256::from_be_bytes(right_pad_with_offset::<32>(input, 64).into_owned());

// modexp temporarily only accepts inputs of 32 bytes (256 bits) or less
if base_len > SCROLL_LEN_LIMIT {
return Err(Error::ModexpBaseOverflow);
}
if exp_len > SCROLL_LEN_LIMIT {
return Err(Error::ModexpExpOverflow);
}
if mod_len > SCROLL_LEN_LIMIT {
return Err(Error::ModexpModOverflow);
}

run_inner(input, gas_limit, 200, |a, b, c, d| {
berlin_gas_calc(a, b, c, d)
})
}

pub fn calculate_iteration_count(exp_length: u64, exp_highp: &U256) -> u64 {
let mut iteration_count: u64 = 0;

Expand Down
9 changes: 9 additions & 0 deletions crates/primitives/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,10 @@ serde = { version = "1.0", default-features = false, features = [
"rc",
], optional = true }

# scroll
halo2_proofs = { git = "https://github.com/scroll-tech/halo2.git", branch = "v1.1", optional = true }
hash-circuit = { package = "poseidon-circuit", git = "https://github.com/scroll-tech/poseidon-circuit.git", branch = "scroll-dev-1201", optional = true }
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

branch = "main"

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hash-circuit or halo2_proofs?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hash-circuit


[build-dependencies]
hex = { version = "0.4", default-features = false }

Expand Down Expand Up @@ -71,6 +75,11 @@ optimism = []
optimism-default-handler = ["optimism"]
negate-optimism-default-handler = []

scroll = ["halo2_proofs", "hash-circuit"]
# Scroll default handler enabled Scroll handler register by default in EvmBuilder.
scroll-default-handler = ["scroll"]
negate-scroll-default-handler = []

dev = [
"memory_limit",
"optional_balance_check",
Expand Down
Loading