Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enables optional verification of Keccak tables #657

Merged
merged 72 commits into from
Oct 7, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
72 commits
Select commit Hold shift + click to select a range
662c8db
refactoring
sai-deng Sep 21, 2024
d82b1a8
fix clippy
sai-deng Sep 21, 2024
4498870
fix clippy
sai-deng Sep 21, 2024
b14417f
wip
sai-deng Sep 22, 2024
eabcf5e
wip
sai-deng Sep 22, 2024
2ad3857
Merge branch 'develop' into sai/skip_table_in_root_circuit
sai-deng Sep 23, 2024
2b2e737
wip
sai-deng Sep 23, 2024
f63363b
cleanup
sai-deng Sep 23, 2024
cc9d41b
prove one segment
sai-deng Sep 23, 2024
fe88819
polish the code
sai-deng Sep 23, 2024
7ff2fb1
cleanup
sai-deng Sep 23, 2024
6ce3cbe
Revert "cleanup"
sai-deng Sep 23, 2024
8c940b0
add [ignore]
sai-deng Sep 23, 2024
d6af500
add challenger state check
sai-deng Sep 23, 2024
f5deedb
add more checks
sai-deng Sep 23, 2024
e39894a
cleanup
sai-deng Sep 23, 2024
948d742
add empty keccak tables bool
sai-deng Sep 24, 2024
49bb96b
enable no keccak recursion
sai-deng Sep 24, 2024
bbba820
refactor
sai-deng Sep 24, 2024
8c4a362
fix tests
sai-deng Sep 24, 2024
5cdb37d
rename
sai-deng Sep 24, 2024
2f0129e
fix pis
sai-deng Sep 24, 2024
36ace10
fix
sai-deng Sep 25, 2024
5b7f818
resolve conflicts
sai-deng Sep 25, 2024
56deaa3
done
sai-deng Sep 25, 2024
7a3fe08
polish the comments
sai-deng Sep 25, 2024
cfc6f24
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Sep 25, 2024
4d288f8
fix ci
sai-deng Sep 25, 2024
e6bed69
add comments
sai-deng Sep 25, 2024
0bd1448
fix ci
sai-deng Sep 25, 2024
3800986
update tests
sai-deng Sep 25, 2024
674485e
Merge branch 'sai/skip_table_in_root_circuit' into sai/conditionally_…
sai-deng Sep 25, 2024
daaabb6
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Sep 25, 2024
91dbeaa
fix ci
sai-deng Sep 25, 2024
bb79080
fix ci
sai-deng Sep 25, 2024
ed9637d
Merge branch 'develop' into sai/skip_table_in_root_circuit
sai-deng Sep 25, 2024
fd55d1a
fix ci
sai-deng Sep 25, 2024
6b27552
Merge branch 'sai/skip_table_in_root_circuit' into sai/conditionally_…
sai-deng Sep 25, 2024
ab2ec50
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Sep 25, 2024
4de4c64
empty payload and no keccak op check
sai-deng Sep 26, 2024
e34c59a
refactor
sai-deng Sep 26, 2024
be887e6
Update evm_arithmetization/src/fixed_recursive_verifier.rs
sai-deng Sep 30, 2024
d2ca0cc
Update evm_arithmetization/src/fixed_recursive_verifier.rs
sai-deng Sep 30, 2024
ce88ef8
empty payload and no keccak op check
sai-deng Sep 26, 2024
a23eaf0
fmt
sai-deng Sep 30, 2024
7d72eb9
Merge branch 'develop' into sai/skip_table_in_root_circuit
sai-deng Sep 30, 2024
baba9da
merge
sai-deng Sep 30, 2024
ec64af9
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Sep 30, 2024
ac41fa4
Merge branch 'develop' into sai/skip_table_in_root_circuit
sai-deng Oct 1, 2024
f24d028
Merge branch 'sai/skip_table_in_root_circuit' into sai/conditionally_…
sai-deng Oct 1, 2024
58f37e8
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Oct 1, 2024
c822a70
fix test
sai-deng Oct 1, 2024
2886fd8
Merge branch 'sai/skip_table_in_root_circuit' into sai/conditionally_…
sai-deng Oct 1, 2024
1540078
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Oct 1, 2024
12c61a8
fix ci with "cdk_erigon"
sai-deng Oct 1, 2024
447cfb9
fix ci error with "cdk_erigon"
sai-deng Oct 1, 2024
5f935c2
Merge branch 'sai/skip_table_in_root_circuit' into sai/conditionally_…
sai-deng Oct 1, 2024
42afc90
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Oct 1, 2024
40867c0
Merge branch 'develop' into sai/conditionally_verify_in_root_circuit
sai-deng Oct 2, 2024
e7ab9eb
Merge branch 'sai/conditionally_verify_in_root_circuit' into sai/enab…
sai-deng Oct 2, 2024
09a1a4a
enable test
sai-deng Oct 2, 2024
4a69e8a
Update evm_arithmetization/src/fixed_recursive_verifier.rs
sai-deng Oct 2, 2024
81d3681
address comments
sai-deng Oct 2, 2024
79aeb72
sync changes into get_challenges
sai-deng Oct 2, 2024
732020d
wip
sai-deng Oct 2, 2024
9f8ec9e
fix
sai-deng Oct 2, 2024
503085b
fix
sai-deng Oct 2, 2024
a36c5f7
fix tests
sai-deng Oct 3, 2024
640bae1
fix
sai-deng Oct 3, 2024
051702d
address comments
sai-deng Oct 6, 2024
93fff4b
merge develop
sai-deng Oct 7, 2024
7bc7c2c
common_date -> common_data
sai-deng Oct 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
120 changes: 82 additions & 38 deletions evm_arithmetization/src/fixed_recursive_verifier.rs
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ use plonky2::plonk::circuit_data::{
use plonky2::plonk::config::{AlgebraicHasher, GenericConfig, GenericHashOut};
use plonky2::plonk::proof::{ProofWithPublicInputs, ProofWithPublicInputsTarget};
use plonky2::recursion::cyclic_recursion::check_cyclic_proof_verifier_data;
use plonky2::recursion::dummy_circuit::cyclic_base_proof;
use plonky2::recursion::dummy_circuit::{cyclic_base_proof, dummy_circuit, dummy_proof};
use plonky2::util::serialization::{
Buffer, GateSerializer, IoResult, Read, WitnessGeneratorSerializer, Write,
};
Expand Down Expand Up @@ -1916,30 +1916,57 @@ where
let mut root_inputs = PartialWitness::new();

for table in 0..NUM_TABLES {
let stark_proof = &all_proof.multi_proof.stark_proofs[table];
let original_degree_bits = stark_proof.proof.recover_degree_bits(config);
let table_circuits = &self.by_table[table];
let shrunk_proof = table_circuits
.by_stark_size
.get(&original_degree_bits)
.ok_or_else(|| {
anyhow!(format!(
"Missing preprocessed circuits for {:?} table with size {}.",
Table::all()[table],
original_degree_bits,
))
})?
.shrink(stark_proof, &all_proof.multi_proof.ctl_challenges)?;
let index_verifier_data = table_circuits
.by_stark_size
.keys()
.position(|&size| size == original_degree_bits)
.unwrap();
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_usize(index_verifier_data),
);
root_inputs.set_proof_with_pis_target(&self.root.proof_with_pis[table], &shrunk_proof);
if KECCAK_TABLES_INDICES.contains(&table) && !all_proof.use_keccak_tables {
// generate and set a dummy `index_verifier_data` and `proof_with_pis`
let index_verifier_data =
table_circuits.by_stark_size.keys().min().ok_or_else(|| {
anyhow::format_err!("No valid size in shrinking circuits")
})?;
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_usize(*index_verifier_data),
);
let table_circuit = table_circuits
.by_stark_size
.get(index_verifier_data)
.ok_or_else(|| anyhow::format_err!("No valid size in shrinking circuits"))?
.shrinking_wrappers
.last()
.ok_or_else(|| anyhow::format_err!("No shrinking circuits"))?;
let dummy_circuit: CircuitData<F, C, D> =
dummy_circuit(&table_circuit.circuit.common);
let dummy_pis = HashMap::new();
let dummy_proof = dummy_proof(&dummy_circuit, dummy_pis)
.expect("Unable to generate dummy proofs");
root_inputs
.set_proof_with_pis_target(&self.root.proof_with_pis[table], &dummy_proof);
} else {
let stark_proof = &all_proof.multi_proof.stark_proofs[table];
let original_degree_bits = stark_proof.proof.recover_degree_bits(config);
let shrunk_proof = table_circuits
.by_stark_size
.get(&original_degree_bits)
.ok_or_else(|| {
anyhow!(format!(
"Missing preprocessed circuits for {:?} table with size {}.",
Table::all()[table],
original_degree_bits,
))
})?
.shrink(stark_proof, &all_proof.multi_proof.ctl_challenges)?;
let index_verifier_data = table_circuits
.by_stark_size
.keys()
.position(|&size| size == original_degree_bits)
.unwrap();
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_usize(index_verifier_data),
);
root_inputs
.set_proof_with_pis_target(&self.root.proof_with_pis[table], &shrunk_proof);
}

check_abort_signal(abort_signal.clone())?;
}
Expand All @@ -1958,8 +1985,7 @@ where
anyhow::Error::msg("Invalid conversion when setting public values targets.")
})?;

// TODO(sdeng): Set to false when this segment contains no Keccak operations.
root_inputs.set_bool_target(self.root.use_keccak_tables, true);
root_inputs.set_bool_target(self.root.use_keccak_tables, all_proof.use_keccak_tables);

let root_proof = self.root.circuit.prove(root_inputs)?;

Expand Down Expand Up @@ -2033,16 +2059,36 @@ where

for table in 0..NUM_TABLES {
let (table_circuit, index_verifier_data) = &table_circuits[table];
if KECCAK_TABLES_INDICES.contains(&table) && !all_proof.use_keccak_tables {
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_u8(*index_verifier_data),
);
// generate and set a dummy `proof_with_pis`
let common_data = &table_circuit
.shrinking_wrappers
.last()
.ok_or_else(|| anyhow::format_err!("No shrinking circuits"))?
.circuit
.common;
let dummy_circuit: CircuitData<F, C, D> = dummy_circuit(common_data);
let dummy_pis = HashMap::new();
let dummy_proof = dummy_proof(&dummy_circuit, dummy_pis)
.expect("Unable to generate dummy proofs");
root_inputs
.set_proof_with_pis_target(&self.root.proof_with_pis[table], &dummy_proof);
} else {
let stark_proof = &all_proof.multi_proof.stark_proofs[table];

let stark_proof = &all_proof.multi_proof.stark_proofs[table];

let shrunk_proof =
table_circuit.shrink(stark_proof, &all_proof.multi_proof.ctl_challenges)?;
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_u8(*index_verifier_data),
);
root_inputs.set_proof_with_pis_target(&self.root.proof_with_pis[table], &shrunk_proof);
let shrunk_proof =
table_circuit.shrink(stark_proof, &all_proof.multi_proof.ctl_challenges)?;
root_inputs.set_target(
self.root.index_verifier_data[table],
F::from_canonical_u8(*index_verifier_data),
);
root_inputs
.set_proof_with_pis_target(&self.root.proof_with_pis[table], &shrunk_proof);
}

check_abort_signal(abort_signal.clone())?;
}
Expand All @@ -2061,8 +2107,7 @@ where
anyhow::Error::msg("Invalid conversion when setting public values targets.")
})?;

// TODO(sdeng): Set to false when this segment contains no Keccak operations.
root_inputs.set_bool_target(self.root.use_keccak_tables, true);
root_inputs.set_bool_target(self.root.use_keccak_tables, all_proof.use_keccak_tables);

let root_proof = self.root.circuit.prove(root_inputs)?;

Expand Down Expand Up @@ -3103,7 +3148,6 @@ mod tests {
type C = PoseidonGoldilocksConfig;

#[test]
#[ignore]
fn test_segment_proof_generation_without_keccak() -> anyhow::Result<()> {
init_logger();

Expand Down
17 changes: 14 additions & 3 deletions evm_arithmetization/src/generation/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -486,15 +486,19 @@ fn get_all_memory_address_and_values(memory_before: &MemoryState) -> Vec<(Memory
res
}

type TablesWithPVsAndFinalMem<F> = ([Vec<PolynomialValues<F>>; NUM_TABLES], PublicValues<F>);
pub struct TablesWithPVs<F: RichField> {
pub tables: [Vec<PolynomialValues<F>>; NUM_TABLES],
pub use_keccak_tables: bool,
pub public_values: PublicValues<F>,
}

pub fn generate_traces<F: RichField + Extendable<D>, const D: usize>(
all_stark: &AllStark<F, D>,
inputs: &TrimmedGenerationInputs<F>,
config: &StarkConfig,
segment_data: &mut GenerationSegmentData,
timing: &mut TimingTree,
) -> anyhow::Result<TablesWithPVsAndFinalMem<F>> {
) -> anyhow::Result<TablesWithPVs<F>> {
let mut state = GenerationState::<F>::new_with_segment_data(inputs, segment_data)
.map_err(|err| anyhow!("Failed to parse all the initial prover inputs: {:?}", err))?;

Expand Down Expand Up @@ -579,6 +583,8 @@ pub fn generate_traces<F: RichField + Extendable<D>, const D: usize>(
mem_after: MemCap::default(),
};

let use_keccak_tables = !state.traces.keccak_inputs.is_empty();

let tables = timed!(
timing,
"convert trace data to tables",
Expand All @@ -591,7 +597,12 @@ pub fn generate_traces<F: RichField + Extendable<D>, const D: usize>(
timing
)
);
Ok((tables, public_values))

Ok(TablesWithPVs {
tables,
use_keccak_tables,
public_values,
})
}

fn simulate_cpu<F: RichField>(
Expand Down
37 changes: 27 additions & 10 deletions evm_arithmetization/src/get_challenges.rs
Original file line number Diff line number Diff line change
Expand Up @@ -254,6 +254,7 @@ pub mod testing {
use starky::lookup::{get_grand_product_challenge_set, GrandProductChallengeSet};
use starky::proof::StarkProofChallenges;

use crate::all_stark::KECCAK_TABLES_INDICES;
use crate::get_challenges::observe_public_values;
use crate::proof::*;
use crate::witness::errors::ProgramError;
Expand All @@ -262,7 +263,7 @@ pub mod testing {
/// Randomness for all STARKs.
pub(crate) struct AllProofChallenges<F: RichField + Extendable<D>, const D: usize> {
/// Randomness used in each STARK proof.
pub stark_challenges: [StarkProofChallenges<F, D>; NUM_TABLES],
pub stark_challenges: [Option<StarkProofChallenges<F, D>>; NUM_TABLES],
/// Randomness used for cross-table lookups. It is shared by all STARKs.
pub ctl_challenges: GrandProductChallengeSet<F>,
}
Expand All @@ -277,8 +278,20 @@ pub mod testing {

let stark_proofs = &self.multi_proof.stark_proofs;

for proof in stark_proofs {
challenger.observe_cap(&proof.proof.trace_cap);
for (i, proof) in stark_proofs.iter().enumerate() {
if KECCAK_TABLES_INDICES.contains(&i) && !self.use_keccak_tables {
// Observe zero merkle caps when skipping Keccak tables.
let zero_merkle_cap = proof
.proof
.trace_cap
.flatten()
.iter()
.map(|_| F::ZERO)
.collect::<Vec<F>>();
challenger.observe_elements(&zero_merkle_cap);
} else {
challenger.observe_cap(&proof.proof.trace_cap);
}
}

observe_public_values::<F, C, D>(&mut challenger, &self.public_values)?;
Expand All @@ -288,13 +301,17 @@ pub mod testing {

Ok(AllProofChallenges {
stark_challenges: core::array::from_fn(|i| {
challenger.compact();
stark_proofs[i].proof.get_challenges(
&mut challenger,
Some(&ctl_challenges),
true,
config,
)
if KECCAK_TABLES_INDICES.contains(&i) && !self.use_keccak_tables {
None
} else {
challenger.compact();
Some(stark_proofs[i].proof.get_challenges(
&mut challenger,
Some(&ctl_challenges),
true,
config,
))
}
}),
ctl_challenges,
})
Expand Down
4 changes: 4 additions & 0 deletions evm_arithmetization/src/proof.rs
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,10 @@ pub struct AllProof<F: RichField + Extendable<D>, C: GenericConfig<D, F = F>, co
pub multi_proof: MultiProof<F, C, D, NUM_TABLES>,
/// Public memory values used for the recursive proofs.
pub public_values: PublicValues<F>,
/// A flag indicating whether the Keccak and KeccakSponge tables contain
/// only padding values (i.e., no meaningful data). This is set to false
/// when no actual Keccak operations were performed.
pub use_keccak_tables: bool,
}

impl<F: RichField + Extendable<D>, C: GenericConfig<D, F = F>, const D: usize> AllProof<F, C, D> {
Expand Down
31 changes: 25 additions & 6 deletions evm_arithmetization/src/prover.rs
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ use starky::proof::{MultiProof, StarkProofWithMetadata};
use starky::prover::prove_with_commitment;
use starky::stark::Stark;

use crate::all_stark::{AllStark, Table, NUM_TABLES};
use crate::all_stark::{AllStark, Table, KECCAK_TABLES_INDICES, NUM_TABLES};
use crate::cpu::kernel::aggregator::KERNEL;
use crate::generation::{generate_traces, GenerationInputs, TrimmedGenerationInputs};
use crate::get_challenges::observe_public_values;
Expand All @@ -47,7 +47,7 @@ where

timed!(timing, "build kernel", Lazy::force(&KERNEL));

let (traces, mut public_values) = timed!(
let mut tables_with_pvs = timed!(
timing,
"generate all traces",
generate_traces(all_stark, &inputs, config, segment_data, timing)?
Expand All @@ -58,8 +58,9 @@ where
let proof = prove_with_traces(
all_stark,
config,
traces,
&mut public_values,
tables_with_pvs.tables,
tables_with_pvs.use_keccak_tables,
&mut tables_with_pvs.public_values,
timing,
abort_signal,
)?;
Expand All @@ -72,6 +73,7 @@ pub(crate) fn prove_with_traces<F, C, const D: usize>(
all_stark: &AllStark<F, D>,
config: &StarkConfig,
trace_poly_values: [Vec<PolynomialValues<F>>; NUM_TABLES],
use_keccak_tables: bool,
public_values: &mut PublicValues<F>,
timing: &mut TimingTree,
abort_signal: Option<Arc<AtomicBool>>,
Expand Down Expand Up @@ -114,8 +116,14 @@ where
.map(|c| c.merkle_tree.cap.clone())
.collect::<Vec<_>>();
let mut challenger = Challenger::<F, C::Hasher>::new();
for cap in &trace_caps {
challenger.observe_cap(cap);
for (i, cap) in trace_caps.iter().enumerate() {
if KECCAK_TABLES_INDICES.contains(&i) && !use_keccak_tables {
// Observe zero merkle caps when skipping Keccak tables.
let zero_merkle_cap = cap.flatten().iter().map(|_| F::ZERO).collect::<Vec<F>>();
challenger.observe_elements(&zero_merkle_cap);
Comment on lines +120 to +123
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can they not just be ignored? (as in only observe if it's not a keccak table)?

Also, has a similar change been applied on the verifier side (in get_challenges.rs)?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is difficult to implement conditionally observe in root circuits, so I choose observe zero caps instead.

I was originally planning to implement it in the next PR (which requires changes on the Plonky2 side, and get_challenges.rs is only used for testing). However, I agree with you that we should make this PR complete. I will implement it in this PR instead.

} else {
challenger.observe_cap(cap);
}
}

observe_public_values::<F, C, D>(&mut challenger, public_values)
Expand Down Expand Up @@ -143,6 +151,7 @@ where
config,
&trace_poly_values,
trace_commitments,
use_keccak_tables,
ctl_data_per_table,
&mut challenger,
&ctl_challenges,
Expand Down Expand Up @@ -206,6 +215,7 @@ where
ctl_challenges,
},
public_values: public_values.clone(),
use_keccak_tables,
})
}

Expand All @@ -229,6 +239,7 @@ fn prove_with_commitments<F, C, const D: usize>(
config: &StarkConfig,
trace_poly_values: &[Vec<PolynomialValues<F>>; NUM_TABLES],
trace_commitments: Vec<PolynomialBatch<F, C, D>>,
use_keccak_tables: bool,
ctl_data_per_table: [CtlData<F>; NUM_TABLES],
challenger: &mut Challenger<F, C::Hasher>,
ctl_challenges: &GrandProductChallengeSet<F>,
Expand Down Expand Up @@ -262,8 +273,16 @@ where
let (arithmetic_proof, _) = prove_table!(arithmetic_stark, Table::Arithmetic);
let (byte_packing_proof, _) = prove_table!(byte_packing_stark, Table::BytePacking);
let (cpu_proof, _) = prove_table!(cpu_stark, Table::Cpu);
let challenger_after_cpu = challenger.clone();
// TODO(sdeng): Keccak proofs are still required for CTLs, etc. Refactor the
// code and remove the unnecessary parts.
Nashtare marked this conversation as resolved.
Show resolved Hide resolved
let (keccak_proof, _) = prove_table!(keccak_stark, Table::Keccak);
let (keccak_sponge_proof, _) = prove_table!(keccak_sponge_stark, Table::KeccakSponge);
if !use_keccak_tables {
// We need to connect the challenger state of CPU and Logic tables when the
// Keccak tables are not in use.
*challenger = challenger_after_cpu;
}
let (logic_proof, _) = prove_table!(logic_stark, Table::Logic);
let (memory_proof, _) = prove_table!(memory_stark, Table::Memory);
let (mem_before_proof, mem_before_cap) = prove_table!(mem_before_stark, Table::MemBefore);
Expand Down
Loading
Loading