Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distro-agnostic compatibility with GCC 13 #514

Merged
merged 4 commits into from
Aug 24, 2023
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -17,4 +17,6 @@ config/recursive*/*
config/zkevm/*
config/tmp
runtime/*
!runtime/README.md
!runtime/README.md
nadimkobeissi marked this conversation as resolved.
Show resolved Hide resolved
v1.1.0-rc.1-fork.4.tgz
v1.1.0-rc.1-fork.4
24 changes: 16 additions & 8 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,17 @@ TARGET_TEST := zkProverTest
BUILD_DIR := ./build
SRC_DIRS := ./src ./test ./tools

LIBOMP := $(shell find /usr/lib/llvm-* -name "libomp.so" | sed 's/libomp.so//')
ifndef LIBOMP
$(error LIBOMP is not set, you need to install libomp-dev)
GRPCPP_FLAGS := $(shell pkg-config grpc++ --cflags)
GRPCPP_LIBS := $(shell pkg-config grpc++ --libs) -lgrpc++_reflection
ABSL_LIBS := $(shell pkg-config absl_log_internal_check_op --libs)
ifndef GRPCPP_LIBS
$(error gRPC++ could not be found via pkg-config, you need to install them)
endif

CXX := g++
AS := nasm
CXXFLAGS := -std=c++17 -Wall -pthread -flarge-source-files -Wno-unused-label -rdynamic -mavx2 #-march=native
LDFLAGS := -lprotobuf -lsodium -lgrpc -lgrpc++ -lgrpc++_reflection -lgpr -lpthread -lpqxx -lpq -lgmp -lstdc++ -lomp -lgmpxx -lsecp256k1 -lcrypto -luuid -L$(LIBOMP)
CXXFLAGS := -std=c++17 -Wall -pthread -flarge-source-files -Wno-unused-label -rdynamic -mavx2 $(GRPCPP_FLAGS) #-Wfatal-errors
LDFLAGS := -lprotobuf -lsodium -lgpr -lpthread -lpqxx -lpq -lgmp -lstdc++ -lgmpxx -lsecp256k1 -lcrypto -luuid $(GRPCPP_LIBS) $(ABSL_LIBS)
CFLAGS := -fopenmp
ASFLAGS := -felf64

Expand All @@ -39,6 +41,12 @@ INC_FLAGS := $(addprefix -I,$(INC_DIRS))

CPPFLAGS ?= $(INC_FLAGS) -MMD -MP

GRPC_CPP_PLUGIN = grpc_cpp_plugin
GRPC_CPP_PLUGIN_PATH ?= `which $(GRPC_CPP_PLUGIN)`

INC_DIRS := $(shell find $(SRC_DIRS) -type d) $(sort $(dir))
INC_FLAGS := $(addprefix -I,$(INC_DIRS))

SRCS_ZKP := $(shell find $(SRC_DIRS) ! -path "./tools/starkpil/bctree/*" ! -path "./test/prover/*" ! -path "./src/goldilocks/benchs/*" ! -path "./src/goldilocks/benchs/*" ! -path "./src/goldilocks/tests/*" ! -path "./src/main_generator/*" ! -path "./src/pols_generator/*" -name *.cpp -or -name *.c -or -name *.asm -or -name *.cc)
OBJS_ZKP := $(SRCS_ZKP:%=$(BUILD_DIR)/%.o)
DEPS_ZKP := $(OBJS_ZKP:.o=.d)
Expand All @@ -58,13 +66,13 @@ bctree: $(BUILD_DIR)/$(TARGET_BCT)
test: $(BUILD_DIR)/$(TARGET_TEST)

$(BUILD_DIR)/$(TARGET_ZKP): $(OBJS_ZKP)
$(CXX) $(OBJS_ZKP) $(CXXFLAGS) -o $@ $(LDFLAGS)
$(CXX) $(OBJS_ZKP) $(CXXFLAGS) -o $@ $(LDFLAGS) $(CFLAGS) $(CPPFLAGS) $(CXXFLAGS) $(LDFLAGS)

$(BUILD_DIR)/$(TARGET_BCT): $(OBJS_BCT)
$(CXX) $(OBJS_BCT) $(CXXFLAGS) -o $@ $(LDFLAGS)
$(CXX) $(OBJS_BCT) $(CXXFLAGS) -o $@ $(LDFLAGS) $(CFLAGS) $(CPPFLAGS) $(CXXFLAGS) $(LDFLAGS)

$(BUILD_DIR)/$(TARGET_TEST): $(OBJS_TEST)
$(CXX) $(OBJS_TEST) $(CXXFLAGS) -o $@ $(LDFLAGS)
$(CXX) $(OBJS_TEST) $(CXXFLAGS) -o $@ $(LDFLAGS) $(CFLAGS) $(CPPFLAGS) $(CXXFLAGS) $(LDFLAGS)

# assembly
$(BUILD_DIR)/%.asm.o: %.asm
Expand Down
188 changes: 116 additions & 72 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,108 +1,152 @@
# zkEVM Prover
zkEVM proof generator
## General info
The zkEVM Prover process can provide up to 3 RPC services and clients:

Built to interface with Ethereum Virtual Machines (EVM), the prover provides critical services through three primary RPC clients: the Aggregator client, Executor service, and StateDB service. The Aggregator client connects to an Aggregator server and harnesses multiple zkEVM Provers simultaneously, thereby maximizing proof generation efficiency. This involves a process where the Prover component calculates a resulting state by processing EVM transaction batches and subsequently generates a proof based on the PIL polynomials definition and their constraints. The Executor service offers a mechanism to validate the integrity of proposed EVM transaction batches, ensuring they adhere to specific workload requirements. The StateDB service interfaces with a system's state (represented as a Merkle tree) and the corresponding database, thus serving as a centralized state information repository.

## Components

### Aggregator client
- It connects to an Aggregator server.
- Many zkEVM Provers can connect to the Aggregator server at the same time, providing more proof generation power.
- When called by the Aggregator service to generate a batch proof:
- It calls the Prover component that executes the input data (a batch of EVM transactions), calculates the resulting state, and generates the proof of the calculation based on the PIL polynomials definition and their constrains.
- The Executor component combines 14 state machines that process the input data to generate the evaluations of the committed polynomials, required to generate the proof. Every state machine generates their computation evidence data, and the more complex calculus demonstrations are delegated to the next state machine.
- The Prover component calls the Stark component to generate a proof of the Executor state machines committed polynomials.
- When called by the Aggregator service to generate an aggregated proof:
- The Prover component combines the result of 2 previously calculated batch or aggregated proofs, provided by the Aggregator, and generates an aggregated proof.
- When called by the Aggregator service to generate a final proof:
- The Prover component takes the result of a previously calculated aggregated proof, provided by the Aggregator, and generates a final proof that can be verified.
- The interface of the server of this service is defined by the file aggregator.proto.

- It establishes a connection to an Aggregator server.
- Multiple zkEVM Provers can simultaneously connect to the Aggregator server, thereby enhancing the proof generation capability.
- Upon being invoked by the Aggregator service for batch proof generation:
- The Prover component processes the input data (a set of EVM transactions), computes the resulting state, and creates a proof based on the PIL polynomial definitions and their constraints.
- The Executor component integrates 14 state machines to process the input data and produce evaluations of the committed polynomials, essential for proof generation. Each state machine generates its computational evidence, and intricate calculations are passed on to the subsequent state machine.
- The Prover component then invokes the Stark component to produce a proof for the committed polynomials from the Executor's state machines.
- When tasked by the Aggregator service to produce an aggregated proof:
- The Prover component amalgamates the results of two previously computed batch or aggregated proofs, supplied by the Aggregator, to create an aggregated proof.
- When tasked by the Aggregator service to produce a final proof:
- The Prover component uses the outcome of a prior aggregated proof, supplied by the Aggregator, to formulate a conclusive proof that can be validated.
- The server interface for this service is delineated in the file named `aggregator.proto`.

### Executor service
- It calls the Executor component that executes the input data (a batch of EVM transactions) and calculates the resulting state. The proof is not generated.
- It provides a fast way to check if the proposed batch of transactions is properly built and it fits the amount of work that can be proven in one single batch.
- When called by the Executor service, the Executor component only uses the Main state machine, since the committed polynomials are not required as the proof will not be generated.
- The interface of this service is defined by the file executor.proto.

- The Executor component processes the input data, which comprises a batch of EVM transactions, and computes the resulting state. Notably, no proof is produced.
- This service offers a swift method to verify whether a proposed batch of transactions is correctly constructed and if it aligns with the workload that can be proven in a single batch.
- When the Executor service invokes the Executor component, only the Main state machine is utilized. This is because the committed polynomials aren't needed, given that a proof isn't generated.
- The service's interface is outlined in the `executor.proto` file.

### StateDB service
- It provides an interface to access the state of the system (a Merkle tree) and the database where the state is stored.
- It is used by the executor and the prover, as the single source of state. It can be used to get state details, e.g. account balances.
- The interface of this service is defined by the file statedb.proto.

## Setup
- This service provides an interface to access the system's state (represented as a Merkle tree) and the database where this state is stored.
- Both the executor and the prover rely on it as the unified source of state. It can be utilized to retrieve specific state details, such as account balances.
- The interface for this service is described in the `statedb.proto` file.

## Compiling locally

Steps to compile `zkevm-prover` locally:
### Clone repository

```sh
$ git clone [email protected]:0xPolygonHermez/zkevm-prover.git
$ cd zkevm-prover
$ git submodule init
$ git submodule update
git clone [email protected]:0xPolygonHermez/zkevm-prover.git
nadimkobeissi marked this conversation as resolved.
Show resolved Hide resolved
cd zkevm-prover
git submodule init
git submodule update
nadimkobeissi marked this conversation as resolved.
Show resolved Hide resolved
```

### Compile
The following packages must be installed.
### Download necessary files

Download this **very large archive (~75GB)**. It's a good idea to start this download now and have it running in the background:

```sh
$ sudo apt update && sudo apt install build-essential libbenchmark-dev libomp-dev libgmp-dev nlohmann-json3-dev postgresql libpqxx-dev libpqxx-doc nasm libsecp256k1-dev grpc-proto libsodium-dev libprotobuf-dev libssl-dev cmake libgrpc++-dev protobuf-compiler protobuf-compiler-grpc uuid-dev
./tools/download_archive.sh
```
To download the files needed to run the prover, you have to execute the following command

The archive will take up an additional 115GB of space once extracted.

### Install dependencies

The following packages must be installed:

#### Ubuntu/Debian

```sh
$ wget https://de012a78750e59b808d922b39535e862.s3.eu-west-1.amazonaws.com/v1.1.0-rc.1-fork.4.tgz
$ tar -xzvf v1.1.0-rc.1-fork.4.tgz
$ rm -rf config
$ mv v1.1.0-rc.1-fork.4/config .
apt update
apt install build-essential libbenchmark-dev libomp-dev libgmp-dev nlohmann-json3-dev postgresql libpqxx-dev libpqxx-doc nasm libsecp256k1-dev grpc-proto libsodium-dev libprotobuf-dev libssl-dev cmake libgrpc++-dev protobuf-compiler protobuf-compiler-grpc uuid-dev
```

Run `make` to compile the project
#### openSUSE
```sh
$ make clean
$ make -j
zypper addrepo https://download.opensuse.org/repositories/network:cryptocurrencies/openSUSE_Tumbleweed/network:cryptocurrencies.repo
zypper refresh
zypper install -t pattern devel_basis
zypper install libbenchmark1 libomp16-devel libgmp10 nlohmann_json-devel postgresql libpqxx-devel ghc-postgresql-libpq-devel nasm libsecp256k1-devel grpc-devel libsodium-devel libprotobuf-c-devel libssl53 cmake libgrpc++1_57 protobuf-devel uuid-devel llvm llvm-devel libopenssl-devel
```

To run the testvector:
#### Arch
```sh
pacman -S base-devel extra/protobuf community/grpc-cli community/nlohmann-json extra/libpqxx nasm extra/libsodium community/libsecp256k1
```

### Compilation

You may first need to recompile the protobufs:
```sh
$ ./build/zkProver -c testvectors/config_runFile_BatchProof.json
cd src/grpc
make
cd ../..
```

### StateDB service database
Run `make` to compile the main project:

```sh
make clean
make -j
```

To compile in debug mode, run `make -j dbg=1`.

### Test vectors

```sh
./build/zkProver -c testvectors/config_runFile_BatchProof.json
```

## StateDB service database

To use persistence in the StateDB (Merkle-tree) service you must create the database objects needed by the service. To do this run the shell script:

```sh
$ ./tools/statedb/create_db.sh <database> <user> <password>
./tools/statedb/create_db.sh <database> <user> <password>
```

For example:

```sh
$ ./tools/statedb/create_db.sh testdb statedb statedb
./tools/statedb/create_db.sh testdb statedb statedb
```

### Build & run docker
## Docker

```sh
$ sudo docker build -t zkprover .
$ sudo docker run --rm --network host -ti -p 50051:50051 -p 50061:50061 -p 50071:50071 -v $PWD/testvectors:/usr/src/app zkprover input_executor.json
sudo docker build -t zkprover .
sudo docker run --rm --network host -ti -p 50051:50051 -p 50061:50061 -p 50071:50071 -v $PWD/testvectors:/usr/src/app zkprover input_executor.json
```

## Usage
To execute the Prover you need to provide a `config.json` file that contains the parameters that allow us to configure the different Prover options. By default, the Prover loads the `config.json`file located in the `testvectors`folder. The most relevant parameters are commented below with the default value for the provided `config.json` file:

| Parameter | Description |
| --------- | ----------- |
| runStateDBServer | Enables StateDB GRPC service, provides SMT (Sparse Merkle Tree) and Database access |
| runExecutorServer | Enables Executor GRPC service, provides a service to process transaction batches |
| runAggregatorClient | Enables Aggregator GRPC client, connects to the Aggregator and process its requests |
| aggregatorClientHost | IP address of the Aggregator server to which the Aggregator client must connect to |
| runProverServer | Enables Prover GRPC service |
| runFileProcessBatch | Processes a batch using as input a JSON file defined in the `"inputFile"` parameter |
| runFileGenProof | Generates a proof using as input a JSON file defined in the `"inputFile"` parameter |
| inputFile | Input JSON file with path relative to the `testvectors` folder |
| outputPath | Output path folder to store the result files, with path relative to the `testvectors` folder |
| databaseURL | Connection string for the PostgreSQL database used by the StateDB service. If the value is `"local"` then the service will not use a database and the data will be stored only in memory (no persistence). The PostgreSQL database connection string has the following format: `"postgresql://<user>:<password>@<ip>:<port>/<database>"`. For example: `"postgresql://statedb:[email protected]:5432/testdb"` |
| stateDBURL | Connection string for the StateDB service. If the value is `"local"` then the GRPC StateDB service will not be used and local StateDB client will be used instead. The StateDB service connection string has the following format: `"<ip>:<port>"`. For example: `"127.0.0.1:50061"` |
| saveRequestToFile | Saves service received requests to a text file |
| saveResponseToFile | Saves service returned responses to a text file |
| saveInputToFile | Saves service received input data to a JSON file |
| saveOutputToFile | Saves service returned output data to a JSON file |

To run a proof test you must perform the following steps:
- Edit the `config.json` file and set the parameter `"runFileGenProof"` to `"true"`. The rest of the parameters must be set to `"false"`. Also set the parameter `"databaseURL` to `"local"` if you don't want to use a postgreSQL database to run the test
- Indicate in the `"inputFile"` parameter the file with the input test data. You can find a test file `input_executor.json` in the `testvectors` folder
- Run the Prover from the `testvectors` folder using the command `$ ../build/zkProver`
- The result files of the proof will be stored in the folder specified in the `"outputPath"` config parameter



To run the Prover, supply a `config.json` file containing the parameters that help customize various Prover settings. By default, the Prover accesses the `config.json` file from the `testvectors` directory. Below are some of the key parameters, accompanied by their default values from the given `config.json`:

| Parameter | Description |
| ---------------------- | ----------- |
| `runStateDBServer` | Enables StateDB GRPC service, provides SMT (Sparse Merkle Tree) and Database access |
| `runExecutorServer` | Enables Executor GRPC service, provides a service to process transaction batches |
| `runAggregatorClient` | Enables Aggregator GRPC client, connects to the Aggregator and process its requests |
| `aggregatorClientHost` | IP address of the Aggregator server to which the Aggregator client must connect to |
| `runProverServer` | Enables Prover GRPC service |
| `runFileProcessBatch` | Processes a batch using as input a JSON file defined in the `"inputFile"` parameter |
| `runFileGenProof` | Generates a proof using as input a JSON file defined in the `"inputFile"` parameter |
| `inputFile` | Input JSON file with path relative to the `testvectors` folder |
| `outputPath` | Output path to store the result files, relative to the `testvectors` folder |
| `saveRequestToFile` | Saves service received requests to a text file |
| `saveResponseToFile` | Saves service returned responses to a text file |
| `saveInputToFile` | Saves service received input data to a JSON file |
| `saveOutputToFile` | Saves service returned output data to a JSON file |
| `databaseURL` | For the StateDB service, if the value is `"local"`, data is stored in memory; otherwise, use the PostgreSQL format: `"postgresql://<user>:<password>@<ip>:<port>/<database>"`, e.g., `"postgresql://statedb:[email protected]:5432/testdb"`. |
| `stateDBURL` | For the StateDB service, if the value is "`local"`, a local client replaces the GRPC service. Use the format: `"<ip>:<port>", e.g., "127.0.0.1:50061"`. |

To execute a proof test:

1. Modify the `config.json` file, setting the `"runFileGenProof"` parameter to `"true"`. Ensure all other parameters are set to `"false"`. If you prefer not to use a PostgreSQL database for the test, adjust the `"databaseURL"` to `"local"`.
2. For the `"inputFile"` parameter, specify the desired input test data file. As an example, the `testvectors` directory contains the `input_executor.json` file.
3. Launch the Prover from the `testvectors` directory using the command: `../build/zkProver`.
4. The proof's result files will be saved in the directory defined by the `"outputPath"` configuration parameter.
2 changes: 1 addition & 1 deletion src/goldilocks
1 change: 1 addition & 0 deletions src/hashdb/multi_write.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@

#include "goldilocks_base_field.hpp"
#include "multi_write_data.hpp"
#include <vector>

using namespace std;

Expand Down
1 change: 1 addition & 0 deletions src/main_sm/fork_5/main_exec_c/rlp_decode.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
#include <gmpxx.h>
#include "zkresult.hpp"
#include "main_sm/fork_5/main_exec_c/rlp_data.hpp"
#include <stdint.h>

using namespace std;

Expand Down
2 changes: 1 addition & 1 deletion src/service/aggregator/aggregator_client.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ AggregatorClient::AggregatorClient (Goldilocks &fr, const Config &config, Prover
prover(prover)
{
// Create channel
std::shared_ptr<grpc_impl::Channel> channel = ::grpc::CreateChannel(config.aggregatorClientHost + ":" + to_string(config.aggregatorClientPort), grpc::InsecureChannelCredentials());
std::shared_ptr<grpc::Channel> channel = ::grpc::CreateChannel(config.aggregatorClientHost + ":" + to_string(config.aggregatorClientPort), grpc::InsecureChannelCredentials());

// Create stub (i.e. client)
stub = new aggregator::v1::AggregatorService::Stub(channel);
Expand Down
2 changes: 1 addition & 1 deletion src/service/hashdb/hashdb_remote.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ HashDBRemote::HashDBRemote (Goldilocks &fr, const Config &config) : fr(fr), conf
channelArguments.SetMaxReceiveMessageSize(100*1024*1024);

// Create channel
std::shared_ptr<grpc_impl::Channel> channel = ::grpc::CreateCustomChannel(config.hashDBURL, grpc::InsecureChannelCredentials(), channelArguments);
std::shared_ptr<grpc::Channel> channel = ::grpc::CreateCustomChannel(config.hashDBURL, grpc::InsecureChannelCredentials(), channelArguments);

// Create stub (i.e. client)
stub = new hashdb::v1::HashDBService::Stub(channel);
Expand Down
2 changes: 2 additions & 0 deletions src/sm/storage/storage_rom_line.hpp
Original file line number Diff line number Diff line change
@@ -1,6 +1,8 @@
#ifndef STORAGE_ROM_LINE_HPP
#define STORAGE_ROM_LINE_HPP

#include <stdint.h>
#include <string>
#include <vector>

using namespace std;
Expand Down
1 change: 1 addition & 0 deletions src/utils/zklog.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,7 @@
#define ZKLOG_HPP

#include <string>
#include <pthread.h>

using namespace std;

Expand Down
Loading
Loading