Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

forestrie trust level updates supporting confirmation of forestrie events #10

Merged
merged 2 commits into from
Jan 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 1 addition & 4 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,10 +30,7 @@ jobs:
- name: Generate, Test and Export
run: |
# Note: it is by design that we don't use the builder
task apis:bootstrap
task apis:generate
task apis:test
task apis:export
task local-all

- name: Show exports
working-directory: exported/
Expand Down
25 changes: 25 additions & 0 deletions .github/workflows/workflow.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
name: Developer Workflow Tests

on:
# we ony run these on pull request as they significantly hinder developer feed
# back if they are triggered on push
pull_request:
branches:
- main
jobs:
docker-all:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Setup go-task
uses: arduino/setup-task@v1
with:
version: 3.x
repo-token: ${{ secrets.GITHUB_TOKEN }}
- name: docker all
run: |

echo "TODO: permissions on accessing our acr prevent this from working"
exit 0
# ensure the all target (which runs everything in docker) works
# task all
31 changes: 23 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,26 @@

Common public api definitions for the DataTrails platform

## Finding and including proto files for depdendecnies

tools/go.mod is the source of truth for all proto providing dependencies. That file alone specifies both the upstream version we are using *and* is used, via go install, to make the .proto files available locally

This corresponds to the practice recommended by grpc-gateway and elsewhere

1. *** ALWAYS *** Use tools/go.mod to specify the dependency version.
2. Add the package to the `go install` command in the apis:preflight task
3. If necessary, add a var for the new path in any-api **and** then add a reference to that var in the PROTO_INC var.

Following this practice removes the need for dual maintenance of dependency versions in the builder image. It also produces a build cycle that is significantly faster.

Cross repository builds in docker while using go.work to refer to locally modified sources don't work. And this setup is essential for an efficient workflow.

## bootstrap proto files

The proto's for protoc itself, the googleapis, and the grpc_health proxy are needed by almost everything and are also don't apear to be compatible with the tools/go.mod approach

For this reason we curl the proto's and make them available in our aggregate proto-includes archive

## Workflow for updating common apis

### Ensure the go tool chain is setup on your host
Expand Down Expand Up @@ -46,12 +66,11 @@ If you want to iterate on *just* the helper go code and there tests, do one roun

`apis:bootstrap` -> `apis:clean:generated`

Then just iterate using `apis:build`

Then just iterate using `task apis:generate` and `task apis:test`

#### For avid

* task avid:xxx
See the README.md in avid/src/api/README.md

##### Build one api against locally cloned go-datatrails-common-api

Expand All @@ -60,8 +79,4 @@ The protos can be included exactly as they are found from a clone of go-datatrai
task apis:assetsv2-api \
DATATRAILS_COMMON_API="../../go-datatrails-common-api"

It is necessary however to run `task apis:bootsrap` after cloning go-datatrails-common

#### For forestrie

* task:xxx
It is necessary however to run `task apis:bootsrap` after cloning go-datatrails-common
40 changes: 30 additions & 10 deletions Taskfile.yml
Original file line number Diff line number Diff line change
Expand Up @@ -68,21 +68,41 @@ tasks:
# Primary workflow tasks
### -------------------------
all:
desc: "do everything necessary after clone (or rebase)"
desc: "do everything necessary after clone (or rebase) in the builder"
cmds:
- task: builder-start
- defer: {task: builder-cleanup}
# These steps should exactly mirror the steps in .github/workflows/ci.yml
- docker exec -t {{.BUILD_CONTAINER}} task apis:bootstrap
- docker exec -t {{.BUILD_CONTAINER}} task apis:generate
- docker exec -t {{.BUILD_CONTAINER}} task apis:test
- docker exec -t {{.BUILD_CONTAINER}} task apis:export
- task: local-all

local-all:
desc: |
do everything necessary after clone (or rebase) (faster as it RUNS ON NATIVE HOST)

Note: This requires go and protoc installed on your host. If you are not
comfortable doing this then use dockerall.

** The CI (ci.yml) uses this task and provides a clear illustration of the
pre-requisites. It does not use docker **

cmds:
- task: clean
- task: bootstrap
- task: generate
- task: apis:test
- task: apis:export

# 0. clean out generated and imported files
clean:
- rm -rf proto-include
- rm -rf exported
- task: apis:clean:generated

# 1. bootstrap (only needed post clone or rebase)
bootstrap:
- task: apis:bootstrap

# 2. generate
generate:
desc: generates all the artifacts we need pre-build
cmds:
- task: builder-start
- defer: {task: builder-cleanup}
# DO NOT add bootstrap here, use all instead
- docker exec -t {{.BUILD_CONTAINER}} task apis:generate
- task: apis:generate
1 change: 1 addition & 0 deletions datatrails-common-api/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
go.work
9 changes: 8 additions & 1 deletion datatrails-common-api/assets/v2/assets/enums.proto
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,15 @@ option go_package="github.com/datatrails/go-datatrails-common-api-gen/assets/v2/
enum ConfirmationStatus {
CONFIRMATION_STATUS_UNSPECIFIED = 0;
PENDING = 1; // not yet committed
CONFIRMED = 2; // committed
CONFIRMED = 2; // committed. forestrie: "You can easily prove it changed"
FAILED = 3; // permanent failure

// Regarding the new statuses for forestrie, See
// https://github.com/datatrails/epic-8120-scalable-proof-mechanisms/blob/main/event-trust-levels.md
STORED = 4; // forestrie, "its in the db"
COMMITTED = 5; // forestrie, "you can know if its changed"
// We re-use the constant for CONFIRMED (above)
UNEQUIVOCAL = 6; // forestrie, "You easily prove it was publicly available to all"
}

enum TrackedStatus {
Expand Down
20 changes: 18 additions & 2 deletions datatrails-common-api/assets/v2/assets/eventresponse.proto
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ import "protoc-gen-openapiv2/options/annotations.proto";
import "google/protobuf/timestamp.proto";
import "datatrails-common-api/assets/v2/assets/enums.proto";
import "datatrails-common-api/assets/v2/assets/principal.proto";
import "datatrails-common-api/assets/v2/assets/merklelogentry.proto";
import "datatrails-common-api/attribute/v2/attribute/attribute.proto";

message EventResponse {
Expand Down Expand Up @@ -102,9 +103,9 @@ message EventResponse {
read_only: true
}];

// timestamp operation has been committed on the blockchain
// timestamp for when the event was committed to a verifiable log
google.protobuf.Timestamp timestamp_committed = 7 [ (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_field) = {
description: "time of event as recorded on blockchain"
description: "time of event as recorded in verifiable storage"
read_only: true
}];

Expand Down Expand Up @@ -135,6 +136,8 @@ message EventResponse {
type: STRING
}];

// NOTICE: We expect to retire simple hash and then remove all the top level dlt fields.

// hash of transaction committing this operation on blockchain
string transaction_id = 11 [ (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_field) = {
description: "hash of the transaction as a hex string `0x11bf5b37e0b842e08dcfdc8c4aefc000`"
Expand Down Expand Up @@ -164,4 +167,17 @@ message EventResponse {
"Identity of the tenant the that created this event"
max_length: 1024
}];

// An event has exactly one proof mechanism. This field caputures the proof
robinbryce marked this conversation as resolved.
Show resolved Hide resolved
// mechanism specific details supporting the trustworthyness of the event
// record. We anticipate at least two proof mechs: merkle_log and
// verkle_log. We use oneof to avoid repeating the scattering of randomly
// re-purposed fields we currently have for simple hash vs khipu.
oneof proof_details {
MerklLogEntry merklelog_entry = 19 [ (grpc.gateway.protoc_gen_openapiv2.options.openapiv2_field) = {
description:
"verifiable merkle mmr log entry details"
max_length: 1024
}];
};
}
53 changes: 53 additions & 0 deletions datatrails-common-api/assets/v2/assets/merklelogentry.proto
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
// Maintainers, please refer to the style guide here:
// https://developers.google.com/protocol-buffers/docs/style
syntax = "proto3";
package archivist.v2;
option go_package="github.com/datatrails/go-datatrails-common-api-gen/assets/v2/assets;assets";
import "google/protobuf/timestamp.proto";

// MerkeLogCommit provides the log entry details for a single mmr leaf.
message MerkleLogCommit {
/* The mmr index */
uint64 index = 3;
/* The mmr *leaf* index */
uint64 leaf_index = 4; // TBD: this may be redundant.
/* time ordered and strictly unique per tenant. system wide
* unique with very reasonable operational assumptions. */
fixed64 idtimestamp = 5;
}

// The message sent from forestrie to avid notifying that the corresponding
// event is commited to the tenants log.
message MerkleLogCommitMessage {

// The tenant identity and the event identity for the committed event.
string tenant_identity = 1;
string event_identity = 2;
/* The time portion of idtimestamp that contributed to the hash of the event
* (the idtimestamp is _also_ included.
* This must be copied into event.timestamp_committed when the saas db is updated */
google.protobuf.Timestamp timestamp = 6;

uint32 log_version = 3;
uint32 log_epoch = 4;
MerkleLogCommit commit = 5;
}



// The details stored in the SaaS db for a proof mech MERKLE_LOG commitment
message MerklLogEntry {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be MerkleLogEntry? and also can we do same with commited -> commit here?


// The tenant log version and epoch when the log entry was created.
uint32 log_version = 1;
uint32 log_epoch = 2;

// Event trust level commited fields
MerkleLogCommit committed = 3;

// TODO: Event trust level confirmed fields

// signature over tenant mmr root

// TODO: Event trust level uniquivocal fields
}
1 change: 0 additions & 1 deletion datatrails-common-api/assets/v2/assets/service.proto
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,6 @@ package archivist.v2;
option go_package="github.com/datatrails/go-datatrails-common-api-gen/assets/v2/assets;assets";

import "google/api/annotations.proto";
import "validate/validate.proto";
import "protoc-gen-openapiv2/options/annotations.proto";

import "datatrails-common-api/assets/v2/assets/assetresponse.proto";
Expand Down
2 changes: 1 addition & 1 deletion datatrails-common-api/go.mod
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ go 1.21
// This allows this module to operate as tho it were the generated module. This
// allows us to manage the proto tool dependencies via this go.mod. This go.mod
// is also used as the go.mod for the generated package.
replace github.com/datatrails/go-datatrails-common-api-gen => ./
// replace github.com/datatrails/go-datatrails-common-api-gen => ./

require (
github.com/datatrails/go-datatrails-common v0.10.2
Expand Down
3 changes: 0 additions & 3 deletions getproto.sh

This file was deleted.

Loading
Loading