diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index ffe4772..81b07b3 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,4 +1,4 @@ -RKVST observes the [CNCF Community Code of Conduct](https://github.com/cncf/foundation/blob/master/code-of-conduct.md), reproduced below for emphasis. +DataTrails observes the [CNCF Community Code of Conduct](https://github.com/cncf/foundation/blob/master/code-of-conduct.md), reproduced below for emphasis. ### Contributor Code of Conduct @@ -32,7 +32,7 @@ Conduct may be permanently removed from the project team. This code of conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. -Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting an RKVST administrator on . +Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting an DataTrails administrator on . This Code of Conduct is adapted from the Contributor Covenant (http://contributor-covenant.org), version 1.2.0, available at diff --git a/DEVELOPERS.md b/DEVELOPERS.md index c54be1b..6ff577f 100644 --- a/DEVELOPERS.md +++ b/DEVELOPERS.md @@ -1,6 +1,6 @@ -# rkvst-samples - developers +# datatrails-samples - developers -Sample python code that uses the rkvst python SDK to manage particular types of assets +Sample python code that uses the datatrails python SDK to manage particular types of assets such as 'doors', 'cards', 'containers' etc. This document describes how to test any modifications made to the codebase. @@ -12,7 +12,7 @@ Required tools for modifying this repo are task-runner and docker-ce. - Install task runner: https://github.com/go-task/task - Install docker-ce: https://docs.docker.com/get-docker/ -A running rkvst instance which allows creation of arbitrary assets etc. This is usually +A running datatrails instance which allows creation of arbitrary assets etc. This is usually a test or demo system and **not** a production system. # Running the samples code @@ -32,10 +32,10 @@ task api ## Authorization Add a token to the file credentials/.auth_token and set some environment vars to -specify the rkvst endpoint: +specify the datatrails endpoint: ```bash -export TEST_ARCHIVIST="https://app.rkvst.io" +export TEST_ARCHIVIST="https://app.datatrails.ai" export TEST_AUTHTOKEN_FILENAME=credentials/.auth_token export TEST_NAMESPACE="unique label" export TEST_VERBOSE=-v diff --git a/LICENSE b/LICENSE index 12f8066..85fa3e3 100644 --- a/LICENSE +++ b/LICENSE @@ -1,6 +1,6 @@ MIT License -Copyright (c) 2019-2022 RKVST +Copyright (c) 2019-2022 DataTrails Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal diff --git a/README.md b/README.md index 0298ba8..cbe5dad 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,6 @@ -# rkvst-samples +# datatrails-samples -Sample python code that uses the rkvst python SDK to manage particular types of assets +Sample python code that uses the datatrails python SDK to manage particular types of assets such as 'doors', 'cards', 'containers' etc. # Installing the samples code @@ -10,7 +10,7 @@ Python 3.8 and later versions are supported. Use the standard python pip utility: ```bash -python3 -m pip install --user rkvst-samples +python3 -m pip install --user datatrails-samples ``` and this will create 7 entry points: @@ -30,7 +30,7 @@ Add a token to the file credentials/.auth_token and set some environment vars to specify the archivist endpoint: ```bash -export TEST_ARCHIVIST="https://app.rkvst.io" +export TEST_ARCHIVIST="https://app.datatrails.ai" export TEST_AUTHTOKEN_FILENAME=credentials/.auth_token export TEST_NAMESPACE="unique label" export TEST_VERBOSE=-v @@ -44,7 +44,7 @@ TEST_PROOF_MECHANISM should be "KHIPU" or "SIMPLE_HASH". If unspecified the defa Windows using Powershell - at the command prompt set values for environment variables: ```bash -$Env:TEST_ARCHIVIST="https://app.rkvst.io" +$Env:TEST_ARCHIVIST="https://app.datatrails.ai" $Env:TEST_AUTHTOKEN_FILENAME = '' $Env:TEST_NAMESPACE = Get-Date -UFormat %s $Env:TEST_VERBOSE = '-v' diff --git a/Taskfile.yml b/Taskfile.yml index 07ee452..dd774cb 100644 --- a/Taskfile.yml +++ b/Taskfile.yml @@ -12,27 +12,27 @@ tasks: api: desc: Build a docker environment with the right dependencies and utilities cmds: - - docker build --no-cache --build-arg VERSION=3.8 -f Dockerfile -t rkvst-samples-api . + - docker build --no-cache --build-arg VERSION=3.8 -f Dockerfile -t datatrails-samples-api . api-3.9: desc: Build a docker environment with the right dependencies and utilities cmds: - - docker build --no-cache --build-arg VERSION=3.9 -f Dockerfile -t rkvst-samples-api . + - docker build --no-cache --build-arg VERSION=3.9 -f Dockerfile -t datatrails-samples-api . api-3.10: desc: Build a docker environment with the right dependencies and utilities cmds: - - docker build --no-cache --build-arg VERSION=3.10 -f Dockerfile -t rkvst-samples-api . + - docker build --no-cache --build-arg VERSION=3.10 -f Dockerfile -t datatrails-samples-api . api-3.11: desc: Build a docker environment with the right dependencies and utilities cmds: - - docker build --no-cache --build-arg VERSION=3.11 -f Dockerfile -t rkvst-samples-api . + - docker build --no-cache --build-arg VERSION=3.11 -f Dockerfile -t datatrails-samples-api . api-3.12: desc: Build a docker environment with the right dependencies and utilities cmds: - - docker build --no-cache --build-arg VERSION=3.12 -f Dockerfile -t rkvst-samples-api . + - docker build --no-cache --build-arg VERSION=3.12 -f Dockerfile -t datatrails-samples-api . check: desc: Check the style, bug and quality of the code diff --git a/archivist_samples/c2pa/README.md b/archivist_samples/c2pa/README.md index 8138a19..ae58599 100644 --- a/archivist_samples/c2pa/README.md +++ b/archivist_samples/c2pa/README.md @@ -1,6 +1,6 @@ # C2PA Sample -The purpose of this sample is to demonstrate how one can record and trace the lifecycle of an embedded file manifest within RKVST thus providing a historical chain of events of wanted or unwanted changes and/or updates. +The purpose of this sample is to demonstrate how one can record and trace the lifecycle of an embedded file manifest within DataTrails thus providing a historical chain of events of wanted or unwanted changes and/or updates. Note: For clarity and simplicity this sample does not show how to create the C2PA manifest itself, but rather uses pre-prepared files generated by the standard c2patool. @@ -8,9 +8,9 @@ Note: For clarity and simplicity this sample does not show how to create the C2P * Python 3.8 and later versions are supported. -* Install the [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") +* Install the [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample diff --git a/archivist_samples/c2pa/c2pa.py b/archivist_samples/c2pa/c2pa.py index 899d8be..7c2be82 100644 --- a/archivist_samples/c2pa/c2pa.py +++ b/archivist_samples/c2pa/c2pa.py @@ -27,7 +27,7 @@ def upload_attachment(arch, attachment_description: AttachmentDescription): blob = arch.attachments.upload(fd) attachment = { # sample-specific attr to relay attachment name - "rkvst_samples_display_name": attachment_description.attribute_name, + "datatrails_samples_display_name": attachment_description.attribute_name, "arc_file_name": attachment_description.filename, "arc_attribute_type": "arc_attachment", "arc_blob_identity": blob["identity"], diff --git a/archivist_samples/c2pa/run.py b/archivist_samples/c2pa/run.py index 62d5224..7556e97 100644 --- a/archivist_samples/c2pa/run.py +++ b/archivist_samples/c2pa/run.py @@ -17,7 +17,7 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) # remove the trailing / on the url if it exists diff --git a/archivist_samples/document/README.md b/archivist_samples/document/README.md index c3d2d49..0735f83 100644 --- a/archivist_samples/document/README.md +++ b/archivist_samples/document/README.md @@ -1,6 +1,6 @@ # Document Lineage Sample -RKVST offers complete document lineage. +DataTrails offers complete document lineage. This sample focuses on an invoice document for a fabricated Asteroid Mining Company. @@ -19,9 +19,9 @@ If the document contains sensitive information, it is also possible to just prov * Python 3.8 and later versions are supported. -* Install the [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") +* Install the [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample diff --git a/archivist_samples/document/document.py b/archivist_samples/document/document.py index c00f31c..94a8d0e 100644 --- a/archivist_samples/document/document.py +++ b/archivist_samples/document/document.py @@ -27,7 +27,7 @@ def upload_attachment(arch, attachment_description: AttachmentDescription): blob = arch.attachments.upload(fd) attachment = { # sample-specific attr to relay attachment name - "rkvst_samples_display_name": attachment_description.attribute_name, + "datatrails_samples_display_name": attachment_description.attribute_name, "arc_file_name": attachment_description.filename, "arc_attribute_type": "arc_attachment", "arc_blob_identity": blob["identity"], diff --git a/archivist_samples/document/run.py b/archivist_samples/document/run.py index 0961960..9616572 100644 --- a/archivist_samples/document/run.py +++ b/archivist_samples/document/run.py @@ -17,7 +17,7 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) # remove the trailing / on the url if it exists diff --git a/archivist_samples/door_entry/README.md b/archivist_samples/door_entry/README.md index d28d218..ea4b535 100644 --- a/archivist_samples/door_entry/README.md +++ b/archivist_samples/door_entry/README.md @@ -2,23 +2,23 @@ Access control is not just an issue for files and computer systems: when it comes to connected door locks it's a real world issue. Various stakeholders (landlords, building services, police, delivery companies) have legitimate reasons to enter the communal areas of shared buildings but the residents and owners of that building also need to be sure of their safety and privacy. How do you makes sure that privileged access to buildings is enabled whilst also holding the authorities to account and preventing abuses? -RKVST Data Assurance Hub offers a solution to this problem: by ensuring that all stakeholders have a transparent view of privileged access events, abuses are discouraged and quickly discovered. By combining virtual world evidence (cryptographic identities, timestamps) with real world evidence (photographs from the built-in camera from the door access system) a very strong record of when who accessed a building is maintained and made available for audit and dispute resolution. +DataTrails Data Assurance Hub offers a solution to this problem: by ensuring that all stakeholders have a transparent view of privileged access events, abuses are discouraged and quickly discovered. By combining virtual world evidence (cryptographic identities, timestamps) with real world evidence (photographs from the built-in camera from the door access system) a very strong record of when who accessed a building is maintained and made available for audit and dispute resolution. -This sample simulates a set of smart connected door locks processing and reporting their privileged accesses through RKVST. Through simulating door access cards it also demonstrates RKVST's principle of building up asset provenance based on trusted Witness Statements rather than direct connection to assets, which provides much greater system visibility than traditional agent-based platforms. +This sample simulates a set of smart connected door locks processing and reporting their privileged accesses through DataTrails. Through simulating door access cards it also demonstrates DataTrails's principle of building up asset provenance based on trusted Witness Statements rather than direct connection to assets, which provides much greater system visibility than traditional agent-based platforms. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the RKVST samples package. If you are just trying out the pre-made samples you should get the official [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. +* Install the DataTrails samples package. If you are just trying out the pre-made samples you should get the official [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample -The Taskfile in the top level of this repository includes a pre-packaged run of this sample that creates a number of door access terminals and access cards and simulates privileged access events which can then be viewed and analysed in your RKVST tenancy. +The Taskfile in the top level of this repository includes a pre-packaged run of this sample that creates a number of door access terminals and access cards and simulates privileged access events which can then be viewed and analysed in your DataTrails tenancy. -Please refer to the instructions in the [top level README](https://github.com/rkvst/rkvst-samples#door-entry-control "door entry sample") +Please refer to the instructions in the [top level README](https://github.com/datatrails/datatrails-samples#door-entry-control "door entry sample") diff --git a/archivist_samples/door_entry/run.py b/archivist_samples/door_entry/run.py index 5c038b2..41f4b80 100644 --- a/archivist_samples/door_entry/run.py +++ b/archivist_samples/door_entry/run.py @@ -50,24 +50,24 @@ def attachment_create(doors, attachment_description: AttachmentDescription): #################################### -def create_rkvst_paris(doors): +def create_datatrails_paris(doors): # Unlike the others, which feature images of the whole building, # this one is actually a close-up of the connected door terminal return doors_creator( doors, - "RKVST front door", + "DataTrails front door", { "arc_firmware_version": "1.0", "arc_serial_number": "das-j1-01", "arc_description": ( "Electronic door entry system controlling the main " - "staff entrance to RKVST France" + "staff entrance to DataTrails France" ), - "wavestone_asset_id": "paris.france.rkvst.das", + "wavestone_asset_id": "paris.france.datatrails.das", }, location={ "props": { - "display_name": "RKVST Paris", + "display_name": "DataTrails Paris", "description": "Sales and sales support for the French region", "latitude": 48.8339211, "longitude": 2.371345, @@ -247,7 +247,7 @@ def create_gdn_side(doors): def create_doors(doors): LOGGER.info("Creating all doors...") doors_map = { - "rkvst_paris": create_rkvst_paris(doors), + "datatrails_paris": create_datatrails_paris(doors), "cityhall": create_cityhall(doors), "courts": create_courts(doors), "bastille": create_bastille(doors), @@ -575,7 +575,7 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) doors = copy(arch) diff --git a/archivist_samples/estate_info/README.md b/archivist_samples/estate_info/README.md index 6734405..4e70a26 100644 --- a/archivist_samples/estate_info/README.md +++ b/archivist_samples/estate_info/README.md @@ -1,22 +1,22 @@ # Estate Info sample -One of the greatest benefits of RKVST is having a system-wide view of your entire asset estate in one place, enabling better informed, more confident decisions. +One of the greatest benefits of DataTrails is having a system-wide view of your entire asset estate in one place, enabling better informed, more confident decisions. -The `estate-info` sample very simply demonstrates how to read and enumerate Assets and Events from your RKVST tenancy. It also demonstrates various techniques for quickly counting assets and events without fetching the whole data set. +The `estate-info` sample very simply demonstrates how to read and enumerate Assets and Events from your DataTrails tenancy. It also demonstrates various techniques for quickly counting assets and events without fetching the whole data set. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the RKVST samples package. If you are just trying out the pre-made samples you should get the official [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. +* Install the DataTrails samples package. If you are just trying out the pre-made samples you should get the official [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample -The Taskfile in the top level of this repository includes a pre-packaged run of this sample that performs both a quick and deep count of the Assets and Events in your RKVST tenancy. +The Taskfile in the top level of this repository includes a pre-packaged run of this sample that performs both a quick and deep count of the Assets and Events in your DataTrails tenancy. -Please refer to the instructions in the [top level README](https://github.com/rkvst/rkvst-samples#manage-assets-and-events-and-check-for-any-inconsistencies "estate info sample") +Please refer to the instructions in the [top level README](https://github.com/datatrails/datatrails-samples#manage-assets-and-events-and-check-for-any-inconsistencies "estate info sample") diff --git a/archivist_samples/estate_info/main.py b/archivist_samples/estate_info/main.py index 801c8c4..2ddc9d3 100644 --- a/archivist_samples/estate_info/main.py +++ b/archivist_samples/estate_info/main.py @@ -25,7 +25,7 @@ def run(poc, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) if args.quick_count: LOGGER.info("Number of events is %d", poc.events.count()) LOGGER.info("Number of assets is %d", poc.assets.count()) @@ -80,7 +80,7 @@ def run(poc, args): def main(): - parser = common_parser("Get basic information about your RKVST estate") + parser = common_parser("Get basic information about your DataTrails estate") # per example exclusive options here operations = parser.add_mutually_exclusive_group(required=True) diff --git a/archivist_samples/sample_scripts/c2pa/README.md b/archivist_samples/sample_scripts/c2pa/README.md index 7c84807..280461f 100644 --- a/archivist_samples/sample_scripts/c2pa/README.md +++ b/archivist_samples/sample_scripts/c2pa/README.md @@ -1,6 +1,6 @@ # Purpose -The purpose of this sample is to demonstrate how one can record and trace the lifecycle of an embedded file manifest within RKVST thus providing a historical chain of events of wanted or unwanted changes and/or updates. +The purpose of this sample is to demonstrate how one can record and trace the lifecycle of an embedded file manifest within DataTrails thus providing a historical chain of events of wanted or unwanted changes and/or updates. It creates 2 separate assets, controlled by different credentials, to show how potentially malicious redaction or stripping of provenance information can be detected and proven. @@ -22,11 +22,11 @@ C2PA Readme sections relevant to this script: There are two app registrations that represent each actor within the script. The information obtained by the app registration (client id and client secret) are referenced by using environment variables. -To create an RKVST App Registrtaion feel free to reference our [documentation](https://docs.rkvst.com/developers/developer-patterns/getting-access-tokens-using-app-registrations/). +To create an DataTrails App Registrtaion feel free to reference our [documentation](https://docs.datatrails.ai/developers/developer-patterns/getting-access-tokens-using-app-registrations/). -### Note: One does not have to create a JWT token for the REST API, just the app registrations in the RKVST tenant settings. +### Note: One does not have to create a JWT token for the REST API, just the app registrations in the DataTrails tenant settings. -Please set the below environment variables, they represent the client id and location of client secret for RKVST app registrations: +Please set the below environment variables, they represent the client id and location of client secret for DataTrails app registrations: ```bash export HONEST_CLIENT_ID=”client id for Honest Abe” @@ -35,14 +35,14 @@ export EVIL_CLIENT_ID=”client id for Evil Eddie” export EVIL_C2_CLIENT_SECRET_FILENAME=”credentials/.evil_secret” ``` -In addition, this script utilizes the RKVST Python3 SDK, located [here](https://github.com/rkvst/rkvst-python). This is not a requirement as RKVST APIs are code agnostic. +In addition, this script utilizes the DataTrails Python3 SDK, located [here](https://github.com/datatrails/datatrails-python). This is not a requirement as DataTrails APIs are code agnostic. # Scenario The scenario executed within this script involves two actors, one is good (Honest Abe) and one is nefarious (Evil Eddie). Honest Abe is a music journalist that is creating and recording asset-embedded manifest for digital content to be used for ACL (Austin City Limits) articles. Evil Eddie is a fellow colleague that likes to make changes to digital content that will be used for articles not written by him. Eddie believes his changes will make his co-workers articles better, however they often do not. -When changes are made, they are hard to track and cause timeline delays, as Eddie’s colleagues try to find the original digital content that an individual usually removes. Now that RKVST is being used, all asset-embedded manifests for digital content are recorded including: the original content, any changes/updates and related detail files. +When changes are made, they are hard to track and cause timeline delays, as Eddie’s colleagues try to find the original digital content that an individual usually removes. Now that DataTrails is being used, all asset-embedded manifests for digital content are recorded including: the original content, any changes/updates and related detail files. -There are two digital content assets recorded within RKVST one by Honest Abe and the other by Evil Eddie. Eddie and Abe have both recorded and published the same asset however Eddie has made changes to the manifest and began recording changes in RKVST alongside Abe. Now we have two assets that are the same, with the same journey and the same versions. Which one is the correct one? +There are two digital content assets recorded within DataTrails one by Honest Abe and the other by Evil Eddie. Eddie and Abe have both recorded and published the same asset however Eddie has made changes to the manifest and began recording changes in DataTrails alongside Abe. Now we have two assets that are the same, with the same journey and the same versions. Which one is the correct one? -This can be identified by using the Instaproof feature within RKVST and downloading the json files within the details event and locating the redacting information or downloading the images and using the [Verify](https://verify.contentauthenticity.org/inspect) tool, one will see “open” information has been removed from the content credentials. \ No newline at end of file +This can be identified by using the Instaproof feature within DataTrails and downloading the json files within the details event and locating the redacting information or downloading the images and using the [Verify](https://verify.contentauthenticity.org/inspect) tool, one will see “open” information has been removed from the content credentials. diff --git a/archivist_samples/sample_scripts/c2pa/c2pa_verify.py b/archivist_samples/sample_scripts/c2pa/c2pa_verify.py index 48726b8..f3efc55 100644 --- a/archivist_samples/sample_scripts/c2pa/c2pa_verify.py +++ b/archivist_samples/sample_scripts/c2pa/c2pa_verify.py @@ -1,7 +1,7 @@ #!/usr/bin/env python3 # List of imports used for this script -# In addition this script uses RKVST Python3 SDK +# In addition this script uses DataTrails Python3 SDK import os import os.path from os import getenv @@ -30,10 +30,10 @@ import importlib_resources as pkg_resources -# RKVST Connection Parameters -- Honest Abe +# DataTrails Connection Parameters -- Honest Abe # # The below are environment variables that are parameters used to connect -# to the production instance of RKVST. +# to the production instance of DataTrails. # # HONEST_CLIENT_ID = represents the client ID from an App Registration # HONEST_CLIENT_SECRET_FILENAME = represents location of client secret from an App Registration @@ -43,15 +43,17 @@ def honest_arch(): with open(client_secret_file, mode="r", encoding="utf-8") as tokenfile: client_secret = tokenfile.read().strip() - arch = Archivist("https://app.rkvst.io", (client_id, client_secret), max_time=300) + arch = Archivist( + "https://app.datatrails.ai", (client_id, client_secret), max_time=300 + ) return arch -# RKVST Connection Parameters -- Evil Eddie +# DataTrails Connection Parameters -- Evil Eddie # # The below are environment variables that are parameters used to connect -# to the production instance of RKVST. +# to the production instance of DataTrails. # # EVIL_CLIENT_ID = represents the client ID from an App Registration # EVIL_CLIENT_SECRET_FILENAME = represents location client secret from an App Registration @@ -61,12 +63,14 @@ def evil_arch(): with open(client_secret_file, mode="r", encoding="utf-8") as tokenfile: client_secret = tokenfile.read().strip() - arch = Archivist("https://app.rkvst.io", (client_id, client_secret), max_time=300) + arch = Archivist( + "https://app.datatrails.ai", (client_id, client_secret), max_time=300 + ) return arch -# Uploads attachments to RKVST +# Uploads attachments to DataTrails def upload_attachment(arch, path, name): with pkg_resources.open_binary(sample, path) as fd: blob = arch.attachments.upload(fd) @@ -81,7 +85,7 @@ def upload_attachment(arch, path, name): return attachment -# Creates a SHA256 hash value for documents that are uploaded to RKVST +# Creates a SHA256 hash value for documents that are uploaded to DataTrails def create_hash(path): with open(path, "rb") as f: data = f.read() @@ -90,13 +94,13 @@ def create_hash(path): return digest -# Creates a public Document Asset with a primary image and related attachments within RKVST +# Creates a public Document Asset with a primary image and related attachments within DataTrails # -# arc_primary_image = represents the primary image to be displayed within the RKVST user interface -# document_document = represents the attachments/document to be uploaed to RKVST +# arc_primary_image = represents the primary image to be displayed within the DataTrails user interface +# document_document = represents the attachments/document to be uploaed to DataTrails # -# For additional information regarding RKVST Document Profile see below: -# https://docs.rkvst.com/developers/developer-patterns/document-profile/ +# For additional information regarding DataTrails Document Profile see below: +# https://docs.datatrails.ai/developers/developer-patterns/document-profile/ def create_asset( arch, displayname, @@ -125,11 +129,11 @@ def create_asset( return arch.assets.create(props=props, attrs=attrs, confirm=True) -# Uploads primary image and related attachments to RKVST +# Uploads primary image and related attachments to DataTrails # Creates hash value for related attachments # Passes expected values to create_asset method # -# image = represents the digital content to be recorded within RKVST +# image = represents the digital content to be recorded within DataTrails # serial_num = represents unique Asset attribute (id) that can be referenced def create_c2docasset(arch, image): attachments = upload_attachment(arch, image, "arc_primary_image") @@ -386,7 +390,7 @@ def create_details(arch, id, image): # # Records two public Document Assets by two individuals: Honest Abe and Evil Eddie # -# Honest Abe and Evil Eddie reside in two separate RKVST tenancies with App Registrations that +# Honest Abe and Evil Eddie reside in two separate DataTrails tenancies with App Registrations that # represent each tenancy. # # arch = Honest Abe diff --git a/archivist_samples/sbom_document/README.md b/archivist_samples/sbom_document/README.md index 7e8572d..bb3f1fc 100644 --- a/archivist_samples/sbom_document/README.md +++ b/archivist_samples/sbom_document/README.md @@ -4,18 +4,18 @@ Maintaining and publishing an accurate Software Bill of Materials (SBOM) is an e In its [recommendations for the minimum required elements of an SBOM](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations"). the NTIA identifies the need to balance transparency with access controls (_"SBOMs should be available in a timely fashion to those who need them and must have appropriate access permissions and roles in place"_), and illustrates in its [NTIA SBOM Proof of Concept](https://www.ntia.doc.gov/files/ntia/publications/ntia_sbom_energy_pocplanning.pdf "NTIA Energy PoC Presentation") the need for strong stakeholder community management and a trusted SBOM data sharing mechanism which protects the interests of all parties. -RKVST Data Assurance Hub offers a solution to this sharing and distribution problem: vendors retain control of their proprietary information and release processes while customers have assured and reliable visibility into their digital supply chain risks with reliable access to current and historical SBOM data for the components they rely on. +DataTrails Data Assurance Hub offers a solution to this sharing and distribution problem: vendors retain control of their proprietary information and release processes while customers have assured and reliable visibility into their digital supply chain risks with reliable access to current and historical SBOM data for the components they rely on. -This sample shows how to quickly get started with integrating your build and SBOM generation process with RKVST Data Assurance Hub. +This sample shows how to quickly get started with integrating your build and SBOM generation process with DataTrails Data Assurance Hub. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") +* Install the [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample @@ -32,7 +32,7 @@ archivist_samples_sbom_docment[-v] A SoftwarePackageDocument represents the published version history of the evolving Software Bill of Materials for a product line. -This Python class makes it easy to manage SBOM distribution in RKVST and publish the [NTIA minimum required SBOM information](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations") for your own SBOMs, regardless of how you generated them. +This Python class makes it easy to manage SBOM distribution in DataTrails and publish the [NTIA minimum required SBOM information](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations") for your own SBOMs, regardless of how you generated them. ### Creating a new SoftwarePackageDocument @@ -40,7 +40,7 @@ This Python class makes it easy to manage SBOM distribution in RKVST and publish To create a brand new SBOM Asset and begin tracking and sharing the release history of a product line, use `SoftwarePackageDocument.create()`: ```python - # Binaries such as images and SBOM XML need to be uploaded to RKVST first + # Binaries such as images and SBOM XML need to be uploaded to DataTrails first def upload_attachment(arch, path, name): with open(f"sbom_files/{path}", "r") as fd: blob = arch.attachments.upload(fd) @@ -52,7 +52,7 @@ To create a brand new SBOM Asset and begin tracking and sharing the release hist } return attachment - # Instantiate SoftwarePackageDocument object and create an RKVST record to begin + # Instantiate SoftwarePackageDocument object and create an DataTrails record to begin # tracing and publishing its version history package = SoftwarePackageDocument(arch) package.create( @@ -65,7 +65,7 @@ To create a brand new SBOM Asset and begin tracking and sharing the release hist ### Loading an existing SoftwarePackageDocument -If you know the RKVST Asset Identity you can load the SBOM directly as a SoftwarePackageDocument using `SoftwarePackageDocument.read()`: +If you know the DataTrails Asset Identity you can load the SBOM directly as a SoftwarePackageDocument using `SoftwarePackageDocument.read()`: ```python # Assume Archivist connection already initialized in `arch` @@ -73,7 +73,7 @@ package = SoftwarePackageDocument(arch) package.read("assets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx") ``` -If you do not know the RKVST Asset Identity then you can load the SBOM based on any unique set of attributes using `SoftwarePackageDocument.read_by_signature()`: +If you do not know the DataTrails Asset Identity then you can load the SBOM based on any unique set of attributes using `SoftwarePackageDocument.read_by_signature()`: ```python # Assume Archivist connection already initialized in `arch` @@ -84,7 +84,7 @@ package.read_by_signature({"sbom_uuid": "com.acme.rrd2013-ce-sp1-v4-1-5-0"}) ### Making a release -When a new official release is issued, update the version history in RKVST with `SoftwarePackageDocument.publish()`: +When a new official release is issued, update the version history in DataTrails with `SoftwarePackageDocument.publish()`: ```python # Assume Archivist connection already initialized in `arch` diff --git a/archivist_samples/sbom_document/run.py b/archivist_samples/sbom_document/run.py index 566f718..d1effdc 100644 --- a/archivist_samples/sbom_document/run.py +++ b/archivist_samples/sbom_document/run.py @@ -16,10 +16,10 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) - # SoftwarePackage class encapsulates SBOM object in RKVST + # SoftwarePackage class encapsulates SBOM object in DataTrails package_name = "ACME Detector Coyote SP1" LOGGER.info("Creating Software Package Asset...: %s", package_name) package = SoftwarePackageDocument(arch) diff --git a/archivist_samples/signed_records/README.md b/archivist_samples/signed_records/README.md index e521d10..8ccea87 100644 --- a/archivist_samples/signed_records/README.md +++ b/archivist_samples/signed_records/README.md @@ -4,22 +4,22 @@ A key aspect of Zero Trust architectures is the removal of inherent trust in any One such strong technology is to use device certificates or cryptographic keys on your IoT devices and use them to authenticate messages, proving that the message came from a particular piece of hardware. This of course is not foolproof: a compromised device with a strong key can send some very secure bad messages! But it is a strong defence against simple network-borne attacks like spoofing and man-in-the-middle forgeries. -RKVST already supports high integrity authentication from secure devices through its flexible command authorization scheme (leading to the 'who' in 'who did what when') but this can be added to by also signing the message contents (the 'what') with a device key. This signature is then verifiable independent of RKVST and provides additional proof of authenticity when it's needed. +DataTrails already supports high integrity authentication from secure devices through its flexible command authorization scheme (leading to the 'who' in 'who did what when') but this can be added to by also signing the message contents (the 'what') with a device key. This signature is then verifiable independent of DataTrails and provides additional proof of authenticity when it's needed. -The `signed-records` sample demonstrates how to integrate message-level signatures from a secure-by-default device into RKVST records, providing an independent measure of integrity and provenance on messages. +The `signed-records` sample demonstrates how to integrate message-level signatures from a secure-by-default device into DataTrails records, providing an independent measure of integrity and provenance on messages. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the RKVST samples package. If you are just trying out the pre-made samples you should get the official [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. +* Install the DataTrails samples package. If you are just trying out the pre-made samples you should get the official [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample The Taskfile in the top level of this repository includes a pre-packaged run of this sample that creates a simulated secure-by-default IoT device, creates an Event with a good signature and another with a hacked signature, and then shows how to verify them. -Please refer to the instructions in the [top level README](https://github.com/rkvst/rkvst-samples#signed-records "signed records sample") +Please refer to the instructions in the [top level README](https://github.com/datatrails/datatrails-samples#signed-records "signed records sample") diff --git a/archivist_samples/signed_records/main.py b/archivist_samples/signed_records/main.py index d557f17..a151ec6 100644 --- a/archivist_samples/signed_records/main.py +++ b/archivist_samples/signed_records/main.py @@ -148,7 +148,7 @@ def generate_crypto_asset(archivist, asset_name): # samples for broader and richer use of asset attributes attrs = { "arc_display_name": asset_name, - "arc_description": "Sample cryptographic asset for RKVST", + "arc_description": "Sample cryptographic asset for DataTrails", "arc_display_type": "Crypto endpoint", "arc_evidence_signing_pubkey": pubkey_pem.decode("utf-8"), } @@ -314,7 +314,7 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) if args.namespace: asset_name = "-".join(["signed-records", args.namespace]) else: @@ -325,7 +325,7 @@ def run(arch, args): if args.create_asset: # Don't create if there's already an asset record with this name. - # This is not strictly necessary - the RKVST system + # This is not strictly necessary - the DataTrails system # does not require arc_display_name to be unique - but to keep # things simple we'll avoid duplicates here. if asset_exists(arch, asset_name): diff --git a/archivist_samples/software_bill_of_materials/README.md b/archivist_samples/software_bill_of_materials/README.md index 25d2597..cfd3b0c 100644 --- a/archivist_samples/software_bill_of_materials/README.md +++ b/archivist_samples/software_bill_of_materials/README.md @@ -4,18 +4,18 @@ Maintaining and publishing an accurate Software Bill of Materials (SBOM) is an e In its [recommendations for the minimum required elements of an SBOM](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations"). the NTIA identifies the need to balance transparency with access controls (_"SBOMs should be available in a timely fashion to those who need them and must have appropriate access permissions and roles in place"_), and illustrates in its [NTIA SBOM Proof of Concept](https://www.ntia.doc.gov/files/ntia/publications/ntia_sbom_energy_pocplanning.pdf "NTIA Energy PoC Presentation") the need for strong stakeholder community management and a trusted SBOM data sharing mechanism which protects the interests of all parties. -RKVST Data Assurance Hub offers a solution to this sharing and distribution problem: vendors retain control of their proprietary information and release processes while customers have assured and reliable visibility into their digital supply chain risks with reliable access to current and historical SBOM data for the components they rely on. +DataTrails Data Assurance Hub offers a solution to this sharing and distribution problem: vendors retain control of their proprietary information and release processes while customers have assured and reliable visibility into their digital supply chain risks with reliable access to current and historical SBOM data for the components they rely on. -This sample shows how to quickly get started with integrating your build and SBOM generation process with RKVST Data Assurance Hub. +This sample shows how to quickly get started with integrating your build and SBOM generation process with DataTrails Data Assurance Hub. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") +* Install the [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample @@ -32,7 +32,7 @@ archivist_samples_software_bill_of_materials [-v] A SoftwarePackage represents the published version history of the evolving Software Bill of Materials for a product line. -This Python class makes it easy to manage SBOM distribution in RKVST and publish the [NTIA minimum required SBOM information](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations") for your own SBOMs, regardless of how you generated them. +This Python class makes it easy to manage SBOM distribution in DataTrails and publish the [NTIA minimum required SBOM information](https://www.ntia.gov/report/2021/minimum-elements-software-bill-materials-sbom "NTIA recommendations") for your own SBOMs, regardless of how you generated them. ### Creating a new SoftwarePackage @@ -40,7 +40,7 @@ This Python class makes it easy to manage SBOM distribution in RKVST and publish To create a brand new SBOM Asset and begin tracking and sharing the release history of a product line, use `SoftwarePackage.create()`: ```python - # Binaries such as images and SBOM XML need to be uploaded to RKVST first + # Binaries such as images and SBOM XML need to be uploaded to DataTrails first def attachment_create(sboms, attachment_description: AttachmentDescription): LOGGER.info("sbom attachment creator: %s", attachment_description.filename) with resources.open_binary(sbom_files, attachment_description.filename) as fd: @@ -55,7 +55,7 @@ To create a brand new SBOM Asset and begin tracking and sharing the release hist } return result - # Instantiate SoftwarePackage object and create an RKVST record to begin + # Instantiate SoftwarePackage object and create an DataTrails record to begin # tracing and publishing its version history package_name = "ACME Detector Coyote SP1" LOGGER.info("Creating Software Package Asset...: %s", package_name) @@ -80,7 +80,7 @@ To create a brand new SBOM Asset and begin tracking and sharing the release hist ### Loading an existing SoftwarePackage -If you know the RKVST Asset Identity you can load the SBOM directly as a SoftwarePackage using `SoftwarePackage.read()`: +If you know the DataTrails Asset Identity you can load the SBOM directly as a SoftwarePackage using `SoftwarePackage.read()`: ```python # Assume Archivist connection already initialized in `arch` @@ -88,7 +88,7 @@ package = SoftwarePackage(arch) package.read("assets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx") ``` -If you do not know the RKVST Asset Identity then you can load the SBOM based on any unique set of attributes using `SoftwarePackage.read_by_signature()`: +If you do not know the DataTrails Asset Identity then you can load the SBOM based on any unique set of attributes using `SoftwarePackage.read_by_signature()`: ```python # Assume Archivist connection already initialized in `arch` @@ -99,7 +99,7 @@ package.read_by_signature({"sbom_uuid": "com.acme.rrd2013-ce-sp1-v4-1-5-0"}) ### Making a release -When a new official release is issued, update the version history in RKVST with `SoftwarePackage.release()`: +When a new official release is issued, update the version history in DataTrails with `SoftwarePackage.release()`: ```python # Assume Archivist connection already initialized in `arch` diff --git a/archivist_samples/software_bill_of_materials/run.py b/archivist_samples/software_bill_of_materials/run.py index 4909d0a..e1c5188 100644 --- a/archivist_samples/software_bill_of_materials/run.py +++ b/archivist_samples/software_bill_of_materials/run.py @@ -16,10 +16,10 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) - # SoftwarePackage class encapsulates SBOM object in RKVST + # SoftwarePackage class encapsulates SBOM object in DataTrails package_name = "ACME Detector Coyote SP1" LOGGER.info("Creating Software Package Asset...: %s", package_name) package = SoftwarePackage(arch) diff --git a/archivist_samples/synsation/README.md b/archivist_samples/synsation/README.md index ca55488..6e9d552 100644 --- a/archivist_samples/synsation/README.md +++ b/archivist_samples/synsation/README.md @@ -1,6 +1,6 @@ # Synsation suite -Synsation Corporation is a fictional company used to build storylines and illustrate use cases around industrial and smart cities applications of the RKVST platform. +Synsation Corporation is a fictional company used to build storylines and illustrate use cases around industrial and smart cities applications of the DataTrails platform. The suite includes a number of entry points / samples that illustrate different capabilities: @@ -14,14 +14,14 @@ The suite includes a number of entry points / samples that illustrate different * Python 3.8 and later versions are supported. -* Install the RKVST samples package. If you are just trying out the pre-made samples you should get the official [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. +* Install the DataTrails samples package. If you are just trying out the pre-made samples you should get the official [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") from PyPi. If you are modifying this sample and want to try out your changes then you'll need to rebuild the wheel: please refer to the developer instructions in the top level of this repository to see how to do that. -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample -The Taskfile in the top level of this repository includes a pre-packaged run of this sample that performs both a quick and deep count of the Assets and Events in your RKVST tenancy. +The Taskfile in the top level of this repository includes a pre-packaged run of this sample that performs both a quick and deep count of the Assets and Events in your DataTrails tenancy. -Please refer to the instructions in the [top level README](https://github.com/rkvst/rkvst-samples#synsation "synsation suite") +Please refer to the instructions in the [top level README](https://github.com/datatrails/datatrails-samples#synsation "synsation suite") diff --git a/archivist_samples/synsation/analyze.py b/archivist_samples/synsation/analyze.py index 32c8927..1262a46 100755 --- a/archivist_samples/synsation/analyze.py +++ b/archivist_samples/synsation/analyze.py @@ -120,7 +120,7 @@ def analyze_asset(conn, asset): def run(arch, args): - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) for asset in arch.assets.list(): analyze_asset(arch, asset) diff --git a/archivist_samples/synsation/charger.py b/archivist_samples/synsation/charger.py index e1f7a79..1adc28d 100755 --- a/archivist_samples/synsation/charger.py +++ b/archivist_samples/synsation/charger.py @@ -75,17 +75,17 @@ def interrupt_listener_run_until(tw, stop): # when the timewarp will hit this value rather than checking it # every time if stop and tw.now() > stop: - LOGGER.info("RKVST EV Charger example reached end time") + LOGGER.info("DataTrails EV Charger example reached end time") break except KeyboardInterrupt: - LOGGER.info("RKVST EV Charger example stopped") + LOGGER.info("DataTrails EV Charger example stopped") def run(arch, args): """logic goes here""" # Stretch the timestamps in logs - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) LOGGER.info("Creating time warp...") diff --git a/archivist_samples/synsation/initialise.py b/archivist_samples/synsation/initialise.py index 4c304fa..03a6e2b 100755 --- a/archivist_samples/synsation/initialise.py +++ b/archivist_samples/synsation/initialise.py @@ -26,7 +26,7 @@ def run(arch, args): - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) if args.create_corporation: @@ -50,7 +50,9 @@ def run(arch, args): def entry(): - parser = common_parser("Populates a clean RKVST tenancy with Synsation test data") + parser = common_parser( + "Populates a clean DataTrails tenancy with Synsation test data" + ) parser.add_argument( "--namespace", type=str, diff --git a/archivist_samples/synsation/simulator.py b/archivist_samples/synsation/simulator.py index 0a23722..79cdbed 100755 --- a/archivist_samples/synsation/simulator.py +++ b/archivist_samples/synsation/simulator.py @@ -175,7 +175,7 @@ def demo_flow(ac, asset_id, asset_type, tw, wait): def run(arch, args): """logic goes here""" - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) LOGGER.info("Looking for asset...") diff --git a/archivist_samples/synsation/wanderer.py b/archivist_samples/synsation/wanderer.py index 7b1db2f..2eec585 100755 --- a/archivist_samples/synsation/wanderer.py +++ b/archivist_samples/synsation/wanderer.py @@ -71,7 +71,7 @@ def shipit(ac, crate_id, delay, tw): def run(arch, args): """logic goes here""" - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) # Find the asset record diff --git a/archivist_samples/testing/archivist_parser.py b/archivist_samples/testing/archivist_parser.py index d174641..29b4acb 100644 --- a/archivist_samples/testing/archivist_parser.py +++ b/archivist_samples/testing/archivist_parser.py @@ -1,7 +1,7 @@ """common parser argument - This is copied from rkvst-python repo. When acceptable this file will - be copied back to the rkvst-python repo. + This is copied from datatrails-python repo. When acceptable this file will + be copied back to the datatrails-python repo. """ # pylint: disable=missing-docstring @@ -69,7 +69,7 @@ def common_parser(description): type=str, dest="url", action="store", - default="https://app.rkvst-poc.io", + default="https://app.datatrails.ai", help="url of Archivist service", ) parser.add_argument( @@ -103,7 +103,7 @@ def endpoint(args): set_logger("INFO") arch = None - LOGGER.info("Initialising connection to RKVST...") + LOGGER.info("Initialising connection to DataTrails...") fixtures = { "assets": { "proof_mechanism": args.proof_mechanism.name, diff --git a/archivist_samples/testing/parser.py b/archivist_samples/testing/parser.py index c7daffb..57bd79e 100644 --- a/archivist_samples/testing/parser.py +++ b/archivist_samples/testing/parser.py @@ -13,7 +13,7 @@ def common_endpoint(label, args): - LOGGER.info("Initialising connection to RKVST ...") + LOGGER.info("Initialising connection to DataTrails ...") arch = endpoint(args) try: diff --git a/archivist_samples/wipp/README.md b/archivist_samples/wipp/README.md index 31b396d..f48c3c8 100644 --- a/archivist_samples/wipp/README.md +++ b/archivist_samples/wipp/README.md @@ -2,18 +2,18 @@ Tracking the Nuclear Waste Management lifecycle is an important aspect to ensure that all safety protocols and procedures have been successfully executed. Fragmented communication and manual checks can lead to honest mistakes and redundancy. Digitizing the lifecycle and exposing data to the right parties at the right time can decrease honest mistakes and increase effective communication. -RKVST Continuous Assurance Hub offers a solution to fragmented communication and manual checks. Parties have near real-time access to data increasing seamless and effective communication in addition stakeholders can control the sharing of data ensuring one can view information that is relevant. Policies, procedures and images can be included/attached thus reducing multiple checks and providing persons with the most recent documentation. +DataTrails Continuous Assurance Hub offers a solution to fragmented communication and manual checks. Parties have near real-time access to data increasing seamless and effective communication in addition stakeholders can control the sharing of data ensuring one can view information that is relevant. Policies, procedures and images can be included/attached thus reducing multiple checks and providing persons with the most recent documentation. -This sample uses publicly-available information about WIPP (Waste Isolation Pilot Plant) and how to quickly get started with integrating Nuclear Waste Management lifecycle with RKVST Continuous Assurance Hub. +This sample uses publicly-available information about WIPP (Waste Isolation Pilot Plant) and how to quickly get started with integrating Nuclear Waste Management lifecycle with DataTrails Continuous Assurance Hub. ## Pre-requisites * Python 3.8 and later versions are supported. -* Install the [RKVST samples Python package](https://pypi.org/project/rkvst-samples/ "PyPi package page") +* Install the [DataTrails samples Python package](https://pypi.org/project/datatrails-samples/ "PyPi package page") -* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [RKVST documentation](https://docs.rkvst.com/docs/rkvst-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. +* Get an authorization bearer token and store it in the file `credentials/.auth_token`. If you don't know how to do this, please refer to the [DataTrails documentation](https://docs.datatrails.ai/docs/datatrails-basics/getting-access-tokens-using-app-registrations/ "Getting an auth token"). Make sure that the `credentials` folder is suitably restricted by disallowing root and group access. ## Running the sample @@ -30,20 +30,20 @@ archivist_samples_wipp [-v] The WIPP class creates two Assets: Drum and Cask. The Drum is an item that contains nuclear waste which is loaded into the Cask for transportation. -This Python class makes it easy to create the above assets and related events in RKVST. Providing an assurance hub with trusted data. +This Python class makes it easy to create the above assets and related events in DataTrails. Providing an assurance hub with trusted data. ### Creating a Drum and Cask To create a brand new WIPP Asset and begin tracking and sharing Nuclear Waste lifecycle, use `Wipp.create()`: ```python - # Binaries such as images need to be uploaded to RKVST first + # Binaries such as images need to be uploaded to DataTrails first def upload_attachment(arch, attachment_description: AttachmentDescription): with resources.open_binary(wipp_files, attachment_description.filename) as fd: blob = arch.attachments.upload(fd) attachment = { # sample-specific attr to relay attachment name - "rkvst_samples_display_name": attachment_description.attribute_name, + "datatrails_samples_display_name": attachment_description.attribute_name, "arc_file_name": attachment_description.filename, "arc_attribute_type": "arc_attachment", "arc_blob_identity": blob["identity"], @@ -52,7 +52,7 @@ To create a brand new WIPP Asset and begin tracking and sharing Nuclear Waste li } return attachment - # Instantiate WIPP object and create an RKVST record to begin + # Instantiate WIPP object and create an DataTrails record to begin # tracing and publishing its lifecycle # Drum Asset LOGGER.info("Creating Drum Asset...") @@ -99,7 +99,7 @@ To create a brand new WIPP Asset and begin tracking and sharing Nuclear Waste li ### Loading data for an existing Wipp object -If you know the RKVST Asset Identity you can load data directly +If you know the DataTrails Asset Identity you can load data directly to the Drum and/or Cask Assset using `Wipp.read()`: ```python @@ -108,7 +108,7 @@ Drum = Wipp(arch, "55 gallon drum") drum.read("assets/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx") ``` -If you do not know the RKVST Asset Identity then you can load data based on any unique set of attributes using `Wipp.read_by_signature()`: +If you do not know the DataTrails Asset Identity then you can load data based on any unique set of attributes using `Wipp.read_by_signature()`: ```python # Assume Archivist connection already initialized in `arch` @@ -119,7 +119,7 @@ drum.read_by_signature({"wipp_package_id": "wipp"}) ### Loading Characterization -When adding characterization, update the Drum Asset in RKVST with +When adding characterization, update the Drum Asset in DataTrails with `Wipp.characterization()`: ```python diff --git a/archivist_samples/wipp/run.py b/archivist_samples/wipp/run.py index 114b8ff..49f0395 100644 --- a/archivist_samples/wipp/run.py +++ b/archivist_samples/wipp/run.py @@ -17,7 +17,7 @@ def run_cask(arch, args): """ Run the sample, only creating the cask asset, returns the system error code """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) # Cask Asset @@ -175,10 +175,10 @@ def run(arch, args): """ runs the sample and returns the system error code. """ - LOGGER.info("Using version %s of rkvst-archivist", about.__version__) + LOGGER.info("Using version %s of datatrails-archivist", about.__version__) LOGGER.info("Fetching use case test assets namespace %s", args.namespace) - # Wipp class encapsulates wipp object in RKVST + # Wipp class encapsulates wipp object in DataTrails LOGGER.info("Creating Drum Asset...") drum = Wipp(arch, "55 gallon drum") drumname = "Drum" diff --git a/archivist_samples/wipp/wipp.py b/archivist_samples/wipp/wipp.py index d4005d6..7e142d8 100644 --- a/archivist_samples/wipp/wipp.py +++ b/archivist_samples/wipp/wipp.py @@ -26,7 +26,7 @@ def upload_attachment(arch, attachment_description: AttachmentDescription): blob = arch.attachments.upload(fd) attachment = { # sample-specific attr to relay attachment name - "rkvst_samples_display_name": attachment_description.attribute_name, + "datatrails_samples_display_name": attachment_description.attribute_name, "arc_file_name": attachment_description.filename, "arc_attribute_type": "arc_attachment", "arc_blob_identity": blob["identity"], @@ -151,7 +151,7 @@ def characterize( } safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -194,7 +194,7 @@ def tomography( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -237,7 +237,7 @@ def loading( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -276,7 +276,7 @@ def preshipping( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -308,7 +308,7 @@ def departure( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -342,7 +342,7 @@ def waypoint( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -374,7 +374,7 @@ def arrival( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -407,7 +407,7 @@ def unloading( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) @@ -446,7 +446,7 @@ def emplacement( safe_attachments = attachments or [] for attachment in safe_attachments: - attrs[attachment["rkvst_samples_display_name"]] = attachment + attrs[attachment["datatrails_samples_display_name"]] = attachment if custom_attrs is not None: attrs.update(custom_attrs) diff --git a/requirements.txt b/requirements.txt index 8284bc3..c9d5fe6 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,3 +1,4 @@ cryptography~=41.0.2 -rkvst-archivist==0.26.0 +# TODO change to datatrails when released +datatrails-archivist==0.27.1 pyyaml~=6.0.1 diff --git a/scripts/api.sh b/scripts/api.sh index ed754b5..168c76e 100755 --- a/scripts/api.sh +++ b/scripts/api.sh @@ -21,5 +21,5 @@ docker run \ -e GITHUB_REF \ -e TWINE_USERNAME \ -e TWINE_PASSWORD \ - rkvst-samples-api \ + datatrails-samples-api \ "$@" diff --git a/scripts/functests.sh b/scripts/functests.sh index 590dea8..c8adba8 100755 --- a/scripts/functests.sh +++ b/scripts/functests.sh @@ -9,7 +9,7 @@ # python3 -m venv samples-venv source samples-venv/bin/activate -python3 -m pip install -q --force-reinstall dist/rkvst_samples-*.whl +python3 -m pip install -q --force-reinstall dist/datatrails_samples-*.whl # do everything in sub directory to ensure that wheel is used and not local code. export TEST_AUTHTOKEN_FILENAME=../${TEST_AUTHTOKEN_FILENAME} diff --git a/setup.cfg b/setup.cfg index 8e12178..416140e 100644 --- a/setup.cfg +++ b/setup.cfg @@ -4,12 +4,12 @@ statistics = True max-line-length = 88 [metadata] -name = rkvst-samples -author = RKVST Inc. -author_email = support@rkvst.com -description = RKVST Examples +name = datatrails-samples +author = DataTrails Inc. +author_email = support@datatrails.ai +description = DataTrails Examples long_description = file: README.md -url = https://github.com/rkvst/rkvst-samples +url = https://github.com/datatrails/datatrails-samples license = MIT license_files = LICENSE @@ -23,8 +23,8 @@ classifiers = Topic :: Utilities project_urls = - Source = https://github.com/rkvst/rkvst-samples - Tracker = https://github.com/rkvst/rkvst-samples/issues + Source = https://github.com/datatrails/datatrails-samples + Tracker = https://github.com/datatrails/datatrails-samples/issues [options] install_requires = file: requirements.txt