Skip to content

Commit

Permalink
Migration to Workshop Studio guide format (#98)
Browse files Browse the repository at this point in the history
Complete re-format of all doc pages to remove references to hugo and format for Workshop Studio.
  • Loading branch information
switch180 authored Nov 27, 2023
1 parent ccdd24b commit 8117c12
Show file tree
Hide file tree
Showing 125 changed files with 572 additions and 758 deletions.
47 changes: 46 additions & 1 deletion .github/scripts/build-assets.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,11 @@
import os
import re
import sys
import time
import glob
import shutil
import ntpath
import tempfile
import subprocess
from pathlib import Path
from zipfile import ZipFile
Expand Down Expand Up @@ -69,4 +72,46 @@
workshop_zip.write(python_script, tail)
shutil.move(os.path.join(os.getcwd(), zip_file_name), os.path.join(dest_root, 'assets', zip_file_name))

exit()
# Check build

preview_build = os.path.join(pkg_root, 'preview_build')
shell_out = tempfile.NamedTemporaryFile(mode='w')
try:
proc = subprocess.Popen([preview_build,"-disable-refresh"],
stdout=shell_out, stderr=shell_out, cwd=pkg_root)
except FileNotFoundError as err:
proc = subprocess.Popen(["preview_build", "-disable-refresh"],
stdout=shell_out, stderr=shell_out, cwd=pkg_root)


time.sleep(10)
proc.kill()
build_result_error = r'.*(Build complete with [0-9].*)'
build_result_success = r'.*(Build succeeded.*)'
status = None
status_message = None
count = 0
with open(shell_out.name) as f:
for line in f:
if count > 10000:
break
count += 1
if status == None:
match_error = re.search(build_result_error, line)
match_success = re.search(build_result_success, line)
if match_error:
status_message = match_error.group(1)
status = 1
print("Discovered an error in the build process.\n{}".format(status_message))
elif match_success:
status_message = match_success.group(1)
status = 0
print("Success. Build result is: \n{}".format(status_message))
elif status == 1:
err_match = re.search(r'^.*ERR(.*)', line)
err_ignore = re.search(r'^.*Error hosting local preview site.*', line)
if err_match and err_ignore is None:
print("{}".format(err_match.group(1)))

shell_out.close()
exit(status)
25 changes: 7 additions & 18 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,47 +3,36 @@ name: PushToProd
permissions:
id-token: write
on:
workflow_dispatch:
inputs:
website:
description: 'Name of the S3 bucket aka website to publish to'
required: true
default: 'amazon-dynamodb-labs.com'
options:
- 'test.amazon-dynamodb-labs.com'
- 'amazon-dynamodb-labs.com'
push:
branches:
- master

jobs:
buildAndDeploy:
runs-on: ubuntu-latest
env:
STEP_S3_BUCKET: ${{ github.event.inputs.website }}
STEP_S3_BUCKET: amazon-dynamodb-labs.com
steps:
- name: Checkout
uses: actions/checkout@v3
with:
submodules: 'recursive'
fetch-depth: '0'
- name: Setup Hugo
uses: peaceiris/actions-hugo@v2
with:
hugo-version: '0.102.3'
# extended: true
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Build Hugo
run: hugo --buildFuture
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-region: us-east-1
role-to-assume: ${{ secrets.AWS_ROLE_TO_ASSUME }}
- name: Pull preview build
run: aws s3 sync s3://amazon-dynamodb-labs-static/build/ . && chmod +x preview_build
- name: Build Assets
run: python3 ./.github/scripts/build-assets.py
- name: S3Sync
run: aws s3 sync public s3://$STEP_S3_BUCKET --delete
run: aws s3 sync public/assets/ s3://$STEP_S3_BUCKET/assets/ --delete
- name: SetS3Acl
run: aws s3api put-object-acl --grant-read uri=http://acs.amazonaws.com/groups/global/AllUsers --bucket $STEP_S3_BUCKET --key assets/lab.yaml
- name: SetS3Acl
Expand Down
32 changes: 32 additions & 0 deletions .github/workflows/pull-request.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
name: ValidatePR

permissions:
id-token: write
on:
pull_request:
branches: [ master ]

jobs:
buildAndVerify:
runs-on: ubuntu-latest
env:
STEP_S3_BUCKET: 'test.amazon-dynamodb-labs.com'
steps:
- name: Checkout
uses: actions/checkout@v3
with:
submodules: 'recursive'
fetch-depth: '0'
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v1
with:
aws-region: us-east-1
role-to-assume: ${{ secrets.AWS_ROLE_TO_ASSUME }}
- name: Pull preview build
run: aws s3 sync s3://amazon-dynamodb-labs-static/build/ . && chmod +x preview_build
- name: Build Assets
run: python3 ./.github/scripts/build-assets.py
2 changes: 1 addition & 1 deletion .gitmodules
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
[submodule "themes/learn"]
path = themes/learn
url = https://github.com/switch180/hugo-theme-learn.git
branch = aws
branch = aws
45 changes: 12 additions & 33 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,46 +1,25 @@
# Amazon DynamoDB Labs
The repo for https://catalog.workshops.aws/dynamodb-labs/en-US , formerly https://amazon-dynamodb-labs.com

### Setup:
### Dev:

#### Install Hugo:
On a mac:

`brew install hugo`

On Linux:
- Download from the releases page: https://github.com/gohugoio/hugo/releases/tag/v0.102.3
- Extract and save the hugo executable to `/usr/local/bin/`

Note: This workshop is built with [hugo v0.102.3](https://github.com/gohugoio/hugo/releases/tag/v0.102.3). Older versions may produce errors due to the aws theme we use.
#### Local development
You can make code changes and markdown changes, but in order to test the build you need to be an Amazon employee with access to preview_build to compile the documentation and run the site locally. [Amazon employees click here for instructions](https://tiny.amazon.com/16x21plc5).

#### Clone this repo:
From wherever you checkout repos:
We suggest you make a fork. From wherever you are you can checkout the repo:
`git clone [email protected]:aws-samples/amazon-dynamodb-labs.git` (or your fork)

#### Clone the theme submodule:
`cd amazon-dynamodb-labs`

`git submodule init; git submodule update`


#### Run Hugo locally:
To run hugo in development:
`hugo serve -D`

`hugo` will build your content locally and output to `./public/`


#### View Hugo locally:
Visit http://localhost:1313/ to see the site.

#### Making Edits:
As you save edits to a page, the site will live-reload to show your changes.
#### Making edits:
Amazon employees only: Make changes, run preview_build, check localhost:8080 to see the site locally
Everyone else: make changes, make a pull request, and wait for the automations to run. They will tell you if you have errors in your changes.

#### Auto deploy:
#### Pull requests
Make a pull request with changes. PRs will be automatically checked to make sure their markdown and other files are correct and without error using an automatic GitHub action. With each commit in a PR, the action will run to verify.

Within minutes of a commit to the master branch, a build and deploy using the default hugo grav learn theme will kick off. You can review your change at the following address.
#### On merge to master

https://master.amazon-dynamodb-labs.com
On merge to master, a GitHub action will deploy the assets to amazon-dynamodb-labs.com and verify the build to ensure the markdown and other files are correctly formatted. From there, a maintainer must manually pull the changes and push to https://catalog.workshops.aws/dynamodb-labs/en-US

## License
This project is licensed under the Apache-2.0 License.
64 changes: 0 additions & 64 deletions config.toml

This file was deleted.

6 changes: 0 additions & 6 deletions content/all-content.en.md

This file was deleted.

15 changes: 9 additions & 6 deletions content/authors.en.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
---
title: "Contributors to Amazon DynamoDB Labs"
hidden: true
hidden: false
chapter: true
description: "Our editors and hall of fame."
weight: 100
---


Expand All @@ -16,12 +17,14 @@ description: "Our editors and hall of fame."

The serverless event driven architecture lab was added in 2023:

1. Lucas Rettenmeier ([@rettenls](https://github.com/rettenls)) - Workshop creator for re:Invent 2021
1. Kirill Bogdanov - ([@kirillsc](https://github.com/kirillsc)) - Workshop creator for re:Invent 2021
1. Sean Shriver - ([@switch180](https://github.com/switch180)) - Presenter of the workshop at re:Invent 2021. Edited and merged the lab to labs.com.
1. Lucas Rettenmeier ([@rettenls](https://github.com/rettenls)) - Workshop creator for re\:Invent 2021
1. Kirill Bogdanov - ([@kirillsc](https://github.com/kirillsc)) - Workshop creator for re\:Invent 2021
1. Sean Shriver - ([@switch180](https://github.com/switch180)) - Presenter of the workshop at re\:Invent 2021. Edited and merged the lab to labs.com.
1. John Terhune - ([@terhunej](https://github.com/terhunej)) - Prepared the lab guide for publshing to labs.com, editing and updating.

The lab guide was migrated from amazon-dynamodb-labs.com to Workshop Studio in December of 2023:

1. Sean Shriver - ([@switch180](https://github.com/switch180)) - Refactored every documentation page for the new Workshop Studio proprietary format.

### 2021 editors

Expand All @@ -45,6 +48,6 @@ The following individuals put in hours of their time to revamp the guide to make

### Original version

This lab was built to run on Qwiklabs in 2018. In 2020 it was rewritten and updated to run outside Qwiklabs.
LADV was built to run on Qwiklabs in 2018. In 2020 it was rewritten and updated to run outside Qwiklabs.

A special thanks goes to Regis Gimenis ([regisgimenis](https://github.com/regisgimenis)) who is the original designer of the advanced design patterns. He did one of the most difficult tasks - creating a lab from scratch. Remnants of Regis' work are found throughout the Python files of the workshop and the lab guide. Without him, this site would not exist.
A special thanks goes to Regis Gimenis ([regisgimenis](https://github.com/regisgimenis)) who is the original designer of the advanced design patterns. He did one of the most difficult tasks - creating a lab from scratch. Remnants of Regis' work are found throughout the Python files of the workshop and the lab guide for LADV. Without him, this site would not exist.
12 changes: 5 additions & 7 deletions content/design-patterns/ex1capacity/Step4.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,23 +9,21 @@ To view the Amazon CloudWatch metrics for your table:
1. Navigate to the DynamoDB section of the AWS management console.
2. As shown in the following image, in the navigation pane, choose Tables. Choose the logfile table, and in the right pane, choose the Metrics tab

![Open the CloudWatch metrics for the table](/images/awsnewconsole3.png)
![Open the CloudWatch metrics for the table](/static/images/awsnewconsole3.png)


<!-- ![Open the CloudWatch metrics for the table version 2](/images/awsconsole3v2.png) -->
<!-- ![Open the CloudWatch metrics for the table version 2](/static/images/awsconsole3v2.png) -->

The CloudWatch metrics will look like what you see in the following image.

![The Cloud Watch metrics for the base table](/images/tablelogfile-stats.png)
![The Cloud Watch metrics for the base table](/static/images/tablelogfile-stats.png)

{{% notice note %}}
You might not see provisioned capacity data in your read or write capacity graphs, which are displayed as red lines. It takes time for DynamoDB to generate provisioned capacity CloudWatch metrics, especially for new tables.
{{% /notice %}}
::alert[You might not see provisioned capacity data in your read or write capacity graphs, which are displayed as red lines. It takes time for DynamoDB to generate provisioned capacity CloudWatch metrics, especially for new tables.]

The CloudWatch metrics will look like what you see in the following image for the global secondary index.


![The Cloud Watch metrics for the GSI](/images/GSI-logfile-stats.png)
![The Cloud Watch metrics for the GSI](/static/images/GSI-logfile-stats.png)


**You might be wondering:** Why are there throttling events on the table but not on the global secondary index? The reason is a base table receives the writes immediately and consumes write capacity doing so, whereas a global secondary index's capacity is consumed asynchronously some time after the initial write to the base table succeeds. In order for this system to work inside the DynamoDB service, there is a buffer between a given base DynamoDB table and a global secondary index (GSI). A base table will quickly surface a throttle if capacity is exhausted, whereas only an imbalance over an extended period of time on a GSI will cause the buffer to fill, thereby generating a throttle. In short, a GSI is more forgiving in the case of an imbalanced access pattern.
Expand Down
4 changes: 1 addition & 3 deletions content/design-patterns/ex1capacity/Step6.en.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,4 @@ row: 2000 in 0.8817043304443359
RowCount: 2000, Total seconds: 17.13607406616211
```

{{% notice note %}}
With the new capacity, the total load time is lower.
{{% /notice %}}
::alert[With the new capacity, the total load time is lower.]
Loading

0 comments on commit 8117c12

Please sign in to comment.