Topcoder's API that deals with submissions, reviews, review summations and review types on the Topcoder platform
- Production API
- ES Processor - Updates data in ElasticSearch
- Active AWS Account
- Node.js 8.11.x
- Npm 5.6.x
- Postman for Verification
- Docker and Docker-Compose (Optional for Local Deployment)
-
Download your AWS Credentials from AWS Console. Refer AWS Documentation
-
Depending on your Operating System, create AWS credentials file in the path listed below
Linux, Unix, and macOS users: ~/.aws/credentials
Windows users: C:\Users\USER_NAME\.aws\credentials
- credentials file should look like below
[default]
aws_access_key_id = SOME_ACCESS_KEY_ID
aws_secret_access_key = SOME_SECRET_ACCESS_KEY
-
Create a S3 bucket from AWS Console and note down the bucket name
-
Create AWS ES Domain from AWS Console and note down the end point details. Note: This application supports ES from other providers also like Bonsai, etc...
If you are creating ES Domain in AWS, It will take some time for ES Domain to get created and for End point details to pop up.
End point will look something like https://search-submission-xxxxxx.us-east-1.es.amazonaws.com/
- From the root of the project, install NPM dependencies
npm i
- Refer to config/default.js and Set up the environment variables as necessary
e.g.
export AWS_REGION="<AWS Region>"
export DMZ_BUCKET="<S3 Bucket Name>"
export ARTIFACT_BUCKET="<Artifact S3 Bucket Name>"
export ES_HOST="<ES Endpoint>"
export AUTH0_URL="<Auth0 URL>"
export AUTH0_CLIENT_ID="<Auth0 Client ID>"
export AUTH0_CLIENT_SECRET="<Auth0 Client Secret>"
Note: Make sure to set Auth0, AWS and ES related environment variables
-
Modify the other configuration variables if necessary in
config/default.js
-
Create the tables in DynamoDB by runing the script
npm run create-tables
- Import the review types in database by running the script
npm run init-db
- Index can be created in ES by running the script
npm run create-index
- Since Submission processor is still under development, To load dummy data in to ES, run the following script
npm run init-es
This script will load the data from scripts/data
directory into ES
- Run the application
npm run start
-
Make sure to use Node v10+ by command
node -v
. We recommend using NVM to quickly switch to the right version:nvm use
-
📦 Install npm dependencies
npm install
-
⚙ Local config
In thesubmissions-api
root directory create.env
file with the next environment variables. Values for Auth0 config should be shared with you on the forum.# AWS related config AWS_ACCESS_KEY_ID= AWS_SECRET_ACCESS_KEY= AWS_REGION= DMZ_BUCKET= ARTIFACT_BUCKET= # Auth0 config AUTH0_URL= AUTH0_PROXY_SERVER_URL= TOKEN_CACHE_TIME= AUTH0_AUDIENCE= AUTH0_CLIENT_ID= AUTH0_CLIENT_SECRET= # Locally deployed services (via docker-compose) ES_HOST=localhost:9200
- Values from this file would be automatically used by many
npm
commands. ⚠️ Never commit this file or its copy to the repository!
- Values from this file would be automatically used by many
-
🚢 Start docker-compose with services which are required to start Topcoder Submissions API locally
npm run services:up
npm run services:down
can be used to shutdown the docker servicesnpm run services:logs
can be used to view the logs from the docker services
-
♻ Create tables.
npm run create-tables
-
♻ Create ES index.
npm run create-index
-
♻ Init DB, ES
npm run local:init
This command will do 2 things:
- Import the data to the database and index it to ElasticSearch
- Note, to migrate the existing data from DynamoDB to ES, run the following script
npm run db-to-es
-
🚀 Start Topcoder Submissions API
npm run start
The Topcoder Submissions API will be served on
http://localhost:3000
npm run lint or npm run lint:fix -- To fix lint errors which could be fixed
npm run dev
To run the Submissions API using docker, follow the below steps
-
Navigate to the directory
docker
-
Rename the file
sample.api.env
toapi.env
-
Set the required AWS and Auth0 credentials in the file
api.env
-
Once that is done, run the following command
docker-compose up
- When you are running the application for the first time, It will take some time initially to download the image and install the dependencies
Integration tests use different index submission-test
which is not same as the usual index submission
.
Please ensure to create the index submission-test
or the index specified in the environment variable ES_INDEX_TEST
before running the Integration tests. You could re-use the existing scripts to create index but you would need to set the below environment variable
export ES_INDEX=submission-test
To run unit tests alone
npm run test
To run unit tests with coverage report
npm run cov
To run integration tests alone
npm run e2e
To run integration tests with coverage report
npm run cov-e2e
To migrate the existing data from DynamoDB to ES, run the following script
npm run db-to-es
Submission API started off using the legacy challenge ids. With the v5 upgrade to the challenge api, we now need to make use of the v5 challenge ids. We have thus created a script to update existing challengeId
attribute on submissions to v5 and store the older challenge ids in the legacyChallengeId
attribute.
To update the existing challengeId data on submissions in DynamoDB to v5 challengeId, set the following env variables:
SUBMISSION_TABLE_NAME // Table name of the submission records. Defaults to 'Submission'
UPDATE_V5_CHALLENGE_BATCH_SIZE // Number of records that are updated simultaneously. Defaults to 250
FETCH_CREATED_DATE_START // The start day of fetch latest challenges. Defaults to '2021-01-01'
FETCH_PAGE_SIZE // The page size of each api request. Defaults to 500
and then run the following script
npm run update-to-v5-challengeId
-
Open Postman
-
Import Postman environment and Collection from
docs
directory -
Postman API requests are categorized into four parts
- Review Type
- Submission
- Review
- Review Summation
-
Postman collection contains both positive and few negative test cases
-
After creating a submission, submissionId will be automatically set in Postman environment to serve future requests
-
Please ensure to create a submission using Postman before testing Review and ReviewSummation end points, since the body of few Review and ReviewSummation requests references
submissionId
from Environment which is set by triggering POST /submissions request in Postman.
-
All JWT tokens provided in Postman environment file is created in JWT.IO with secret
mysecret
-
There are 3 tokens provided in the environment collection representing each role - Topcoder User, Copilot, Administrator
-
DynamoDB performance seems to be slower in my testing
Token Commit.