The API portion of the Court Administration Scheduling System. The frontend for the Court Administration Scheduling System can be found here; Court Administration Scheduling Frontend
This project is based on the Sheriff Scheduling API. Although this project is not (can't be) a direct fork of the Sheriff Scheduling Frontend, the commit history was retained to make it easy (although not quite as easy as a PR) to share changes between the projects.
This project was done with a very limited budget and timeline as a POC/Demo, as such it may still contain references to the term sheriff
.
At it's core this (and the parent project) is a scheduling application that does not need to be fixed as a domain specific application.
Give more time and budget, the scheduling features and UX of the Sheriff/Court Administration Scheduling System should be developed into a more generic application that can be configured (through deployment (preferably) and/or build configuration) to act as a scheduling system for any domain that would benefit from scheduling features provided by the application. Alternatively it could be developed into a multi-tenant/domain system to service multiple business areas within a single application.
The architecture notes for this project and be found in the Project Docs.
This API includes a Typescript / Javascript client within the repo that can be added to your project via
yarn add github:bcgov/cass-api
or
npm install github:bcgov/cass-api
If you are working on developing the Client api, the typical flow is as follows:
Within this project folder run yarn link
This registers this project as a local package.
Then within your frontend project run yarn link cass-api
, which will simlink in the registered local pacakge into your node_modules.
Then you'll want to run this project in dev mode (i.e. yarn watch:dev
) and start your frontend project in dev mode which will point the api proxy to this instance.
- Deploy the backend to minishift (See openshift/Readme.md)
- The build of the cass-api will deploy and will migrate the database via
liquibase
, so if you have a branch that has new database changes you will need to build and deploy that branch (in order to migrate the database) before you can run your local development instance against the database. - Once the development environment is set up, you should be able to use the commands described below to get your development / test instances up and running.
This project was built using VS Code and as such the debugging flow is built around some mechanisms supported in this specific editor.
You can attach the debugger to the test runner, dev and test instances of the api easily. Each of these are defined within the launch.json and support multiple debugging sessions at the same time. Which allows you to debug right from the unit test into the api and back.
The scripts can be organized into a few categories:
Launches the backend in dev mode against the development database. Running this command will read and use environment variables defined in
.env.dev
. See the ##Setup## section for instructions on how to have this file generated for you automatically.
Cleans all compiled files (found in the
dist/
) with the exception of thedist/.gitignore
folder. Then regenerates all typescript files and rebuilds. This should be done before issuing / completing Pull Requests
The Tests have setup and teardown hooks that clear many of the database tables so that CRUD functionality of the API can be verified (i.e. numbers of records etc.)
Generally, I run the following two commands in separate terminal windows so that I can clearly watch the output of both.
Launches the backend in dev mode against the testing database. Running this command will read and use environment variables defined in
.env.testing
. See the ##Setup## section for instructions on how to have this file generated for you automatically.
runs the jest tests for the application (should be done in conjunction with the
yarn watch:testing
command described above). This command will use the values of.env.jest
to override the values within.env.testing
this is to allow the test runner to have admin access to the database so that tables and records can be deleted and cleaned for testing purposes.
Runs the jest tests for the API Client using code coverage and displays the coverage results in a browser. Note that the coverage only covers the API Client code and not the API itself.
These commands typically don't have to be run and are mostly here to support the commands described above.
Starts the production instance of the API Server. This generally not run on dev machines but instead is used by the openshift container running the application.
Starts the server loading the
.env.dev
environment variables, effectively wiring up to your development database.
Starts the server loading the
.env.testing
environment varibles, effectively wiring up to your testing database.
Runs typescript to compile the javascript that ends up in
dist/
Watches for changes to typescript files and runs the
build
command described above
Generates code based on the process described in Generating Code
A development hook to allow debugging within the OpenShift environment. This is typically never used and is largely untested but remains as a reminder of the possibility.
In this project, the Controllers used by the application represent all of the endpoints supported by the API. The Controllers use models to expose the shape of objects that they accept / return. These controllers and models are used by TSOA to generate the swagger.json
and the src/routes.ts
based on the Routes Template and the Controllers that are imported within the Controller Index according to the TSOA Config File.
The routes that are generated include code for authorization and validation and general error handling, however, the template can be changed to match the needs of the application.
The swagger.json
generated by TSOA (from the controllers and models) is then used by swagger-ts-client to generate a typescript / javascript client for the API. This client is created using the two templates:
Used to generate the Model Definitions
Used to generate the Base Api Client.
- if you the database migration fails, you may need to destroy the current database (by deleting the storage volume in openshift) and recreating it. This is because sometimes changesets are changed in development and liquibase keeps hash values for changesets so it will fail if it's already applied a changeset and it changes.
If you would like to contribute, please see our CONTRIBUTING guidelines.
Please note that this project is released with a Contributor Code of Conduct. By participating in this project you agree to abide by its terms.
Copyright 2016 Province of British Columbia
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.