Skip to content

Commit

Permalink
Adding Documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
pranavk1511 committed Dec 15, 2023
1 parent 6fefe8d commit 014a768
Show file tree
Hide file tree
Showing 7 changed files with 303 additions and 45 deletions.
43 changes: 43 additions & 0 deletions Documentation/BuildAMI.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# GitHub Actions Workflow: Build AMI and Deploy

This GitHub Actions workflow automates the process of building an Amazon Machine Image (AMI) and deploying a web application. The workflow is triggered on every push to the `main` branch.

## Workflow Structure

The workflow consists of the following jobs:

### 1. build-ami

This job runs on an Ubuntu-latest environment and includes the following steps:

- **Checkout code:** Fetches the latest code using the `actions/checkout` action.
- **Install dependencies:** Installs the required npm dependencies for the web application.
- **Display .env file:** Creates a `.env` file with secrets and displays its content.
- **Configure MySQL:** Updates and installs necessary packages, starts MySQL, and creates a database.
- **Run Tests:** Executes npm tests for the web application.
- **Create Zip Archive:** Zips the application files into `webapp.zip`.
- **Set up Packer:** Downloads and installs Packer, an open-source tool for creating identical machine images for multiple platforms.
- **Initialize Packer:** Initializes the Packer configuration file (`aws-ubuntu.pkr.hcl`).
- **Build AMI in dev account:** Uses Packer to build an AMI in the development AWS account. Sets the resulting AMI ID as an output variable.
- **Configure AWS CLI for Demo Account:** Configures AWS CLI for the demo account using secrets.
- **Instance Refresh Automation / Continuous Delivery:** Installs `jq` and performs an instance refresh for the Auto Scaling Group in the demo account. Monitors the progress until completion.

## Prerequisites

Before running this workflow, ensure the following:

1. **AWS Credentials:** Add AWS credentials for both the development and demo accounts as secrets in your GitHub repository.
2. **GitHub Repository Secrets:**
- `DB_HOST`, `DB_USER`, `DB_PASSWORD`, `DB_NAME`, `PORT`, `region`, `topicarn`, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`: Set these as repository secrets for secure access.
- `MY_SQL_PASSWORD`: Set the MySQL password as a secret.
- `AWS_ACCESS_KEY_ID_DEMO`, `AWS_SECRET_ACCESS_KEY_DEMO`, `AWS_REGION_DEMO`: Set these for the demo AWS account.

## Usage

1. Push changes to the `main` branch to trigger the workflow.
2. Monitor the workflow progress and check the logs for each step.
3. Upon successful completion, the web application will be deployed using the updated AMI in the demo account.

## License

This GitHub Actions workflow is licensed under the [MIT License](LICENSE).
37 changes: 37 additions & 0 deletions Documentation/CI.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# GitHub Actions Workflow: Node.js CI

This GitHub Actions workflow is designed for continuous integration (CI) of a Node.js project. It is triggered on every pull request to the `main` branch.

## Workflow Structure

The workflow has a single job named `build` that runs on an Ubuntu-latest environment. Here are the key steps:

1. **Checkout code:** Fetches the latest code using the `actions/checkout` action.
2. **Setup Node.js:** Sets up the Node.js environment with version 14 using the `actions/setup-node` action.
3. **Install Dependencies:** Runs `npm install` to install project dependencies.
4. **Display .env file:** Creates a `.env` file with secrets and displays its content.
5. **Configure MySQL:** Updates and installs necessary packages, starts MySQL, and creates a database (`Assignment3`).
6. **Run Tests:** Executes npm tests for the Node.js project.

## Prerequisites

Before running this workflow, ensure the following:

1. **GitHub Repository Secrets:**
- `DB_USER`, `DB_PASSWORD`, `DB_NAME`, `PORT`: Set these as repository secrets for secure access.
- `MY_SQL_PASSWORD`: Set the MySQL password as a secret.

## Usage

1. Open a pull request targeting the `main` branch.
2. Monitor the workflow progress and check the logs for each step.
3. The workflow will execute tests and MySQL configuration for the Node.js project.

## Additional Notes

- The workflow includes steps to configure the MySQL database, install dependencies, and run tests, providing a comprehensive CI process.
- Feel free to customize this workflow according to your project's specific requirements.

## License

This GitHub Actions workflow is licensed under the [MIT License](LICENSE).
99 changes: 99 additions & 0 deletions Documentation/Packer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,99 @@
# Packer Configuration for Creating Amazon Machine Image (AMI)

This Packer configuration is designed to create an Amazon Machine Image (AMI) for a web application. The resulting AMI includes the necessary configurations, files, and scripts for deploying the web application on an EC2 instance.

## Packer Variables

### profile

- Type: `string`
- Default: `dev_cli`
- Description: The AWS CLI profile to be used for building the AMI.

### source_ami

- Type: `string`
- Default: `ami-06db4d78cb1d3bbf9`
- Description: The source Amazon Machine Image (AMI) ID that will be used as a base for creating the custom AMI.

### instance_type

- Type: `string`
- Default: `t2.micro`
- Description: The EC2 instance type to be used for building the AMI.

### vpc_id

- Type: `string`
- Default: `vpc-055b7ed82be744193`
- Description: The ID of the Virtual Private Cloud (VPC) where the EC2 instance will be launched.

### subnet_id

- Type: `string`
- Default: `subnet-0edc53e23cb32476a`
- Description: The ID of the subnet within the VPC where the EC2 instance will be launched.

### region

- Type: `string`
- Default: `us-east-1`
- Description: The AWS region in which the AMI will be created.

### ssh_username

- Type: `string`
- Default: `admin`
- Description: The SSH username for connecting to the EC2 instance.

### ami_users

- Type: `list(string)`
- Default: `["026310524371", "009251910612"]`
- Description: A list of AWS account IDs that will have launch permission for the custom AMI.

## Packer Build Process

1. **AMAZON Plugin Requirement**: Packer requires the `amazon` plugin with a version greater than or equal to 1.0.0. Ensure it is installed before running the Packer build.

```sh
packer init
```

2. **AMAZON Source Block**: The Packer configuration includes an Amazon Machine Image (EBS-backed) source block named "debian." This block specifies the details for creating the custom AMI.

- `ami_name`: The name of the resulting AMI, including a timestamp for uniqueness.
- `profile`: The AWS CLI profile to be used.
- `source_ami`: The source Amazon Machine Image (AMI) ID.
- `instance_type`: The EC2 instance type.
- `vpc_id`: The ID of the Virtual Private Cloud (VPC).
- `subnet_id`: The ID of the subnet.
- `region`: The AWS region.
- `ssh_username`: The SSH username for connecting to the EC2 instance.
- `ami_users`: List of AWS account IDs with launch permissions.

3. **Packer Build Block**: The `build` block defines the build process, including provisioning steps.

- **File Provisioner**: Copies local files (`webapp.zip` and `nodeserver.service`) to the EC2 instance.
- **Shell Provisioner**: Executes shell scripts (`setup.sh`, `autostart.sh`, and `cloudwatch.sh`) on the EC2 instance for configuration.

4. **Post-Processor**: The Packer configuration includes a post-processor that creates a `manifest.json` file, providing information about the created AMI.

## Building the AMI

To build the custom AMI, follow these steps:

1. Install Packer: [Packer Installation Guide](https://www.packer.io/docs/install)
2. Navigate to the directory containing the Packer configuration file.
3. Create a `.env` file with the required environment variables.
4. Run the following command:

```sh
packer build packer-config.hcl
```

The resulting AMI will be available in your AWS account.

## License

This Packer configuration is licensed under the [MIT License](LICENSE).
37 changes: 37 additions & 0 deletions Documentation/PackerValidate.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
# GitHub Actions Workflow: Packer Template Validation

This GitHub Actions workflow is designed to validate a Packer template on every pull request to the `main` branch.

## Workflow Structure

The workflow has a single job named `Validation` that runs on an Ubuntu-latest environment. Here are the key steps:

1. **Check out code:** Fetches the latest code using the `actions/checkout` action.
2. **Create Zip Archive:** Generates a zip archive of the project files.
3. **Set up Packer:** Downloads and installs Packer, ensuring it's available in the environment.
4. **Initialize Packer:** Runs `packer init` to initialize the Packer configuration.
5. **Validate Packer Template:** Checks the validity of the Packer template using `packer validate`.
6. **Check Formatting:** Ensures the Packer template follows proper formatting using `packer fmt`.

## Prerequisites

Before running this workflow, ensure the following:

1. **Packer Template:** Include the Packer template file (`aws-ubuntu.pkr.hcl`) in your repository.
2. **GitHub Repository Structure:** The repository should have a well-structured layout for Packer usage.

## Usage

1. Open a pull request targeting the `main` branch.
2. Monitor the workflow progress and check the logs for each step.
3. The workflow will validate the Packer template, ensuring it meets the required standards.

## Additional Notes

- This workflow is useful for maintaining the quality and correctness of your Packer templates.
- Make sure to address any validation or formatting issues reported by the workflow.
- Customize the workflow to include additional steps or checks as per your project's requirements.

## License

This GitHub Actions workflow is licensed under the [MIT License](LICENSE).
111 changes: 77 additions & 34 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,56 +1,99 @@
# Web-App
# Assignment Management System πŸ“š

Setup On Local
1. Set Up your MySql
2. Set Download Dependenceies
3. nodemon app.js
This Node.js application is an Assignment Management System that allows users to create, update, and delete assignments. It also supports assignment submissions with validation checks, such as the maximum number of attempts and submission deadlines. The system includes user authentication using basic HTTP authentication with email and password.

### Steps for Digital Ocean Droplet
-ssh into your VM(Droplet)
ssh -i ~/.ssh/DigitalOcean root@"yourip"
## Technologies Used πŸš€

- sch you web-app
scp -i /Users/pranavkulkarni/.ssh/DigitalOcean -r /Users/pranavkulkarni/Desktop/CloudAssignmentGithub/Web-App-Local [email protected]:/root/Demo
- **Node.js**: The application is built using Node.js for server-side JavaScript programming.
- **Express**: Express is used as the web application framework for handling HTTP requests and responses.
- **Sequelize**: Sequelize is used as the ORM (Object-Relational Mapping) for interacting with the database.
- **bcrypt**: bcrypt is employed for password hashing and verification for user authentication.
- **AWS SDK**: The AWS SDK is used to interact with Amazon Simple Notification Service (SNS) for publishing messages.
- **Winston**: Winston is used for logging, providing both console and file-based logging.
- **CSV Parser**: The application can process CSV files for creating or updating user accounts.
- **StatsD**: StatsD is used for collecting custom application metrics.

-MARIA DB
## Database πŸ—ƒοΈ

sudo apt install mariadb-server mariadb-client
sudo systemctl start mariadb
sudo systemctl enable mariadb
mysql -u root -p
ALTER USER 'root'@'localhost' IDENTIFIED BY 'pranav';
FLUSH PRIVILEGES
CREATE DATABASE Assignment3;
SHOW DATABASES;
-install node and npm
The application uses Sequelize as the ORM to interact with the underlying relational database. The database schema includes tables for assignments, users, and submissions.

sudo apt install nodejs npm
## API Endpoints 🌐

--start stop maria db
### Health Check

sudo systemctl stop mariadb
sudo systemctl start mariadb
- `/healthz`: Performs a health check, verifying the connection to the database.

-- change .env
### Assignment Endpoints

vi .env
- `GET /v1/assignments`: Retrieve a list of all assignments.
- `GET /v1/assignments/:id`: Retrieve details of a specific assignment by ID.
- `POST /v1/assignments`: Create a new assignment with various validations:
- Name must be a non-empty string.
- Points must be a number between 1 and 10.
- Number of attempts must be a positive integer.
- Deadline must be a valid date.
- `PUT /v1/assignments/:id`: Update an assignment (restricted to the owner) with similar validations as the creation.
- `DELETE /v1/assignments/:id`: Delete an assignment (restricted to the owner).

### Assignment Submission

- `POST /v1/assignments/:id/submission`: Submit an assignment with a submission URL and the following validations:
- Submission URL must be a non-empty string.
- Submission URL must be a valid URL ending with '.zip'.
- Users cannot submit more times than the specified number of attempts.
- Submissions after the deadline are not allowed.

// ami buid check -1
// ami build check -2
// ami buid check -3 4
## Authentication πŸ”

changed env config for autostart
User authentication is implemented using basic HTTP authentication. Users are required to include an Authorization header with their email and password encoded in base64.

## Assignment Submission and SNS Integration πŸ“€

Users can submit assignments, and upon submission, a message is published to an AWS SNS topic. This can be used for notification purposes or integration with other services.

// Sanity Test
## Logging πŸ“

// Ami build for route 53
The application utilizes Winston for logging. Logs are output to both the console and a file (`app.log`).

// Canvas download check for demo - build ami
## CSV Processing πŸ“Š
The application includes a function (`processCSVFile`) to read and process a CSV file, creating or updating user accounts based on the data.

## Configuration βš™οΈ

// CD Check
Configuration settings, such as database connection details and AWS credentials, are managed using environment variables.

```env
DB_HOST=127.0.0.1
DB_USER=**your-user**
DB_PASSWORD=**your-password**
DB_NAME=**your-dbname**
PORT=3000
CSVPATH= "path-to-your-csv"
accessKeyId=**your-aws-acess-key-id**
secretAccessKey=**your-secretAccessKey**
region=us-east-1
topicarn=**your-topic arn**
```
## Running the Application πŸš€
1. Install dependencies: `npm install`
2. Set environment variables in a `.env` file.
3. Run the application: `npm start`

The server will run on port 3000 by default.

## Testing πŸ§ͺ

The application includes basic Integration tests. You can run tests using: `npm test`

## Deployment

The application can be deployed to a hosting service or cloud provider. Ensure that the necessary environment variables are configured for production use.

## Contributing

Contributions are welcome. Fork the repository, make changes, and submit a pull request.

## License

This project is licensed under the [MIT License](LICENSE).

Loading

0 comments on commit 014a768

Please sign in to comment.