- LLM that support standard OpenAI api
- file
data.pdf
as context
# Virtualenv
virtualenv opsta-line-bot
source opsta-line-bot/bin/activate
# Install Requirement
pip install -r requirements.txt
# Export Line channel access token and secret
export LINE_CHANNEL_ACCESS_TOKEN=YOURTOKEN
export LINE_CHANNEL_SECRET=YOURSECRET
export OPENAI_API_KEY=YOUROPENAIKEY
# Or if you have .env file
export $(xargs <.env)
# This will run on port 5000
flask --app chat run --host 0.0.0.0 --port 5000
docker compose up
apiVersion: v1
kind: Secret
metadata:
name: opsta-line-bot-secrets-dev
namespace: demo-opsta-line-bot-dev
type: Opaque
stringData:
LINE_CHANNEL_SECRET: CHANGEME
LINE_CHANNEL_ACCESS_TOKEN: CHANGEME
PDF_FILE: https://storage.googleapis.com/bucket/file.pdf
OPENAI_API_BASE: https://api.openai.com/v1
- Add the following to
iac/helm-values/*
to increase container security
securityContext:
readOnlyRootFilesystem: true
allowPrivilegeEscalation: false
runAsNonRoot: true
runAsUser: 65532
runAsGroup: 65532
seccompProfile:
type: RuntimeDefault
capabilities:
drop:
- ALL
We are the DevSecOps Platform Specialist Team that focuses on DevSecOps Transformation with Platform Engineering. With our product, Opstella: a DevSecOps Platform Engineering Portal, is also an Internal Developer Portal (IDP) designed to simplify the development process and operations. It accelerates and scales applications quickly, ensuring standard security compliance. With Opstella, you can significantly reduce costs by centralizing and optimizing DevSecOps and Platform Engineering practices.