How to Use TensorFlow Model Inference on FastAPI ?
On Harti, Our Challenge is the Container Size. Our Container Size is in 1.8 GB
docker build -t simpleapps:latest .
docker run -p 8080:8080 simpleapps:latest
Using FastAPI, Our Model Can Perform Image Classification on Cat or Dog