-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
hitting 512 MB memory quota on heroku after just 2 API calls #101
Labels
Comments
mfekadu
added
bug
Something isn't working
bug:performance
bugs related to the performance of the Nimbus system
labels
Mar 2, 2020
Additional Problems
For a moment, when the answers were incorrect, I hypothesized that the wrong
|
a more broad memory profilePOST data
endpoint
request{
"question": "Who is the contact for Color Coded?"
} response{
"answer": "Color Coded's advisor is Foaad Khosmood.",
"session": "SOME_NEW_TOKEN"
} memory profile logs just after POST
|
11 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Bug Description
The Heroku Standard 1x Dyno has a 512 MB memory quota and we have hit the limit after just
/ask
endpointScreenshots
screenshots
Proposed solutions
1. switch to GCP and choose an appropriately sized virtual machine
.github/workflows/*.yml
to handle GCP deployment2. scale the heroku dyno to standard-2x or something else
This would get expensive, quickly...
3. identify memory usage & performance improvements for our code
maybe avoid the use of
pandas
inQA
and useimport csv
insteadonly
read_csv
is importedapi/QA.py
Line 10 in 48336b0
reference @zpdeng 's usage of
csv
moduleapi/database_wrapper.py
Lines 426 to 433 in 48336b0
it may be a good idea to actually test if
pandas
is truly hogging memorymaybe
spacy
?maybe
nltk
?maybe
SQLAlchemy
?maybe
Flask
?maybe
gcloud
?maybe
another_package_expected_to_be_large
?consider this output of
python3 -m memory_profiler flask_api.py
memory profile
git diff
The text was updated successfully, but these errors were encountered: