Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error code 500 #12

Open
sahilmtayade opened this issue Sep 7, 2023 · 3 comments
Open

Error code 500 #12

sahilmtayade opened this issue Sep 7, 2023 · 3 comments
Labels
bug Something isn't working

Comments

@sahilmtayade
Copy link

Describe the bug
Successfully ran docker line for gpu machine using normal docker.
docker run -it -d --gpus 0 -p 5000:5000 graykode/ai-docstring
Extension activates but then encounters error code 500

Versions (please complete the following information):

  • autoDocstring Version:
  • Operating System:
  • Vscode Version:

Original Code (with line to generate on):

def gaussDeriv2D(sigma):
    # generate on this line
    # Your code to generate Gx and Gy here. use partial dervitive of gaussian equations
    Gy = sigma * 5
    Gx = sigma * 3

    return Gx, Gy

Expected Result:

"""
Docstring generated
"""

Actual Result:
"""AI is creating summary for gaussDeriv2D

Args:
    sigma ([type]): [description]

Returns:
    [type]: [description]
"""

Stack trace:

[INFO 23:26:14.797] Generating Docstring at line: 1
[INFO 23:26:14.797] Docstring generated:
    """${1:AI is creating summary for gaussDeriv2D}

    Args:
        sigma (${2:[type]}): ${3:[description]}

    Returns:
        ${4:[type]}: ${5:[description]}
    """
[INFO 23:26:14.797] Inserting at position: 1 0
[INFO 23:26:14.811] Successfully inserted docstring
[ERROR 23:26:14.837] Error: Request failed with status code 500

Additional context
Add any other context about the problem here.

@sahilmtayade sahilmtayade added the bug Something isn't working label Sep 7, 2023
@sahilmtayade
Copy link
Author

On Windows, not WSL

@sahilmtayade
Copy link
Author

When running on WSL terminal (Ubuntu) I get Socket hung up error instead of code 500

@sahilmtayade
Copy link
Author

Installed nvidia docker on WSL ubuntu 20.04 and opened container. Turns out my GPU is too new. I don't know how to fix this

2023-09-07 00:21:59 09/07/2023 04:21:59 - INFO - utils -   Namespace(beam_size=10, device=device(type='cuda'), device_name='cuda', host='0.0.0.0', max_source_length=512, max_target_length=128, no_cuda=False, port=5000)
2023-09-07 00:21:59 09/07/2023 04:21:59 - INFO - filelock -   Lock 139890302154512 acquired on /root/.cache/torch/transformers/1b62771d5f5169b34713b0af1ab85d80e11f7b1812fbf3ee7d03a866c5f58e72.06eb31f0a63f4e8a136733ccac422f0abf9ffa87c3e61104b57e7075a704d008.lock
Downloading:   0%|                                                                                                                       Downloading: 100%|##############################################################################################################################################| 498/498 [00:00<00:00, 469kB/s]
2023-09-07 00:21:59 09/07/2023 04:21:59 - INFO - filelock -   Lock 139890302154512 released on /root/.cache/torch/transformers/1b62771d5f5169b34713b0af1ab85d80e11f7b1812fbf3ee7d03a866c5f58e72.06eb31f0a63f4e8a136733ccac422f0abf9ffa87c3e61104b57e7075a704d008.lock
2023-09-07 00:22:00 09/07/2023 04:22:00 - INFO - filelock -   Lock 139890302154512 acquired on /root/.cache/torch/transformers/aca4dbdf4f074d4e071c2664901fec33c8aa69c35aa0101bc669ed4b44d1f6c3.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock
Downloading:   0%|                                                                                                                       Downloading:   8%|##########6                                                                                                            Downloading:  40%|########################################################1                                                              Downloading:  83%|##################################################################################################################9    Downloading: 100%|###########################################################################################################################################| 899k/899k [00:00<00:00, 2.58MB/s]
2023-09-07 00:22:00 09/07/2023 04:22:00 - INFO - filelock -   Lock 139890302154512 released on /root/.cache/torch/transformers/aca4dbdf4f074d4e071c2664901fec33c8aa69c35aa0101bc669ed4b44d1f6c3.6a4061e8fc00057d21d80413635a86fdcf55b6e7594ad9e25257d2f99a02f4be.lock
2023-09-07 00:22:00 09/07/2023 04:22:00 - INFO - filelock -   Lock 139890830138672 acquired on /root/.cache/torch/transformers/779a2f0c38ba2ff65d9a3ee23e58db9568f44a20865c412365e3dc540f01743f.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock
Downloading:   0%|                                                                                                                       Downloading:  27%|#####################################3                                                                                 Downloading: 100%|###########################################################################################################################################| 456k/456k [00:00<00:00, 2.24MB/s]
2023-09-07 00:22:01 09/07/2023 04:22:01 - INFO - filelock -   Lock 139890830138672 released on /root/.cache/torch/transformers/779a2f0c38ba2ff65d9a3ee23e58db9568f44a20865c412365e3dc540f01743f.70bec105b4158ed9a1747fea67a43f5dee97855c64d62b6ec3742f4cfdb5feda.lock
2023-09-07 00:22:01 09/07/2023 04:22:01 - INFO - filelock -   Lock 139890830138672 acquired on /root/.cache/torch/transformers/5a191080da4f00859b5d3d29529f57894583e00ab07b7c940d65c33db4b25d4d.16f949018cf247a2ea7465a74ca9a292212875e5fd72f969e0807011e7f192e4.lock
Downloading:   0%|                                                                                                                       Downloading: 100%|##############################################################################################################################################| 150/150 [00:00<00:00, 107kB/s]
2023-09-07 00:22:02 09/07/2023 04:22:02 - INFO - filelock -   Lock 139890830138672 released on /root/.cache/torch/transformers/5a191080da4f00859b5d3d29529f57894583e00ab07b7c940d65c33db4b25d4d.16f949018cf247a2ea7465a74ca9a292212875e5fd72f969e0807011e7f192e4.lock
2023-09-07 00:22:02 09/07/2023 04:22:02 - INFO - filelock -   Lock 139890302154512 acquired on /root/.cache/torch/transformers/1b4723c5fb2d933e11c399450ea233aaf33f093b5cbef3ec864624735380e490.70b5dbd5d3b9b4c9bfb3d1f6464291ff52f6a8d96358899aa3834e173b45092d.lock
Downloading:   0%|                                                                                                                       Downloading: 100%|###########################################################################################################################################| 25.0/25.0 [00:00<00:00, 18.7kB/s]
2023-09-07 00:22:02 09/07/2023 04:22:02 - INFO - filelock -   Lock 139890302154512 released on /root/.cache/torch/transformers/1b4723c5fb2d933e11c399450ea233aaf33f093b5cbef3ec864624735380e490.70b5dbd5d3b9b4c9bfb3d1f6464291ff52f6a8d96358899aa3834e173b45092d.lock
2023-09-07 00:22:07 /usr/local/lib/python3.6/dist-packages/torch/cuda/__init__.py:125: UserWarning: 
2023-09-07 00:22:07 NVIDIA GeForce RTX 3060 Laptop GPU with CUDA capability sm_86 is not compatible with the current PyTorch installation.
2023-09-07 00:22:07 The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70 sm_75.
2023-09-07 00:22:07 If you want to use the NVIDIA GeForce RTX 3060 Laptop GPU GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/
2023-09-07 00:22:07 
2023-09-07 00:22:07   warnings.warn(incompatible_device_warn.format(device_name, capability, " ".join(arch_list), device_name))
2023-09-07 00:22:09  * Serving Flask app "app" (lazy loading)
2023-09-07 00:22:09  * Environment: production
2023-09-07 00:22:09    WARNING: This is a development server. Do not use it in a production deployment.
2023-09-07 00:22:09    Use a production WSGI server instead.
2023-09-07 00:22:09  * Debug mode: on
2023-09-07 00:22:09 09/07/2023 04:22:09 - INFO - werkzeug -    * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)
2023-09-07 00:22:09 09/07/2023 04:22:09 - INFO - werkzeug -    * Restarting with stat
2023-09-07 00:22:10 09/07/2023 04:22:10 - INFO - utils -   Namespace(beam_size=10, device=device(type='cuda'), device_name='cuda', host='0.0.0.0', max_source_length=512, max_target_length=128, no_cuda=False, port=5000)
2023-09-07 00:22:15 /usr/local/lib/python3.6/dist-packages/torch/cuda/__init__.py:125: UserWarning: 
2023-09-07 00:22:15 NVIDIA GeForce RTX 3060 Laptop GPU with CUDA capability sm_86 is not compatible with the current PyTorch installation.
2023-09-07 00:22:15 The current PyTorch install supports CUDA capabilities sm_37 sm_50 sm_60 sm_70 sm_75.
2023-09-07 00:22:15 If you want to use the NVIDIA GeForce RTX 3060 Laptop GPU GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/
2023-09-07 00:22:15 
2023-09-07 00:22:15   warnings.warn(incompatible_device_warn.format(device_name, capability, " ".join(arch_list), device_name))
2023-09-07 00:22:15 09/07/2023 04:22:15 - WARNING - werkzeug -    * Debugger is active!
2023-09-07 00:22:15 09/07/2023 04:22:15 - INFO - werkzeug -    * Debugger PIN: 195-397-013

Repository owner deleted a comment from Azerxim Feb 6, 2024
Repository owner deleted a comment from Vader-WenboZhao Feb 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants
@sahilmtayade and others