Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

500 failed request #2

Open
jp555soul opened this issue Dec 27, 2022 · 2 comments
Open

500 failed request #2

jp555soul opened this issue Dec 27, 2022 · 2 comments

Comments

@jp555soul
Copy link

Any suggestions on how to get more information on this error?

AxiosError: Request failed with status code 500
    at settle (/Users/Documents/workspace/adva/node_modules/axios/dist/node/axios.cjs:1855:12)
    at IncomingMessage.handleStreamEnd (/Users/Documents/workspace/adva/node_modules/axios/dist/node/axios.cjs:2712:11)
    at IncomingMessage.emit (node:events:539:35)
    at endReadableNT (node:internal/streams/readable:1345:12)
    at processTicksAndRejections (node:internal/process/task_queues:83:21) {
  code: 'ERR_BAD_RESPONSE',
  config: {
    transitional: {
      silentJSONParsing: true,
      forcedJSONParsing: true,
      clarifyTimeoutError: false
    },
    adapter: [ 'xhr', 'http' ],
    transformRequest: [ [Function: transformRequest] ],
    transformResponse: [ [Function: transformResponse] ],
    timeout: 0,
    xsrfCookieName: 'XSRF-TOKEN',
    xsrfHeaderName: 'X-XSRF-TOKEN',
    maxContentLength: -1,
    maxBodyLength: -1,
    env: { FormData: [Function], Blob: null },
    validateStatus: [Function: validateStatus],
    headers: AxiosHeaders {
      Accept: 'application/json, text/plain, */*',
      'Content-Type': 'application/json',
      'User-Agent': 'axios/1.2.1',
      'Content-Length': '63',
      'Accept-Encoding': 'gzip, compress, deflate, br'
    },
    method: 'post',
    url: 'http://xx.8x.12.10x:200xx/generate',
    data: '{"prompt":"Write an image description of a NASA rocket launch"}'
  },
  request: <ref *1> ClientRequest {
    _events: [Object: null prototype] {
      abort: [Function (anonymous)],
      aborted: [Function (anonymous)],
      connect: [Function (anonymous)],
      error: [Function (anonymous)],
      socket: [Function (anonymous)],
      timeout: [Function (anonymous)],
      prefinish: [Function: requestOnPrefinish]
    },
    _eventsCount: 7,
    _maxListeners: undefined,
    outputData: [],
    outputSize: 0,
    writable: true,
    destroyed: false,
    _last: true,
    chunkedEncoding: false,
    shouldKeepAlive: false,
    maxRequestsOnConnectionReached: false,
    _defaultKeepAlive: true,
    useChunkedEncodingByDefault: true,
    sendDate: false,
    _removedConnection: false,
    _removedContLen: false,
    _removedTE: false,
    _contentLength: null,
    _hasBody: true,
    _trailer: '',
    finished: true,
    _headerSent: true,
    _closed: false,
    socket: Socket {
      connecting: false,
      _hadError: false,
      _parent: null,
      _host: null,
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 7,
      _maxListeners: undefined,
      _writableState: [WritableState],
      allowHalfOpen: false,
      _sockname: null,
      _pendingData: null,
      _pendingEncoding: '',
      server: null,
      _server: null,
      parser: null,
      _httpMessage: [Circular *1],
      [Symbol(async_id_symbol)]: 108,
      [Symbol(kHandle)]: [TCP],
      [Symbol(lastWriteQueueSize)]: 0,
      [Symbol(timeout)]: null,
      [Symbol(kBuffer)]: null,
      [Symbol(kBufferCb)]: null,
      [Symbol(kBufferGen)]: null,
      [Symbol(kCapture)]: false,
      [Symbol(kSetNoDelay)]: false,
      [Symbol(kSetKeepAlive)]: true,
      [Symbol(kSetKeepAliveInitialDelay)]: 60,
      [Symbol(kBytesRead)]: 0,
      [Symbol(kBytesWritten)]: 0,
      [Symbol(RequestTimeout)]: undefined
    },
    _header: 'POST /generate HTTP/1.1\r\n' +
      'Accept: application/json, text/plain, */*\r\n' +
      'Content-Type: application/json\r\n' +
      'User-Agent: axios/1.2.1\r\n' +
      'Content-Length: 63\r\n' +
      'Accept-Encoding: gzip, compress, deflate, br\r\n' +
      'Host: xx.8x.1x.10x:200xx\r\n' +
      'Connection: close\r\n' +
      '\r\n',
    _keepAliveTimeout: 0,
    _onPendingData: [Function: nop],
    agent: Agent {
      _events: [Object: null prototype],
      _eventsCount: 2,
      _maxListeners: undefined,
      defaultPort: 80,
      protocol: 'http:',
      options: [Object: null prototype],
      requests: [Object: null prototype] {},
      sockets: [Object: null prototype],
      freeSockets: [Object: null prototype] {},
      keepAliveMsecs: 1000,
      keepAlive: false,
      maxSockets: Infinity,
      maxFreeSockets: 256,
      scheduling: 'lifo',
      maxTotalSockets: Infinity,
      totalSocketCount: 1,
      [Symbol(kCapture)]: false
    },
    socketPath: undefined,
    method: 'POST',
    maxHeaderSize: undefined,
    insecureHTTPParser: undefined,
    path: '/generate',
    _ended: true,
    res: IncomingMessage {
      _readableState: [ReadableState],
      _events: [Object: null prototype],
      _eventsCount: 4,
      _maxListeners: undefined,
      socket: [Socket],
      httpVersionMajor: 1,
      httpVersionMinor: 1,
      httpVersion: '1.1',
      complete: true,
      rawHeaders: [Array],
      rawTrailers: [],
      aborted: false,
      upgrade: false,
      url: '',
      method: null,
      statusCode: 500,
      statusMessage: 'Internal Server Error',
      client: [Socket],
      _consuming: false,
      _dumped: false,
      req: [Circular *1],
      responseUrl: 'http://xx.8x.12.10x:200xx/generate',
      redirects: [],
      [Symbol(kCapture)]: false,
      [Symbol(kHeaders)]: [Object],
      [Symbol(kHeadersCount)]: 10,
      [Symbol(kTrailers)]: null,
      [Symbol(kTrailersCount)]: 0,
      [Symbol(RequestTimeout)]: undefined
    },
    aborted: false,
    timeoutCb: null,
    upgradeOrConnect: false,
    parser: null,
    maxHeadersCount: null,
    reusedSocket: false,
    host: 'xx.8x.12.10x:200xx',
    protocol: 'http:',
    _redirectable: Writable {
      _writableState: [WritableState],
      _events: [Object: null prototype],
      _eventsCount: 3,
      _maxListeners: undefined,
      _options: [Object],
      _ended: true,
      _ending: true,
      _redirectCount: 0,
      _redirects: [],
      _requestBodyLength: 63,
      _requestBodyBuffers: [],
      _onNativeResponse: [Function (anonymous)],
      _currentRequest: [Circular *1],
      _currentUrl: 'http://xx.8x.12.10x:200xx/generate',
      [Symbol(kCapture)]: false
    },
    [Symbol(kCapture)]: false,
    [Symbol(kNeedDrain)]: false,
    [Symbol(corked)]: 0,
    [Symbol(kOutHeaders)]: [Object: null prototype] {
      accept: [Array],
      'content-type': [Array],
      'user-agent': [Array],
      'content-length': [Array],
      'accept-encoding': [Array],
      host: [Array]
    }
  },
  response: {
    status: 500,
    statusText: 'Internal Server Error',
    headers: AxiosHeaders {
      date: 'Tue, 27 Dec 2022 19:08:27 GMT',
      server: 'uvicorn',
      'content-length': '21',
      'content-type': 'text/plain; charset=utf-8',
      connection: 'close'
    },
    config: {
      transitional: [Object],
      adapter: [Array],
      transformRequest: [Array],
      transformResponse: [Array],
      timeout: 0,
      xsrfCookieName: 'XSRF-TOKEN',
      xsrfHeaderName: 'X-XSRF-TOKEN',
      maxContentLength: -1,
      maxBodyLength: -1,
      env: [Object],
      validateStatus: [Function: validateStatus],
      headers: [AxiosHeaders],
      method: 'post',
      url: 'http://xx.8x.12.10x:200xx/generate',
      data: '{"prompt":"Write an image description of a NASA rocket launch"}'
    },
    request: <ref *1> ClientRequest {
      _events: [Object: null prototype],
      _eventsCount: 7,
      _maxListeners: undefined,
      outputData: [],
      outputSize: 0,
      writable: true,
      destroyed: false,
      _last: true,
      chunkedEncoding: false,
      shouldKeepAlive: false,
      maxRequestsOnConnectionReached: false,
      _defaultKeepAlive: true,
      useChunkedEncodingByDefault: true,
      sendDate: false,
      _removedConnection: false,
      _removedContLen: false,
      _removedTE: false,
      _contentLength: null,
      _hasBody: true,
      _trailer: '',
      finished: true,
      _headerSent: true,
      _closed: false,
      socket: [Socket],
      _header: 'POST /generate HTTP/1.1\r\n' +
        'Accept: application/json, text/plain, */*\r\n' +
        'Content-Type: application/json\r\n' +
        'User-Agent: axios/1.2.1\r\n' +
        'Content-Length: 63\r\n' +
        'Accept-Encoding: gzip, compress, deflate, br\r\n' +
        'Host: xx.8x.12.10x:200xx\r\n' +
        'Connection: close\r\n' +
        '\r\n',
      _keepAliveTimeout: 0,
      _onPendingData: [Function: nop],
      agent: [Agent],
      socketPath: undefined,
      method: 'POST',
      maxHeaderSize: undefined,
      insecureHTTPParser: undefined,
      path: '/generate',
      _ended: true,
      res: [IncomingMessage],
      aborted: false,
      timeoutCb: null,
      upgradeOrConnect: false,
      parser: null,
      maxHeadersCount: null,
      reusedSocket: false,
      host: 'xx.8x.12.10x:200xx',
      protocol: 'http:',
      _redirectable: [Writable],
      [Symbol(kCapture)]: false,
      [Symbol(kNeedDrain)]: false,
      [Symbol(corked)]: 0,
      [Symbol(kOutHeaders)]: [Object: null prototype]
    },
    data: 'Internal Server Error'
  }
}
@monatis
Copy link
Owner

monatis commented Dec 28, 2022

Hi @jp555soul you can checkout the logs with the command docker compose logs. If it doesn't give a clue for what's going wrong, you can share the request you're trying with I'll have a look.

@jp555soul
Copy link
Author

Hey @monatis TY!

Output from docker logs below.

Based on some searching I'm not seeing a clear cut solution. Also attached a screenshot of the hardware to double-check its not a memory issue.

Screenshot 2022-12-28 at 10 33 04 AM

stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858046: I tensorflow/core/common_runtime/bfc_allocator.cc:1097] 1 Chunks of size 176947200 totalling 168.75MiB
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858144: I tensorflow/core/common_runtime/bfc_allocator.cc:1097] 2 Chunks of size 8589934592 totalling 16.00GiB
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858244: I tensorflow/core/common_runtime/bfc_allocator.cc:1101] Sum Total of in-use chunks: 20.46GiB
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858337: I tensorflow/core/common_runtime/bfc_allocator.cc:1103] total_region_allocated_bytes_: 23381475328 memory_limit_: 23381475328 available bytes: 0 curr_region_allocation_bytes_: 46762950656
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858366: I tensorflow/core/common_runtime/bfc_allocator.cc:1109] Stats:
stable-diffusion-tf-docker-app-1  | Limit:                     23381475328
stable-diffusion-tf-docker-app-1  | InUse:                     21972676864
stable-diffusion-tf-docker-app-1  | MaxInUse:                  21972777472
stable-diffusion-tf-docker-app-1  | NumAllocs:                        6794
stable-diffusion-tf-docker-app-1  | MaxAllocSize:               8589934592
stable-diffusion-tf-docker-app-1  | Reserved:                            0
stable-diffusion-tf-docker-app-1  | PeakReserved:                        0
stable-diffusion-tf-docker-app-1  | LargestFreeBlock:                    0
stable-diffusion-tf-docker-app-1  |
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858497: W tensorflow/core/common_runtime/bfc_allocator.cc:491] ***********************************************************************************************_____
stable-diffusion-tf-docker-app-1  | 2022-12-27 19:08:37.858619: W tensorflow/core/framework/op_kernel.cc:1780] OP_REQUIRES failed at softmax_op_gpu.cu.cc:222 : RESOURCE_EXHAUSTED: OOM when allocating tensor with shape[1,8,16384,16384] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc
stable-diffusion-tf-docker-app-1  | INFO:     76.90.83.17:54975 - "POST /generate HTTP/1.1" 500 Internal Server Error
 49 981:   0%|          | 0/50 [00:10<?, ?it/s]
stable-diffusion-tf-docker-app-1  | ERROR:    Exception in ASGI application
stable-diffusion-tf-docker-app-1  | Traceback (most recent call last):
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/uvicorn/protocols/http/httptools_impl.py", line 419, in run_asgi
stable-diffusion-tf-docker-app-1  |     result = await app(  # type: ignore[func-returns-value]
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/uvicorn/middleware/proxy_headers.py", line 78, in __call__
stable-diffusion-tf-docker-app-1  |     return await self.app(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/fastapi/applications.py", line 270, in __call__
stable-diffusion-tf-docker-app-1  |     await super().__call__(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/applications.py", line 124, in __call__
stable-diffusion-tf-docker-app-1  |     await self.middleware_stack(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/middleware/errors.py", line 184, in __call__
stable-diffusion-tf-docker-app-1  |     raise exc
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/middleware/errors.py", line 162, in __call__
stable-diffusion-tf-docker-app-1  |     await self.app(scope, receive, _send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/middleware/exceptions.py", line 79, in __call__
stable-diffusion-tf-docker-app-1  |     raise exc
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/middleware/exceptions.py", line 68, in __call__
stable-diffusion-tf-docker-app-1  |     await self.app(scope, receive, sender)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/fastapi/middleware/asyncexitstack.py", line 21, in __call__
stable-diffusion-tf-docker-app-1  |     raise e
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/fastapi/middleware/asyncexitstack.py", line 18, in __call__
stable-diffusion-tf-docker-app-1  |     await self.app(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/routing.py", line 706, in __call__
stable-diffusion-tf-docker-app-1  |     await route.handle(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/routing.py", line 276, in handle
stable-diffusion-tf-docker-app-1  |     await self.app(scope, receive, send)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/routing.py", line 66, in app
stable-diffusion-tf-docker-app-1  |     response = await func(request)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/fastapi/routing.py", line 235, in app
stable-diffusion-tf-docker-app-1  |     raw_response = await run_endpoint_function(
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/fastapi/routing.py", line 163, in run_endpoint_function
stable-diffusion-tf-docker-app-1  |     return await run_in_threadpool(dependant.call, **values)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/starlette/concurrency.py", line 41, in run_in_threadpool
stable-diffusion-tf-docker-app-1  |     return await anyio.to_thread.run_sync(func, *args)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/anyio/to_thread.py", line 31, in run_sync
stable-diffusion-tf-docker-app-1  |     return await get_asynclib().run_sync_in_worker_thread(
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread
stable-diffusion-tf-docker-app-1  |     return await future
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/anyio/_backends/_asyncio.py", line 867, in run
stable-diffusion-tf-docker-app-1  |     result = context.run(func, *args)
stable-diffusion-tf-docker-app-1  |   File "/app/./app.py", line 45, in generate
stable-diffusion-tf-docker-app-1  |     img = generator.generate(req.prompt, num_steps=req.steps, unconditional_guidance_scale=req.scale, temperature=1, batch_size=1, seed=req.seed)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/stable_diffusion.py", line 116, in generate
stable-diffusion-tf-docker-app-1  |     e_t = self.get_model_output(
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/stable_diffusion.py", line 190, in get_model_output
stable-diffusion-tf-docker-app-1  |     unconditional_latent = self.diffusion_model.predict_on_batch(
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 2474, in predict_on_batch
stable-diffusion-tf-docker-app-1  |     outputs = self.predict_function(iterator)
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/util/traceback_utils.py", line 153, in error_handler
stable-diffusion-tf-docker-app-1  |     raise e.with_traceback(filtered_tb) from None
stable-diffusion-tf-docker-app-1  |   File "/usr/local/lib/python3.8/dist-packages/tensorflow/python/eager/execute.py", line 54, in quick_execute
stable-diffusion-tf-docker-app-1  |     tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name,
stable-diffusion-tf-docker-app-1  | tensorflow.python.framework.errors_impl.ResourceExhaustedError: Graph execution error:
stable-diffusion-tf-docker-app-1  |
stable-diffusion-tf-docker-app-1  | Detected at node 'model_1/u_net_model/spatial_transformer/basic_transformer_block/cross_attention/Softmax' defined at (most recent call last):
stable-diffusion-tf-docker-app-1  |     File "/usr/lib/python3.8/threading.py", line 890, in _bootstrap
stable-diffusion-tf-docker-app-1  |       self._bootstrap_inner()
stable-diffusion-tf-docker-app-1  |     File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
stable-diffusion-tf-docker-app-1  |       self.run()
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/anyio/_backends/_asyncio.py", line 867, in run
stable-diffusion-tf-docker-app-1  |       result = context.run(func, *args)
stable-diffusion-tf-docker-app-1  |     File "/app/./app.py", line 45, in generate
stable-diffusion-tf-docker-app-1  |       img = generator.generate(req.prompt, num_steps=req.steps, unconditional_guidance_scale=req.scale, temperature=1, batch_size=1, seed=req.seed)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/stable_diffusion.py", line 116, in generate
stable-diffusion-tf-docker-app-1  |       e_t = self.get_model_output(
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/stable_diffusion.py", line 190, in get_model_output
stable-diffusion-tf-docker-app-1  |       unconditional_latent = self.diffusion_model.predict_on_batch(
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 2474, in predict_on_batch
stable-diffusion-tf-docker-app-1  |       outputs = self.predict_function(iterator)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 2041, in predict_function
stable-diffusion-tf-docker-app-1  |       return step_function(self, iterator)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 2027, in step_function
stable-diffusion-tf-docker-app-1  |       outputs = model.distribute_strategy.run(run_step, args=(data,))
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 2015, in run_step
stable-diffusion-tf-docker-app-1  |       outputs = model.predict_step(data)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 1983, in predict_step
stable-diffusion-tf-docker-app-1  |       return self(x, training=False)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 557, in __call__
stable-diffusion-tf-docker-app-1  |       return super().__call__(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 1097, in __call__
stable-diffusion-tf-docker-app-1  |       outputs = call_fn(inputs, *args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 96, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/functional.py", line 510, in call
stable-diffusion-tf-docker-app-1  |       return self._run_internal_graph(inputs, training=training, mask=mask)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/functional.py", line 667, in _run_internal_graph
stable-diffusion-tf-docker-app-1  |       outputs = node.layer(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/training.py", line 557, in __call__
stable-diffusion-tf-docker-app-1  |       return super().__call__(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 1097, in __call__
stable-diffusion-tf-docker-app-1  |       outputs = call_fn(inputs, *args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 96, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 207, in call
stable-diffusion-tf-docker-app-1  |       for b in self.input_blocks:
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 208, in call
stable-diffusion-tf-docker-app-1  |       for layer in b:
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 209, in call
stable-diffusion-tf-docker-app-1  |       x = apply(x, layer)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 198, in apply
stable-diffusion-tf-docker-app-1  |       if isinstance(layer, ResBlock):
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 200, in apply
stable-diffusion-tf-docker-app-1  |       elif isinstance(layer, SpatialTransformer):
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 201, in apply
stable-diffusion-tf-docker-app-1  |       x = layer([x, context])
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 1097, in __call__
stable-diffusion-tf-docker-app-1  |       outputs = call_fn(inputs, *args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 96, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 112, in call
stable-diffusion-tf-docker-app-1  |       for block in self.transformer_blocks:
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 113, in call
stable-diffusion-tf-docker-app-1  |       x = block([x, context])
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 1097, in __call__
stable-diffusion-tf-docker-app-1  |       outputs = call_fn(inputs, *args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 96, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 91, in call
stable-diffusion-tf-docker-app-1  |       x = self.attn1([self.norm1(x)]) + x
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 65, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/engine/base_layer.py", line 1097, in __call__
stable-diffusion-tf-docker-app-1  |       outputs = call_fn(inputs, *args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/utils/traceback_utils.py", line 96, in error_handler
stable-diffusion-tf-docker-app-1  |       return fn(*args, **kwargs)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/stable_diffusion_tf/diffusion_model.py", line 67, in call
stable-diffusion-tf-docker-app-1  |       weights = keras.activations.softmax(score)  # (bs, num_heads, time, time)
stable-diffusion-tf-docker-app-1  |     File "/usr/local/lib/python3.8/dist-packages/keras/activations.py", line 84, in softmax
stable-diffusion-tf-docker-app-1  |       output = tf.nn.softmax(x, axis=axis)
stable-diffusion-tf-docker-app-1  | Node: 'model_1/u_net_model/spatial_transformer/basic_transformer_block/cross_attention/Softmax'
stable-diffusion-tf-docker-app-1  | OOM when allocating tensor with shape[1,8,16384,16384] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc
stable-diffusion-tf-docker-app-1  | 	 [[{{node model_1/u_net_model/spatial_transformer/basic_transformer_block/cross_attention/Softmax}}]]
stable-diffusion-tf-docker-app-1  | Hint: If you want to see a list of allocated tensors when OOM happens, add report_tensor_allocations_upon_oom to RunOptions for current allocation info. This isn't available when running in Eager mode.
stable-diffusion-tf-docker-app-1  |  [Op:__inference_predict_function_40478]
root@c00ecfce-f3eb-4261-a178-e3461c736aec:/home/user/stable-diffusion-tf-docker#

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants