diff --git a/ChatQnA/docker_compose/intel/cpu/xeon/README.md b/ChatQnA/docker_compose/intel/cpu/xeon/README.md index 707ff52e5..4598c07ec 100644 --- a/ChatQnA/docker_compose/intel/cpu/xeon/README.md +++ b/ChatQnA/docker_compose/intel/cpu/xeon/README.md @@ -311,7 +311,7 @@ For details on how to verify the correctness of the response, refer to [how-to-v Try the command below to check whether the LLM serving is ready. ```bash - docker logs ${CONTAINER_ID} | grep Connected + docker logs tgi-service | grep Connected ``` If the service is ready, you will get the response like below. diff --git a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md index 2300996e4..43aa720f0 100644 --- a/ChatQnA/docker_compose/intel/hpu/gaudi/README.md +++ b/ChatQnA/docker_compose/intel/hpu/gaudi/README.md @@ -320,7 +320,7 @@ For validation details, please refer to [how-to-validate_service](./how_to_valid Try the command below to check whether the LLM serving is ready. ```bash - docker logs ${CONTAINER_ID} | grep Connected + docker logs tgi-service | grep Connected ``` If the service is ready, you will get the response like below.