Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

1.0.2: test_startup_without_pattern and test_startup_with_pattern_and_callback fails #154

Open
mtelka opened this issue Jul 25, 2024 · 3 comments · May be fixed by #157
Open

1.0.2: test_startup_without_pattern and test_startup_with_pattern_and_callback fails #154

mtelka opened this issue Jul 25, 2024 · 3 comments · May be fixed by #157
Labels

Comments

@mtelka
Copy link

mtelka commented Jul 25, 2024

Describe the bug
I run tests for pytest-xprocess version 1.0.2 and the following tests fails:

=========================== short test summary info ============================
FAILED test_process_initialization.py::test_startup_without_pattern[s1] - ConnectionRefusedError: [Errno 146] Connection refused
FAILED test_process_initialization.py::test_startup_without_pattern[s2] - ConnectionRefusedError: [Errno 146] Connection refused
FAILED test_process_initialization.py::test_startup_without_pattern[s3] - ConnectionRefusedError: [Errno 146] Connection refused
FAILED test_process_initialization.py::test_startup_with_pattern_and_callback[s1-will not match-21] - Failed: DID NOT RAISE <class 'RuntimeError'>
FAILED test_process_initialization.py::test_startup_with_pattern_and_callback[s2-spam, bacon, eggs-30] - ConnectionRefusedError: [Errno 146] Connection refused
FAILED test_process_initialization.py::test_startup_with_pattern_and_callback[s3-finally started-130] - ConnectionRefusedError: [Errno 146] Connection refused
======================== 6 failed, 45 passed in 41.26s =========================

To Reproduce
Run tests. It is easily reproducible in my environment.

Expected behavior
All tests pass.

Screenshots
N/A

Environment (please complete the following information):

  • OS: OpenIndiana
  • Python Version 3.9.19
  • pytest-xprocess version 1.0.2

Additional context
I found that when I do this:

--- pytest-xprocess-1.0.2/tests/test_process_initialization.py.orig
+++ pytest-xprocess-1.0.2/tests/test_process_initialization.py
@@ -1,5 +1,6 @@
 import socket
 import sys
+import time
 from pathlib import Path
 
 import pytest
@@ -126,6 +127,7 @@
         args = [sys.executable, server_path, tcp_port, "--no-children"]
 
         def startup_check(self):
+            time.sleep(1)
             return request_response_cycle(tcp_port, data)
 
     xprocess.ensure(proc_name, Starter)
@@ -153,6 +155,7 @@
         args = [sys.executable, server_path, tcp_port, "--no-children"]
 
         def startup_check(self):
+            time.sleep(1)
             return request_response_cycle(tcp_port, data)
 
     if proc_name == "s1":

then all tests pass. It looks like the problem is some kind of race condition.

@mtelka mtelka added the bug label Jul 25, 2024
@iv-m
Copy link

iv-m commented Aug 29, 2024

I see same problem on my riscv64 linux box.

Here is my understanding of what's happening. On fast machines, startup_check for these tests never fails: by the time it's called the server is already started. That's why tests are passing on most architectures and OSes. But on slower machines, the pytest process may be faster then server.py, and startup_check fails when it's run for the first time.

But request_response_cycle function does not return False when it fails to connect to server; it raises an exception instead; and the exceptions from the startup_check callback are not handled by the call site.

To reproduce the issue on any machine, you can artificially make the server slower:

--- a/pytest-xprocess/tests/server.py
+++ b/pytest-xprocess/tests/server.py
@@ -76,6 +76,7 @@ if __name__ == "__main__":
         while True:
             sleep(1)

+    sleep(1)
     HOST, PORT = "localhost", int(sys.argv[1])
     server = TestServer((HOST, PORT), TestHandler)
     if "--ignore-sigterm" in sys.argv and sys.platform != "win32":

The simplest way to make the tests pass would be to handle exceptions in the callbacks of the test cases:

--- a/pytest-xprocess/tests/test_process_initialization.py
+++ b/pytest-xprocess/tests/test_process_initialization.py
@@ -126,7 +126,10 @@ def test_startup_without_pattern(tcp_port, proc_name, xprocess):
         args = [sys.executable, server_path, tcp_port, "--no-children"]

         def startup_check(self):
-            return request_response_cycle(tcp_port, data)
+            try:
+                return request_response_cycle(tcp_port, data)
+            except Exception:
+                return False

     xprocess.ensure(proc_name, Starter)
     info = xprocess.getinfo(proc_name)
@@ -153,7 +156,10 @@ def test_startup_with_pattern_and_callback(
         args = [sys.executable, server_path, tcp_port, "--no-children"]

         def startup_check(self):
-            return request_response_cycle(tcp_port, data)
+            try:
+                return request_response_cycle(tcp_port, data)
+            except Exception:
+                return False

     if proc_name == "s1":
         with pytest.raises(RuntimeError):

But maybe the exceptions should be handled in ProcessStarter.wait_callback?

@iv-m
Copy link

iv-m commented Sep 3, 2024

But maybe the exceptions should be handled in ProcessStarter.wait_callback?

@northernSage @mtelka WDYT?

@mtelka
Copy link
Author

mtelka commented Sep 3, 2024

But maybe the exceptions should be handled in ProcessStarter.wait_callback?

@northernSage @mtelka WDYT?

@iv-m IHNI

U2FsdGVkX1 added a commit to U2FsdGVkX1/pytest-xprocess that referenced this issue Nov 22, 2024
U2FsdGVkX1 added a commit to U2FsdGVkX1/pytest-xprocess that referenced this issue Nov 22, 2024
@U2FsdGVkX1 U2FsdGVkX1 linked a pull request Nov 22, 2024 that will close this issue
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants