Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add functionality for test flavors #102

Merged
merged 4 commits into from
Aug 30, 2024
Merged

Add functionality for test flavors #102

merged 4 commits into from
Aug 30, 2024

Conversation

Korulag
Copy link
Contributor

@Korulag Korulag commented Aug 16, 2024

This commit brings the ability to define additional search parameters for VMs, aka test flavor. This allows to search for specific images and select them for VM deployment.
Right now works only for Opennebula driver.

This commit brings the ability to define additional search parameters
for VMs, aka test flavor. This allows to search for specific images and
select them for VM deployment.
Right now works only for Opennebula driver.
Copy link

github-actions bot commented Aug 16, 2024

33 passed

Code Coverage Summary

Package Line Rate
alts.scheduler 0%
alts.shared 87%
alts.shared.uploaders 37%
alts.shared.utils 52%
alts.worker 6%
alts.worker.executors 73%
alts.worker.runners 27%
Summary 37% (933 / 2532)

Linter reports

Pylint report
************* Module alts.shared.models
alts/shared/models.py:358:0: C0301: Line too long (92/80) (line-too-long)
alts/shared/models.py:123:9: W0511: FIXME: Find correct way to search for certs store (fixme)
alts/shared/models.py:235:0: C0115: Missing class docstring (missing-class-docstring)
************* Module alts.worker.runners.base
alts/worker/runners/base.py:319:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:396:0: C0301: Line too long (85/80) (line-too-long)
alts/worker/runners/base.py:587:0: C0301: Line too long (87/80) (line-too-long)
alts/worker/runners/base.py:595:0: C0301: Line too long (89/80) (line-too-long)
alts/worker/runners/base.py:729:0: C0301: Line too long (86/80) (line-too-long)
alts/worker/runners/base.py:893:0: C0301: Line too long (83/80) (line-too-long)
alts/worker/runners/base.py:941:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:946:0: C0301: Line too long (96/80) (line-too-long)
alts/worker/runners/base.py:1004:0: C0301: Line too long (85/80) (line-too-long)
alts/worker/runners/base.py:1141:0: C0301: Line too long (87/80) (line-too-long)
alts/worker/runners/base.py:1207:0: C0301: Line too long (81/80) (line-too-long)
alts/worker/runners/base.py:1644:0: C0301: Line too long (82/80) (line-too-long)
alts/worker/runners/base.py:1768:0: C0301: Line too long (93/80) (line-too-long)
alts/worker/runners/base.py:1806:0: C0301: Line too long (91/80) (line-too-long)
alts/worker/runners/base.py:1:0: C0302: Too many lines in module (1826/1000) (too-many-lines)
alts/worker/runners/base.py:484:5: W0511: TODO: Think of better implementation (fixme)
alts/worker/runners/base.py:492:5: W0511: TODO: Think of better implementation (fixme)
alts/worker/runners/base.py:108:0: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:120:16: W0212: Access to a protected member _raise_if_aborted of a client class (protected-access)
alts/worker/runners/base.py:121:19: W0212: Access to a protected member _work_dir of a client class (protected-access)
alts/worker/runners/base.py:121:56: W0212: Access to a protected member _work_dir of a client class (protected-access)
alts/worker/runners/base.py:131:21: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:138:39: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:139:20: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:140:25: W0212: Access to a protected member _artifacts of a client class (protected-access)
alts/worker/runners/base.py:146:12: W0212: Access to a protected member _stats of a client class (protected-access)
alts/worker/runners/base.py:152:16: W0212: Access to a protected member _logger of a client class (protected-access)
alts/worker/runners/base.py:159:12: W0212: Access to a protected member _logger of a client class (protected-access)
alts/worker/runners/base.py:117:8: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:168:0: R0205: Class 'BaseRunner' inherits from object, can be safely removed from bases in python3 (useless-object-inheritance)
alts/worker/runners/base.py:598:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:599:12: E1206: Not enough arguments for logging format string (logging-too-few-args)
alts/worker/runners/base.py:682:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:671:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:756:8: R1705: Unnecessary "elif" after "return", remove the leading "el" from "elif" (no-else-return)
alts/worker/runners/base.py:787:4: R0914: Too many local variables (16/15) (too-many-locals)
alts/worker/runners/base.py:808:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:832:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:832:4: R0914: Too many local variables (17/15) (too-many-locals)
alts/worker/runners/base.py:842:8: W0613: Unused argument 'allow_fail' (unused-argument)
alts/worker/runners/base.py:928:27: W0612: Unused variable 'stderr' (unused-variable)
alts/worker/runners/base.py:1163:27: W0612: Unused variable 'stderr' (unused-variable)
alts/worker/runners/base.py:1217:12: R1724: Unnecessary "else" after "continue", remove the "else" and de-indent the code inside it (no-else-continue)
alts/worker/runners/base.py:1227:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1267:4: R0914: Too many local variables (16/15) (too-many-locals)
alts/worker/runners/base.py:1335:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1349:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:1349:4: R0914: Too many local variables (20/15) (too-many-locals)
alts/worker/runners/base.py:1422:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/base.py:1487:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1510:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1534:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:1539:19: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/base.py:168:0: R0904: Too many public methods (52/20) (too-many-public-methods)
alts/worker/runners/base.py:1613:0: C0115: Missing class docstring (missing-class-docstring)
alts/worker/runners/base.py:1613:0: W0223: Method '_render_tf_main_file' is abstract in class 'BaseRunner' but is not overridden in child class 'GenericVMRunner' (abstract-method)
alts/worker/runners/base.py:1613:0: W0223: Method '_render_tf_variables_file' is abstract in class 'BaseRunner' but is not overridden in child class 'GenericVMRunner' (abstract-method)
alts/worker/runners/base.py:1743:12: W0702: No exception type(s) specified (bare-except)
alts/worker/runners/base.py:1752:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/runners/base.py:1766:16: W0612: Unused variable 'attempt' (unused-variable)
************* Module alts.worker.runners.docker
alts/worker/runners/docker.py:302:0: C0301: Line too long (87/80) (line-too-long)
alts/worker/runners/docker.py:323:0: C0301: Line too long (87/80) (line-too-long)
alts/worker/runners/docker.py:141:4: W0221: Variadics removed in overriding 'DockerRunner.exec_command' method (arguments-differ)
alts/worker/runners/docker.py:321:4: E0102: method already defined line 300 (function-redefined)
alts/worker/runners/docker.py:337:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
************* Module alts.worker.runners.opennebula
alts/worker/runners/opennebula.py:81:0: C0301: Line too long (90/80) (line-too-long)
alts/worker/runners/opennebula.py:85:0: C0301: Line too long (82/80) (line-too-long)
alts/worker/runners/opennebula.py:124:0: C0301: Line too long (83/80) (line-too-long)
alts/worker/runners/opennebula.py:86:61: W1401: Anomalous backslash in string: '\d'. String constant might be missing an r prefix. (anomalous-backslash-in-string)
alts/worker/runners/opennebula.py:75:4: C0116: Missing function or method docstring (missing-function-docstring)
alts/worker/runners/opennebula.py:75:4: R0914: Too many local variables (20/15) (too-many-locals)
alts/worker/runners/opennebula.py:223:12: W0702: No exception type(s) specified (bare-except)
alts/worker/runners/opennebula.py:237:15: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/runners/opennebula.py:251:4: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
************* Module alts.worker.tasks
alts/worker/tasks.py:245:17: W0511: FIXME: Temporary solution, needs to be removed when this (fixme)
alts/worker/tasks.py:26:0: W0622: Redefining built-in 'TimeoutError' (redefined-builtin)
alts/worker/tasks.py:78:11: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/tasks.py:89:0: W0223: Method 'run' is abstract in class 'Task' but is not overridden in child class 'RetryableTask' (abstract-method)
alts/worker/tasks.py:98:0: R0914: Too many local variables (29/15) (too-many-locals)
alts/worker/tasks.py:290:8: W0134: 'return' shadowed by the 'finally' clause. (return-in-finally)
alts/worker/tasks.py:231:11: W0718: Catching too general exception Exception (broad-exception-caught)
alts/worker/tasks.py:290:8: W0150: return statement in finally block may swallow exception (lost-exception)
alts/worker/tasks.py:98:0: R0912: Too many branches (26/12) (too-many-branches)
alts/worker/tasks.py:98:0: R0915: Too many statements (99/50) (too-many-statements)
alts/worker/tasks.py:98:0: R1710: Either all return statements in a function should return an expression, or none of them should. (inconsistent-return-statements)
alts/worker/tasks.py:13:0: C0411: standard import "collections.defaultdict" should be placed before third party import "celery.contrib.abortable.AbortableTask" (wrong-import-order)
alts/worker/tasks.py:14:0: C0411: standard import "socket.timeout" should be placed before third party import "celery.contrib.abortable.AbortableTask" (wrong-import-order)
alts/worker/tasks.py:15:0: C0411: standard import "typing.Union" should be placed before third party import "celery.contrib.abortable.AbortableTask" (wrong-import-order)
alts/worker/tasks.py:1:0: R0801: Similar lines in 2 files
==alts.worker.runners.base:[1629:1642]
==alts.worker.runners.opennebula:[54:67]
        super().__init__(
            task_id,
            task_is_aborted,
            dist_name,
            dist_version,
            repositories=repositories,
            dist_arch=dist_arch,
            artifacts_uploader=artifacts_uploader,
            package_channel=package_channel,
            test_configuration=test_configuration,
            test_flavor=test_flavor,
            verbose=verbose,
        ) (duplicate-code)
alts/worker/tasks.py:1:0: R0801: Similar lines in 2 files
==alts.worker.runners.base:[1629:1636]
==alts.worker.runners.docker:[91:98]
        super().__init__(
            task_id,
            task_is_aborted,
            dist_name,
            dist_version,
            repositories=repositories,
            dist_arch=dist_arch, (duplicate-code)
alts/worker/tasks.py:1:0: R0801: Similar lines in 2 files
==alts.worker.runners.base:[913:919]
==alts.worker.tasks:[183:189]
            package_name,
            package_version=package_version,
            package_epoch=package_epoch,
            module_name=module_name,
            module_stream=module_stream,
            module_version=module_version, (duplicate-code)
alts/worker/tasks.py:1:0: R0801: Similar lines in 2 files
==alts.worker.runners.base:[1064:1070]
==alts.worker.runners.docker:[286:292]
        cmd_args.append('tests')
        self._logger.info(
            'Running package integrity tests for %s on %s...',
            full_pkg_name,
            self.env_name,
        ) (duplicate-code)

-----------------------------------
Your code has been rated at 9.30/10


Black report
--- alts/worker/runners/opennebula.py	2024-08-19 10:47:56.056427+00:00
+++ alts/worker/runners/opennebula.py	2024-08-19 10:48:37.992338+00:00
@@ -6,11 +6,12 @@
 
 import os
 import re
 import time
 from typing import (
-    Callable, Dict,
+    Callable,
+    Dict,
     List,
     Optional,
     Union,
 )
 
@@ -80,12 +81,14 @@
         channels = '|'.join(CONFIG.allowed_channel_names)
         regex_str = r'(?P<platform_name>\w+(-\w+)?)-(?P<version>\d+(.\d+)?)-(?P<arch>\w+)'
         if self.test_flavor:
             name = self.test_flavor['name']
             version = self.test_flavor['version']
-            regex_str += f'.(?P<flavor_name>{name})-(?P<flavor_version>{version})'
-        regex_str += f'.base_image.test_system.({channels}).b\d+' # noqa
+            regex_str += (
+                f'.(?P<flavor_name>{name})-(?P<flavor_version>{version})'
+            )
+        regex_str += f'.base_image.test_system.({channels}).b\d+'  # noqa
         # Filter images to leave only those that are related to the particular
         # platform
         # Note: newer OS don't have 32-bit images usually, so we need to try
         # to find correct 64-bit replacement
         if self.dist_arch == 'i686':
@@ -119,11 +122,14 @@
             f'dist version: {self.dist_version}, '
             f'architecture: {self.dist_arch}'
         )
         if not filtered_templates:
             self._logger.info('Searching new templates without the channel')
-            if self.package_channel is not None and self.package_channel == 'beta':
+            if (
+                self.package_channel is not None
+                and self.package_channel == 'beta'
+            ):
                 filtered_templates = search_template(include_channel=False)
                 self._logger.info(
                     'Filtered templates: %s',
                     [i.NAME for i in filtered_templates],
                 )
@@ -221,11 +227,12 @@
                 self.opennebula_client.vm.recover(vm_id, 3)
                 wait_for_state(pyone.VM_STATE.DONE, attempts=60)
             except:
                 self._logger.exception(
                     'Cannot terminate VM %s via API, please contact infra '
-                    'team to ask for help', vm_id
+                    'team to ask for help',
+                    vm_id,
                 )
 
         try:
             self.opennebula_client.vm.action('terminate-hard', vm_id)
             wait_for_state(pyone.VM_STATE.DONE)
@@ -236,26 +243,30 @@
             recover_delete()
         except Exception as e:
             self._logger.error(
                 'Unexpected error during execution of '
                 'terminate-hard on VM %s:\n%s',
-                vm_id, str(e)
+                vm_id,
+                str(e),
             )
             recover_delete()
 
     @command_decorator(
         'stop_environment',
         'Cannot destroy environment',
         exception_class=StopEnvironmentError,
         is_abortable=False,
     )
     def stop_env(self):
-        id_exit_code, vm_id, id_stderr = local['terraform'].with_cwd(
-            self._work_dir).run(
-            args=('output', '-raw', '-no-color', 'vm_id'),
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        id_exit_code, vm_id, id_stderr = (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=('output', '-raw', '-no-color', 'vm_id'),
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
         if id_exit_code != 0:
             self._logger.warning('Cannot get VM ID: %s', id_stderr)
         try:
             return super().stop_env()
--- alts/worker/runners/docker.py	2024-08-19 10:47:56.056427+00:00
+++ alts/worker/runners/docker.py	2024-08-19 10:48:37.998950+00:00
@@ -165,13 +165,17 @@
         cmd.extend([str(self.env_name), *args])
         self._logger.debug(
             'Running "docker %s" command',
             ' '.join(cmd),
         )
-        return local['docker'].with_cwd(self._work_dir).run(
-            args=tuple(cmd),
-            retcode=None,
+        return (
+            local['docker']
+            .with_cwd(self._work_dir)
+            .run(
+                args=tuple(cmd),
+                retcode=None,
+            )
         )
 
     @staticmethod
     def copy(copy_args: List[str]):
         """
@@ -227,11 +231,12 @@
                     r's/(deb|security)\.debian\.org/archive\.debian\.org/',
                     '/etc/apt/sources.list',
                 )
             self._logger.info('Installing python3 package...')
             exit_code, stdout, stderr = self.exec_command(
-                self.pkg_manager, 'update',
+                self.pkg_manager,
+                'update',
             )
             if exit_code != 0:
                 return exit_code, stdout, stderr
             cmd_args = (self.pkg_manager, 'install', '-y', 'python3')
             exit_code, stdout, stderr = self.exec_command(*cmd_args)
@@ -239,12 +244,19 @@
                 return exit_code, stdout, stderr
             self._logger.info('Installation is completed')
         if self.dist_name in CONFIG.rhel_flavors and self.dist_version == '6':
             self._logger.info('Removing old repositories')
             self.exec_command(
-                'find', '/etc/yum.repos.d', '-type', 'f', '-exec',
-                'rm', '-f', '{}', '+',
+                'find',
+                '/etc/yum.repos.d',
+                '-type',
+                'f',
+                '-exec',
+                'rm',
+                '-f',
+                '{}',
+                '+',
             )
         return super().initial_provision(verbose=verbose)
 
     @command_decorator(
         'package_integrity_tests',
@@ -297,44 +309,56 @@
         'Third party tests failed',
         exception_class=ThirdPartyTestError,
     )
     def run_third_party_test(
         self,
-        executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
         cmd_args: List[str],
         docker_args: Optional[List[str]] = None,
         workdir: str = '',
         artifacts_key: str = '',
         additional_section_name: str = '',
         env_vars: Optional[List[str]] = None,
     ):
-        return executor.run_docker_command(
-            cmd_args=cmd_args,
-            docker_args=docker_args,
-            env_vars=env_vars,
-        ).model_dump().values()
+        return (
+            executor.run_docker_command(
+                cmd_args=cmd_args,
+                docker_args=docker_args,
+                env_vars=env_vars,
+            )
+            .model_dump()
+            .values()
+        )
 
     @command_decorator(
         '',
         'Third party tests failed',
         exception_class=ThirdPartyTestError,
     )
     def run_third_party_test(
         self,
-        executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
         cmd_args: List[str],
         docker_args: Optional[List[str]] = None,
         workdir: str = '',
         artifacts_key: str = '',
         additional_section_name: str = '',
         env_vars: Optional[List[str]] = None,
     ):
-        return executor.run_docker_command(
-            cmd_args=cmd_args,
-            docker_args=docker_args,
-            env_vars=env_vars,
-        ).model_dump().values()
+        return (
+            executor.run_docker_command(
+                cmd_args=cmd_args,
+                docker_args=docker_args,
+                env_vars=env_vars,
+            )
+            .model_dump()
+            .values()
+        )
 
     def clone_third_party_repo(
         self,
         repo_url: str,
         git_ref: str,
@@ -355,15 +379,18 @@
         'stop_environment',
         'Cannot destroy environment',
         exception_class=StopEnvironmentError,
     )
     def stop_env(self):
-        _, container_id, _ = local['terraform'].with_cwd(
-            self._work_dir).run(
-            args=('output', '-raw', '-no-color', 'container_id'),
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        _, container_id, _ = (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=('output', '-raw', '-no-color', 'container_id'),
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
         try:
             return super().stop_env()
         except StopEnvironmentError:
             # Attempt to delete environment via plain docker command
--- alts/shared/models.py	2024-08-19 10:47:56.056427+00:00
+++ alts/shared/models.py	2024-08-19 10:48:38.066324+00:00
@@ -292,13 +292,13 @@
     # Build system settings
     bs_host: Optional[str] = None
     bs_tasks_endpoint: str = '/api/v1/tests/get_test_tasks/'
     bs_token: Optional[str] = None
     # Log uploader settings
-    logs_uploader_config: Optional[
-        Union[PulpLogsConfig, AzureLogsConfig]
-    ] = None
+    logs_uploader_config: Optional[Union[PulpLogsConfig, AzureLogsConfig]] = (
+        None
+    )
     uninstall_excluded_pkgs: List[str] = [
         'almalinux-release',
         'kernel',
         'dnf',
     ]
@@ -307,13 +307,13 @@
     provision_timeout: int = 600  # 10 minutes in seconds
     tests_exec_timeout: int = 1800  # 30 minutes in seconds
     deprecated_ansible_venv: str = get_abspath('~/ansible_env')
     epel_release_urls: Dict[str, str] = {
         '6': 'http://dl.fedoraproject.org/pub/archive/epel/6/x86_64/'
-             'epel-release-6-8.noarch.rpm',
+        'epel-release-6-8.noarch.rpm',
         '7': 'https://dl.fedoraproject.org/pub/archive/epel/7/x86_64/'
-             'Packages/e/epel-release-7-14.noarch.rpm',
+        'Packages/e/epel-release-7-14.noarch.rpm',
     }
     centos_baseurl: str = 'http://mirror.centos.org/centos'
     git_reference_directory: Optional[str] = None
     tests_base_dir: str = '/tests'
     package_proxy: str = ''
@@ -355,17 +355,17 @@
             'task_track_started': True,
             'task_soft_time_limit': self.task_soft_time_limit,
             'worker_prefetch_multiplier': self.worker_prefetch_multiplier,
             'worker_deduplicate_successful_tasks': self.worker_deduplicate_successful_tasks,
             'worker_max_tasks_per_child': self.worker_max_tasks_per_child,
-            'broker_transport_options': {'visibility_timeout': 36000}
+            'broker_transport_options': {'visibility_timeout': 36000},
         }
         if isinstance(self.results_backend_config, AzureResultsConfig):
             for key in (
                 'azureblockblob_container_name',
                 'azureblockblob_base_path',
-                'azure_connection_string'
+                'azure_connection_string',
             ):
                 config_dict[key] = getattr(self.results_backend_config, key)
         elif isinstance(self.results_backend_config, S3ResultsConfig):
             for key in (
                 's3_access_key_id',
--- alts/worker/tasks.py	2024-08-19 10:47:56.056427+00:00
+++ alts/worker/tasks.py	2024-08-19 10:48:38.127376+00:00
@@ -158,11 +158,11 @@
     runner_kwargs = {
         'repositories': task_params.get('repositories', []),
         'dist_arch': task_params.get('dist_arch', 'x86_64'),
         'package_channel': task_params.get('package_channel', 'beta'),
         'test_configuration': task_params.get('test_configuration', {}),
-        'test_flavor': task_params.get('test_flavor', {})
+        'test_flavor': task_params.get('test_flavor', {}),
     }
 
     runner_class = RUNNER_MAPPING[task_params['runner_type']]
     runner: Union[DockerRunner, OpennebulaRunner] = runner_class(
         *runner_args,
@@ -201,11 +201,11 @@
         logging.exception('Cannot find VM image: %s', exc)
     except WorkDirPreparationError:
         runner.artifacts['prepare_environment'] = {
             'exit_code': 1,
             'stdout': '',
-            'stderr': traceback.format_exc()
+            'stderr': traceback.format_exc(),
         }
     except TerraformInitializationError as exc:
         logging.exception('Cannot initialize terraform: %s', exc)
     except StartEnvironmentError as exc:
         logging.exception('Cannot start environment: %s', exc)
@@ -222,11 +222,11 @@
     except StopEnvironmentError as exc:
         logging.exception('Cannot stop environment: %s', exc)
     except AbortedTestTask:
         logging.warning(
             'Task %s has been aborted. Gracefully stopping tests.',
-            task_params['task_id']
+            task_params['task_id'],
         )
         aborted = True
 
     except Exception as exc:
         logging.exception('Unexpected exception: %s', exc)
@@ -272,13 +272,11 @@
                 'result': summary,
                 'stats': runner.stats,
             }
             session = requests.Session()
             retries = Retry(total=5, backoff_factor=3, status_forcelist=[502])
-            retry_adapter = requests.adapters.HTTPAdapter(
-                max_retries=retries
-            )
+            retry_adapter = requests.adapters.HTTPAdapter(max_retries=retries)
             session.mount("http://", retry_adapter)
             session.mount("https://", retry_adapter)
             response = session.post(
                 full_url,
                 json=payload,
--- alts/worker/runners/base.py	2024-08-19 10:47:56.056427+00:00
+++ alts/worker/runners/base.py	2024-08-19 10:48:38.611516+00:00
@@ -95,16 +95,20 @@
     '.sh': ShellExecutor,
     '.yml': AnsibleExecutor,
     '.yaml': AnsibleExecutor,
 }
 
-DetectExecutorResult = Type[Optional[Union[
-    AnsibleExecutor,
-    BatsExecutor,
-    CommandExecutor,
-    ShellExecutor,
-]]]
+DetectExecutorResult = Type[
+    Optional[
+        Union[
+            AnsibleExecutor,
+            BatsExecutor,
+            CommandExecutor,
+            ShellExecutor,
+        ]
+    ]
+]
 
 
 def command_decorator(
     artifacts_key,
     error_message,
@@ -314,20 +318,20 @@
         return self._test_env.get('use_deprecated_ansible', False)
 
     @property
     def ansible_binary(self) -> str:
         if self.use_deprecated_ansible:
-            return os.path.join(CONFIG.deprecated_ansible_venv, 'bin', 'ansible')
+            return os.path.join(
+                CONFIG.deprecated_ansible_venv, 'bin', 'ansible'
+            )
         return 'ansible'
 
     @property
     def ansible_playbook_binary(self) -> str:
         if self.use_deprecated_ansible:
             return os.path.join(
-                CONFIG.deprecated_ansible_venv,
-                'bin',
-                'ansible-playbook'
+                CONFIG.deprecated_ansible_venv, 'bin', 'ansible-playbook'
             )
         return 'ansible-playbook'
 
     @property
     def vm_disk_size(self) -> int:
@@ -391,11 +395,13 @@
             url_parts.insert(1, f'[arch={self.dist_arch}]')
             repo['url'] = ' '.join(url_parts)
             self._logger.debug('Repository modified state: %s', repo)
         return repositories
 
-    def add_credentials_to_build_repos(self, repositories: List[dict]) -> List[dict]:
+    def add_credentials_to_build_repos(
+        self, repositories: List[dict]
+    ) -> List[dict]:
         modified_repositories = []
         for repo in repositories:
             if '-br' not in repo['name']:
                 modified_repositories.append(repo)
                 continue
@@ -423,12 +429,11 @@
                 parsed.path,
                 parsed.params,
                 parsed.query,
                 parsed.fragment,
             ))
-            if (self.dist_name in CONFIG.debian_flavors
-                    and not url_parts):
+            if self.dist_name in CONFIG.debian_flavors and not url_parts:
                 url = f'deb {url} ./'
             elif url_parts and parsed_url_index:
                 url_parts[parsed_url_index] = url
                 url = ' '.join(url_parts)
             self._logger.debug('Modified repo url: %s', url)
@@ -482,13 +487,11 @@
             pass
 
     # TODO: Think of better implementation
     def _create_work_dir(self):
         if not self._work_dir or not os.path.exists(self._work_dir):
-            self._work_dir = Path(
-                tempfile.mkdtemp(prefix=self.TEMPFILE_PREFIX)
-            )
+            self._work_dir = Path(tempfile.mkdtemp(prefix=self.TEMPFILE_PREFIX))
         return self._work_dir
 
     # TODO: Think of better implementation
     def _create_artifacts_dir(self):
         if not self._work_dir:
@@ -560,33 +563,38 @@
             if (
                 self.dist_name in CONFIG.rhel_flavors
                 and self.dist_version in ('8', '9', '10')
                 and package_version
             ):
-                full_pkg_name = (f'{package_name}{delimiter}{package_epoch}:'
-                                 f'{package_version}')
+                full_pkg_name = (
+                    f'{package_name}{delimiter}{package_epoch}:'
+                    f'{package_version}'
+                )
         return full_pkg_name
 
     def run_ansible_command(
-        self, args: Union[tuple, list], retcode_none: bool = False,
-        timeout: int = CONFIG.provision_timeout
+        self,
+        args: Union[tuple, list],
+        retcode_none: bool = False,
+        timeout: int = CONFIG.provision_timeout,
     ):
-        run_kwargs = {
-            'args': args,
-            'timeout': timeout
-        }
+        run_kwargs = {'args': args, 'timeout': timeout}
         if retcode_none:
             run_kwargs['retcode'] = None
         cmd = local[self.ansible_playbook_binary].with_cwd(self._work_dir)
         formulated_cmd = cmd.formulate(args=run_kwargs.get('args', ()))
         exception_happened = False
         cmd_pid = None
         try:
             future = cmd.run_bg(**run_kwargs)
             cmd_pid = future.proc.pid
             future.wait()
-            exit_code, stdout, stderr = future.returncode, future.stdout, future.stderr
+            exit_code, stdout, stderr = (
+                future.returncode,
+                future.stdout,
+                future.stderr,
+            )
         except ProcessExecutionError as e:
             stdout = e.stdout
             stderr = e.stderr
             exit_code = e.retcode
             exception_happened = True
@@ -595,12 +603,11 @@
             stderr = f'Timeout occurred when running ansible command: "{formulated_cmd}"'
             exit_code = COMMAND_TIMEOUT_EXIT_CODE
             exception_happened = True
         except Exception as e:
             self._logger.error(
-                'Unknown error happened during %s execution: %s',
-                formulated_cmd
+                'Unknown error happened during %s execution: %s', formulated_cmd
             )
             stdout = ''
             stderr = str(e)
             exit_code = 255
 
@@ -655,13 +662,17 @@
             'container_name': str(self.env_name),
         }
 
     def __terraform_init(self):
         with FileLock(TF_INIT_LOCK_PATH, timeout=60, thread_local=False):
-            return local['terraform'].with_cwd(self._work_dir).run(
-                ('init', '-no-color'),
-                timeout=CONFIG.provision_timeout,
+            return (
+                local['terraform']
+                .with_cwd(self._work_dir)
+                .run(
+                    ('init', '-no-color'),
+                    timeout=CONFIG.provision_timeout,
+                )
             )
 
     # After: prepare_work_dir_files
     @command_decorator(
         'initialize_terraform',
@@ -699,14 +710,18 @@
         )
         self._logger.debug('Running "terraform apply --auto-approve" command')
         cmd_args = ['apply', '--auto-approve', '-no-color']
         if self.TF_VARIABLES_FILE:
             cmd_args.extend(['--var-file', self.TF_VARIABLES_FILE])
-        return local['terraform'].with_cwd(self._work_dir).run(
-            args=cmd_args,
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        return (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=cmd_args,
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
 
     # After: start_env
     @command_decorator(
         'initial_provision',
@@ -724,11 +739,14 @@
             'pytest_is_needed': self.pytest_is_needed,
             'development_mode': CONFIG.development_mode,
             'package_proxy': CONFIG.package_proxy,
         }
         dist_major_version = self.dist_version[0]
-        if self.dist_name in CONFIG.rhel_flavors and dist_major_version in ('6', '7'):
+        if self.dist_name in CONFIG.rhel_flavors and dist_major_version in (
+            '6',
+            '7',
+        ):
             epel_release_url = CONFIG.epel_release_urls.get(dist_major_version)
             if epel_release_url:
                 var_dict['epel_release_url'] = epel_release_url
         if CONFIG.centos_baseurl:
             var_dict['centos_repo_baseurl'] = CONFIG.centos_baseurl
@@ -762,23 +780,38 @@
     def get_system_info_commands_list(self) -> Dict[str, tuple]:
         self._logger.debug('Returning default system info commands list')
         basic_commands = BASE_SYSTEM_INFO_COMMANDS.copy()
         if self._dist_name in CONFIG.rhel_flavors:
             basic_commands['Installed packages'] = ('rpm', '-qa')
-            basic_commands['Repositories list'] = (
-                 self.pkg_manager, 'repolist'
-            )
+            basic_commands['Repositories list'] = (self.pkg_manager, 'repolist')
             basic_commands['Repositories details'] = (
-                'find', '/etc/yum.repos.d/', '-type', 'f',
-                '-exec', 'cat', '{}', '+'
+                'find',
+                '/etc/yum.repos.d/',
+                '-type',
+                'f',
+                '-exec',
+                'cat',
+                '{}',
+                '+',
             )
         else:
             basic_commands['Installed packages'] = ('dpkg', '-l')
             basic_commands['Repositories list'] = ('apt-cache', 'policy')
             basic_commands['Repositories details'] = (
-                'find', '/etc/apt/', '-type', 'f', '-name', '*.list*',
-                '-o', '-name', '*.sources*', '-exec', 'cat', '{}', '+'
+                'find',
+                '/etc/apt/',
+                '-type',
+                'f',
+                '-name',
+                '*.list*',
+                '-o',
+                '-name',
+                '*.sources*',
+                '-exec',
+                'cat',
+                '{}',
+                '+',
             )
         return basic_commands
 
     @command_decorator(
         'system_info',
@@ -791,14 +824,11 @@
         error_output = ''
         executor_params = self.get_test_executor_params()
         executor_params['timeout'] = CONFIG.commands_exec_timeout
         for section, cmd in self.get_system_info_commands_list().items():
             start = datetime.datetime.utcnow()
-            self._logger.info(
-                'Running "%s" for env %s',
-                cmd, self.env_name
-            )
+            self._logger.info('Running "%s" for env %s', cmd, self.env_name)
             try:
                 binary, *args = cmd
                 result = CommandExecutor(binary, **executor_params).run(args)
                 output = '\n'.join([result.stdout, result.stderr])
                 if result.is_successful():
@@ -808,11 +838,13 @@
             except Exception as e:
                 errored_commands[section] = str(e)
             finish = datetime.datetime.utcnow()
             self._logger.info(
                 '"%s" for env %s took %s',
-                cmd, self.env_name, str(finish - start)
+                cmd,
+                self.env_name,
+                str(finish - start),
             )
         success_output = '\n\n'.join((
             section + '\n' + section_out
             for section, section_out in successful_commands.items()
         ))
@@ -885,14 +917,16 @@
             cmd_args, timeout=CONFIG.provision_timeout
         )
         if exit_code == COMMAND_TIMEOUT_EXIT_CODE:
             self._logger.error(
                 'Package was not installed due to command timeout: %s',
-                f'{out}\n{err}'
+                f'{out}\n{err}',
             )
         elif exit_code != 0:
-            self._logger.error('Cannot install package %s: %s', full_pkg_name, err)
+            self._logger.error(
+                'Cannot install package %s: %s', full_pkg_name, err
+            )
         return exit_code, out, err
 
     @command_decorator(
         'install_package',
         'Cannot install package',
@@ -932,27 +966,36 @@
             return []
         files = [i.strip() for i in stdout.split('\n') if i.strip()]
         protected = []
         for file_ in files:
             exit_code, stdout, stderr = self.exec_command(
-                'cat', f'/etc/{self.pkg_manager}/protected.d/{file_}',
+                'cat',
+                f'/etc/{self.pkg_manager}/protected.d/{file_}',
             )
             if exit_code != 0:
                 continue
-            file_protected = [i.strip() for i in stdout.split('\n') if i.strip()]
+            file_protected = [
+                i.strip() for i in stdout.split('\n') if i.strip()
+            ]
             if file_protected:
                 protected.extend(file_protected)
         protected.append('kernel-core')
         dnf_command = (
-            r'dnf', '-q', '--qf=%{NAME}', 'repoquery', '--requires', '--resolve', '--recursive',
-            *protected
+            r'dnf',
+            '-q',
+            '--qf=%{NAME}',
+            'repoquery',
+            '--requires',
+            '--resolve',
+            '--recursive',
+            *protected,
         )
         exit_code, stdout, stderr = self.exec_command(*dnf_command)
         if exit_code != 0:
             self._logger.warning(
                 'Cannot resolve non-uninstallable packages via DNF: %s',
-                dnf_command
+                dnf_command,
             )
             return protected
         dnf_protected = [i.strip() for i in stdout.split('\n') if i.strip()]
         if dnf_protected:
             protected.extend(dnf_protected)
@@ -996,14 +1039,16 @@
             cmd_args, timeout=CONFIG.provision_timeout
         )
         if exit_code == COMMAND_TIMEOUT_EXIT_CODE:
             self._logger.error(
                 'Package was not uninstalled due to command timeout: %s',
-                f'{out}\n{err}'
+                f'{out}\n{err}',
             )
         elif exit_code != 0:
-            self._logger.error('Cannot uninstall package %s: %s', full_pkg_name, err)
+            self._logger.error(
+                'Cannot uninstall package %s: %s', full_pkg_name, err
+            )
         return exit_code, out, err
 
     def ensure_package_is_uninstalled(self, package_name: str):
         package_exists = self.check_package_existence(package_name)
         if package_exists:
@@ -1066,33 +1111,35 @@
         self._logger.info(
             'Running package integrity tests for %s on %s...',
             full_pkg_name,
             self.env_name,
         )
-        return local['py.test'].with_cwd(self._integrity_tests_dir).run(
-            args=cmd_args,
-            retcode=None,
-            timeout=CONFIG.tests_exec_timeout,
+        return (
+            local['py.test']
+            .with_cwd(self._integrity_tests_dir)
+            .run(
+                args=cmd_args,
+                retcode=None,
+                timeout=CONFIG.tests_exec_timeout,
+            )
         )
 
     @staticmethod
     def prepare_gerrit_repo_url(url: str) -> str:
         parsed = urllib.parse.urlparse(url)
         if CONFIG.gerrit_username:
             netloc = f'{CONFIG.gerrit_username}@{parsed.netloc}'
         else:
             netloc = parsed.netloc
-        return urllib.parse.urlunparse(
-            (
-                parsed.scheme,
-                netloc,
-                parsed.path,
-                parsed.params,
-                parsed.query,
-                parsed.fragment,
-            )
-        )
+        return urllib.parse.urlunparse((
+            parsed.scheme,
+            netloc,
+            parsed.path,
+            parsed.params,
+            parsed.query,
+            parsed.fragment,
+        ))
 
     def clone_third_party_repo(
         self,
         repo_url: str,
         git_ref: str,
@@ -1110,37 +1157,41 @@
         if not repo_name.endswith('.git'):
             repo_name += '.git'
         repo_reference_dir = None
         if CONFIG.git_reference_directory:
             repo_reference_dir = os.path.join(
-                CONFIG.git_reference_directory, repo_name)
+                CONFIG.git_reference_directory, repo_name
+            )
         repo_path = None
         for attempt in range(1, 6):
             try:
                 repo_path = func(
                     repo_url,
                     git_ref,
                     self._work_dir,
                     self._logger,
-                    reference_directory=repo_reference_dir
+                    reference_directory=repo_reference_dir,
                 )
             except (ProcessExecutionError, ProcessTimedOut):
                 pass
             if not repo_path:
                 self._logger.warning(
                     'Attempt %d to clone %s locally has failed',
-                    attempt, repo_url
+                    attempt,
+                    repo_url,
                 )
                 self._logger.debug('Sleeping before making another attempt')
                 time.sleep(random.randint(5, 10))
             else:
                 break
         return repo_path
 
     def run_third_party_test(
         self,
-        executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
         cmd_args: List[str],
         docker_args: Optional[List[str]] = None,
         workdir: str = '',
         artifacts_key: str = '',
         additional_section_name: str = '',
@@ -1154,12 +1205,16 @@
         package_version: Optional[str] = None,
     ) -> bool:
         if self.dist_name in CONFIG.rhel_flavors:
             cmd = ('rpm', '-q', package_name)
         elif self.dist_name in CONFIG.debian_flavors:
-            cmd = ('dpkg-query', '-Wf', r'${db:Status-Status} ${Package}\n',
-                   package_name)
+            cmd = (
+                'dpkg-query',
+                '-Wf',
+                r'${db:Status-Status} ${Package}\n',
+                package_name,
+            )
         else:
             raise ValueError(f'Unknown distribution: {self.dist_name}')
         exit_code, stdout, stderr = self.exec_command(*cmd)
         installed = exit_code == 0
         if installed and package_version:
@@ -1179,11 +1234,11 @@
         if not package_installed:
             self.install_package_no_log(
                 package_name,
                 package_version=package_version,
                 package_epoch=package_epoch,
-                semi_verbose=True
+                semi_verbose=True,
             )
 
     def get_init_script(self, tests_dir: Path) -> Optional[Path]:
         init = None
         for test in tests_dir.iterdir():
@@ -1202,11 +1257,19 @@
     def find_tests(self, tests_dir: str) -> List[Path]:
         self._logger.info('Looking tests on the remote in %s', tests_dir)
         if not tests_dir.endswith('/'):
             tests_dir += '/'
         _, stdout, _ = self.exec_command(
-            'find', tests_dir, '-maxdepth', '1', '-type', 'f', '-o', '-type', 'l'
+            'find',
+            tests_dir,
+            '-maxdepth',
+            '1',
+            '-type',
+            'f',
+            '-o',
+            '-type',
+            'l',
         )
         tests_list = [Path(i) for i in stdout.split('\n')]
         self._logger.debug('Tests list: %s', tests_list)
         tests_list.sort()
         organized_tests_list = []
@@ -1241,16 +1304,17 @@
             if re.search(regex, magic_out, re.IGNORECASE):
                 return executor_class_  # noqa
         return ShellExecutor  # noqa
 
     def detect_python_binary(
-        self,
-        test_path: Union[Path, str]
+        self, test_path: Union[Path, str]
     ) -> Tuple[str, str]:
         default_python = 'python3'
-        if (self.dist_name in CONFIG.rhel_flavors
-                and self.dist_version.startswith(('6', '7'))):
+        if (
+            self.dist_name in CONFIG.rhel_flavors
+            and self.dist_version.startswith(('6', '7'))
+        ):
             default_python = 'python'
         with open(test_path, 'rt') as f:
             shebang = f.readline()
             result = INTERPRETER_REGEX.search(shebang)
             if not result:
@@ -1278,19 +1342,14 @@
         errors = []
         executor_class = self.detect_executor(
             os.path.join(remote_workdir, test_file.name)
         )
         if not executor_class:
-            self._logger.warning(
-                'Cannot get executor for test %s',
-                test_file
-            )
+            self._logger.warning('Cannot get executor for test %s', test_file)
             return errors
         self._logger.info('Running %s', test_file)
-        self._logger.debug(
-            'Executor: %s', executor_class.__name__
-        )
+        self._logger.debug('Executor: %s', executor_class.__name__)
         if executor_class == AnsibleExecutor:
             cmd_args = [test_file]
             workdir = local_workdir
             executor_params['binary_name'] = self.ansible_playbook_binary
         else:
@@ -1396,12 +1455,14 @@
             tests_list = self.find_tests(remote_workdir)
             # Check if package has 0_init-like script
             for test_file in tests_list:
                 if tests_to_run and test_file.name not in tests_to_run:
                     continue
-                if (('0_init' not in test_file.name
-                     or '0_install' not in test_file.name)):
+                if (
+                    '0_init' not in test_file.name
+                    or '0_install' not in test_file.name
+                ):
                     self.ensure_package_is_installed(
                         package_name,
                         package_version=package_version,
                         package_epoch=package_epoch,
                     )
@@ -1494,14 +1555,18 @@
                 'Running "terraform destroy --auto-approve" command'
             )
             cmd_args = ['destroy', '--auto-approve', '-no-color']
             if self.TF_VARIABLES_FILE:
                 cmd_args.extend(['--var-file', self.TF_VARIABLES_FILE])
-            return local['terraform'].with_cwd(self._work_dir).run(
-                args=cmd_args,
-                retcode=None,
-                timeout=CONFIG.provision_timeout,
+            return (
+                local['terraform']
+                .with_cwd(self._work_dir)
+                .run(
+                    args=cmd_args,
+                    retcode=None,
+                    timeout=CONFIG.provision_timeout,
+                )
             )
 
     def erase_work_dir(self):
         if self._work_dir and os.path.exists(self._work_dir):
             self._logger.info('Erasing working directory...')
@@ -1639,11 +1704,13 @@
             test_configuration=test_configuration,
             test_flavor=test_flavor,
             verbose=verbose,
         )
         self._tests_dir = CONFIG.tests_base_dir
-        self._ssh_client: Optional[Union[AsyncSSHClient, LongRunSSHClient]] = None
+        self._ssh_client: Optional[Union[AsyncSSHClient, LongRunSSHClient]] = (
+            None
+        )
         self._vm_ip = None
 
     def _wait_for_ssh(self, retries=60):
         ansible = local[self.ansible_binary]
         cmd_args = ('-i', self.ANSIBLE_INVENTORY_FILE, '-m', 'ping', 'all')
@@ -1700,15 +1767,18 @@
     def start_env(self):
         exit_code, stdout, stderr = super().start_env()
         # VM gets its IP address only after deploy.
         # To extract it, the `vm_ip` output should be defined
         # in Terraform main file.
-        ip_exit_code, ip_stdout, ip_stderr = local['terraform'].with_cwd(
-            self._work_dir).run(
-            args=('output', '-raw',  '-no-color', 'vm_ip'),
-            retcode=None,
-            timeout=CONFIG.provision_timeout,
+        ip_exit_code, ip_stdout, ip_stderr = (
+            local['terraform']
+            .with_cwd(self._work_dir)
+            .run(
+                args=('output', '-raw', '-no-color', 'vm_ip'),
+                retcode=None,
+                timeout=CONFIG.provision_timeout,
+            )
         )
         if ip_exit_code != 0:
             error_message = f'Cannot get VM IP: {ip_stderr}'
             self._logger.error(error_message)
             return ip_exit_code, ip_stdout, ip_stderr
@@ -1748,13 +1818,13 @@
         command = ' '.join(args)
         result = self._ssh_client.sync_run_command(command)
         return result.exit_code, result.stdout, result.stderr
 
     def clone_third_party_repo(
-            self,
-            repo_url: str,
-            git_ref: str,
+        self,
+        repo_url: str,
+        git_ref: str,
     ) -> Optional[Path]:
         git_repo_path = super().clone_third_party_repo(repo_url, git_ref)
         if not git_repo_path:
             return
         if self._ssh_client:
@@ -1762,19 +1832,22 @@
                 self._tests_dir,
                 Path(repo_url).name.replace('.git', ''),
             )
             result = None
             for attempt in range(1, 6):
-                cmd = (f'if [ -e {repo_path} ]; then cd {repo_path} && '
-                       f'git reset --hard origin/master && git checkout master && git pull; '
-                       f'else cd {self._tests_dir} && git clone {repo_url}; fi')
+                cmd = (
+                    f'if [ -e {repo_path} ]; then cd {repo_path} && '
+                    f'git reset --hard origin/master && git checkout master && git pull; '
+                    f'else cd {self._tests_dir} && git clone {repo_url}; fi'
+                )
                 result = self._ssh_client.sync_run_command(cmd)
                 if result.is_successful():
                     break
                 self._logger.warning(
                     'Attempt to clone repository on VM failed:\n%s\n%s',
-                    result.stdout, result.stderr,
+                    result.stdout,
+                    result.stderr,
                 )
                 time.sleep(random.randint(5, 10))
             if not result or (result and not result.is_successful()):
                 return
 
@@ -1800,26 +1873,30 @@
         '',
         'Third party tests failed',
         exception_class=ThirdPartyTestError,
     )
     def run_third_party_test(
-            self,
-            executor: Union[AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor],
-            cmd_args: List[str],
-            docker_args: Optional[List[str]] = None,
-            workdir: str = '',
-            artifacts_key: str = '',
-            additional_section_name: str = '',
-            env_vars: Optional[List[str]] = None,
+        self,
+        executor: Union[
+            AnsibleExecutor, BatsExecutor, CommandExecutor, ShellExecutor
+        ],
+        cmd_args: List[str],
+        docker_args: Optional[List[str]] = None,
+        workdir: str = '',
+        artifacts_key: str = '',
+        additional_section_name: str = '',
+        env_vars: Optional[List[str]] = None,
     ):
         result = executor.run(
             cmd_args=cmd_args,
             workdir=workdir,
             env_vars=env_vars,
         )
-        if (self.VM_RESTART_OUTPUT_TRIGGER in result.stdout
-                or self.VM_RESTART_OUTPUT_TRIGGER in result.stderr):
+        if (
+            self.VM_RESTART_OUTPUT_TRIGGER in result.stdout
+            or self.VM_RESTART_OUTPUT_TRIGGER in result.stderr
+        ):
             reboot_result = self.reboot_target()
             if not reboot_result:
                 exit_code = 1
                 stderr = result.stderr + '\n\nReboot failed'
                 return exit_code, result.stdout, stderr

Isort report
--- /code/alts/shared/models.py:before	2024-08-19 10:47:56.056427
+++ /code/alts/shared/models.py:after	2024-08-19 10:48:39.187872
@@ -7,8 +7,8 @@
     List,
     Literal,
     Optional,
+    Set,
     Union,
-    Set,
 )
 
 from pydantic import BaseModel, ConfigDict, computed_field
--- /code/alts/worker/runners/base.py:before	2024-08-19 10:47:56.056427
+++ /code/alts/worker/runners/base.py:after	2024-08-19 10:48:39.214259
@@ -4,8 +4,8 @@
 import os
 import random
 import re
+import shutil
 import signal
-import shutil
 import tempfile
 import time
 import traceback
@@ -18,15 +18,15 @@
     Dict,
     List,
     Optional,
-    Union,
     Tuple,
     Type,
+    Union,
 )
 
 from billiard.exceptions import SoftTimeLimitExceeded
 from filelock import FileLock
 from mako.lookup import TemplateLookup
-from plumbum import local, ProcessExecutionError, ProcessTimedOut
+from plumbum import ProcessExecutionError, ProcessTimedOut, local
 
 from alts.shared.constants import COMMAND_TIMEOUT_EXIT_CODE
 from alts.shared.exceptions import (
--- /code/alts/worker/runners/docker.py:before	2024-08-19 10:47:56.056427
+++ /code/alts/worker/runners/docker.py:after	2024-08-19 10:48:39.221787
@@ -11,8 +11,8 @@
     Dict,
     List,
     Optional,
+    Tuple,
     Union,
-    Tuple,
 )
 
 from plumbum import local
--- /code/alts/worker/runners/opennebula.py:before	2024-08-19 10:47:56.056427
+++ /code/alts/worker/runners/opennebula.py:after	2024-08-19 10:48:39.226342
@@ -8,7 +8,8 @@
 import re
 import time
 from typing import (
-    Callable, Dict,
+    Callable,
+    Dict,
     List,
     Optional,
     Union,
@@ -20,8 +21,8 @@
 from alts.shared.constants import X32_ARCHITECTURES
 from alts.shared.exceptions import (
     OpennebulaVMStopError,
+    StopEnvironmentError,
     VMImageNotFound,
-    StopEnvironmentError,
 )
 from alts.shared.uploaders.base import BaseLogsUploader
 from alts.worker import CONFIG
--- /code/alts/worker/tasks.py:before	2024-08-19 10:47:56.056427
+++ /code/alts/worker/tasks.py:after	2024-08-19 10:48:39.234934
@@ -5,11 +5,10 @@
 """AlmaLinux Test System package testing tasks running."""
 
 import logging
-import traceback
 import random
 import time
+import traceback
 import urllib.parse
-from celery.contrib.abortable import AbortableTask
 from collections import defaultdict
 from socket import timeout
 from typing import Union
@@ -17,16 +16,18 @@
 import requests
 import requests.adapters
 import tap.parser
+from celery.contrib.abortable import AbortableTask
 from requests.exceptions import (
+    ConnectTimeout,
     HTTPError,
     ReadTimeout,
-    ConnectTimeout,
 )
 from urllib3 import Retry
 from urllib3.exceptions import TimeoutError
 
 from alts.shared.constants import API_VERSION, DEFAULT_REQUEST_TIMEOUT
 from alts.shared.exceptions import (
+    AbortedTestTask,
     InstallPackageError,
     PackageIntegrityTestsError,
     ProvisionError,
@@ -35,7 +36,6 @@
     TerraformInitializationError,
     ThirdPartyTestError,
     UninstallPackageError,
-    AbortedTestTask,
     VMImageNotFound,
     WorkDirPreparationError,
 )

Bandit report
Run started:2024-08-19 10:48:40.190388

Test results:
>> Issue: [B108:hardcoded_tmp_directory] Probable insecure usage of temp file/directory.
   Severity: Medium   Confidence: Medium
   CWE: CWE-377 (https://cwe.mitre.org/data/definitions/377.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/plugins/b108_hardcoded_tmp_directory.html
   Location: ./alts/worker/runners/base.py:76:20
75	)
76	TF_INIT_LOCK_PATH = '/tmp/tf_init_lock'
77	BASE_SYSTEM_INFO_COMMANDS = {

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:685:27
684	                attempts -= 1
685	                time.sleep(random.randint(5, 10))
686	        if attempts == 0 and recorded_exc:

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:1134:27
1133	                self._logger.debug('Sleeping before making another attempt')
1134	                time.sleep(random.randint(5, 10))
1135	            else:

--------------------------------------------------
>> Issue: [B110:try_except_pass] Try, Except, Pass detected.
   Severity: Low   Confidence: High
   CWE: CWE-703 (https://cwe.mitre.org/data/definitions/703.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/plugins/b110_try_except_pass.html
   Location: ./alts/worker/runners/base.py:1743:12
1742	                self._ssh_client.close()
1743	            except:
1744	                pass
1745	        super().teardown(publish_artifacts=publish_artifacts)

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/runners/base.py:1777:27
1776	                )
1777	                time.sleep(random.randint(5, 10))
1778	            if not result or (result and not result.is_successful()):

--------------------------------------------------
>> Issue: [B311:blacklist] Standard pseudo-random generators are not suitable for security/cryptographic purposes.
   Severity: Low   Confidence: High
   CWE: CWE-330 (https://cwe.mitre.org/data/definitions/330.html)
   More Info: https://bandit.readthedocs.io/en/1.7.8/blacklists/blacklist_calls.html#b311-random
   Location: ./alts/worker/tasks.py:180:19
179	        # a lot of tasks are coming to the machine
180	        time.sleep(random.randint(5, 10))
181	        runner.setup()

--------------------------------------------------

Code scanned:
	Total lines of code: 2862
	Total lines skipped (#nosec): 0
	Total potential issues skipped due to specifically being disabled (e.g., #nosec BXXX): 0

Run metrics:
	Total issues (by severity):
		Undefined: 0
		Low: 5
		Medium: 1
		High: 0
	Total issues (by confidence):
		Undefined: 0
		Low: 0
		Medium: 1
		High: 5
Files skipped (0):

View full reports on the Job Summary page.

@Korulag Korulag merged commit 4c4e637 into master Aug 30, 2024
2 checks passed
@Korulag Korulag deleted the add-test-flavors branch August 30, 2024 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants