Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve CNS-related error messages #1012

Closed
rvhonorato opened this issue Sep 10, 2024 · 0 comments · Fixed by #1018
Closed

Improve CNS-related error messages #1012

rvhonorato opened this issue Sep 10, 2024 · 0 comments · Fixed by #1018
Assignees
Labels
enhancement Enhancing an existing feature of adding a new one

Comments

@rvhonorato
Copy link
Member

Desired feature/enhancement

A better description of the errors found during a .inp file execution

Motivation

Currently when one of the CNS-based modules fail to produce it's output, the error is caught later during the tolerancy check, raising this RuntimeError exception which is not very informative (#1011 and others)

[2024-09-09 20:04:33,709 libutil ERROR] 100.00% of output was not generated for this module and tolerance was set to 5.00%.
Traceback (most recent call last):
  File "/data/foldseek/af-data/haddock3/src/haddock/libs/libutil.py", line 335, in log_error_and_exit
    yield
  File "/data/foldseek/af-data/haddock3/src/haddock/clis/cli.py", line 192, in main
    workflow.run()
  File "/data/foldseek/af-data/haddock3/src/haddock/libs/libworkflow.py", line 43, in run
    step.execute()
  File "/data/foldseek/af-data/haddock3/src/haddock/libs/libworkflow.py", line 162, in execute
    self.module.run()  # type: ignore
  File "/data/foldseek/af-data/haddock3/src/haddock/modules/base_cns_module.py", line 61, in run
    self._run()
  File "/data/foldseek/af-data/haddock3/src/haddock/modules/sampling/rigidbody/__init__.py", line 246, in _run
    self.export_io_models(faulty_tolerance=self.params["tolerance"])
  File "/data/foldseek/af-data/haddock3/src/haddock/modules/__init__.py", line 300, in export_io_models
    self.finish_with_error(_msg)
  File "/data/foldseek/af-data/haddock3/src/haddock/modules/__init__.py", line 308, in finish_with_error
    raise RuntimeError(reason)
RuntimeError: 100.00% of output was not generated for this module and tolerance was set to 5.00%.

Description

The CNS-based modules can fail to produce output for multiple reasons, however over the years @amjjbonvin has collected several outputs related to a specific reason for failure. See here haddock25/tools/check-error-messages.sh (only for @haddocking/haddock-developers)

These checks can be added with a new parse_errors method of CNSJob, and integrated into it's run method, parsing the stdout of the p.communicate call

Additional context

When implementing this it's important to consider efficiency as these .out files can be quite large and with a lot of lines.

Since the errors are going to be at the end, it's a good idea to read the file backwards and stop when an error is found - for example:

error_dict= {
    "exceeded allocation for NOE-restraints":  "check your active/passive definition",
    "ROTMAT error encountered: rotation vector has zero length": "try turning off the sampling of 180 degrees rotattion"
    # ... etc
}

def parse_errors(filename: str) -> str:
    with open(filename, 'rb') as file:
        file.seek(0, 2)
        size = file.tell()
        chunk_size = 4096
        buffer = b''
        for i in range(size - 1, -1, -chunk_size):
            file.seek(max(i - chunk_size, 0))
            chunk = file.read(min(chunk_size, i + 1))
            buffer = chunk + buffer
            lines = buffer.split(b'\n')
            for line in reversed(lines[1:]):
                decoded_line = line.decode('utf-8', errors='replace')
                for error, cause in error_dict.items():
                    # Check if this error is known
                    if error in decoded_line:
                        # return the cause
                        return cause
@rvhonorato rvhonorato added the enhancement Enhancing an existing feature of adding a new one label Sep 10, 2024
@VGPReys VGPReys self-assigned this Sep 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Enhancing an existing feature of adding a new one
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants