Skip to content
This repository has been archived by the owner on Feb 1, 2019. It is now read-only.

.coafile: Updated .coafile #19

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

sosooding
Copy link

.coafile: Updated .coafile

Closes: #18

@TravisBuddy
Copy link

Travis tests have failed

Hey @Rishik1504,
Please read the following log in order to understand the failure reason.
It'll be awesome if you fix what's wrong and commit the changes.

3rd Build

View build log

coverage run $(which behave) ./tests/server.features
Feature: coalashim module # tests/server.features/coalashim.feature:1
  coalashim is a module of language-server, it interacts with coala core.
  Scenario: Test run_coala_with_specific_file                    # tests/server.features/coalashim.feature:4
    Given the current directory and path of qualified.py         # tests/server.features/steps/coalashim_steps.py:13
    Given the current directory and path of qualified.py         # tests/server.features/steps/coalashim_steps.py:13 0.000s
    When I pass the qualified.py to run_coala_with_specific_file # tests/server.features/steps/coalashim_steps.py:26
    When I pass the qualified.py to run_coala_with_specific_file # tests/server.features/steps/coalashim_steps.py:26 5.013s
    Then it should return output in json format                  # tests/server.features/steps/coalashim_steps.py:31
    Then it should return output in json format                  # tests/server.features/steps/coalashim_steps.py:31 0.000s
    And with no error in the output                              # tests/server.features/steps/coalashim_steps.py:36
    And with no error in the output                              # tests/server.features/steps/coalashim_steps.py:36 0.000s

  Scenario: Test run_coala_with_specific_file                      # tests/server.features/coalashim.feature:10
    Given the current directory and path of unqualified.py         # tests/server.features/steps/coalashim_steps.py:41
    Given the current directory and path of unqualified.py         # tests/server.features/steps/coalashim_steps.py:41 0.000s
    When I pass the unqualified.py to run_coala_with_specific_file # tests/server.features/steps/coalashim_steps.py:54
    When I pass the unqualified.py to run_coala_with_specific_file # tests/server.features/steps/coalashim_steps.py:54 4.350s
    Then it should return output in json format                    # tests/server.features/steps/coalashim_steps.py:31
    Then it should return output in json format                    # tests/server.features/steps/coalashim_steps.py:31 0.000s
    And with autopep8 errors in the output                         # tests/server.features/steps/coalashim_steps.py:59
    And with autopep8 errors in the output                         # tests/server.features/steps/coalashim_steps.py:59 0.001s
      Traceback (most recent call last):
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/server.features/steps/coalashim_steps.py", line 61, in step_impl
          assert json.loads(context.output)['results']['autopep8'] is not None
      KeyError: 'autopep8'
      
      Captured stderr:
      [WARNING][10:15:45] Section `all.todos` contain invalid language setting: 'Language `python3` is not a valid language name or not recognized by coala.'
      Output = {
        "results": {
          "all": [],
          "all.autopep8": [
            {
              "additional_info": "",
              "affected_code": [
                {
                  "end": {
                    "column": null,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  },
                  "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                  "start": {
                    "column": null,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  }
                }
              ],
              "applied_actions": {},
              "aspect": "NoneType",
              "confidence": 100,
              "debug_msg": "",
              "diffs": {
                "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py": "--- \n+++ \n@@ -1,2 +1,2 @@\n def test():\n-  a = 1\n+    a = 1\n"
              },
              "id": 166333784423917419720597156743241798632,
              "message": "The code does not comply to PEP8.",
              "message_arguments": {},
              "message_base": "The code does not comply to PEP8.",
              "origin": "PEP8Bear",
              "severity": 1
            },
            {
              "additional_info": "",
              "affected_code": [
                {
                  "end": {
                    "column": 3,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  },
                  "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                  "start": {
                    "column": 3,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  }
                }
              ],
              "applied_actions": {},
              "aspect": "NoneType",
              "confidence": 100,
              "debug_msg": "",
              "diffs": null,
              "id": 84188821207710503332153972590973145849,
              "message": "E111 indentation is not a multiple of four",
              "message_arguments": {},
              "message_base": "E111 indentation is not a multiple of four",
              "origin": "PycodestyleBear (E111)",
              "severity": 1
            },
            {
              "additional_info": "",
              "affected_code": [
                {
                  "end": {
                    "column": 8,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  },
                  "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                  "start": {
                    "column": 8,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  }
                }
              ],
              "applied_actions": {},
              "aspect": "NoneType",
              "confidence": 100,
              "debug_msg": "",
              "diffs": null,
              "id": 79143895640915389452441634259884638004,
              "message": "W292 no newline at end of file",
              "message_arguments": {},
              "message_base": "W292 no newline at end of file",
              "origin": "PycodestyleBear (W292)",
              "severity": 1
            }
          ],
          "all.commit": [],
          "all.linelength": [],
          "all.python": [
            {
              "additional_info": "A trailing newline character ('\\n') is missing from your file. <http://stackoverflow.com/a/5813359/3212182> gives more information about why you might need one.",
              "affected_code": [
                {
                  "end": {
                    "column": null,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  },
                  "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                  "start": {
                    "column": null,
                    "file": "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py",
                    "line": 2
                  }
                }
              ],
              "applied_actions": {},
              "aspect": "NoneType",
              "confidence": 100,
              "debug_msg": "",
              "diffs": {
                "/home/travis/build/coala/coala-ls/tests/resources/unqualified.py": "--- \n+++ \n@@ -1,2 +1,2 @@\n def test():\n-  a = 1\n+  a = 1\n"
              },
              "id": 125639603658679655154191626767604465341,
              "message": "Line contains following spacing inconsistencies:\n- No newline at EOF.",
              "message_arguments": {},
              "message_base": "Line contains following spacing inconsistencies:\n- No newline at EOF.",
              "origin": "SpaceConsistencyBear",
              "severity": 1
            }
          ],
          "all.yml": [],
          "cli": []
        }
      }


Feature: diagnostic module # tests/server.features/diagnostic.feature:1
  diagnostic is a module of language-server.
  Scenario: Test output_to_diagnostics                  # tests/server.features/diagnostic.feature:4
    Given the output with errors by coala               # tests/server.features/steps/diagnostic_steps.py:13
    Given the output with errors by coala               # tests/server.features/steps/diagnostic_steps.py:13 4.152s
    When I pass the parameters to output_to_diagnostics # tests/server.features/steps/diagnostic_steps.py:28
    When I pass the parameters to output_to_diagnostics # tests/server.features/steps/diagnostic_steps.py:28 0.000s
    Then it should return output in vscode format       # tests/server.features/steps/diagnostic_steps.py:33
    Then it should return output in vscode format       # tests/server.features/steps/diagnostic_steps.py:33 0.000s

Feature: jsonrpc module # tests/server.features/jsonrpc.feature:1
  jsonrpc is a module of language-server.
  Scenario: Test JsonRpcStreamWriter and JsonRpcStreamReader  # tests/server.features/jsonrpc.feature:4
    Given the message                                         # tests/server.features/steps/jsonrpc_steps.py:23
    Given the message                                         # tests/server.features/steps/jsonrpc_steps.py:23 0.000s
    When I write it to JsonRpcStreamWriter                    # tests/server.features/steps/jsonrpc_steps.py:30
    When I write it to JsonRpcStreamWriter                    # tests/server.features/steps/jsonrpc_steps.py:30 0.001s
    Then it should read from JsonRpcStreamReader              # tests/server.features/steps/jsonrpc_steps.py:37
    Then it should read from JsonRpcStreamReader              # tests/server.features/steps/jsonrpc_steps.py:37 0.000s

  Scenario: Test notification and disptacher                  # tests/server.features/jsonrpc.feature:9
    Given a notification type rpc request                     # tests/server.features/steps/jsonrpc_steps.py:55
    Given a notification type rpc request                     # tests/server.features/steps/jsonrpc_steps.py:55 0.000s
    When I send rpc request using JsonRpcStreamWriter         # tests/server.features/steps/jsonrpc_steps.py:67
    When I send rpc request using JsonRpcStreamWriter         # tests/server.features/steps/jsonrpc_steps.py:67 0.000s
    Then it should invoke the notification consumer with args # tests/server.features/steps/jsonrpc_steps.py:74
    Then it should invoke the notification consumer with args # tests/server.features/steps/jsonrpc_steps.py:74 0.001s

  Scenario: Test rpc request and response              # tests/server.features/jsonrpc.feature:14
    Given a request type rpc request                   # tests/server.features/steps/jsonrpc_steps.py:94
    Given a request type rpc request                   # tests/server.features/steps/jsonrpc_steps.py:94 0.000s
    When I send rpc request using JsonRpcStreamWriter  # tests/server.features/steps/jsonrpc_steps.py:67
    When I send rpc request using JsonRpcStreamWriter  # tests/server.features/steps/jsonrpc_steps.py:67 0.000s
    Then it should invoke consumer and return response # tests/server.features/steps/jsonrpc_steps.py:107
    Then it should invoke consumer and return response # tests/server.features/steps/jsonrpc_steps.py:107 0.000s

Feature: langserver module # tests/server.features/langserver.feature:1
  langserver is the main program of language-server.
  Scenario: Test serve_initialize with rootPath                  # tests/server.features/langserver.feature:4
    Given the LangServer instance                                # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                                # tests/server.features/steps/langserver_steps.py:22 0.001s
    When I send a initialize request with rootPath to the server # tests/server.features/steps/langserver_steps.py:28
    When I send a initialize request with rootPath to the server # tests/server.features/steps/langserver_steps.py:28 0.000s
    Then it should return the response with textDocumentSync     # tests/server.features/steps/langserver_steps.py:56
    Then it should return the response with textDocumentSync     # tests/server.features/steps/langserver_steps.py:56 0.000s

  Scenario: Test serve_initialize with rootUri                  # tests/server.features/langserver.feature:9
    Given the LangServer instance                               # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                               # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a initialize request with rootUri to the server # tests/server.features/steps/langserver_steps.py:42
    When I send a initialize request with rootUri to the server # tests/server.features/steps/langserver_steps.py:42 0.000s
    Then it should return the response with textDocumentSync    # tests/server.features/steps/langserver_steps.py:56
    Then it should return the response with textDocumentSync    # tests/server.features/steps/langserver_steps.py:56 0.000s

  Scenario: Test send_diagnostics                            # tests/server.features/langserver.feature:14
    Given the LangServer instance                            # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                            # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I invoke send_diagnostics message                   # tests/server.features/steps/langserver_steps.py:75
    When I invoke send_diagnostics message                   # tests/server.features/steps/langserver_steps.py:75 0.000s
    Then I should receive a publishDiagnostics type response # tests/server.features/steps/langserver_steps.py:135
    Then I should receive a publishDiagnostics type response # tests/server.features/steps/langserver_steps.py:135 0.000s

  Scenario: Test negative m_text_document__did_save                       # tests/server.features/langserver.feature:19
    Given the LangServer instance                                         # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                                         # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a did_save request about a non-existed file to the server # tests/server.features/steps/langserver_steps.py:81
    When I send a did_save request about a non-existed file to the server # tests/server.features/steps/langserver_steps.py:81 3.937s
    Then I should receive a publishDiagnostics type response              # tests/server.features/steps/langserver_steps.py:135
    Then I should receive a publishDiagnostics type response              # tests/server.features/steps/langserver_steps.py:135 0.000s

  Scenario: Test positive m_text_document__did_save                    # tests/server.features/langserver.feature:24
    Given the LangServer instance                                      # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                                      # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a did_save request about a existing file to the server # tests/server.features/steps/langserver_steps.py:96
    When I send a did_save request about a existing file to the server # tests/server.features/steps/langserver_steps.py:96 4.419s
    Then I should receive a publishDiagnostics type response           # tests/server.features/steps/langserver_steps.py:135
    Then I should receive a publishDiagnostics type response           # tests/server.features/steps/langserver_steps.py:135 0.000s

  Scenario: Test when coafile is missing                               # tests/server.features/langserver.feature:29
    Given the LangServer instance                                      # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                                      # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a did_save request on a file with no coafile to server # tests/server.features/steps/langserver_steps.py:116
    When I send a did_save request on a file with no coafile to server # tests/server.features/steps/langserver_steps.py:116 4.000s
    Then I should receive a publishDiagnostics type response           # tests/server.features/steps/langserver_steps.py:135
    Then I should receive a publishDiagnostics type response           # tests/server.features/steps/langserver_steps.py:135 0.001s

  Scenario: Test didChange                                      # tests/server.features/langserver.feature:34
    Given the LangServer instance                               # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                               # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a did_change request about a file to the server # tests/server.features/steps/langserver_steps.py:156
    When I send a did_change request about a file to the server # tests/server.features/steps/langserver_steps.py:156 0.002s
    Then it should ignore the request                           # tests/server.features/steps/langserver_steps.py:179
    Then it should ignore the request                           # tests/server.features/steps/langserver_steps.py:179 0.000s

  Scenario: Test langserver shutdown             # tests/server.features/langserver.feature:39
    Given the LangServer instance                # tests/server.features/steps/langserver_steps.py:22
    Given the LangServer instance                # tests/server.features/steps/langserver_steps.py:22 0.000s
    When I send a shutdown request to the server # tests/server.features/steps/langserver_steps.py:186
    When I send a shutdown request to the server # tests/server.features/steps/langserver_steps.py:186 0.000s
    Then it should shutdown                      # tests/server.features/steps/langserver_steps.py:197
    Then it should shutdown                      # tests/server.features/steps/langserver_steps.py:197 0.000s

  Scenario: Test language server in stdio mode                         # tests/server.features/langserver.feature:44
    Given I send a initialize request via stdio stream                 # tests/server.features/steps/langserver_steps.py:297
    Given I send a initialize request via stdio stream                 # tests/server.features/steps/langserver_steps.py:297 0.000s
    When the server is started in stdio mode                           # tests/server.features/steps/langserver_steps.py:315
    When the server is started in stdio mode                           # tests/server.features/steps/langserver_steps.py:315 0.001s
    Then it should return the response with textDocumentSync via stdio # tests/server.features/steps/langserver_steps.py:337
    Then it should return the response with textDocumentSync via stdio # tests/server.features/steps/langserver_steps.py:337 2.003s

  Scenario: Test language server in tcp mode                         # tests/server.features/langserver.feature:49
    Given the server started in TCP mode                             # tests/server.features/steps/langserver_steps.py:236
    Given the server started in TCP mode                             # tests/server.features/steps/langserver_steps.py:236 1.005s
    When I send a initialize request via TCP stream                  # tests/server.features/steps/langserver_steps.py:265
    When I send a initialize request via TCP stream                  # tests/server.features/steps/langserver_steps.py:265 0.000s
    Then it should return the response with textDocumentSync via TCP # tests/server.features/steps/langserver_steps.py:279
    Then it should return the response with textDocumentSync via TCP # tests/server.features/steps/langserver_steps.py:279 0.001s

Feature: log module # tests/server.features/log.feature:1
  log is a module of language-server.
  Scenario: Test log               # tests/server.features/log.feature:4
    Given There is a string        # tests/server.features/steps/log_steps.py:11
    Given There is a string        # tests/server.features/steps/log_steps.py:11 0.001s
    When I pass the string to log  # tests/server.features/steps/log_steps.py:16
    When I pass the string to log  # tests/server.features/steps/log_steps.py:16 0.000s
    Then it should return normally # tests/server.features/steps/log_steps.py:21
    Then it should return normally # tests/server.features/steps/log_steps.py:21 0.000s
      Traceback (most recent call last):
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/server.features/steps/log_steps.py", line 23, in step_impl
          assert context.failed is False
      AssertionError
      
      Captured stderr:
      file://Users


Feature: uri module # tests/server.features/uri.feature:1
  uri is a module of language-server.
  Scenario: Test path_from_uri                              # tests/server.features/uri.feature:4
    Given There is a string with "file://"                  # tests/server.features/steps/uri_steps.py:11
    Given There is a string with "file://"                  # tests/server.features/steps/uri_steps.py:11 0.000s
    When I pass the string with the prefix to path_from_uri # tests/server.features/steps/uri_steps.py:16
    When I pass the string with the prefix to path_from_uri # tests/server.features/steps/uri_steps.py:16 0.000s
    Then it should return a string without "file://"        # tests/server.features/steps/uri_steps.py:31
    Then it should return a string without "file://"        # tests/server.features/steps/uri_steps.py:31 0.000s
      Traceback (most recent call last):
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/server.features/steps/uri_steps.py", line 33, in step_impl
          assert context.failed is False
      AssertionError


  Scenario: Test path_from_uri                                 # tests/server.features/uri.feature:9
    Given There is a string without "file://"                  # tests/server.features/steps/uri_steps.py:21
    Given There is a string without "file://"                  # tests/server.features/steps/uri_steps.py:21 0.000s
    When I pass the string without the prefix to path_from_uri # tests/server.features/steps/uri_steps.py:26
    When I pass the string without the prefix to path_from_uri # tests/server.features/steps/uri_steps.py:26 0.000s
    Then it should return a string without "file://"           # tests/server.features/steps/uri_steps.py:31
    Then it should return a string without "file://"           # tests/server.features/steps/uri_steps.py:31 0.000s
      Traceback (most recent call last):
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/server.features/steps/uri_steps.py", line 33, in step_impl
          assert context.failed is False
      AssertionError


  Scenario: Test dir_from_uri                       # tests/server.features/uri.feature:14
    Given There is a string without "file://"       # tests/server.features/steps/uri_steps.py:21
    Given There is a string without "file://"       # tests/server.features/steps/uri_steps.py:21 0.000s
    When I pass the string to dir_from_uri          # tests/server.features/steps/uri_steps.py:37
    When I pass the string to dir_from_uri          # tests/server.features/steps/uri_steps.py:37 0.000s
    Then it should return the directory of the path # tests/server.features/steps/uri_steps.py:42
    Then it should return the directory of the path # tests/server.features/steps/uri_steps.py:42 0.000s
      Traceback (most recent call last):
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/model.py", line 1329, in run
          match.run(runner.context)
        File "/home/travis/virtualenv/python3.6.3/lib/python3.6/site-packages/behave/matchers.py", line 98, in run
          self.func(context, *args, **kwargs)
        File "tests/server.features/steps/uri_steps.py", line 44, in step_impl
          assert context.failed is False
      AssertionError



Failing scenarios:
  tests/server.features/coalashim.feature:10  Test run_coala_with_specific_file
  tests/server.features/log.feature:4  Test log
  tests/server.features/uri.feature:4  Test path_from_uri
  tests/server.features/uri.feature:9  Test path_from_uri
  tests/server.features/uri.feature:14  Test dir_from_uri

3 features passed, 3 failed, 0 skipped
15 scenarios passed, 5 failed, 0 skipped
57 steps passed, 5 failed, 0 skipped, 0 undefined
Took 0m28.901s
TravisBuddy Request Identifier: c80b1f30-e4d1-11e8-80bf-e932c516df5f

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Development

Successfully merging this pull request may close these issues.

2 participants