Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test against multiple SQLite versions #2352

Merged
merged 30 commits into from
Jun 13, 2024

Conversation

asg017
Copy link
Collaborator

@asg017 asg017 commented Jun 11, 2024

Copy link

codecov bot commented Jun 11, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 92.63%. Comparing base (64a125b) to head (eb6b27d).

Current head eb6b27d differs from pull request most recent head 653c46d

Please upload reports for the commit 653c46d to get more accurate results.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2352      +/-   ##
==========================================
- Coverage   92.63%   92.63%   -0.01%     
==========================================
  Files          41       41              
  Lines        6399     6395       -4     
==========================================
- Hits         5928     5924       -4     
  Misses        471      471              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@simonw
Copy link
Owner

simonw commented Jun 12, 2024

Looks like that test failure is caused by:

    async def test_derive_named_parameters(sql, expected):
        ds = Datasette([], memory=True)
        db = ds.get_database("_memory")
        params = await utils.derive_named_parameters(db, sql)
>       assert params == expected
E       AssertionError: assert ['00', 'cat'] == ['cat']
E         
E         At index 0 diff: '00' != 'cat'
E         Left contains one more item: 'cat'
E         
E         Full diff:
E           [
E         +     '00',
E               'cat',
E           ]

@simonw
Copy link
Owner

simonw commented Jun 12, 2024

That test is still failing because the whole point of that test is to confirm that the clever trick I came up with (and that gets skipped on SQLite 3.46.0) correctly avoids the "0:00" trick string.

@pytest.mark.asyncio
@pytest.mark.parametrize(
"sql,expected",
(
("select 1", []),
("select 1 + :one", ["one"]),
("select 1 + :one + :two", ["one", "two"]),
("select 'bob' || '0:00' || :cat", ["cat"]),
("select this is invalid :one, :two, :three", ["one", "two", "three"]),
),
)
async def test_derive_named_parameters(sql, expected):
ds = Datasette([], memory=True)
db = ds.get_database("_memory")
params = await utils.derive_named_parameters(db, sql)
assert params == expected

@simonw
Copy link
Owner

simonw commented Jun 13, 2024

Let's drop macOS for the moment, and just go with Linux.

@simonw
Copy link
Owner

simonw commented Jun 13, 2024

Got this intermittent failure: https://github.com/simonw/datasette/actions/runs/9494472159/job/26164885108

Really frustrating, I don't think it's anything to do with SQLite versions, I think it's a flaky test.

    def test_max_csv_mb(app_client_csv_max_mb_one):
        # This query deliberately generates a really long string
        # should be 100*100*100*2 = roughly 2MB
        response = app_client_csv_max_mb_one.get(
            "/fixtures.csv?"
            + urllib.parse.urlencode(
                {
                    "sql": """
                select group_concat('ab', '')
                from json_each(json_array({lots})),
                    json_each(json_array({lots})),
                    json_each(json_array({lots}))
                """.format(
                        lots=", ".join(str(i) for i in range(100))
                    ),
                    "_stream": 1,
                    "_size": "max",
                }
            ),
        )
        # It's a 200 because we started streaming before we knew the error
>       assert response.status == 200
E       assert 400 == 200
E        +  where 400 = <datasette.utils.testing.TestResponse object at 0x7fd35cdc04d0>.status

@simonw
Copy link
Owner

simonw commented Jun 13, 2024

Let's comment out that test so it doesn't block us on merging this PR, then fix that test in this issue:

Comment on lines 54 to 55
# And the test that exceeds a localhost HTTPS server
tests/test_datasette_https_server.sh
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could drop this, it's covered elsewhere

@@ -0,0 +1,53 @@
name: Test
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should change this to Test SQLite versions otherwise the GitHub Actions UI has a bunch of things all called Test:

CleanShot 2024-06-13 at 09 42 56@2x

@simonw simonw marked this pull request as ready for review June 13, 2024 16:46
@simonw simonw changed the title WIP: Test on multiple SQLite version Test against multiple SQLite versions Jun 13, 2024
@simonw
Copy link
Owner

simonw commented Jun 13, 2024

Flaky test failed again in a different way: https://github.com/asg017/datasette/actions/runs/9503497141/job/26194049093

    def test_max_csv_mb(app_client_csv_max_mb_one):
        # This query deliberately generates a really long string
        # should be 100*100*100*2 = roughly 2MB
        response = app_client_csv_max_mb_one.get(
            "/fixtures.csv?"
            + urllib.parse.urlencode(
                {
                    "sql": """
                select group_concat('ab', '')
                from json_each(json_array({lots})),
                    json_each(json_array({lots})),
                    json_each(json_array({lots}))
                """.format(
                        lots=", ".join(str(i) for i in range(100))
                    ),
                    "_stream": 1,
                    "_size": "max",
                }
            ),
        )
        # It's a 200 because we started streaming before we knew the error
>       assert response.status == 200
E       assert 500 == 200
E        +  where 500 = <datasette.utils.testing.TestResponse object at 0x7fe7fb6d0790>.status

/home/runner/work/datasette/datasette/tests/test_csv.py:213: AssertionError
----------------------------- Captured stderr call -----------------------------
Traceback (most recent call last):
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 306, in sql_operation_in_thread
    cursor.execute(sql, params if params is not None else {})
sqlite3.OperationalError: interrupted

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/runner/work/datasette/datasette/datasette/app.py", line 1732, in route_path
    response = await view(request, send)
  File "/home/runner/work/datasette/datasette/datasette/app.py", line 1899, in async_view_for_class
    return await async_call_with_supported_arguments(
  File "/home/runner/work/datasette/datasette/datasette/utils/__init__.py", line 1022, in async_call_with_supported_arguments
    return await fn(*call_with)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 89, in __call__
    return await handler(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 61, in get
    return await QueryView()(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 89, in __call__
    return await handler(request, datasette)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 564, in get
    return await stream_csv(datasette, fetch_data_for_csv, request, db.name)
  File "/home/runner/work/datasette/datasette/datasette/views/base.py", line 441, in stream_csv
    response_or_template_contexts = await fetch_data(request)
  File "/home/runner/work/datasette/datasette/datasette/views/database.py", line 560, in fetch_data_for_csv
    results = await db.execute(sql, params, truncate=True)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 336, in execute
    results = await self.execute_fn(sql_operation_in_thread)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 282, in execute_fn
    return await asyncio.get_event_loop().run_in_executor(
  File "/opt/hostedtoolcache/Python/3.8.18/x64/lib/python3.8/concurrent/futures/thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 280, in in_thread
    return fn(conn)
  File "/home/runner/work/datasette/datasette/datasette/database.py", line 319, in sql_operation_in_thread
    raise QueryInterrupted(e, sql, params)
datasette.database.QueryInterrupted: (OperationalError('interrupted'), "\n            select group_concat('ab', '')\n            from json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99)),\n                json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99)),\n                json_each(json_array(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99))\n            ", {'_stream': '1', '_size': 'max'})

@simonw
Copy link
Owner

simonw commented Jun 13, 2024

Split out another flaky test issue:

@simonw simonw merged commit 8f86d2a into simonw:main Jun 13, 2024
19 checks passed
@simonw simonw mentioned this pull request Aug 5, 2024
simonw added a commit that referenced this pull request Aug 5, 2024
simonw added a commit that referenced this pull request Aug 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants