Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pre-commit.ci] pre-commit autoupdate #2810

Merged
merged 3 commits into from
Oct 3, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,13 +14,13 @@ repos:
hooks:
- id: black
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.0.291
rev: v0.0.292
hooks:
- id: ruff
types: [file]
types_or: [python, pyi, toml]
- repo: https://github.com/codespell-project/codespell
rev: v2.2.5
rev: v2.2.6
hooks:
- id: codespell

Expand Down
4 changes: 2 additions & 2 deletions docs/source/history.rst
Original file line number Diff line number Diff line change
Expand Up @@ -265,7 +265,7 @@ Headline features
Features
~~~~~~~~

- To speed up `trio.to_thread.run_sync`, Trio now caches and re-uses
- To speed up `trio.to_thread.run_sync`, Trio now caches and reuses
worker threads.

And in case you have some exotic use case where you need to spawn
Expand Down Expand Up @@ -419,7 +419,7 @@ Features

If you're using higher-level interfaces outside of the `trio.hazmat <trio.lowlevel>`
module, then you don't need to worry about any of this; those
intefaces already take care of calling `~trio.lowlevel.notify_closing`
interfaces already take care of calling `~trio.lowlevel.notify_closing`
for you. (`#1272 <https://github.com/python-trio/trio/issues/1272>`__)


Expand Down
6 changes: 3 additions & 3 deletions docs/source/reference-core.rst
Original file line number Diff line number Diff line change
Expand Up @@ -330,9 +330,9 @@ they might find it easier to work with absolute deadlines instead of
relative timeouts. If they're the ones calling into the cancellation
machinery, then they get to pick, and you don't have to worry about
it. Second, and more importantly, this makes it easier for others to
re-use your code. If you write a ``http_get`` function, and then I
come along later and write a ``log_in_to_twitter`` function that needs
to internally make several ``http_get`` calls, I don't want to have to
reuse your code. If you write a ``http_get`` function, and then I come
along later and write a ``log_in_to_twitter`` function that needs to
internally make several ``http_get`` calls, I don't want to have to
figure out how to configure the individual timeouts on each of those
calls – and with Trio's timeout system, it's totally unnecessary.

Expand Down
2 changes: 1 addition & 1 deletion docs/source/reference-io.rst
Original file line number Diff line number Diff line change
Expand Up @@ -444,7 +444,7 @@ Socket objects
has begun. If :meth:`connect` is cancelled, and is unable to
abort the connection attempt, then it will:

1. forcibly close the socket to prevent accidental re-use
1. forcibly close the socket to prevent accidental reuse
2. raise :exc:`~trio.Cancelled`.

tl;dr: if :meth:`connect` is cancelled then the socket is
Expand Down
2 changes: 1 addition & 1 deletion docs/source/tutorial.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1043,7 +1043,7 @@ available; ``receive_some`` returns as soon as *any* data is available. If
``data`` is small, then our operating systems / network / server will
*probably* keep it all together in a single chunk, but there's no
guarantee. If the server sends ``hello`` then we might get ``hello``,
or ``hel`` ``lo``, or ``h`` ``e`` ``l`` ``l`` ``o``, or ... bottom
or ``he`` ``llo``, or ``h`` ``e`` ``l`` ``l`` ``o``, or ... bottom
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the only real:tm: change and I presume this is fine.

line, any time we're expecting more than one byte of data, we have to
be prepared to call ``receive_some`` multiple times.

Expand Down
2 changes: 1 addition & 1 deletion trio/_core/_thread_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,7 @@ def start_thread_soon(

This is a low-level, no-frills interface, very similar to using
`threading.Thread` to spawn a thread directly. The main difference is
that this function tries to re-use threads when possible, so it can be
that this function tries to reuse threads when possible, so it can be
a bit faster than `threading.Thread`.

Worker threads have the `~threading.Thread.daemon` flag set, which means
Expand Down
2 changes: 1 addition & 1 deletion trio/_tests/test_testing.py
Original file line number Diff line number Diff line change
Expand Up @@ -190,7 +190,7 @@ async def f2(seq):
assert record == [("f2", 0), ("f1", 1), ("f2", 2), ("f1", 3), ("f1", 4)]

seq = Sequencer()
# Catches us if we try to re-use a sequence point:
# Catches us if we try to reuse a sequence point:
async with seq(0):
pass
with pytest.raises(RuntimeError):
Expand Down
3 changes: 1 addition & 2 deletions trio/_tests/type_tests/check_wraps.py
Original file line number Diff line number Diff line change
@@ -1,8 +1,7 @@
# https://github.com/python-trio/trio/issues/2775#issuecomment-1702267589
# (except platform independent...)
import typing_extensions

import trio
import typing_extensions


async def fn(s: trio.SocketStream) -> None:
Expand Down
2 changes: 1 addition & 1 deletion trio/testing/_sequencer.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ async def main():
@asynccontextmanager
async def __call__(self, position: int) -> AsyncIterator[None]:
if position in self._claimed:
raise RuntimeError(f"Attempted to re-use sequence point {position}")
raise RuntimeError(f"Attempted to reuse sequence point {position}")
if self._broken:
raise RuntimeError("sequence broken!")
self._claimed.add(position)
Expand Down
Loading