Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Uncap urllib3 version on Python < 3.10 #777

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

dgw
Copy link

@dgw dgw commented Oct 15, 2023

Between myself and another maintainer of our downstream project, we've tested Python 3.8 & 3.9 with both OpenSSL 1.1.1 and 3.0.2, using vcrpy 5.1.0 and a forcibly upgraded urllib3 2.0.6. Based on those results, I'm optimistically hoping that this incompatibility is resolved and #699 (comment) has become true.

However, since the test suite already appears to fail on the current HEAD revision prior to applying this change, I'm not sure how to use the test results to definitively verify this patch (except by noting that in manual checks of the relevant environments I didn't see any new failures as compared to running the same tox env on master).

Downstream, we plan to unblock ourselves on the cassette cross-compatibility problem we're having by adding a step to our CI that ensures urllib3 2.x prior to running tests—but if we're lucky, that will be only a temporary kludge.

@piiq
Copy link

piiq commented Nov 10, 2023

Hi 👋
First of all, a big thank you to all of the maintainers and contributors of vcrpy. Great tool, does it's job.

A <2 pin of urllib3 is causing some troubles for us too and it would be great if this PR could solve it. I wanted to ask @kevin1024 if this PR is actually solves the issue? How can I contribute to testing this out? It would be great if the solution to unpin urllib3 would be found and a new version of vcrpy released?

Cheers and thanks 🙏

@jairhenrique
Copy link
Collaborator

@dgw can you rebase your branch with master to run CI tests please!?

@hartwork
Copy link
Collaborator

hartwork commented Dec 8, 2023

If after a rebase CI is all green, this can be merged. If it's not, it may be the same issue as back then, more details at #699 (comment) .

urllib3 2.0 works with OpenSSL 1.1.1 and 3.0 on up-to-date builds of
Python 3.8 & 3.9.
Left them on their own line for stylistic reasons, just because putting
all of the py38-py312 permutations on one line was *very* long.
@dgw
Copy link
Author

dgw commented Dec 8, 2023

Rebased!

If after a rebase CI is all green

I hope this means the urllib3 envs and does not apply to the boto3 env that is failing under py3.11 and pypy3.10 on master?

@hartwork
Copy link
Collaborator

hartwork commented Dec 8, 2023

@dgw I'm not 100% sure what you mean. We can only merge things that are fully green, if there is unrelated breakage elsewhere, that breakage needs dedicated fixing outside of this pull request, first. At least that would be my take.

@codecov-commenter
Copy link

codecov-commenter commented Dec 8, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (c062c9f) 90.10% compared to head (ea2ba18) 90.10%.

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@           Coverage Diff           @@
##           master     #777   +/-   ##
=======================================
  Coverage   90.10%   90.10%           
=======================================
  Files          27       27           
  Lines        1809     1809           
  Branches      335      335           
=======================================
  Hits         1630     1630           
  Misses        134      134           
  Partials       45       45           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@dgw
Copy link
Author

dgw commented Dec 8, 2023

@hartwork I mean that every commit with a check status on master has been ❌ since c9da7a1, and I have never been able to cleanly build this patch because of that. Can't even claim that "it was green when I first opened the PR".

But it appears I will indeed need to look at the test suite and reevaluate something; 3.8 and 3.9 failed on this patch after rebasing. (pypy-3.10 is failing on master, therefore likely unrelated to the patch.)

Probably won't get to that today. Would you like this PR placed in draft state until it's ready for another look?

@hartwork
Copy link
Collaborator

hartwork commented Dec 8, 2023

@hartwork I mean that every commit with a check status on master has been ❌ since c9da7a1, and I have never been able to cleanly build this patch because of that. Can't even claim that "it was green when I first opened the PR".

@dgw yes, that is understood, I should have put myself more clearly, sorry. That red-CI-from-the-begging state is a killer for pull requests and something we need to get rid of with priority. I have created pull request #786 on this topic just now and I hope that @jairhenrique is okay with this approach also.

But it appears I will indeed need to look at the test suite and reevaluate something; 3.8 and 3.9 failed on this patch after rebasing. Probably won't get to that today. Would you like this PR placed in draft state until it's ready for another look?

It's the same thing that breaks PyPy 3.10 apparently. I'd be good with status "draft", I support the idea.

@dgw dgw marked this pull request as draft December 8, 2023 21:31
@dgw
Copy link
Author

dgw commented Dec 8, 2023

In reference to the errors here, the ConnectionResetError shown in CI logs indeed seems similar to what was mentioned (tests/integration/test_urllib3.py::test_post[https]) back in May. I'll leave myself a link to #699 (comment) as reference; that and your (@hartwork) earlier link should give me a starting point when I sit down to examine the failure more closely.

Also including the 3.9 failure log in its entirety here, rather than digging through GHA again
tox: py39-urllib3-2
  py39-urllib3-2: commands[0]> ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m 'not online'
  ============================= test session starts ==============================
  platform linux -- Python 3.9.18, pytest-7.4.3, pluggy-1.3.0
  cachedir: .tox/py39-urllib3-2/.pytest_cache
  rootdir: /home/runner/work/vcrpy/vcrpy
  configfile: pyproject.toml
  plugins: cov-4.1.0, httpbin-2.0.0
  collected 264 items / 14 deselected / 8 skipped / 250 selected
  
  tests/integration/test_basic.py .....                                    [  2%]
  tests/integration/test_config.py .                                       [  2%]
  tests/integration/test_filter.py ..........                              [  6%]
  tests/integration/test_ignore.py ....                                    [  8%]
  tests/integration/test_matchers.py ..............                        [ 13%]
  tests/integration/test_multiple.py .                                     [ 14%]
  tests/integration/test_record_mode.py ........                           [ 17%]
  tests/integration/test_register_persister.py ...                         [ 18%]
  tests/integration/test_register_serializer.py .                          [ 18%]
  tests/integration/test_request.py ..                                     [ 19%]
  tests/integration/test_stubs.py ....                                     [ 21%]
  tests/integration/test_urllib2.py ........                               [ 24%]
  tests/integration/test_urllib3.py .......                                [ 27%]
  tests/integration/test_urllib2.py ........                               [ 30%]
  tests/integration/test_urllib3.py .....F.                                [ 33%]
  tests/integration/test_urllib2.py .                                      [ 33%]
  tests/integration/test_urllib3.py ...                                    [ 34%]
  tests/unit/test_cassettes.py ...............................             [ 47%]
  tests/unit/test_errors.py ....                                           [ 48%]
  tests/unit/test_filters.py ........................                      [ 58%]
  tests/unit/test_json_serializer.py .                                     [ 58%]
  tests/unit/test_matchers.py ............................                 [ 70%]
  tests/unit/test_migration.py ...                                         [ 71%]
  tests/unit/test_persist.py ....                                          [ 72%]
  tests/unit/test_request.py .................                             [ 79%]
  tests/unit/test_response.py ....                                         [ 81%]
  tests/unit/test_serialize.py ...............                             [ 87%]
  tests/unit/test_stubs.py .                                               [ 87%]
  tests/unit/test_unittest.py .......                                      [ 90%]
  tests/unit/test_vcr.py .......................                           [ 99%]
  tests/unit/test_vcr_import.py .                                          [100%]
  
  =================================== FAILURES ===================================
  _______________________________ test_post[https] _______________________________
  
  self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fd064[604](https://github.com/kevin1024/vcrpy/actions/runs/7146224656/job/19463668770?pr=777#step:6:613)4f0>
  method = 'POST', url = '/post', body = {'key1': 'value1', 'key2': 'value2'}
  headers = HTTPHeaderDict({})
  retries = Retry(total=3, connect=None, read=None, redirect=None, status=None)
  redirect = False, assert_same_host = False, timeout = <_TYPE_DEFAULT.token: -1>
  pool_timeout = None, release_conn = True, chunked = False, body_pos = None
  preload_content = True, decode_content = True, response_kw = {}
  parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/post', query=None, fragment=None)
  destination_scheme = None, conn = None, release_this_conn = True
  http_tunnel_required = False, err = None, clean_exit = False
  
      def urlopen(  # type: ignore[override]
          self,
          method: str,
          url: str,
          body: _TYPE_BODY | None = None,
          headers: typing.Mapping[str, str] | None = None,
          retries: Retry | bool | int | None = None,
          redirect: bool = True,
          assert_same_host: bool = True,
          timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT,
          pool_timeout: int | None = None,
          release_conn: bool | None = None,
          chunked: bool = False,
          body_pos: _TYPE_BODY_POSITION | None = None,
          preload_content: bool = True,
          decode_content: bool = True,
          **response_kw: typing.Any,
      ) -> BaseHTTPResponse:
          """
          Get a connection from the pool and perform an HTTP request. This is the
          lowest level call for making a request, so you'll need to specify all
          the raw details.
      
          .. note::
      
             More commonly, it's appropriate to use a convenience method
             such as :meth:`request`.
      
          .. note::
      
             `release_conn` will only behave as expected if
             `preload_content=False` because we want to make
             `preload_content=False` the default behaviour someday soon without
             breaking backwards compatibility.
      
          :param method:
              HTTP request method (such as GET, POST, PUT, etc.)
      
          :param url:
              The URL to perform the request on.
      
          :param body:
              Data to send in the request body, either :class:`str`, :class:`bytes`,
              an iterable of :class:`str`/:class:`bytes`, or a file-like object.
      
          :param headers:
              Dictionary of custom headers to send, such as User-Agent,
              If-None-Match, etc. If None, pool headers are used. If provided,
              these headers completely replace any pool-specific headers.
      
          :param retries:
              Configure the number of retries to allow before raising a
              :class:`~urllib3.exceptions.MaxRetryError` exception.
      
              Pass ``None`` to retry until you receive a response. Pass a
              :class:`~urllib3.util.retry.Retry` object for fine-grained control
              over different types of retries.
              Pass an integer number to retry connection errors that many times,
              but no other types of errors. Pass zero to never retry.
      
              If ``False``, then retries are disabled and any exception is raised
              immediately. Also, instead of raising a MaxRetryError on redirects,
              the redirect response will be returned.
      
          :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int.
      
          :param redirect:
              If True, automatically handle redirects (status codes 301, 302,
              303, 307, 308). Each redirect counts as a retry. Disabling retries
              will disable redirect, too.
      
          :param assert_same_host:
              If ``True``, will make sure that the host of the pool requests is
              consistent else will raise HostChangedError. When ``False``, you can
              use the pool on an HTTP proxy and request foreign hosts.
      
          :param timeout:
              If specified, overrides the default timeout for this one
              request. It may be a float (in seconds) or an instance of
              :class:`urllib3.util.Timeout`.
      
          :param pool_timeout:
              If set and the pool is set to block=True, then this method will
              block for ``pool_timeout`` seconds and raise EmptyPoolError if no
              connection is available within the time period.
      
          :param bool preload_content:
              If True, the response's body will be preloaded into memory.
      
          :param bool decode_content:
              If True, will attempt to decode the body based on the
              'content-encoding' header.
      
          :param release_conn:
              If False, then the urlopen call will not release the connection
              back into the pool once a response is received (but will release if
              you read the entire contents of the response such as when
              `preload_content=True`). This is useful if you're not preloading
              the response's content immediately. You will need to call
              ``r.release_conn()`` on the response ``r`` to return the connection
              back into the pool. If None, it takes the value of ``preload_content``
              which defaults to ``True``.
      
          :param bool chunked:
              If True, urllib3 will send the body using chunked transfer
              encoding. Otherwise, urllib3 will send the body using the standard
              content-length form. Defaults to False.
      
          :param int body_pos:
              Position to seek to in file-like body in the event of a retry or
              redirect. Typically this won't need to be set because urllib3 will
              auto-populate the value when needed.
          """
          parsed_url = parse_url(url)
          destination_scheme = parsed_url.scheme
      
          if headers is None:
              headers = self.headers
      
          if not isinstance(retries, Retry):
              retries = Retry.from_int(retries, redirect=redirect, default=self.retries)
      
          if release_conn is None:
              release_conn = preload_content
      
          # Check host
          if assert_same_host and not self.is_same_host(url):
              raise HostChangedError(self, url, retries)
      
          # Ensure that the URL we're connecting to is properly encoded
          if url.startswith("/"):
              url = to_str(_encode_target(url))
          else:
              url = to_str(parsed_url.url)
      
          conn = None
      
          # Track whether `conn` needs to be released before
          # returning/raising/recursing. Update this variable if necessary, and
          # leave `release_conn` constant throughout the function. That way, if
          # the function recurses, the original value of `release_conn` will be
          # passed down into the recursive call, and its value will be respected.
          #
          # See issue #651 [1] for details.
          #
          # [1] <https://github.com/urllib3/urllib3/issues/651>
          release_this_conn = release_conn
      
          http_tunnel_required = connection_requires_http_tunnel(
              self.proxy, self.proxy_config, destination_scheme
          )
      
          # Merge the proxy headers. Only done when not using HTTP CONNECT. We
          # have to copy the headers dict so we can safely change it without those
          # changes being reflected in anyone else's copy.
          if not http_tunnel_required:
              headers = headers.copy()  # type: ignore[attr-defined]
              headers.update(self.proxy_headers)  # type: ignore[union-attr]
      
          # Must keep the exception bound to a separate variable or else Python 3
          # complains about UnboundLocalError.
          err = None
      
          # Keep track of whether we cleanly exited the except block. This
          # ensures we do proper cleanup in finally.
          clean_exit = False
      
          # Rewind body position, if needed. Record current position
          # for future rewinds in the event of a redirect/retry.
          body_pos = set_file_position(body, body_pos)
      
          try:
              # Request a connection from the queue.
              timeout_obj = self._get_timeout(timeout)
              conn = self._get_conn(timeout=pool_timeout)
      
              conn.timeout = timeout_obj.connect_timeout  # type: ignore[assignment]
      
              # Is this a closed/new connection that requires CONNECT tunnelling?
              if self.proxy is not None and http_tunnel_required and conn.is_closed:
                  try:
                      self._prepare_proxy(conn)
                  except (BaseSSLError, OSError, SocketTimeout) as e:
                      self._raise_timeout(
                          err=e, url=self.proxy.url, timeout_value=conn.timeout
                      )
                      raise
      
              # If we're going to release the connection in ``finally:``, then
              # the response doesn't need to know about the connection. Otherwise
              # it will also try to release it and we'll have a double-release
              # mess.
              response_conn = conn if not release_conn else None
      
              # Make the request on the HTTPConnection object
  >           response = self._make_request(
                  conn,
                  method,
                  url,
                  timeout=timeout_obj,
                  body=body,
                  headers=headers,
                  chunked=chunked,
                  retries=retries,
                  response_conn=response_conn,
                  preload_content=preload_content,
                  decode_content=decode_content,
                  **response_kw,
              )
  
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connectionpool.py:790: 
  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connectionpool.py:536: in _make_request
      response = conn.getresponse()
  vcr/stubs/__init__.py:285: in getresponse
      response = self.real_connection.getresponse()
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connection.py:461: in getresponse
      httplib_response = super().getresponse()
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:1377: in getresponse
      response.begin()
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:339: in begin
      self.headers = self.msg = parse_headers(self.fp)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:236: in parse_headers
      headers = _read_headers(fp)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:216: in _read_headers
      line = fp.readline(_MAXLINE + 1)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/socket.py:704: in readinto
      return self._sock.recv_into(b)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/ssl.py:1275: in recv_into
      return self.read(nbytes, buffer)
  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
  
  self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6>
  len = 8192, buffer = <memory at 0x7fd0656cbdc0>
  
      def read(self, len=1024, buffer=None):
          """Read up to LEN bytes and return them.
          Return zero-length string on EOF."""
      
          self._checkClosed()
          if self._sslobj is None:
              raise ValueError("Read on closed or unwrapped SSL socket.")
          try:
              if buffer is not None:
  >               return self._sslobj.read(len, buffer)
  E               ConnectionResetError: [Errno 104] Connection reset by peer
  
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/ssl.py:1133: ConnectionResetError
  
  During handling of the above exception, another exception occurred:
  
  tmpdir = local('/tmp/pytest-of-root/pytest-8/test_post_https_0')
  httpbin_both = <pytest_httpbin.serve.SecureServer object at 0x7fd065737790>
  verify_pool_mgr = <urllib3.poolmanager.PoolManager object at 0x7fd0[646](https://github.com/kevin1024/vcrpy/actions/runs/7146224656/job/19463668770?pr=777#step:6:655)04700>
  
      def test_post(tmpdir, httpbin_both, verify_pool_mgr):
          """Ensure that we can post and cache the results"""
          data = {"key1": "value1", "key2": "value2"}
          url = httpbin_both.url + "/post"
          with vcr.use_cassette(str(tmpdir.join("verify_pool_mgr.yaml"))):
  >           req1 = verify_pool_mgr.request("POST", url, data).data
  
  tests/integration/test_urllib3.py:93: 
  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/_request_methods.py:118: in request
      return self.request_encode_body(
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/_request_methods.py:217: in request_encode_body
      return self.urlopen(method, url, **extra_kw)
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/poolmanager.py:444: in urlopen
      response = conn.urlopen(method, u.request_uri, **kw)
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connectionpool.py:844: in urlopen
      retries = retries.increment(
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/util/retry.py:470: in increment
      raise reraise(type(error), error, _stacktrace)
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/util/util.py:38: in reraise
      raise value.with_traceback(tb)
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connectionpool.py:790: in urlopen
      response = self._make_request(
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connectionpool.py:536: in _make_request
      response = conn.getresponse()
  vcr/stubs/__init__.py:285: in getresponse
      response = self.real_connection.getresponse()
  .tox/py39-urllib3-2/lib/python3.9/site-packages/urllib3/connection.py:461: in getresponse
      httplib_response = super().getresponse()
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:1377: in getresponse
      response.begin()
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:339: in begin
      self.headers = self.msg = parse_headers(self.fp)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:236: in parse_headers
      headers = _read_headers(fp)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/http/client.py:216: in _read_headers
      line = fp.readline(_MAXLINE + 1)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/socket.py:704: in readinto
      return self._sock.recv_into(b)
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/ssl.py:1275: in recv_into
      return self.read(nbytes, buffer)
  _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
  
  self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6>
  len = 8192, buffer = <memory at 0x7fd0[656](https://github.com/kevin1024/vcrpy/actions/runs/7146224656/job/19463668770?pr=777#step:6:665)cbdc0>
  
      def read(self, len=1024, buffer=None):
          """Read up to LEN bytes and return them.
          Return zero-length string on EOF."""
      
          self._checkClosed()
          if self._sslobj is None:
              raise ValueError("Read on closed or unwrapped SSL socket.")
          try:
              if buffer is not None:
  >               return self._sslobj.read(len, buffer)
  E               urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(104, 'Connection reset by peer'))
  
  /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/ssl.py:1133: ProtocolError
  ----------------------------- Captured stderr call -----------------------------
  127.0.0.1 - - [08/Dec/2023 21:10:25] "POST /post HTTP/1.1" 501 184
  
  ---------- coverage: platform linux, python 3.9.18-final-0 -----------
  Coverage XML written to file coverage.xml
  
  =========================== short test summary info ============================
  FAILED tests/integration/test_urllib3.py::test_post[https] - urllib3.exceptio...
  =========== 1 failed, 249 passed, 8 skipped, 14 deselected in 3.55s ============
  py39-urllib3-2: exit 1 (4.17 seconds) /home/runner/work/vcrpy/vcrpy> ./runtests.sh --cov=./vcr --cov-branch --cov-report=xml --cov-append -m 'not online' pid=5433
.pkg: _exit> python /opt/hostedtoolcache/Python/3.9.18/x64/lib/python3.9/site-packages/pyproject_api/_backend.py True setuptools.build_meta __legacy__
  py39-requests-urllib3-1: OK (7.17=setup[1.36]+cmd[5.81] seconds)
  py39-httplib2: OK (4.52=setup[0.84]+cmd[3.68] seconds)
  py39-urllib3-1: OK (4.69=setup[0.84]+cmd[3.85] seconds)
  py39-tornado4: OK (4.34=setup[0.88]+cmd[3.46] seconds)
  py39-boto3: OK (4.78=setup[0.87]+cmd[3.91] seconds)
  py39-aiohttp: OK (4.47=setup[0.84]+cmd[3.63] seconds)
  py39-httpx: OK (4.40=setup[0.85]+cmd[3.54] seconds)
  py39-requests-urllib3-2: FAIL code 1 (7.16=setup[0.84]+cmd[6.32] seconds)
  py39-urllib3-2: FAIL code 1 (5.04=setup[0.87]+cmd[4.17] seconds)

I'd be good with status "draft", I support the idea.

Great, I've marked the PR as a draft and will be sure to comment again when (if?) I have something worth looking at again. Meanwhile I will keep an eye on the status of #786.

@dgw
Copy link
Author

dgw commented Oct 31, 2024

I started rebasing this to try again, just with Python 3.9 this time (rest in peace 3.8), and got stuck locally when I tried to prepare my venv for running tests with urllib3>=2. It turns out that botocore has always capped urllib3's version to something below 2, and it only allows urllib3 2.x on Python 3.10+ as well.

Keeping boto3 and urllib3>=2 apart in the GitHub Actions matrix is not at all clear to me. Admittedly I have only a basic understanding of how to define and manipulate GHA matrices—but I really don't know how one would go about skipping a requirement defined in setup.cfg based on matrix parameters.

The good news is that if I comment out boto3 in the tests extra and pip install -e '.[tests]' 'urllib3>=2' in my Python 3.9 venv, running ./runtests.sh collects 478 items / 1 skipped and succeeds with 447 passed, 32 skipped in 25.21s.

Presumably the skipped item at collection is integration/test_boto3.py. The skipped individual tests are all in integration/test_tornado.py. importorskip and skipif annotations in the test files appear to cover all of these.

My rebased branch is (for now) available at https://github.com/dgw/vcrpy/tree/rebase-uncap-urllib3 — pending some more thought (or hints) at how to finish setting up the GHA matrix. Alternatively we can just drop this whole idea. 😅 Python 3.9 has less than a year to live at this point, and it hasn't been as big of a deal as I thought it might be to use a custom fork of vcrpy for our CI upstream. This might be too much effort for not enough gains.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants