Pulling large dataframes using client.list_rows and rows.to_dataframe() leads to duplicates when setting max_results and start_index #1569
Labels
api: bigquery
Issues related to the googleapis/python-bigquery API.
priority: p3
Desirable enhancement or fix. May not be included in next release.
type: bug
Error or flaw in code with unintended results or allowing sub-optimal usage patterns.
Environment details
google-api-core 2.11.0
google-auth 2.18.0
google-cloud-bigquery 3.10.0
google-cloud-bigquery-storage 2.19.1
google-cloud-core 2.3.2
google-crc32c 1.5.0
google-resumable-media 2.5.0
Steps to reproduce
Code example
These columns are not duplicated in the actual publications table, which can be confirmed with this SQL:
The text was updated successfully, but these errors were encountered: