Skip to content

Commit

Permalink
chore: Release v2.0.0 (#64)
Browse files Browse the repository at this point in the history
* chore: release v2.0.0

* Update CHANGELOG.md

* Replace PROJECT_ID with GOOGLE_CLOUD_PROJECT in test

* Do not add google.cloud.bigquery as namespace package

* Install the library as non-editable in tests

This avoids import errors from google.cloud.bigquery.* namespace.

* Fix test coverage plugin paths

* Regenerate code with different namespace (bigquery_storage)

* Adjust import paths to bigquery_storage namespace

* Adjust docs to bigquery_storage namespace

* Adjust UPGRADING guide to changed namespace

Co-authored-by: Tim Swast <swast@google.com>
  • Loading branch information
plamut and tswast committed Sep 29, 2020
1 parent 0a0eb2e commit 6254bf2
Show file tree
Hide file tree
Showing 41 changed files with 173 additions and 171 deletions.
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,17 @@

[1]: https://pypi.org/project/google-cloud-bigquery-storage/#history

## 2.0.0

09-24-2020 08:21 PDT

### Implementation Changes

- Transition the library to microgenerator. ([#62](https://github.com/googleapis/python-bigquery-storage/pull/62))
This is a **breaking change** that introduces several **method signature changes** and **drops support
for Python 2.7 and 3.5**. See [migration guide](https://googleapis.dev/python/bigquerystorage/latest/UPGRADING.html)
for more info.

## 1.1.0

09-14-2020 08:51 PDT
Expand Down
19 changes: 10 additions & 9 deletions UPGRADING.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,10 @@ The 2.0.0 release requires Python 3.6+.

## Import Path

The library was moved into `google.cloud.bigquery` namespace. It is recommended
to use this path in order to reduce the chance of future compatibility issues
in case the library is restuctured internally.
The library's top-level namespace is `google.cloud.bigquery_storage`. Importing
from `google.cloud.bigquery_storage_v1` still works, but it is advisable to use
the `google.cloud.bigquery_storage` path in order to reduce the chance of future
compatibility issues should the library be restuctured internally.

**Before:**
```py
Expand All @@ -44,7 +45,7 @@ from google.cloud.bigquery_storage_v1 import BigQueryReadClient

**After:**
```py
from google.cloud.bigquery.storage import BigQueryReadClient
from google.cloud.bigquery_storage import BigQueryReadClient
```


Expand All @@ -65,7 +66,7 @@ data_format = BigQueryReadClient.enums.DataFormat.ARROW

**After:**
```py
from google.cloud.bigquery.storage import types
from google.cloud.bigquery_storage import types

data_format = types.DataFormat.ARROW
```
Expand Down Expand Up @@ -157,13 +158,13 @@ session = client.create_read_session(

**After:**
```py
from google.cloud.bigquery import storage
from google.cloud import bigquery_storage

client = storage.BigQueryReadClient()
client = bigquery_storage.BigQueryReadClient()

requested_session = storage.types.ReadSession(
requested_session = bigquery_storage.types.ReadSession(
table="projects/PROJECT_ID/datasets/DATASET_ID/tables/TABLE_ID",
data_format=storage.types.DataFormat.ARROW,
data_format=bigquery_storage.types.DataFormat.ARROW,
)
session = client.create_read_session(
request={
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Services for Google Cloud Bigquery Storage v1 API
=================================================

.. automodule:: google.cloud.bigquery.storage_v1.services.big_query_read
.. automodule:: google.cloud.bigquery_storage_v1.services.big_query_read
:members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Types for Google Cloud Bigquery Storage v1 API
==============================================

.. automodule:: google.cloud.bigquery.storage_v1.types
.. automodule:: google.cloud.bigquery_storage_v1.types
:members:
6 changes: 3 additions & 3 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,9 @@ API Reference
.. toctree::
:maxdepth: 2

storage_v1/library
storage_v1/services
storage_v1/types
bigquery_storage_v1/library
bigquery_storage_v1/services
bigquery_storage_v1/types


Migration Guide
Expand Down
51 changes: 0 additions & 51 deletions google/cloud/bigquery/storage_v1/__init__.py

This file was deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -16,22 +16,22 @@
#

from google.cloud.bigquery_storage_v1 import BigQueryReadClient
from google.cloud.bigquery_storage_v1 import types
from google.cloud.bigquery_storage_v1 import gapic_types as types
from google.cloud.bigquery_storage_v1 import __version__
from google.cloud.bigquery.storage_v1.types.arrow import ArrowRecordBatch
from google.cloud.bigquery.storage_v1.types.arrow import ArrowSchema
from google.cloud.bigquery.storage_v1.types.avro import AvroRows
from google.cloud.bigquery.storage_v1.types.avro import AvroSchema
from google.cloud.bigquery.storage_v1.types.storage import CreateReadSessionRequest
from google.cloud.bigquery.storage_v1.types.storage import ReadRowsRequest
from google.cloud.bigquery.storage_v1.types.storage import ReadRowsResponse
from google.cloud.bigquery.storage_v1.types.storage import SplitReadStreamRequest
from google.cloud.bigquery.storage_v1.types.storage import SplitReadStreamResponse
from google.cloud.bigquery.storage_v1.types.storage import StreamStats
from google.cloud.bigquery.storage_v1.types.storage import ThrottleState
from google.cloud.bigquery.storage_v1.types.stream import DataFormat
from google.cloud.bigquery.storage_v1.types.stream import ReadSession
from google.cloud.bigquery.storage_v1.types.stream import ReadStream
from google.cloud.bigquery_storage_v1.types.arrow import ArrowRecordBatch
from google.cloud.bigquery_storage_v1.types.arrow import ArrowSchema
from google.cloud.bigquery_storage_v1.types.avro import AvroRows
from google.cloud.bigquery_storage_v1.types.avro import AvroSchema
from google.cloud.bigquery_storage_v1.types.storage import CreateReadSessionRequest
from google.cloud.bigquery_storage_v1.types.storage import ReadRowsRequest
from google.cloud.bigquery_storage_v1.types.storage import ReadRowsResponse
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamRequest
from google.cloud.bigquery_storage_v1.types.storage import SplitReadStreamResponse
from google.cloud.bigquery_storage_v1.types.storage import StreamStats
from google.cloud.bigquery_storage_v1.types.storage import ThrottleState
from google.cloud.bigquery_storage_v1.types.stream import DataFormat
from google.cloud.bigquery_storage_v1.types.stream import ReadSession
from google.cloud.bigquery_storage_v1.types.stream import ReadStream

__all__ = (
"__version__",
Expand Down
File renamed without changes.
12 changes: 6 additions & 6 deletions google/cloud/bigquery_storage_v1/client.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,8 @@

import google.api_core.gapic_v1.method

from google.cloud.bigquery import storage_v1
from google.cloud.bigquery_storage_v1 import reader
from google.cloud.bigquery_storage_v1.services import big_query_read


_SCOPES = (
Expand All @@ -33,7 +33,7 @@
)


class BigQueryReadClient(storage_v1.BigQueryReadClient):
class BigQueryReadClient(big_query_read.BigQueryReadClient):
"""Client for interacting with BigQuery Storage API.
The BigQuery storage API can be used to read data stored in BigQuery.
Expand All @@ -60,9 +60,9 @@ def read_rows(
to read data.
Example:
>>> from google.cloud.bigquery import storage
>>> from google.cloud import bigquery_storage
>>>
>>> client = storage.BigQueryReadClient()
>>> client = bigquery_storage.BigQueryReadClient()
>>>
>>> # TODO: Initialize ``table``:
>>> table = "projects/{}/datasets/{}/tables/{}".format(
Expand All @@ -74,9 +74,9 @@ def read_rows(
>>> # TODO: Initialize `parent`:
>>> parent = 'projects/your-billing-project-id'
>>>
>>> requested_session = storage.types.ReadSession(
>>> requested_session = bigquery_storage.types.ReadSession(
... table=table,
... data_format=storage.types.DataFormat.AVRO,
... data_format=bigquery_storage.types.DataFormat.AVRO,
... )
>>> session = client.create_read_session(
... parent=parent, read_session=requested_session
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,10 +22,10 @@

import proto

from google.cloud.bigquery.storage_v1.types import arrow
from google.cloud.bigquery.storage_v1.types import avro
from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream

from google.protobuf import message as protobuf_message
from google.protobuf import timestamp_pb2
Expand Down
File renamed without changes.
6 changes: 3 additions & 3 deletions google/cloud/bigquery_storage_v1/reader.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,11 +81,11 @@ def __init__(self, wrapped, client, name, offset, read_rows_kwargs):
Args:
wrapped (Iterable[ \
~google.cloud.bigquery.storage.types.ReadRowsResponse \
~google.cloud.bigquery_storage.types.ReadRowsResponse \
]):
The ReadRows stream to read.
client ( \
~google.cloud.bigquery.storage_v1.services. \
~google.cloud.bigquery_storage_v1.services. \
big_query_read.BigQueryReadClient \
):
A GAPIC client used to reconnect to a ReadRows stream. This
Expand All @@ -104,7 +104,7 @@ def __init__(self, wrapped, client, name, offset, read_rows_kwargs):
Returns:
Iterable[ \
~google.cloud.bigquery_storage_v1.types.ReadRowsResponse \
~google.cloud.bigquery_storage.types.ReadRowsResponse \
]:
A sequence of row messages.
"""
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,10 +28,10 @@
from google.auth import credentials # type: ignore
from google.oauth2 import service_account # type: ignore

from google.cloud.bigquery.storage_v1.types import arrow
from google.cloud.bigquery.storage_v1.types import avro
from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream
from google.protobuf import timestamp_pb2 as timestamp # type: ignore

from .transports.base import BigQueryReadTransport, DEFAULT_CLIENT_INFO
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,10 +32,10 @@
from google.auth.exceptions import MutualTLSChannelError # type: ignore
from google.oauth2 import service_account # type: ignore

from google.cloud.bigquery.storage_v1.types import arrow
from google.cloud.bigquery.storage_v1.types import avro
from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream
from google.protobuf import timestamp_pb2 as timestamp # type: ignore

from .transports.base import BigQueryReadTransport, DEFAULT_CLIENT_INFO
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@
from google.api_core import retry as retries # type: ignore
from google.auth import credentials # type: ignore

from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream


try:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@

import grpc # type: ignore

from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream

from .base import BigQueryReadTransport, DEFAULT_CLIENT_INFO

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@
import grpc # type: ignore
from grpc.experimental import aio # type: ignore

from google.cloud.bigquery.storage_v1.types import storage
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import storage
from google.cloud.bigquery_storage_v1.types import stream

from .base import BigQueryReadTransport, DEFAULT_CLIENT_INFO
from .grpc import BigQueryReadGrpcTransport
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
AvroRows,
)
from .stream import (
DataFormat,
ReadSession,
ReadStream,
)
Expand All @@ -43,6 +44,7 @@
"ArrowRecordBatch",
"AvroSchema",
"AvroRows",
"DataFormat",
"ReadSession",
"ReadStream",
"CreateReadSessionRequest",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,9 +18,9 @@
import proto # type: ignore


from google.cloud.bigquery.storage_v1.types import arrow
from google.cloud.bigquery.storage_v1.types import avro
from google.cloud.bigquery.storage_v1.types import stream
from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import stream


__protobuf__ = proto.module(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,8 @@
import proto # type: ignore


from google.cloud.bigquery.storage_v1.types import arrow
from google.cloud.bigquery.storage_v1.types import avro
from google.cloud.bigquery_storage_v1.types import arrow
from google.cloud.bigquery_storage_v1.types import avro
from google.protobuf import timestamp_pb2 as timestamp # type: ignore


Expand Down
5 changes: 3 additions & 2 deletions noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,9 +79,10 @@ def default(session):
session.run(
"py.test",
"--quiet",
"--cov=google.cloud.bigquerystorage",
"--cov=google.cloud.bigquery_storage",
"--cov=google.cloud.bigquery_storage_v1",
"--cov=google.cloud",
"--cov=tests.unit",
"--cov=tests/unit",
"--cov-append",
"--cov-config=.coveragerc",
"--cov-report=",
Expand Down
4 changes: 2 additions & 2 deletions samples/quickstart/quickstart.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@

def main(project_id="your-project-id", snapshot_millis=0):
# [START bigquerystorage_quickstart]
from google.cloud.bigquery.storage import BigQueryReadClient
from google.cloud.bigquery.storage import types
from google.cloud.bigquery_storage import BigQueryReadClient
from google.cloud.bigquery_storage import types

# TODO(developer): Set the project_id variable.
# project_id = 'your-project-id'
Expand Down

0 comments on commit 6254bf2

Please sign in to comment.