diff --git a/CHANGELOG.md b/CHANGELOG.md index 8d01843e..de1c6918 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,6 +4,19 @@ [1]: https://pypi.org/project/google-cloud-bigquery-storage/#history +## [2.1.0](https://www.github.com/googleapis/python-bigquery-storage/compare/v2.0.1...v2.1.0) (2020-11-04) + + +### Features + +* add public transport property and path formatting methods to client ([#80](https://www.github.com/googleapis/python-bigquery-storage/issues/80)) ([fbbb439](https://www.github.com/googleapis/python-bigquery-storage/commit/fbbb439b8c77fa9367a4b5bea725dd0b0f26b769)) + + +### Documentation + +* add intersphinx to proto-plus library ([#86](https://www.github.com/googleapis/python-bigquery-storage/issues/86)) ([4cd35d2](https://www.github.com/googleapis/python-bigquery-storage/commit/4cd35d21de4486f659b7efc4ff4dcb9b4eee6c9e)) +* show inheritance in types reference ([#91](https://www.github.com/googleapis/python-bigquery-storage/issues/91)) ([e5fd4e6](https://www.github.com/googleapis/python-bigquery-storage/commit/e5fd4e62de2768a49d633dc3a81e03d64df9fe1f)) + ### [2.0.1](https://www.github.com/googleapis/python-bigquery-storage/compare/v2.0.0...v2.0.1) (2020-10-21) diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index b3d1f602..039f4368 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -1,44 +1,95 @@ -# Contributor Code of Conduct +# Code of Conduct -As contributors and maintainers of this project, -and in the interest of fostering an open and welcoming community, -we pledge to respect all people who contribute through reporting issues, -posting feature requests, updating documentation, -submitting pull requests or patches, and other activities. +## Our Pledge -We are committed to making participation in this project -a harassment-free experience for everyone, -regardless of level of experience, gender, gender identity and expression, -sexual orientation, disability, personal appearance, -body size, race, ethnicity, age, religion, or nationality. +In the interest of fostering an open and welcoming environment, we as +contributors and maintainers pledge to making participation in our project and +our community a harassment-free experience for everyone, regardless of age, body +size, disability, ethnicity, gender identity and expression, level of +experience, education, socio-economic status, nationality, personal appearance, +race, religion, or sexual identity and orientation. + +## Our Standards + +Examples of behavior that contributes to creating a positive environment +include: + +* Using welcoming and inclusive language +* Being respectful of differing viewpoints and experiences +* Gracefully accepting constructive criticism +* Focusing on what is best for the community +* Showing empathy towards other community members Examples of unacceptable behavior by participants include: -* The use of sexualized language or imagery -* Personal attacks -* Trolling or insulting/derogatory comments -* Public or private harassment -* Publishing other's private information, -such as physical or electronic -addresses, without explicit permission -* Other unethical or unprofessional conduct. +* The use of sexualized language or imagery and unwelcome sexual attention or + advances +* Trolling, insulting/derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or electronic + address, without explicit permission +* Other conduct which could reasonably be considered inappropriate in a + professional setting + +## Our Responsibilities + +Project maintainers are responsible for clarifying the standards of acceptable +behavior and are expected to take appropriate and fair corrective action in +response to any instances of unacceptable behavior. Project maintainers have the right and responsibility to remove, edit, or reject -comments, commits, code, wiki edits, issues, and other contributions -that are not aligned to this Code of Conduct. -By adopting this Code of Conduct, -project maintainers commit themselves to fairly and consistently -applying these principles to every aspect of managing this project. -Project maintainers who do not follow or enforce the Code of Conduct -may be permanently removed from the project team. - -This code of conduct applies both within project spaces and in public spaces -when an individual is representing the project or its community. - -Instances of abusive, harassing, or otherwise unacceptable behavior -may be reported by opening an issue -or contacting one or more of the project maintainers. - -This Code of Conduct is adapted from the [Contributor Covenant](http://contributor-covenant.org), version 1.2.0, -available at [http://contributor-covenant.org/version/1/2/0/](http://contributor-covenant.org/version/1/2/0/) +comments, commits, code, wiki edits, issues, and other contributions that are +not aligned to this Code of Conduct, or to ban temporarily or permanently any +contributor for other behaviors that they deem inappropriate, threatening, +offensive, or harmful. + +## Scope + +This Code of Conduct applies both within project spaces and in public spaces +when an individual is representing the project or its community. Examples of +representing a project or community include using an official project e-mail +address, posting via an official social media account, or acting as an appointed +representative at an online or offline event. Representation of a project may be +further defined and clarified by project maintainers. + +This Code of Conduct also applies outside the project spaces when the Project +Steward has a reasonable belief that an individual's behavior may have a +negative impact on the project or its community. + +## Conflict Resolution + +We do not believe that all conflict is bad; healthy debate and disagreement +often yield positive results. However, it is never okay to be disrespectful or +to engage in behavior that violates the project’s code of conduct. + +If you see someone violating the code of conduct, you are encouraged to address +the behavior directly with those involved. Many issues can be resolved quickly +and easily, and this gives people more control over the outcome of their +dispute. If you are unable to resolve the matter for any reason, or if the +behavior is threatening or harassing, report it. We are dedicated to providing +an environment where participants feel welcome and safe. + + +Reports should be directed to *googleapis-stewards@google.com*, the +Project Steward(s) for *Google Cloud Client Libraries*. It is the Project Steward’s duty to +receive and address reported violations of the code of conduct. They will then +work with a committee consisting of representatives from the Open Source +Programs Office and the Google Open Source Strategy team. If for any reason you +are uncomfortable reaching out to the Project Steward, please email +opensource@google.com. + +We will investigate every complaint, but you may not receive a direct response. +We will use our discretion in determining when and how to follow up on reported +incidents, which may range from not taking action to permanent expulsion from +the project and project-sponsored spaces. We will notify the accused of the +report and provide them an opportunity to discuss it before any action is taken. +The identity of the reporter will be omitted from the details of the report +supplied to the accused. In potentially harmful situations, such as ongoing +harassment or threats to anyone's safety, we may take action without notice. + +## Attribution + +This Code of Conduct is adapted from the Contributor Covenant, version 1.4, +available at +https://www.contributor-covenant.org/version/1/4/code-of-conduct.html \ No newline at end of file diff --git a/docs/bigquery_storage_v1/types.rst b/docs/bigquery_storage_v1/types.rst index 1eb34796..3f722c57 100644 --- a/docs/bigquery_storage_v1/types.rst +++ b/docs/bigquery_storage_v1/types.rst @@ -3,3 +3,4 @@ Types for Google Cloud Bigquery Storage v1 API .. automodule:: google.cloud.bigquery_storage_v1.types :members: + :show-inheritance: diff --git a/docs/conf.py b/docs/conf.py index 3b109df5..e3efb9fd 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -349,6 +349,7 @@ "google-auth": ("https://google-auth.readthedocs.io/en/stable", None), "google.api_core": ("https://googleapis.dev/python/google-api-core/latest/", None,), "grpc": ("https://grpc.io/grpc/python/", None), + "proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None), } diff --git a/google/cloud/bigquery_storage_v1/services/big_query_read/async_client.py b/google/cloud/bigquery_storage_v1/services/big_query_read/async_client.py index f5c80cd0..7108ffd0 100644 --- a/google/cloud/bigquery_storage_v1/services/big_query_read/async_client.py +++ b/google/cloud/bigquery_storage_v1/services/big_query_read/async_client.py @@ -18,7 +18,7 @@ from collections import OrderedDict import functools import re -from typing import Dict, AsyncIterable, Sequence, Tuple, Type, Union +from typing import Dict, AsyncIterable, Awaitable, Sequence, Tuple, Type, Union import pkg_resources import google.api_core.client_options as ClientOptions # type: ignore @@ -53,10 +53,46 @@ class BigQueryReadAsyncClient: parse_read_session_path = staticmethod(BigQueryReadClient.parse_read_session_path) read_stream_path = staticmethod(BigQueryReadClient.read_stream_path) parse_read_stream_path = staticmethod(BigQueryReadClient.parse_read_stream_path) + table_path = staticmethod(BigQueryReadClient.table_path) + parse_table_path = staticmethod(BigQueryReadClient.parse_table_path) + + common_billing_account_path = staticmethod( + BigQueryReadClient.common_billing_account_path + ) + parse_common_billing_account_path = staticmethod( + BigQueryReadClient.parse_common_billing_account_path + ) + + common_folder_path = staticmethod(BigQueryReadClient.common_folder_path) + parse_common_folder_path = staticmethod(BigQueryReadClient.parse_common_folder_path) + + common_organization_path = staticmethod(BigQueryReadClient.common_organization_path) + parse_common_organization_path = staticmethod( + BigQueryReadClient.parse_common_organization_path + ) + + common_project_path = staticmethod(BigQueryReadClient.common_project_path) + parse_common_project_path = staticmethod( + BigQueryReadClient.parse_common_project_path + ) + + common_location_path = staticmethod(BigQueryReadClient.common_location_path) + parse_common_location_path = staticmethod( + BigQueryReadClient.parse_common_location_path + ) from_service_account_file = BigQueryReadClient.from_service_account_file from_service_account_json = from_service_account_file + @property + def transport(self) -> BigQueryReadTransport: + """Return the transport used by the client instance. + + Returns: + BigQueryReadTransport: The transport used by the client instance. + """ + return self._client.transport + get_transport_class = functools.partial( type(BigQueryReadClient).get_transport_class, type(BigQueryReadClient) ) @@ -191,7 +227,8 @@ async def create_read_session( # Create or coerce a protobuf request object. # Sanity check: If we got a request object, we should *not* have # gotten any keyword arguments that map to the request. - if request is not None and any([parent, read_session, max_stream_count]): + has_flattened_params = any([parent, read_session, max_stream_count]) + if request is not None and has_flattened_params: raise ValueError( "If the `request` argument is set, then none of " "the individual field arguments should be set." @@ -218,7 +255,7 @@ async def create_read_session( maximum=60.0, multiplier=1.3, predicate=retries.if_exception_type( - exceptions.ServiceUnavailable, exceptions.DeadlineExceeded, + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, ), ), default_timeout=600.0, @@ -248,7 +285,7 @@ def read_rows( retry: retries.Retry = gapic_v1.method.DEFAULT, timeout: float = None, metadata: Sequence[Tuple[str, str]] = (), - ) -> AsyncIterable[storage.ReadRowsResponse]: + ) -> Awaitable[AsyncIterable[storage.ReadRowsResponse]]: r"""Reads rows from the stream in the format prescribed by the ReadSession. Each response contains one or more table rows, up to a maximum of 100 MiB per response; @@ -291,7 +328,8 @@ def read_rows( # Create or coerce a protobuf request object. # Sanity check: If we got a request object, we should *not* have # gotten any keyword arguments that map to the request. - if request is not None and any([read_stream, offset]): + has_flattened_params = any([read_stream, offset]) + if request is not None and has_flattened_params: raise ValueError( "If the `request` argument is set, then none of " "the individual field arguments should be set." @@ -385,7 +423,7 @@ async def split_read_stream( maximum=60.0, multiplier=1.3, predicate=retries.if_exception_type( - exceptions.ServiceUnavailable, exceptions.DeadlineExceeded, + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, ), ), default_timeout=600.0, diff --git a/google/cloud/bigquery_storage_v1/services/big_query_read/client.py b/google/cloud/bigquery_storage_v1/services/big_query_read/client.py index f60e862d..3f04760f 100644 --- a/google/cloud/bigquery_storage_v1/services/big_query_read/client.py +++ b/google/cloud/bigquery_storage_v1/services/big_query_read/client.py @@ -133,6 +133,15 @@ def from_service_account_file(cls, filename: str, *args, **kwargs): from_service_account_json = from_service_account_file + @property + def transport(self) -> BigQueryReadTransport: + """Return the transport used by the client instance. + + Returns: + BigQueryReadTransport: The transport used by the client instance. + """ + return self._transport + @staticmethod def read_session_path(project: str, location: str, session: str,) -> str: """Return a fully-qualified read_session string.""" @@ -167,6 +176,81 @@ def parse_read_stream_path(path: str) -> Dict[str, str]: ) return m.groupdict() if m else {} + @staticmethod + def table_path(project: str, dataset: str, table: str,) -> str: + """Return a fully-qualified table string.""" + return "projects/{project}/datasets/{dataset}/tables/{table}".format( + project=project, dataset=dataset, table=table, + ) + + @staticmethod + def parse_table_path(path: str) -> Dict[str, str]: + """Parse a table path into its component segments.""" + m = re.match( + r"^projects/(?P.+?)/datasets/(?P.+?)/tables/(?P.+?)$", + path, + ) + return m.groupdict() if m else {} + + @staticmethod + def common_billing_account_path(billing_account: str,) -> str: + """Return a fully-qualified billing_account string.""" + return "billingAccounts/{billing_account}".format( + billing_account=billing_account, + ) + + @staticmethod + def parse_common_billing_account_path(path: str) -> Dict[str, str]: + """Parse a billing_account path into its component segments.""" + m = re.match(r"^billingAccounts/(?P.+?)$", path) + return m.groupdict() if m else {} + + @staticmethod + def common_folder_path(folder: str,) -> str: + """Return a fully-qualified folder string.""" + return "folders/{folder}".format(folder=folder,) + + @staticmethod + def parse_common_folder_path(path: str) -> Dict[str, str]: + """Parse a folder path into its component segments.""" + m = re.match(r"^folders/(?P.+?)$", path) + return m.groupdict() if m else {} + + @staticmethod + def common_organization_path(organization: str,) -> str: + """Return a fully-qualified organization string.""" + return "organizations/{organization}".format(organization=organization,) + + @staticmethod + def parse_common_organization_path(path: str) -> Dict[str, str]: + """Parse a organization path into its component segments.""" + m = re.match(r"^organizations/(?P.+?)$", path) + return m.groupdict() if m else {} + + @staticmethod + def common_project_path(project: str,) -> str: + """Return a fully-qualified project string.""" + return "projects/{project}".format(project=project,) + + @staticmethod + def parse_common_project_path(path: str) -> Dict[str, str]: + """Parse a project path into its component segments.""" + m = re.match(r"^projects/(?P.+?)$", path) + return m.groupdict() if m else {} + + @staticmethod + def common_location_path(project: str, location: str,) -> str: + """Return a fully-qualified location string.""" + return "projects/{project}/locations/{location}".format( + project=project, location=location, + ) + + @staticmethod + def parse_common_location_path(path: str) -> Dict[str, str]: + """Parse a location path into its component segments.""" + m = re.match(r"^projects/(?P.+?)/locations/(?P.+?)$", path) + return m.groupdict() if m else {} + def __init__( self, *, @@ -202,10 +286,10 @@ def __init__( not provided, the default SSL client certificate will be used if present. If GOOGLE_API_USE_CLIENT_CERTIFICATE is "false" or not set, no client certificate will be used. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing + client_info (google.api_core.gapic_v1.client_info.ClientInfo): + The client info used to send a user-agent string along with + API requests. If ``None``, then default info will be used. + Generally, you only need to set this if you're developing your own client library. Raises: diff --git a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/base.py b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/base.py index 5727ca5a..4497158f 100644 --- a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/base.py +++ b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/base.py @@ -118,7 +118,7 @@ def _prep_wrapped_messages(self, client_info): maximum=60.0, multiplier=1.3, predicate=retries.if_exception_type( - exceptions.ServiceUnavailable, exceptions.DeadlineExceeded, + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, ), ), default_timeout=600.0, @@ -142,7 +142,7 @@ def _prep_wrapped_messages(self, client_info): maximum=60.0, multiplier=1.3, predicate=retries.if_exception_type( - exceptions.ServiceUnavailable, exceptions.DeadlineExceeded, + exceptions.DeadlineExceeded, exceptions.ServiceUnavailable, ), ), default_timeout=600.0, diff --git a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc.py b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc.py index 36377f1d..041854b9 100644 --- a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc.py +++ b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc.py @@ -91,10 +91,10 @@ def __init__( for grpc channel. It is ignored if ``channel`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. - client_info (google.api_core.gapic_v1.client_info.ClientInfo): - The client info used to send a user-agent string along with - API requests. If ``None``, then default info will be used. - Generally, you only need to set this if you're developing + client_info (google.api_core.gapic_v1.client_info.ClientInfo): + The client info used to send a user-agent string along with + API requests. If ``None``, then default info will be used. + Generally, you only need to set this if you're developing your own client library. Raises: @@ -103,6 +103,8 @@ def __init__( google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` and ``credentials_file`` are passed. """ + self._ssl_channel_credentials = ssl_channel_credentials + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -110,6 +112,7 @@ def __init__( # If a channel was explicitly provided, set it. self._grpc_channel = channel + self._ssl_channel_credentials = None elif api_mtls_endpoint: warnings.warn( "api_mtls_endpoint and client_cert_source are deprecated", @@ -150,6 +153,7 @@ def __init__( ("grpc.max_receive_message_length", -1), ), ) + self._ssl_channel_credentials = ssl_credentials else: host = host if ":" in host else host + ":443" @@ -231,12 +235,8 @@ def create_channel( @property def grpc_channel(self) -> grpc.Channel: - """Create the channel designed to connect to this service. - - This property caches on the instance; repeated calls return - the same channel. + """Return the channel designed to connect to this service. """ - # Return the channel from cache. return self._grpc_channel @property diff --git a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc_asyncio.py b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc_asyncio.py index b383f36d..3e08afdd 100644 --- a/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc_asyncio.py +++ b/google/cloud/bigquery_storage_v1/services/big_query_read/transports/grpc_asyncio.py @@ -148,6 +148,8 @@ def __init__( google.api_core.exceptions.DuplicateCredentialArgs: If both ``credentials`` and ``credentials_file`` are passed. """ + self._ssl_channel_credentials = ssl_channel_credentials + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -155,6 +157,7 @@ def __init__( # If a channel was explicitly provided, set it. self._grpc_channel = channel + self._ssl_channel_credentials = None elif api_mtls_endpoint: warnings.warn( "api_mtls_endpoint and client_cert_source are deprecated", @@ -195,6 +198,7 @@ def __init__( ("grpc.max_receive_message_length", -1), ), ) + self._ssl_channel_credentials = ssl_credentials else: host = host if ":" in host else host + ":443" diff --git a/google/cloud/bigquery_storage_v1/types/storage.py b/google/cloud/bigquery_storage_v1/types/storage.py index 3460dce7..1b9c9d35 100644 --- a/google/cloud/bigquery_storage_v1/types/storage.py +++ b/google/cloud/bigquery_storage_v1/types/storage.py @@ -166,9 +166,9 @@ class ReadRowsResponse(proto.Message): row_count = proto.Field(proto.INT64, number=6) - stats = proto.Field(proto.MESSAGE, number=2, message=StreamStats,) + stats = proto.Field(proto.MESSAGE, number=2, message="StreamStats",) - throttle_state = proto.Field(proto.MESSAGE, number=5, message=ThrottleState,) + throttle_state = proto.Field(proto.MESSAGE, number=5, message="ThrottleState",) class SplitReadStreamRequest(proto.Message): diff --git a/samples/to_dataframe/requirements.txt b/samples/to_dataframe/requirements.txt index 58712e8f..14c1784c 100644 --- a/samples/to_dataframe/requirements.txt +++ b/samples/to_dataframe/requirements.txt @@ -1,6 +1,6 @@ -google-auth==1.21.0 +google-auth==1.23.0 google-cloud-bigquery-storage==2.0.0 -google-cloud-bigquery==2.1.0 +google-cloud-bigquery==2.2.0 pyarrow==1.0.1 ipython==7.10.2; python_version > '3.0' ipython==5.9.0; python_version < '3.0' diff --git a/scripts/fixup_bigquery_storage_v1_keywords.py b/scripts/fixup_bigquery_storage_v1_keywords.py index 4fc6755e..28faf655 100644 --- a/scripts/fixup_bigquery_storage_v1_keywords.py +++ b/scripts/fixup_bigquery_storage_v1_keywords.py @@ -1,3 +1,4 @@ +#! /usr/bin/env python3 # -*- coding: utf-8 -*- # Copyright 2020 Google LLC diff --git a/setup.py b/setup.py index 67972bba..1ccf3b83 100644 --- a/setup.py +++ b/setup.py @@ -21,7 +21,7 @@ name = "google-cloud-bigquery-storage" description = "BigQuery Storage API API client library" -version = "2.0.1" +version = "2.1.0" release_status = "Development Status :: 5 - Production/Stable" dependencies = [ "google-api-core[grpc] >= 1.22.2, < 2.0.0dev", diff --git a/synth.metadata b/synth.metadata index 19f14807..ea2bcb4e 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,29 +4,29 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-bigquery-storage.git", - "sha": "a7fe7626312a5b9fe1e7bd0e0fe5601ae97605c7" + "sha": "20ed21d40cd4f89c3d4ae5d8db7ed3c6b801cc4c" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "062f46f246c78fde2160524db593fa0fa7bdbe64", - "internalRef": "337404700" + "sha": "07d41a7e5cade45aba6f0d277c89722b48f2c956", + "internalRef": "339292950" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "da5c6050d13b4950c82666a81d8acd25157664ae" + "sha": "ea52b8a0bd560f72f376efcf45197fb7c8869120" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "da5c6050d13b4950c82666a81d8acd25157664ae" + "sha": "ea52b8a0bd560f72f376efcf45197fb7c8869120" } } ], diff --git a/tests/unit/gapic/bigquery_storage_v1/test_big_query_read.py b/tests/unit/gapic/bigquery_storage_v1/test_big_query_read.py index a86692f3..1c3cfafb 100644 --- a/tests/unit/gapic/bigquery_storage_v1/test_big_query_read.py +++ b/tests/unit/gapic/bigquery_storage_v1/test_big_query_read.py @@ -94,12 +94,12 @@ def test_big_query_read_client_from_service_account_file(client_class): ) as factory: factory.return_value = creds client = client_class.from_service_account_file("dummy/file/path.json") - assert client._transport._credentials == creds + assert client.transport._credentials == creds client = client_class.from_service_account_json("dummy/file/path.json") - assert client._transport._credentials == creds + assert client.transport._credentials == creds - assert client._transport._host == "bigquerystorage.googleapis.com:443" + assert client.transport._host == "bigquerystorage.googleapis.com:443" def test_big_query_read_client_get_transport_class(): @@ -444,7 +444,7 @@ def test_create_read_session( # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = stream.ReadSession( @@ -463,6 +463,7 @@ def test_create_read_session( assert args[0] == storage.CreateReadSessionRequest() # Establish that the response is the type that we expect. + assert isinstance(response, stream.ReadSession) assert response.name == "name_value" @@ -477,18 +478,20 @@ def test_create_read_session_from_dict(): @pytest.mark.asyncio -async def test_create_read_session_async(transport: str = "grpc_asyncio"): +async def test_create_read_session_async( + transport: str = "grpc_asyncio", request_type=storage.CreateReadSessionRequest +): client = BigQueryReadAsyncClient( credentials=credentials.AnonymousCredentials(), transport=transport, ) # Everything is optional in proto3 as far as the runtime is concerned, # and we are mocking out the actual API, so just send an empty request. - request = storage.CreateReadSessionRequest() + request = request_type() # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( @@ -505,7 +508,7 @@ async def test_create_read_session_async(transport: str = "grpc_asyncio"): assert len(call.mock_calls) _, args, _ = call.mock_calls[0] - assert args[0] == request + assert args[0] == storage.CreateReadSessionRequest() # Establish that the response is the type that we expect. assert isinstance(response, stream.ReadSession) @@ -517,6 +520,11 @@ async def test_create_read_session_async(transport: str = "grpc_asyncio"): assert response.table == "table_value" +@pytest.mark.asyncio +async def test_create_read_session_async_from_dict(): + await test_create_read_session_async(request_type=dict) + + def test_create_read_session_field_headers(): client = BigQueryReadClient(credentials=credentials.AnonymousCredentials(),) @@ -527,7 +535,7 @@ def test_create_read_session_field_headers(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: call.return_value = stream.ReadSession() @@ -557,7 +565,7 @@ async def test_create_read_session_field_headers_async(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: call.return_value = grpc_helpers_async.FakeUnaryUnaryCall(stream.ReadSession()) @@ -581,7 +589,7 @@ def test_create_read_session_flattened(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = stream.ReadSession() @@ -626,7 +634,7 @@ async def test_create_read_session_flattened_async(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._client._transport.create_read_session), "__call__" + type(client.transport.create_read_session), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = stream.ReadSession() @@ -677,7 +685,7 @@ def test_read_rows(transport: str = "grpc", request_type=storage.ReadRowsRequest request = request_type() # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object(type(client._transport.read_rows), "__call__") as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: # Designate an appropriate return value for the call. call.return_value = iter([storage.ReadRowsResponse()]) @@ -699,19 +707,19 @@ def test_read_rows_from_dict(): @pytest.mark.asyncio -async def test_read_rows_async(transport: str = "grpc_asyncio"): +async def test_read_rows_async( + transport: str = "grpc_asyncio", request_type=storage.ReadRowsRequest +): client = BigQueryReadAsyncClient( credentials=credentials.AnonymousCredentials(), transport=transport, ) # Everything is optional in proto3 as far as the runtime is concerned, # and we are mocking out the actual API, so just send an empty request. - request = storage.ReadRowsRequest() + request = request_type() # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object( - type(client._client._transport.read_rows), "__call__" - ) as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: # Designate an appropriate return value for the call. call.return_value = mock.Mock(aio.UnaryStreamCall, autospec=True) call.return_value.read = mock.AsyncMock( @@ -724,13 +732,18 @@ async def test_read_rows_async(transport: str = "grpc_asyncio"): assert len(call.mock_calls) _, args, _ = call.mock_calls[0] - assert args[0] == request + assert args[0] == storage.ReadRowsRequest() # Establish that the response is the type that we expect. message = await response.read() assert isinstance(message, storage.ReadRowsResponse) +@pytest.mark.asyncio +async def test_read_rows_async_from_dict(): + await test_read_rows_async(request_type=dict) + + def test_read_rows_field_headers(): client = BigQueryReadClient(credentials=credentials.AnonymousCredentials(),) @@ -740,7 +753,7 @@ def test_read_rows_field_headers(): request.read_stream = "read_stream/value" # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object(type(client._transport.read_rows), "__call__") as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: call.return_value = iter([storage.ReadRowsResponse()]) client.read_rows(request) @@ -765,9 +778,7 @@ async def test_read_rows_field_headers_async(): request.read_stream = "read_stream/value" # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object( - type(client._client._transport.read_rows), "__call__" - ) as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: call.return_value = mock.Mock(aio.UnaryStreamCall, autospec=True) call.return_value.read = mock.AsyncMock( side_effect=[storage.ReadRowsResponse()] @@ -789,7 +800,7 @@ def test_read_rows_flattened(): client = BigQueryReadClient(credentials=credentials.AnonymousCredentials(),) # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object(type(client._transport.read_rows), "__call__") as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: # Designate an appropriate return value for the call. call.return_value = iter([storage.ReadRowsResponse()]) @@ -825,9 +836,7 @@ async def test_read_rows_flattened_async(): client = BigQueryReadAsyncClient(credentials=credentials.AnonymousCredentials(),) # Mock the actual call within the gRPC stub, and fake the request. - with mock.patch.object( - type(client._client._transport.read_rows), "__call__" - ) as call: + with mock.patch.object(type(client.transport.read_rows), "__call__") as call: # Designate an appropriate return value for the call. call.return_value = iter([storage.ReadRowsResponse()]) @@ -871,7 +880,7 @@ def test_split_read_stream( # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._transport.split_read_stream), "__call__" + type(client.transport.split_read_stream), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = storage.SplitReadStreamResponse() @@ -885,6 +894,7 @@ def test_split_read_stream( assert args[0] == storage.SplitReadStreamRequest() # Establish that the response is the type that we expect. + assert isinstance(response, storage.SplitReadStreamResponse) @@ -893,18 +903,20 @@ def test_split_read_stream_from_dict(): @pytest.mark.asyncio -async def test_split_read_stream_async(transport: str = "grpc_asyncio"): +async def test_split_read_stream_async( + transport: str = "grpc_asyncio", request_type=storage.SplitReadStreamRequest +): client = BigQueryReadAsyncClient( credentials=credentials.AnonymousCredentials(), transport=transport, ) # Everything is optional in proto3 as far as the runtime is concerned, # and we are mocking out the actual API, so just send an empty request. - request = storage.SplitReadStreamRequest() + request = request_type() # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._client._transport.split_read_stream), "__call__" + type(client.transport.split_read_stream), "__call__" ) as call: # Designate an appropriate return value for the call. call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( @@ -917,12 +929,17 @@ async def test_split_read_stream_async(transport: str = "grpc_asyncio"): assert len(call.mock_calls) _, args, _ = call.mock_calls[0] - assert args[0] == request + assert args[0] == storage.SplitReadStreamRequest() # Establish that the response is the type that we expect. assert isinstance(response, storage.SplitReadStreamResponse) +@pytest.mark.asyncio +async def test_split_read_stream_async_from_dict(): + await test_split_read_stream_async(request_type=dict) + + def test_split_read_stream_field_headers(): client = BigQueryReadClient(credentials=credentials.AnonymousCredentials(),) @@ -933,7 +950,7 @@ def test_split_read_stream_field_headers(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._transport.split_read_stream), "__call__" + type(client.transport.split_read_stream), "__call__" ) as call: call.return_value = storage.SplitReadStreamResponse() @@ -960,7 +977,7 @@ async def test_split_read_stream_field_headers_async(): # Mock the actual call within the gRPC stub, and fake the request. with mock.patch.object( - type(client._client._transport.split_read_stream), "__call__" + type(client.transport.split_read_stream), "__call__" ) as call: call.return_value = grpc_helpers_async.FakeUnaryUnaryCall( storage.SplitReadStreamResponse() @@ -1014,7 +1031,7 @@ def test_transport_instance(): credentials=credentials.AnonymousCredentials(), ) client = BigQueryReadClient(transport=transport) - assert client._transport is transport + assert client.transport is transport def test_transport_get_channel(): @@ -1047,7 +1064,7 @@ def test_transport_adc(transport_class): def test_transport_grpc_default(): # A client should use the gRPC transport by default. client = BigQueryReadClient(credentials=credentials.AnonymousCredentials(),) - assert isinstance(client._transport, transports.BigQueryReadGrpcTransport,) + assert isinstance(client.transport, transports.BigQueryReadGrpcTransport,) def test_big_query_read_base_transport_error(): @@ -1155,7 +1172,7 @@ def test_big_query_read_host_no_port(): api_endpoint="bigquerystorage.googleapis.com" ), ) - assert client._transport._host == "bigquerystorage.googleapis.com:443" + assert client.transport._host == "bigquerystorage.googleapis.com:443" def test_big_query_read_host_with_port(): @@ -1165,7 +1182,7 @@ def test_big_query_read_host_with_port(): api_endpoint="bigquerystorage.googleapis.com:8000" ), ) - assert client._transport._host == "bigquerystorage.googleapis.com:8000" + assert client.transport._host == "bigquerystorage.googleapis.com:8000" def test_big_query_read_grpc_transport_channel(): @@ -1177,6 +1194,7 @@ def test_big_query_read_grpc_transport_channel(): ) assert transport.grpc_channel == channel assert transport._host == "squid.clam.whelk:443" + assert transport._ssl_channel_credentials == None def test_big_query_read_grpc_asyncio_transport_channel(): @@ -1188,6 +1206,7 @@ def test_big_query_read_grpc_asyncio_transport_channel(): ) assert transport.grpc_channel == channel assert transport._host == "squid.clam.whelk:443" + assert transport._ssl_channel_credentials == None @pytest.mark.parametrize( @@ -1238,6 +1257,7 @@ def test_big_query_read_transport_channel_mtls_with_client_cert_source(transport ), ) assert transport.grpc_channel == mock_grpc_channel + assert transport._ssl_channel_credentials == mock_ssl_cred @pytest.mark.parametrize( @@ -1311,10 +1331,10 @@ def test_parse_read_session_path(): def test_read_stream_path(): - project = "squid" - location = "clam" - session = "whelk" - stream = "octopus" + project = "cuttlefish" + location = "mussel" + session = "winkle" + stream = "nautilus" expected = "projects/{project}/locations/{location}/sessions/{session}/streams/{stream}".format( project=project, location=location, session=session, stream=stream, @@ -1325,10 +1345,10 @@ def test_read_stream_path(): def test_parse_read_stream_path(): expected = { - "project": "oyster", - "location": "nudibranch", - "session": "cuttlefish", - "stream": "mussel", + "project": "scallop", + "location": "abalone", + "session": "squid", + "stream": "clam", } path = BigQueryReadClient.read_stream_path(**expected) @@ -1337,6 +1357,132 @@ def test_parse_read_stream_path(): assert expected == actual +def test_table_path(): + project = "whelk" + dataset = "octopus" + table = "oyster" + + expected = "projects/{project}/datasets/{dataset}/tables/{table}".format( + project=project, dataset=dataset, table=table, + ) + actual = BigQueryReadClient.table_path(project, dataset, table) + assert expected == actual + + +def test_parse_table_path(): + expected = { + "project": "nudibranch", + "dataset": "cuttlefish", + "table": "mussel", + } + path = BigQueryReadClient.table_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_table_path(path) + assert expected == actual + + +def test_common_billing_account_path(): + billing_account = "winkle" + + expected = "billingAccounts/{billing_account}".format( + billing_account=billing_account, + ) + actual = BigQueryReadClient.common_billing_account_path(billing_account) + assert expected == actual + + +def test_parse_common_billing_account_path(): + expected = { + "billing_account": "nautilus", + } + path = BigQueryReadClient.common_billing_account_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_common_billing_account_path(path) + assert expected == actual + + +def test_common_folder_path(): + folder = "scallop" + + expected = "folders/{folder}".format(folder=folder,) + actual = BigQueryReadClient.common_folder_path(folder) + assert expected == actual + + +def test_parse_common_folder_path(): + expected = { + "folder": "abalone", + } + path = BigQueryReadClient.common_folder_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_common_folder_path(path) + assert expected == actual + + +def test_common_organization_path(): + organization = "squid" + + expected = "organizations/{organization}".format(organization=organization,) + actual = BigQueryReadClient.common_organization_path(organization) + assert expected == actual + + +def test_parse_common_organization_path(): + expected = { + "organization": "clam", + } + path = BigQueryReadClient.common_organization_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_common_organization_path(path) + assert expected == actual + + +def test_common_project_path(): + project = "whelk" + + expected = "projects/{project}".format(project=project,) + actual = BigQueryReadClient.common_project_path(project) + assert expected == actual + + +def test_parse_common_project_path(): + expected = { + "project": "octopus", + } + path = BigQueryReadClient.common_project_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_common_project_path(path) + assert expected == actual + + +def test_common_location_path(): + project = "oyster" + location = "nudibranch" + + expected = "projects/{project}/locations/{location}".format( + project=project, location=location, + ) + actual = BigQueryReadClient.common_location_path(project, location) + assert expected == actual + + +def test_parse_common_location_path(): + expected = { + "project": "cuttlefish", + "location": "mussel", + } + path = BigQueryReadClient.common_location_path(**expected) + + # Check that the path construction is reversible. + actual = BigQueryReadClient.parse_common_location_path(path) + assert expected == actual + + def test_client_withDEFAULT_CLIENT_INFO(): client_info = gapic_v1.client_info.ClientInfo()