From cd7d4ba0fc3b3cf7ebb5aae690215c62d3166bf7 Mon Sep 17 00:00:00 2001 From: WhiteSource Renovate Date: Wed, 25 Nov 2020 20:31:51 +0100 Subject: [PATCH 01/26] chore(deps): update dependency google-cloud-datacatalog to v3 (#73) --- samples/snippets/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/samples/snippets/requirements.txt b/samples/snippets/requirements.txt index af94dd90..62a69aee 100644 --- a/samples/snippets/requirements.txt +++ b/samples/snippets/requirements.txt @@ -1 +1 @@ -google-cloud-datacatalog==2.0.0 +google-cloud-datacatalog==3.0.0 From eed034a3969913e40554300ae97c5e00e4fcc79a Mon Sep 17 00:00:00 2001 From: Ricardo Mendes Date: Mon, 7 Dec 2020 13:58:04 -0300 Subject: [PATCH 02/26] docs: update the upgrade guide to be from 1.0 to 3.0 (#77) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - [x] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-datacatalog/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - Ensure the tests and linter pass - N/A - Code coverage does not decrease (if any source code was changed) - N/A - [x] Appropriate docs were updated (if necessary) Fixes #76 🦕 --- UPGRADING.md | 29 ++++++++++++----------------- 1 file changed, 12 insertions(+), 17 deletions(-) diff --git a/UPGRADING.md b/UPGRADING.md index 6a3961ce..1fa25990 100644 --- a/UPGRADING.md +++ b/UPGRADING.md @@ -1,13 +1,13 @@ -# 2.0.0 Migration Guide +# 3.0.0 Migration Guide -The 2.0 release of the `google-cloud-datacatalog` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage. +The 3.0 release of the `google-cloud-datacatalog` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage. If you experience issues or have questions, please file an [issue](https://github.com/googleapis/python-datacatalog/issues). ## Supported Python Versions > **WARNING**: Breaking change -The 2.0.0 release requires Python 3.6+. +The 3.0.0 release requires Python 3.6+. ## Method Calls @@ -45,7 +45,7 @@ return datacatalog.lookup_entry(request={'linked_resource': resource_name}) ### More Details -In `google-cloud-datacatalog<2.0.0`, parameters required by the API were positional parameters and optional parameters were keyword parameters. +In `google-cloud-datacatalog<=1.0.0`, parameters required by the API were positional parameters and optional parameters were keyword parameters. **Before:** ```py @@ -60,7 +60,7 @@ In `google-cloud-datacatalog<2.0.0`, parameters required by the API were positio ): ``` -In the 2.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. +In the 3.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. Some methods have additional keyword only parameters. The available parameters depend on the `google.api.method_signature` annotation specified by the API producer. @@ -127,24 +127,19 @@ The submodules `enums` and `types` have been removed. **Before:** ```py from google.cloud import datacatalog_v1 -entry = datacatalog_v1beta1.types.Entry() -entry.type = datacatalog_v1beta1.enums.EntryType.FILESET +entry = datacatalog_v1.types.Entry() +entry.type = datacatalog_v1.enums.EntryType.FILESET ``` **After:** ```py from google.cloud import datacatalog_v1 -entry = datacatalog_v1beta1.Entry() -entry.type = datacatalog_v1beta1.EntryType.FILESET +entry = datacatalog_v1.Entry() +entry.type = datacatalog_v1.EntryType.FILESET ``` -## Project Path Helper Methods +## Common Resource Path Helper Methods -The project path helper method `project_path` has been removed. Please construct -this path manually. - -```py -project = 'my-project' -project_path = f'projects/{project}' -``` \ No newline at end of file +The `location_path` method existing in `google-cloud-datacatalog<=1.0.0` was renamed to `common_location_path`. +And more resource path helper methods were added: `common_billing_account_path`, `common_folder_path`, `common_organization_path`, and `common_project_path`. From b2c8892d9e900d8b353ef6f035fc7c0f4f75a974 Mon Sep 17 00:00:00 2001 From: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Date: Mon, 7 Dec 2020 16:50:08 -0700 Subject: [PATCH 03/26] chore: require samples checks (#79) Co-authored-by: Steffany Brown <30247553+steffnay@users.noreply.github.com> --- .github/sync-repo-settings.yaml | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 .github/sync-repo-settings.yaml diff --git a/.github/sync-repo-settings.yaml b/.github/sync-repo-settings.yaml new file mode 100644 index 00000000..af599353 --- /dev/null +++ b/.github/sync-repo-settings.yaml @@ -0,0 +1,13 @@ +# https://github.com/googleapis/repo-automation-bots/tree/master/packages/sync-repo-settings +# Rules for master branch protection +branchProtectionRules: +# Identifies the protection rule pattern. Name of the branch to be protected. +# Defaults to `master` +- pattern: master + requiredStatusCheckContexts: + - 'Kokoro' + - 'cla/google' + - 'Samples - Lint' + - 'Samples - Python 3.6' + - 'Samples - Python 3.7' + - 'Samples - Python 3.8' From 6ca679ec15c6bab726e311b7637d4c8c9c84e508 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Mon, 28 Dec 2020 10:25:01 -0800 Subject: [PATCH 04/26] chore: reorder classes (#82) --- datacatalog-v1-py.tar.gz | 0 google/cloud/datacatalog_v1beta1/__init__.py | 4 +- synth.metadata | 167 ++++++++++++++++++- 3 files changed, 165 insertions(+), 6 deletions(-) create mode 100644 datacatalog-v1-py.tar.gz diff --git a/datacatalog-v1-py.tar.gz b/datacatalog-v1-py.tar.gz new file mode 100644 index 00000000..e69de29b diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index be0bdd8e..16534418 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,7 +103,6 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", - "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -141,6 +140,7 @@ "LookupEntryRequest", "PolicyTag", "PolicyTagManagerClient", + "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", "SearchCatalogRequest", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerSerializationClient", + "DataCatalogClient", ) diff --git a/synth.metadata b/synth.metadata index 0f3327ea..01d1ea15 100644 --- a/synth.metadata +++ b/synth.metadata @@ -3,16 +3,16 @@ { "git": { "name": ".", - "remote": "git@github.com:googleapis/python-datacatalog", - "sha": "7f1b8ee4579c4306d9b6a56498a0755803b9eadf" + "remote": "https://github.com/googleapis/python-datacatalog.git", + "sha": "b2c8892d9e900d8b353ef6f035fc7c0f4f75a974" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "754a312a0d01cfc1484d397872ff45e5565af0da", - "internalRef": "342758098" + "sha": "53eb2512a55caabcbad1898225080a2a3dfcb6aa", + "internalRef": "346818879" } }, { @@ -49,5 +49,164 @@ "generator": "bazel" } } + ], + "generatedFiles": [ + ".flake8", + ".github/CONTRIBUTING.md", + ".github/ISSUE_TEMPLATE/bug_report.md", + ".github/ISSUE_TEMPLATE/feature_request.md", + ".github/ISSUE_TEMPLATE/support_request.md", + ".github/PULL_REQUEST_TEMPLATE.md", + ".github/release-please.yml", + ".github/snippet-bot.yml", + ".gitignore", + ".kokoro/build.sh", + ".kokoro/continuous/common.cfg", + ".kokoro/continuous/continuous.cfg", + ".kokoro/docker/docs/Dockerfile", + ".kokoro/docker/docs/fetch_gpg_keys.sh", + ".kokoro/docs/common.cfg", + ".kokoro/docs/docs-presubmit.cfg", + ".kokoro/docs/docs.cfg", + ".kokoro/populate-secrets.sh", + ".kokoro/presubmit/common.cfg", + ".kokoro/presubmit/presubmit.cfg", + ".kokoro/publish-docs.sh", + ".kokoro/release.sh", + ".kokoro/release/common.cfg", + ".kokoro/release/release.cfg", + ".kokoro/samples/lint/common.cfg", + ".kokoro/samples/lint/continuous.cfg", + ".kokoro/samples/lint/periodic.cfg", + ".kokoro/samples/lint/presubmit.cfg", + ".kokoro/samples/python3.6/common.cfg", + ".kokoro/samples/python3.6/continuous.cfg", + ".kokoro/samples/python3.6/periodic.cfg", + ".kokoro/samples/python3.6/presubmit.cfg", + ".kokoro/samples/python3.7/common.cfg", + ".kokoro/samples/python3.7/continuous.cfg", + ".kokoro/samples/python3.7/periodic.cfg", + ".kokoro/samples/python3.7/presubmit.cfg", + ".kokoro/samples/python3.8/common.cfg", + ".kokoro/samples/python3.8/continuous.cfg", + ".kokoro/samples/python3.8/periodic.cfg", + ".kokoro/samples/python3.8/presubmit.cfg", + ".kokoro/test-samples.sh", + ".kokoro/trampoline.sh", + ".kokoro/trampoline_v2.sh", + ".trampolinerc", + "CODE_OF_CONDUCT.md", + "CONTRIBUTING.rst", + "LICENSE", + "MANIFEST.in", + "datacatalog-v1-py.tar.gz", + "docs/_static/custom.css", + "docs/_templates/layout.html", + "docs/conf.py", + "docs/datacatalog_v1/services.rst", + "docs/datacatalog_v1/types.rst", + "docs/datacatalog_v1beta1/services.rst", + "docs/datacatalog_v1beta1/types.rst", + "docs/multiprocessing.rst", + "google/cloud/datacatalog/__init__.py", + "google/cloud/datacatalog/py.typed", + "google/cloud/datacatalog_v1/__init__.py", + "google/cloud/datacatalog_v1/proto/common.proto", + "google/cloud/datacatalog_v1/proto/datacatalog.proto", + "google/cloud/datacatalog_v1/proto/gcs_fileset_spec.proto", + "google/cloud/datacatalog_v1/proto/schema.proto", + "google/cloud/datacatalog_v1/proto/search.proto", + "google/cloud/datacatalog_v1/proto/table_spec.proto", + "google/cloud/datacatalog_v1/proto/tags.proto", + "google/cloud/datacatalog_v1/proto/timestamps.proto", + "google/cloud/datacatalog_v1/py.typed", + "google/cloud/datacatalog_v1/services/__init__.py", + "google/cloud/datacatalog_v1/services/data_catalog/__init__.py", + "google/cloud/datacatalog_v1/services/data_catalog/async_client.py", + "google/cloud/datacatalog_v1/services/data_catalog/client.py", + "google/cloud/datacatalog_v1/services/data_catalog/pagers.py", + "google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py", + "google/cloud/datacatalog_v1/services/data_catalog/transports/base.py", + "google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py", + "google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py", + "google/cloud/datacatalog_v1/types/__init__.py", + "google/cloud/datacatalog_v1/types/common.py", + "google/cloud/datacatalog_v1/types/datacatalog.py", + "google/cloud/datacatalog_v1/types/gcs_fileset_spec.py", + "google/cloud/datacatalog_v1/types/schema.py", + "google/cloud/datacatalog_v1/types/search.py", + "google/cloud/datacatalog_v1/types/table_spec.py", + "google/cloud/datacatalog_v1/types/tags.py", + "google/cloud/datacatalog_v1/types/timestamps.py", + "google/cloud/datacatalog_v1beta1/__init__.py", + "google/cloud/datacatalog_v1beta1/proto/common.proto", + "google/cloud/datacatalog_v1beta1/proto/datacatalog.proto", + "google/cloud/datacatalog_v1beta1/proto/gcs_fileset_spec.proto", + "google/cloud/datacatalog_v1beta1/proto/policytagmanager.proto", + "google/cloud/datacatalog_v1beta1/proto/policytagmanagerserialization.proto", + "google/cloud/datacatalog_v1beta1/proto/schema.proto", + "google/cloud/datacatalog_v1beta1/proto/search.proto", + "google/cloud/datacatalog_v1beta1/proto/table_spec.proto", + "google/cloud/datacatalog_v1beta1/proto/tags.proto", + "google/cloud/datacatalog_v1beta1/proto/timestamps.proto", + "google/cloud/datacatalog_v1beta1/py.typed", + "google/cloud/datacatalog_v1beta1/services/__init__.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/__init__.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/client.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/transports/base.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py", + "google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/__init__.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/base.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/__init__.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/base.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py", + "google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py", + "google/cloud/datacatalog_v1beta1/types/__init__.py", + "google/cloud/datacatalog_v1beta1/types/common.py", + "google/cloud/datacatalog_v1beta1/types/datacatalog.py", + "google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py", + "google/cloud/datacatalog_v1beta1/types/policytagmanager.py", + "google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py", + "google/cloud/datacatalog_v1beta1/types/schema.py", + "google/cloud/datacatalog_v1beta1/types/search.py", + "google/cloud/datacatalog_v1beta1/types/table_spec.py", + "google/cloud/datacatalog_v1beta1/types/tags.py", + "google/cloud/datacatalog_v1beta1/types/timestamps.py", + "mypy.ini", + "noxfile.py", + "renovate.json", + "samples/AUTHORING_GUIDE.md", + "samples/CONTRIBUTING.md", + "samples/snippets/noxfile.py", + "scripts/decrypt-secrets.sh", + "scripts/fixup_datacatalog_v1_keywords.py", + "scripts/fixup_datacatalog_v1beta1_keywords.py", + "scripts/readme-gen/readme_gen.py", + "scripts/readme-gen/templates/README.tmpl.rst", + "scripts/readme-gen/templates/auth.tmpl.rst", + "scripts/readme-gen/templates/auth_api_key.tmpl.rst", + "scripts/readme-gen/templates/install_deps.tmpl.rst", + "scripts/readme-gen/templates/install_portaudio.tmpl.rst", + "setup.cfg", + "testing/.gitignore", + "tests/unit/gapic/datacatalog_v1/__init__.py", + "tests/unit/gapic/datacatalog_v1/test_data_catalog.py", + "tests/unit/gapic/datacatalog_v1beta1/__init__.py", + "tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py", + "tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py", + "tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py" ] } \ No newline at end of file From 7e91e0fbe1c5cab4d7671c7a99b29ab198909c9e Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Tue, 29 Dec 2020 09:24:11 -0800 Subject: [PATCH 05/26] chore: update templates (#83) --- .coveragerc | 2 +- .pre-commit-config.yaml | 17 +++++++++++++++++ CONTRIBUTING.rst | 10 ++++++++++ docs/conf.py | 6 +++--- noxfile.py | 5 ++--- samples/snippets/noxfile.py | 17 +++++++++-------- synth.metadata | 5 +++-- 7 files changed, 45 insertions(+), 17 deletions(-) create mode 100644 .pre-commit-config.yaml diff --git a/.coveragerc b/.coveragerc index 7b592626..c0c0ad46 100644 --- a/.coveragerc +++ b/.coveragerc @@ -21,7 +21,7 @@ branch = True [report] fail_under = 100 show_missing = True -omit = google/cloud/datacatalog/__init.py +omit = google/cloud/datacatalog/__init__.py exclude_lines = # Re-enable the standard pragma pragma: NO COVER diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml new file mode 100644 index 00000000..6ad83346 --- /dev/null +++ b/.pre-commit-config.yaml @@ -0,0 +1,17 @@ +# See https://pre-commit.com for more information +# See https://pre-commit.com/hooks.html for more hooks +repos: +- repo: https://github.com/pre-commit/pre-commit-hooks + rev: v3.3.0 + hooks: + - id: trailing-whitespace + - id: end-of-file-fixer + - id: check-yaml +- repo: https://github.com/psf/black + rev: 19.10b0 + hooks: + - id: black +- repo: https://gitlab.com/pycqa/flake8 + rev: 3.8.4 + hooks: + - id: flake8 diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index cdd8c7f3..986e442b 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -111,6 +111,16 @@ Coding Style should point to the official ``googleapis`` checkout and the the branch should be the main branch on that remote (``master``). +- This repository contains configuration for the + `pre-commit `__ tool, which automates checking + our linters during a commit. If you have it installed on your ``$PATH``, + you can enable enforcing those checks via: + +.. code-block:: bash + + $ pre-commit install + pre-commit installed at .git/hooks/pre-commit + Exceptions to PEP8: - Many unit tests use a helper method, ``_call_fut`` ("FUT" is short for diff --git a/docs/conf.py b/docs/conf.py index 01be52f5..dba56c04 100644 --- a/docs/conf.py +++ b/docs/conf.py @@ -345,10 +345,10 @@ # Example configuration for intersphinx: refer to the Python standard library. intersphinx_mapping = { - "python": ("http://python.readthedocs.org/en/latest/", None), - "google-auth": ("https://google-auth.readthedocs.io/en/stable", None), + "python": ("https://python.readthedocs.org/en/latest/", None), + "google-auth": ("https://googleapis.dev/python/google-auth/latest/", None), "google.api_core": ("https://googleapis.dev/python/google-api-core/latest/", None,), - "grpc": ("https://grpc.io/grpc/python/", None), + "grpc": ("https://grpc.github.io/grpc/python/", None), "proto-plus": ("https://proto-plus-python.readthedocs.io/en/latest/", None), } diff --git a/noxfile.py b/noxfile.py index 5ba11445..5543011c 100644 --- a/noxfile.py +++ b/noxfile.py @@ -81,9 +81,8 @@ def default(session): session.run( "py.test", "--quiet", - "--cov=google.cloud.datacatalog ", - "--cov=google.cloud", - "--cov=tests.unit", + "--cov=google/cloud", + "--cov=tests/unit", "--cov-append", "--cov-config=.coveragerc", "--cov-report=", diff --git a/samples/snippets/noxfile.py b/samples/snippets/noxfile.py index b90eef00..bca0522e 100644 --- a/samples/snippets/noxfile.py +++ b/samples/snippets/noxfile.py @@ -17,6 +17,7 @@ import os from pathlib import Path import sys +from typing import Callable, Dict, List, Optional import nox @@ -68,7 +69,7 @@ TEST_CONFIG.update(TEST_CONFIG_OVERRIDE) -def get_pytest_env_vars(): +def get_pytest_env_vars() -> Dict[str, str]: """Returns a dict for pytest invocation.""" ret = {} @@ -97,7 +98,7 @@ def get_pytest_env_vars(): # -def _determine_local_import_names(start_dir): +def _determine_local_import_names(start_dir: str) -> List[str]: """Determines all import names that should be considered "local". This is used when running the linter to insure that import order is @@ -135,7 +136,7 @@ def _determine_local_import_names(start_dir): @nox.session -def lint(session): +def lint(session: nox.sessions.Session) -> None: if not TEST_CONFIG['enforce_type_hints']: session.install("flake8", "flake8-import-order") else: @@ -154,7 +155,7 @@ def lint(session): @nox.session -def blacken(session): +def blacken(session: nox.sessions.Session) -> None: session.install("black") python_files = [path for path in os.listdir(".") if path.endswith(".py")] @@ -168,7 +169,7 @@ def blacken(session): PYTEST_COMMON_ARGS = ["--junitxml=sponge_log.xml"] -def _session_tests(session, post_install=None): +def _session_tests(session: nox.sessions.Session, post_install: Callable = None) -> None: """Runs py.test for a particular project.""" if os.path.exists("requirements.txt"): session.install("-r", "requirements.txt") @@ -194,7 +195,7 @@ def _session_tests(session, post_install=None): @nox.session(python=ALL_VERSIONS) -def py(session): +def py(session: nox.sessions.Session) -> None: """Runs py.test for a sample using the specified version of Python.""" if session.python in TESTED_VERSIONS: _session_tests(session) @@ -209,7 +210,7 @@ def py(session): # -def _get_repo_root(): +def _get_repo_root() -> Optional[str]: """ Returns the root folder of the project. """ # Get root of this repository. Assume we don't have directories nested deeper than 10 items. p = Path(os.getcwd()) @@ -232,7 +233,7 @@ def _get_repo_root(): @nox.session @nox.parametrize("path", GENERATED_READMES) -def readmegen(session, path): +def readmegen(session: nox.sessions.Session, path: str) -> None: """(Re-)generates the readme for a sample.""" session.install("jinja2", "pyyaml") dir_ = os.path.dirname(path) diff --git a/synth.metadata b/synth.metadata index 01d1ea15..215d15d3 100644 --- a/synth.metadata +++ b/synth.metadata @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "d5fc0bcf9ea9789c5b0e3154a9e3b29e5cea6116" + "sha": "18c5dbdb4ac8cf75d4d8174e7b4558f48e76f8a1" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "d5fc0bcf9ea9789c5b0e3154a9e3b29e5cea6116" + "sha": "18c5dbdb4ac8cf75d4d8174e7b4558f48e76f8a1" } } ], @@ -94,6 +94,7 @@ ".kokoro/test-samples.sh", ".kokoro/trampoline.sh", ".kokoro/trampoline_v2.sh", + ".pre-commit-config.yaml", ".trampolinerc", "CODE_OF_CONDUCT.md", "CONTRIBUTING.rst", From e0c40c765242868570532b5074fd239aa2c259e9 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Tue, 29 Dec 2020 09:25:15 -0800 Subject: [PATCH 06/26] fix: remove gRPC send/recv limit; add enums to `types/__init__.py` (#87) --- datacatalog-v1-py.tar.gz | 0 .../services/data_catalog/transports/__init__.py | 1 - .../services/data_catalog/transports/grpc.py | 10 +++++++++- .../services/data_catalog/transports/grpc_asyncio.py | 8 ++++++++ google/cloud/datacatalog_v1/types/__init__.py | 12 ++++++++++-- google/cloud/datacatalog_v1beta1/__init__.py | 4 ++-- .../services/data_catalog/transports/__init__.py | 1 - .../services/data_catalog/transports/grpc.py | 10 +++++++++- .../services/data_catalog/transports/grpc_asyncio.py | 8 ++++++++ .../policy_tag_manager/transports/__init__.py | 1 - .../services/policy_tag_manager/transports/grpc.py | 10 +++++++++- .../policy_tag_manager/transports/grpc_asyncio.py | 8 ++++++++ .../transports/__init__.py | 1 - .../transports/grpc.py | 10 +++++++++- .../transports/grpc_asyncio.py | 8 ++++++++ google/cloud/datacatalog_v1beta1/types/__init__.py | 12 ++++++++++-- synth.metadata | 7 +++---- tests/unit/gapic/datacatalog_v1/test_data_catalog.py | 8 ++++++++ .../gapic/datacatalog_v1beta1/test_data_catalog.py | 8 ++++++++ .../datacatalog_v1beta1/test_policy_tag_manager.py | 8 ++++++++ .../test_policy_tag_manager_serialization.py | 8 ++++++++ 21 files changed, 125 insertions(+), 18 deletions(-) delete mode 100644 datacatalog-v1-py.tar.gz diff --git a/datacatalog-v1-py.tar.gz b/datacatalog-v1-py.tar.gz deleted file mode 100644 index e69de29b..00000000 diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py index 77a41a96..f3f1cf12 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/__init__.py @@ -28,7 +28,6 @@ _transport_registry["grpc"] = DataCatalogGrpcTransport _transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport - __all__ = ( "DataCatalogTransport", "DataCatalogGrpcTransport", diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py index 9150d2ac..b5b4d6c6 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py @@ -151,6 +151,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -169,6 +173,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._stubs = {} # type: Dict[str, Callable] @@ -195,7 +203,7 @@ def create_channel( ) -> grpc.Channel: """Create and return a gRPC channel object. Args: - address (Optionsl[str]): The host for the channel to use. + address (Optional[str]): The host for the channel to use. credentials (Optional[~.Credentials]): The authorization credentials to attach to requests. These credentials identify this application to the service. If diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py index 49b84e3a..2a25f5b8 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py @@ -196,6 +196,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -214,6 +218,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) # Run the base constructor. diff --git a/google/cloud/datacatalog_v1/types/__init__.py b/google/cloud/datacatalog_v1/types/__init__.py index b3bff88c..de273a2c 100644 --- a/google/cloud/datacatalog_v1/types/__init__.py +++ b/google/cloud/datacatalog_v1/types/__init__.py @@ -24,12 +24,16 @@ Schema, ColumnSchema, ) -from .search import SearchCatalogResult +from .search import ( + SearchCatalogResult, + SearchResultType, +) from .table_spec import ( BigQueryTableSpec, ViewSpec, TableSpec, BigQueryDateShardedSpec, + TableSourceType, ) from .tags import ( Tag, @@ -69,20 +73,23 @@ ListTagsResponse, ListEntriesRequest, ListEntriesResponse, + EntryType, ) - __all__ = ( + "IntegratedSystem", "SystemTimestamps", "GcsFilesetSpec", "GcsFileSpec", "Schema", "ColumnSchema", "SearchCatalogResult", + "SearchResultType", "BigQueryTableSpec", "ViewSpec", "TableSpec", "BigQueryDateShardedSpec", + "TableSourceType", "Tag", "TagField", "TagTemplate", @@ -118,4 +125,5 @@ "ListTagsResponse", "ListEntriesRequest", "ListEntriesResponse", + "EntryType", ) diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 16534418..8bc01583 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,6 +103,7 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", + "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -139,7 +140,6 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", - "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "DataCatalogClient", + "PolicyTagManagerClient", ) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py index 77a41a96..f3f1cf12 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/__init__.py @@ -28,7 +28,6 @@ _transport_registry["grpc"] = DataCatalogGrpcTransport _transport_registry["grpc_asyncio"] = DataCatalogGrpcAsyncIOTransport - __all__ = ( "DataCatalogTransport", "DataCatalogGrpcTransport", diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py index 4a34e3f9..e4fd43f0 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py @@ -151,6 +151,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -169,6 +173,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._stubs = {} # type: Dict[str, Callable] @@ -195,7 +203,7 @@ def create_channel( ) -> grpc.Channel: """Create and return a gRPC channel object. Args: - address (Optionsl[str]): The host for the channel to use. + address (Optional[str]): The host for the channel to use. credentials (Optional[~.Credentials]): The authorization credentials to attach to requests. These credentials identify this application to the service. If diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py index b8670aa2..11229337 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py @@ -196,6 +196,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -214,6 +218,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) # Run the base constructor. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py index 1a518753..95f18c5c 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/__init__.py @@ -28,7 +28,6 @@ _transport_registry["grpc"] = PolicyTagManagerGrpcTransport _transport_registry["grpc_asyncio"] = PolicyTagManagerGrpcAsyncIOTransport - __all__ = ( "PolicyTagManagerTransport", "PolicyTagManagerGrpcTransport", diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py index ee1e1daa..8d316d4b 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py @@ -150,6 +150,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -168,6 +172,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._stubs = {} # type: Dict[str, Callable] @@ -194,7 +202,7 @@ def create_channel( ) -> grpc.Channel: """Create and return a gRPC channel object. Args: - address (Optionsl[str]): The host for the channel to use. + address (Optional[str]): The host for the channel to use. credentials (Optional[~.Credentials]): The authorization credentials to attach to requests. These credentials identify this application to the service. If diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py index 71d83118..eef5872a 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py @@ -195,6 +195,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -213,6 +217,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) # Run the base constructor. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py index 9e8babd0..1e108bd2 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/__init__.py @@ -30,7 +30,6 @@ _transport_registry["grpc"] = PolicyTagManagerSerializationGrpcTransport _transport_registry["grpc_asyncio"] = PolicyTagManagerSerializationGrpcAsyncIOTransport - __all__ = ( "PolicyTagManagerSerializationTransport", "PolicyTagManagerSerializationGrpcTransport", diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py index 84f435e9..943dcf5e 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py @@ -150,6 +150,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -168,6 +172,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._stubs = {} # type: Dict[str, Callable] @@ -194,7 +202,7 @@ def create_channel( ) -> grpc.Channel: """Create and return a gRPC channel object. Args: - address (Optionsl[str]): The host for the channel to use. + address (Optional[str]): The host for the channel to use. credentials (Optional[~.Credentials]): The authorization credentials to attach to requests. These credentials identify this application to the service. If diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py index a93a8572..7d51d774 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py @@ -195,6 +195,10 @@ def __init__( ssl_credentials=ssl_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) self._ssl_channel_credentials = ssl_credentials else: @@ -213,6 +217,10 @@ def __init__( ssl_credentials=ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) # Run the base constructor. diff --git a/google/cloud/datacatalog_v1beta1/types/__init__.py b/google/cloud/datacatalog_v1beta1/types/__init__.py index 8a5c9ee7..253122a9 100644 --- a/google/cloud/datacatalog_v1beta1/types/__init__.py +++ b/google/cloud/datacatalog_v1beta1/types/__init__.py @@ -24,12 +24,16 @@ Schema, ColumnSchema, ) -from .search import SearchCatalogResult +from .search import ( + SearchCatalogResult, + SearchResultType, +) from .table_spec import ( BigQueryTableSpec, ViewSpec, TableSpec, BigQueryDateShardedSpec, + TableSourceType, ) from .tags import ( Tag, @@ -69,6 +73,7 @@ ListTagsResponse, ListEntriesRequest, ListEntriesResponse, + EntryType, ) from .policytagmanager import ( Taxonomy, @@ -96,18 +101,20 @@ ExportTaxonomiesResponse, ) - __all__ = ( + "IntegratedSystem", "SystemTimestamps", "GcsFilesetSpec", "GcsFileSpec", "Schema", "ColumnSchema", "SearchCatalogResult", + "SearchResultType", "BigQueryTableSpec", "ViewSpec", "TableSpec", "BigQueryDateShardedSpec", + "TableSourceType", "Tag", "TagField", "TagTemplate", @@ -143,6 +150,7 @@ "ListTagsResponse", "ListEntriesRequest", "ListEntriesResponse", + "EntryType", "Taxonomy", "PolicyTag", "CreateTaxonomyRequest", diff --git a/synth.metadata b/synth.metadata index 215d15d3..93d5c202 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,15 +4,15 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "b2c8892d9e900d8b353ef6f035fc7c0f4f75a974" + "sha": "6ca679ec15c6bab726e311b7637d4c8c9c84e508" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "53eb2512a55caabcbad1898225080a2a3dfcb6aa", - "internalRef": "346818879" + "sha": "dd372aa22ded7a8ba6f0e03a80e06358a3fa0907", + "internalRef": "347055288" } }, { @@ -100,7 +100,6 @@ "CONTRIBUTING.rst", "LICENSE", "MANIFEST.in", - "datacatalog-v1-py.tar.gz", "docs/_static/custom.css", "docs/_templates/layout.html", "docs/conf.py", diff --git a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py index 7851ae04..ba76e5b9 100644 --- a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py @@ -6797,6 +6797,10 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel assert transport._ssl_channel_credentials == mock_ssl_cred @@ -6835,6 +6839,10 @@ def test_data_catalog_transport_channel_mtls_with_adc(transport_class): scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py index 57088c0c..e1b71ede 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py @@ -6792,6 +6792,10 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel assert transport._ssl_channel_credentials == mock_ssl_cred @@ -6830,6 +6834,10 @@ def test_data_catalog_transport_channel_mtls_with_adc(transport_class): scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py index fc201fe0..98b5c966 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py @@ -3676,6 +3676,10 @@ def test_policy_tag_manager_transport_channel_mtls_with_client_cert_source( scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel assert transport._ssl_channel_credentials == mock_ssl_cred @@ -3717,6 +3721,10 @@ def test_policy_tag_manager_transport_channel_mtls_with_adc(transport_class): scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py index cb5be9a9..a3c3540e 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py @@ -996,6 +996,10 @@ def test_policy_tag_manager_serialization_transport_channel_mtls_with_client_cer scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel assert transport._ssl_channel_credentials == mock_ssl_cred @@ -1039,6 +1043,10 @@ def test_policy_tag_manager_serialization_transport_channel_mtls_with_adc( scopes=("https://www.googleapis.com/auth/cloud-platform",), ssl_credentials=mock_ssl_cred, quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], ) assert transport.grpc_channel == mock_grpc_channel From e727ece2fbb5ce5aae8bbda7bdd9d1b6864578aa Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Tue, 29 Dec 2020 09:25:44 -0800 Subject: [PATCH 07/26] chore: reorder classes (#86) From 22a5523cefdbffaa31f495864adbb06250eb224d Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Wed, 30 Dec 2020 08:02:17 -0800 Subject: [PATCH 08/26] chore: reorder classes (#89) --- google/cloud/datacatalog_v1beta1/__init__.py | 4 ++-- synth.metadata | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 8bc01583..16534418 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,7 +103,6 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", - "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -140,6 +139,7 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", + "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerClient", + "DataCatalogClient", ) diff --git a/synth.metadata b/synth.metadata index 93d5c202..3dfe2d76 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "6ca679ec15c6bab726e311b7637d4c8c9c84e508" + "sha": "e727ece2fbb5ce5aae8bbda7bdd9d1b6864578aa" } }, { From fda528a1da2ec1dbf6b3ad33eb2d33780a77f3d9 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Wed, 30 Dec 2020 08:03:03 -0800 Subject: [PATCH 09/26] chore: exclude `.nox` from linting; update pre-commit; update supported python versions (#90) --- .flake8 | 1 + .pre-commit-config.yaml | 2 +- CONTRIBUTING.rst | 11 +++++------ synth.metadata | 4 ++-- 4 files changed, 9 insertions(+), 9 deletions(-) diff --git a/.flake8 b/.flake8 index ed931638..29227d4c 100644 --- a/.flake8 +++ b/.flake8 @@ -26,6 +26,7 @@ exclude = *_pb2.py # Standard linting exemptions. + **/.nox/** __pycache__, .git, *.pyc, diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml index 6ad83346..a9024b15 100644 --- a/.pre-commit-config.yaml +++ b/.pre-commit-config.yaml @@ -2,7 +2,7 @@ # See https://pre-commit.com/hooks.html for more hooks repos: - repo: https://github.com/pre-commit/pre-commit-hooks - rev: v3.3.0 + rev: v3.4.0 hooks: - id: trailing-whitespace - id: end-of-file-fixer diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 986e442b..1d7f7e5d 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -21,8 +21,8 @@ In order to add a feature: - The feature must be documented in both the API and narrative documentation. -- The feature must work fully on the following CPython versions: 2.7, - 3.5, 3.6, 3.7 and 3.8 on both UNIX and Windows. +- The feature must work fully on the following CPython versions: + 3.6, 3.7, 3.8 and 3.9 on both UNIX and Windows. - The feature must not add unnecessary dependencies (where "unnecessary" is of course subjective, but new dependencies should @@ -202,25 +202,24 @@ Supported Python Versions We support: -- `Python 3.5`_ - `Python 3.6`_ - `Python 3.7`_ - `Python 3.8`_ +- `Python 3.9`_ -.. _Python 3.5: https://docs.python.org/3.5/ .. _Python 3.6: https://docs.python.org/3.6/ .. _Python 3.7: https://docs.python.org/3.7/ .. _Python 3.8: https://docs.python.org/3.8/ +.. _Python 3.9: https://docs.python.org/3.9/ Supported versions can be found in our ``noxfile.py`` `config`_. .. _config: https://github.com/googleapis/python-datacatalog/blob/master/noxfile.py -Python 2.7 support is deprecated. All code changes should maintain Python 2.7 compatibility until January 1, 2020. We also explicitly decided to support Python 3 beginning with version -3.5. Reasons for this include: +3.6. Reasons for this include: - Encouraging use of newest versions of Python 3 - Taking the lead of `prominent`_ open-source `projects`_ diff --git a/synth.metadata b/synth.metadata index 3dfe2d76..47a173ef 100644 --- a/synth.metadata +++ b/synth.metadata @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "18c5dbdb4ac8cf75d4d8174e7b4558f48e76f8a1" + "sha": "373861061648b5fe5e0ac4f8a38b32d639ee93e4" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "18c5dbdb4ac8cf75d4d8174e7b4558f48e76f8a1" + "sha": "373861061648b5fe5e0ac4f8a38b32d639ee93e4" } } ], From 529b4f63cd6cf4ec3ca92040d1858bdeff37fb6d Mon Sep 17 00:00:00 2001 From: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Date: Thu, 7 Jan 2021 12:52:56 -0700 Subject: [PATCH 10/26] chore: add constraints file (#91) * chore: add constraints file * chore: add constraints file * chore: add constraints file * chore: add constraints file * chore: add constraints file * chore: add constraints file --- testing/constraints-3.10.txt | 0 testing/constraints-3.11.txt | 0 testing/constraints-3.6.txt | 11 +++++++++++ testing/constraints-3.7.txt | 0 testing/constraints-3.8.txt | 0 testing/constraints-3.9.txt | 0 6 files changed, 11 insertions(+) create mode 100644 testing/constraints-3.10.txt create mode 100644 testing/constraints-3.11.txt create mode 100644 testing/constraints-3.6.txt create mode 100644 testing/constraints-3.7.txt create mode 100644 testing/constraints-3.8.txt create mode 100644 testing/constraints-3.9.txt diff --git a/testing/constraints-3.10.txt b/testing/constraints-3.10.txt new file mode 100644 index 00000000..e69de29b diff --git a/testing/constraints-3.11.txt b/testing/constraints-3.11.txt new file mode 100644 index 00000000..e69de29b diff --git a/testing/constraints-3.6.txt b/testing/constraints-3.6.txt new file mode 100644 index 00000000..421f979e --- /dev/null +++ b/testing/constraints-3.6.txt @@ -0,0 +1,11 @@ +# This constraints file is used to check that lower bounds +# are correct in setup.py +# List *all* library dependencies and extras in this file. +# Pin the version to the lower bound. +# +# e.g., if setup.py has "foo >= 1.14.0, < 2.0.0dev", +# Then this file should have foo==1.14.0 +google-api-core==1.22.0 +grpc-google-iam-v1==0.12.3 +libcst==0.2.5 +proto-plus==1.4.0 \ No newline at end of file diff --git a/testing/constraints-3.7.txt b/testing/constraints-3.7.txt new file mode 100644 index 00000000..e69de29b diff --git a/testing/constraints-3.8.txt b/testing/constraints-3.8.txt new file mode 100644 index 00000000..e69de29b diff --git a/testing/constraints-3.9.txt b/testing/constraints-3.9.txt new file mode 100644 index 00000000..e69de29b From 64719624aa16aa07c29f2cc29b54662307a96e2c Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Fri, 8 Jan 2021 09:09:13 -0800 Subject: [PATCH 11/26] chore: sort class names (#92) autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. --- google/cloud/datacatalog_v1beta1/__init__.py | 4 ++-- synth.metadata | 2 +- 2 files changed, 3 insertions(+), 3 deletions(-) diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 16534418..be0bdd8e 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,6 +103,7 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", + "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -140,7 +141,6 @@ "LookupEntryRequest", "PolicyTag", "PolicyTagManagerClient", - "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", "SearchCatalogRequest", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "DataCatalogClient", + "PolicyTagManagerSerializationClient", ) diff --git a/synth.metadata b/synth.metadata index 47a173ef..6978ee95 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "e727ece2fbb5ce5aae8bbda7bdd9d1b6864578aa" + "sha": "fda528a1da2ec1dbf6b3ad33eb2d33780a77f3d9" } }, { From 3d96b69d68295ad2032556b45faf729c715e71e0 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Fri, 8 Jan 2021 09:58:34 -0800 Subject: [PATCH 12/26] chore: use http in LICENSE to match official Apache file (#95) * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore(python): fix column sizing issue in docs Source-Author: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Source-Date: Thu Jan 7 11:58:32 2021 -0700 Source-Repo: googleapis/synthtool Source-Sha: f15b57ccfd71106c2299e9b89835fe6e55015662 Source-Link: https://github.com/googleapis/synthtool/commit/f15b57ccfd71106c2299e9b89835fe6e55015662 * chore(python): use 'http' in LICENSE Co-authored-by: Tim Swast Source-Author: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Source-Date: Thu Jan 7 13:05:12 2021 -0700 Source-Repo: googleapis/synthtool Source-Sha: 41a4e56982620d3edcf110d76f4fcdfdec471ac8 Source-Link: https://github.com/googleapis/synthtool/commit/41a4e56982620d3edcf110d76f4fcdfdec471ac8 * remove redundant class names Co-authored-by: Tim Swast --- LICENSE | 7 ++++--- docs/_static/custom.css | 7 ++++++- google/cloud/datacatalog_v1beta1/__init__.py | 2 +- synth.metadata | 6 +++--- 4 files changed, 14 insertions(+), 8 deletions(-) diff --git a/LICENSE b/LICENSE index a8ee855d..d6456956 100644 --- a/LICENSE +++ b/LICENSE @@ -1,6 +1,7 @@ - Apache License + + Apache License Version 2.0, January 2004 - https://www.apache.org/licenses/ + http://www.apache.org/licenses/ TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION @@ -192,7 +193,7 @@ you may not use this file except in compliance with the License. You may obtain a copy of the License at - https://www.apache.org/licenses/LICENSE-2.0 + http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, diff --git a/docs/_static/custom.css b/docs/_static/custom.css index 0abaf229..bcd37bbd 100644 --- a/docs/_static/custom.css +++ b/docs/_static/custom.css @@ -1,4 +1,9 @@ div#python2-eol { border-color: red; border-width: medium; -} \ No newline at end of file +} + +/* Ensure minimum width for 'Parameters' / 'Returns' column */ +dl.field-list > dt { + min-width: 100px +} diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index be0bdd8e..85a49234 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -141,6 +141,7 @@ "LookupEntryRequest", "PolicyTag", "PolicyTagManagerClient", + "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", "SearchCatalogRequest", @@ -165,5 +166,4 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerSerializationClient", ) diff --git a/synth.metadata b/synth.metadata index 6978ee95..969819b5 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "fda528a1da2ec1dbf6b3ad33eb2d33780a77f3d9" + "sha": "529b4f63cd6cf4ec3ca92040d1858bdeff37fb6d" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "373861061648b5fe5e0ac4f8a38b32d639ee93e4" + "sha": "41a4e56982620d3edcf110d76f4fcdfdec471ac8" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "373861061648b5fe5e0ac4f8a38b32d639ee93e4" + "sha": "41a4e56982620d3edcf110d76f4fcdfdec471ac8" } } ], From 890b6cc7c323a61255e001a21081beafb88c83f5 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Mon, 11 Jan 2021 07:31:00 -0800 Subject: [PATCH 13/26] test: skip docfx in main presubmit (#97) * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore(python): skip docfx in main presubmit * chore(python): skip docfx in main presubmit * fix: properly template the repo name Source-Author: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Source-Date: Fri Jan 8 10:32:13 2021 -0700 Source-Repo: googleapis/synthtool Source-Sha: fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483 Source-Link: https://github.com/googleapis/synthtool/commit/fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483 --- .kokoro/build.sh | 16 ++++++++++------ .kokoro/docs/docs-presubmit.cfg | 11 +++++++++++ .trampolinerc | 2 ++ google/cloud/datacatalog_v1beta1/__init__.py | 2 +- noxfile.py | 11 +++++++++++ synth.metadata | 6 +++--- 6 files changed, 38 insertions(+), 10 deletions(-) diff --git a/.kokoro/build.sh b/.kokoro/build.sh index eead7b10..d86a114e 100755 --- a/.kokoro/build.sh +++ b/.kokoro/build.sh @@ -15,7 +15,11 @@ set -eo pipefail -cd github/python-datacatalog +if [[ -z "${PROJECT_ROOT:-}" ]]; then + PROJECT_ROOT="github/python-datacatalog" +fi + +cd "${PROJECT_ROOT}" # Disable buffering, so that the logs stream through. export PYTHONUNBUFFERED=1 @@ -30,16 +34,16 @@ export GOOGLE_APPLICATION_CREDENTIALS=${KOKORO_GFILE_DIR}/service-account.json export PROJECT_ID=$(cat "${KOKORO_GFILE_DIR}/project-id.json") # Remove old nox -python3.6 -m pip uninstall --yes --quiet nox-automation +python3 -m pip uninstall --yes --quiet nox-automation # Install nox -python3.6 -m pip install --upgrade --quiet nox -python3.6 -m nox --version +python3 -m pip install --upgrade --quiet nox +python3 -m nox --version # If NOX_SESSION is set, it only runs the specified session, # otherwise run all the sessions. if [[ -n "${NOX_SESSION:-}" ]]; then - python3.6 -m nox -s "${NOX_SESSION:-}" + python3 -m nox -s ${NOX_SESSION:-} else - python3.6 -m nox + python3 -m nox fi diff --git a/.kokoro/docs/docs-presubmit.cfg b/.kokoro/docs/docs-presubmit.cfg index 11181078..cf544000 100644 --- a/.kokoro/docs/docs-presubmit.cfg +++ b/.kokoro/docs/docs-presubmit.cfg @@ -15,3 +15,14 @@ env_vars: { key: "TRAMPOLINE_IMAGE_UPLOAD" value: "false" } + +env_vars: { + key: "TRAMPOLINE_BUILD_FILE" + value: github/python-datacatalog/.kokoro/build.sh" +} + +# Only run this nox session. +env_vars: { + key: "NOX_SESSION" + value: "docs docfx" +} diff --git a/.trampolinerc b/.trampolinerc index 995ee291..c7d663ae 100644 --- a/.trampolinerc +++ b/.trampolinerc @@ -18,12 +18,14 @@ required_envvars+=( "STAGING_BUCKET" "V2_STAGING_BUCKET" + "NOX_SESSION" ) # Add env vars which are passed down into the container here. pass_down_envvars+=( "STAGING_BUCKET" "V2_STAGING_BUCKET" + "NOX_SESSION" ) # Prevent unintentional override on the default image. diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 85a49234..8bc01583 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -140,7 +140,6 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", - "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", @@ -166,4 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", + "PolicyTagManagerClient", ) diff --git a/noxfile.py b/noxfile.py index 5543011c..8fca72e4 100644 --- a/noxfile.py +++ b/noxfile.py @@ -30,6 +30,17 @@ SYSTEM_TEST_PYTHON_VERSIONS = ["3.8"] UNIT_TEST_PYTHON_VERSIONS = ["3.6", "3.7", "3.8", "3.9"] +# 'docfx' is excluded since it only needs to run in 'docs-presubmit' +nox.options.sessions = [ + "unit", + "system", + "cover", + "lint", + "lint_setup_py", + "blacken", + "docs", +] + @nox.session(python=DEFAULT_PYTHON_VERSION) def lint(session): diff --git a/synth.metadata b/synth.metadata index 969819b5..adaa6b35 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "529b4f63cd6cf4ec3ca92040d1858bdeff37fb6d" + "sha": "3d96b69d68295ad2032556b45faf729c715e71e0" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "41a4e56982620d3edcf110d76f4fcdfdec471ac8" + "sha": "fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "41a4e56982620d3edcf110d76f4fcdfdec471ac8" + "sha": "fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483" } } ], From bf883c6583838242b9b0bfd37da7780152b14690 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Wed, 13 Jan 2021 14:51:50 -0800 Subject: [PATCH 14/26] chore: fix error in docs-presubmit config (#99) * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore: add missing quotation mark Source-Author: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Source-Date: Mon Jan 11 09:43:06 2021 -0700 Source-Repo: googleapis/synthtool Source-Sha: 16ec872dd898d7de6e1822badfac32484b5d9031 Source-Link: https://github.com/googleapis/synthtool/commit/16ec872dd898d7de6e1822badfac32484b5d9031 --- .kokoro/docs/docs-presubmit.cfg | 2 +- google/cloud/datacatalog_v1beta1/__init__.py | 4 ++-- synth.metadata | 6 +++--- 3 files changed, 6 insertions(+), 6 deletions(-) diff --git a/.kokoro/docs/docs-presubmit.cfg b/.kokoro/docs/docs-presubmit.cfg index cf544000..ff3c54c6 100644 --- a/.kokoro/docs/docs-presubmit.cfg +++ b/.kokoro/docs/docs-presubmit.cfg @@ -18,7 +18,7 @@ env_vars: { env_vars: { key: "TRAMPOLINE_BUILD_FILE" - value: github/python-datacatalog/.kokoro/build.sh" + value: "github/python-datacatalog/.kokoro/build.sh" } # Only run this nox session. diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 8bc01583..16534418 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,7 +103,6 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", - "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -140,6 +139,7 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", + "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerClient", + "DataCatalogClient", ) diff --git a/synth.metadata b/synth.metadata index adaa6b35..059cec33 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "3d96b69d68295ad2032556b45faf729c715e71e0" + "sha": "890b6cc7c323a61255e001a21081beafb88c83f5" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483" + "sha": "16ec872dd898d7de6e1822badfac32484b5d9031" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "fb53b6fb373b7c3edf4e55f3e8036bc6d73fa483" + "sha": "16ec872dd898d7de6e1822badfac32484b5d9031" } } ], From 2dbb3ef062b52925ad421c5c469ed6e67671e878 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Thu, 14 Jan 2021 08:12:02 -0800 Subject: [PATCH 15/26] docs: document enum values with `undoc-members` option (#93) This PR was generated using Autosynth. :rainbow: Synth log will be available here: https://source.cloud.google.com/results/invocations/f529093e-7e5c-432b-bfc8-df2fd7eacf9c/targets - [ ] To automatically regenerate this PR, check this box. PiperOrigin-RevId: 350246057 Source-Link: https://github.com/googleapis/googleapis/commit/520682435235d9c503983a360a2090025aa47cd1 --- .coveragerc | 22 +- docs/datacatalog_v1/data_catalog.rst | 11 + docs/datacatalog_v1/services.rst | 6 +- docs/datacatalog_v1/types.rst | 1 + docs/datacatalog_v1beta1/data_catalog.rst | 11 + .../policy_tag_manager.rst | 11 + .../policy_tag_manager_serialization.rst | 6 + docs/datacatalog_v1beta1/services.rst | 14 +- docs/datacatalog_v1beta1/types.rst | 1 + .../services/data_catalog/async_client.py | 619 +++++++-------- .../services/data_catalog/client.py | 696 +++++++++-------- .../services/data_catalog/pagers.py | 64 +- .../cloud/datacatalog_v1/types/datacatalog.py | 60 +- .../datacatalog_v1/types/gcs_fileset_spec.py | 4 +- google/cloud/datacatalog_v1/types/schema.py | 4 +- google/cloud/datacatalog_v1/types/search.py | 4 +- .../cloud/datacatalog_v1/types/table_spec.py | 6 +- google/cloud/datacatalog_v1/types/tags.py | 16 +- .../cloud/datacatalog_v1/types/timestamps.py | 6 +- .../services/data_catalog/async_client.py | 625 ++++++++-------- .../services/data_catalog/client.py | 702 ++++++++++-------- .../services/data_catalog/pagers.py | 64 +- .../policy_tag_manager/async_client.py | 352 +++++---- .../services/policy_tag_manager/client.py | 389 +++++----- .../services/policy_tag_manager/pagers.py | 32 +- .../async_client.py | 15 +- .../client.py | 34 +- .../datacatalog_v1beta1/types/datacatalog.py | 60 +- .../types/gcs_fileset_spec.py | 4 +- .../types/policytagmanager.py | 18 +- .../types/policytagmanagerserialization.py | 12 +- .../cloud/datacatalog_v1beta1/types/schema.py | 4 +- .../cloud/datacatalog_v1beta1/types/search.py | 2 +- .../datacatalog_v1beta1/types/table_spec.py | 6 +- .../cloud/datacatalog_v1beta1/types/tags.py | 16 +- .../datacatalog_v1beta1/types/timestamps.py | 6 +- synth.metadata | 11 +- .../gapic/datacatalog_v1/test_data_catalog.py | 30 +- .../datacatalog_v1beta1/test_data_catalog.py | 30 +- .../test_policy_tag_manager.py | 28 +- .../test_policy_tag_manager_serialization.py | 28 +- 41 files changed, 2137 insertions(+), 1893 deletions(-) create mode 100644 docs/datacatalog_v1/data_catalog.rst create mode 100644 docs/datacatalog_v1beta1/data_catalog.rst create mode 100644 docs/datacatalog_v1beta1/policy_tag_manager.rst create mode 100644 docs/datacatalog_v1beta1/policy_tag_manager_serialization.rst diff --git a/.coveragerc b/.coveragerc index c0c0ad46..8f9f82cd 100644 --- a/.coveragerc +++ b/.coveragerc @@ -1,34 +1,16 @@ -# -*- coding: utf-8 -*- -# -# Copyright 2020 Google LLC -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# https://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -# Generated by synthtool. DO NOT EDIT! [run] branch = True [report] fail_under = 100 show_missing = True -omit = google/cloud/datacatalog/__init__.py +omit = + google/cloud/datacatalog/__init__.py exclude_lines = # Re-enable the standard pragma pragma: NO COVER # Ignore debug-only repr def __repr__ - # Ignore abstract methods - raise NotImplementedError # Ignore pkg_resources exceptions. # This is added at the module level as a safeguard for if someone # generates the code and tries to run it without pip installing. This diff --git a/docs/datacatalog_v1/data_catalog.rst b/docs/datacatalog_v1/data_catalog.rst new file mode 100644 index 00000000..1f955a11 --- /dev/null +++ b/docs/datacatalog_v1/data_catalog.rst @@ -0,0 +1,11 @@ +DataCatalog +----------------------------- + +.. automodule:: google.cloud.datacatalog_v1.services.data_catalog + :members: + :inherited-members: + + +.. automodule:: google.cloud.datacatalog_v1.services.data_catalog.pagers + :members: + :inherited-members: diff --git a/docs/datacatalog_v1/services.rst b/docs/datacatalog_v1/services.rst index a73ca817..fd21338e 100644 --- a/docs/datacatalog_v1/services.rst +++ b/docs/datacatalog_v1/services.rst @@ -1,6 +1,6 @@ Services for Google Cloud Datacatalog v1 API ============================================ +.. toctree:: + :maxdepth: 2 -.. automodule:: google.cloud.datacatalog_v1.services.data_catalog - :members: - :inherited-members: + data_catalog diff --git a/docs/datacatalog_v1/types.rst b/docs/datacatalog_v1/types.rst index 19f12ef8..a27783ee 100644 --- a/docs/datacatalog_v1/types.rst +++ b/docs/datacatalog_v1/types.rst @@ -3,4 +3,5 @@ Types for Google Cloud Datacatalog v1 API .. automodule:: google.cloud.datacatalog_v1.types :members: + :undoc-members: :show-inheritance: diff --git a/docs/datacatalog_v1beta1/data_catalog.rst b/docs/datacatalog_v1beta1/data_catalog.rst new file mode 100644 index 00000000..e3b0675d --- /dev/null +++ b/docs/datacatalog_v1beta1/data_catalog.rst @@ -0,0 +1,11 @@ +DataCatalog +----------------------------- + +.. automodule:: google.cloud.datacatalog_v1beta1.services.data_catalog + :members: + :inherited-members: + + +.. automodule:: google.cloud.datacatalog_v1beta1.services.data_catalog.pagers + :members: + :inherited-members: diff --git a/docs/datacatalog_v1beta1/policy_tag_manager.rst b/docs/datacatalog_v1beta1/policy_tag_manager.rst new file mode 100644 index 00000000..01a7cf6b --- /dev/null +++ b/docs/datacatalog_v1beta1/policy_tag_manager.rst @@ -0,0 +1,11 @@ +PolicyTagManager +---------------------------------- + +.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager + :members: + :inherited-members: + + +.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager.pagers + :members: + :inherited-members: diff --git a/docs/datacatalog_v1beta1/policy_tag_manager_serialization.rst b/docs/datacatalog_v1beta1/policy_tag_manager_serialization.rst new file mode 100644 index 00000000..aed4c56c --- /dev/null +++ b/docs/datacatalog_v1beta1/policy_tag_manager_serialization.rst @@ -0,0 +1,6 @@ +PolicyTagManagerSerialization +----------------------------------------------- + +.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization + :members: + :inherited-members: diff --git a/docs/datacatalog_v1beta1/services.rst b/docs/datacatalog_v1beta1/services.rst index 43425e2f..4f762e1c 100644 --- a/docs/datacatalog_v1beta1/services.rst +++ b/docs/datacatalog_v1beta1/services.rst @@ -1,12 +1,8 @@ Services for Google Cloud Datacatalog v1beta1 API ================================================= +.. toctree:: + :maxdepth: 2 -.. automodule:: google.cloud.datacatalog_v1beta1.services.data_catalog - :members: - :inherited-members: -.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager - :members: - :inherited-members: -.. automodule:: google.cloud.datacatalog_v1beta1.services.policy_tag_manager_serialization - :members: - :inherited-members: + data_catalog + policy_tag_manager + policy_tag_manager_serialization diff --git a/docs/datacatalog_v1beta1/types.rst b/docs/datacatalog_v1beta1/types.rst index a1baedaf..687d8391 100644 --- a/docs/datacatalog_v1beta1/types.rst +++ b/docs/datacatalog_v1beta1/types.rst @@ -3,4 +3,5 @@ Types for Google Cloud Datacatalog v1beta1 API .. automodule:: google.cloud.datacatalog_v1beta1.types :members: + :undoc-members: :show-inheritance: diff --git a/google/cloud/datacatalog_v1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py index e5cb1dbf..dd7e759f 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/async_client.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py @@ -94,6 +94,7 @@ class DataCatalogAsyncClient: DataCatalogClient.parse_common_location_path ) + from_service_account_info = DataCatalogClient.from_service_account_info from_service_account_file = DataCatalogClient.from_service_account_file from_service_account_json = from_service_account_file @@ -187,15 +188,16 @@ async def search_catalog( for more information. Args: - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (:class:`google.cloud.datacatalog_v1.types.SearchCatalogRequest`): The request object. Request message for [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. - scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + scope (:class:`google.cloud.datacatalog_v1.types.SearchCatalogRequest.Scope`): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -214,6 +216,7 @@ async def search_catalog( `Data Catalog Search Syntax `__ for more information. + This corresponds to the ``query`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -225,9 +228,9 @@ async def search_catalog( sent along with the request as metadata. Returns: - ~.pagers.SearchCatalogAsyncPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.SearchCatalogAsyncPager: Response message for - [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. Iterating over this object will yield results and resolve additional pages automatically. @@ -313,7 +316,7 @@ async def create_entry_group( for more information). Args: - request (:class:`~.datacatalog.CreateEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1.types.CreateEntryGroupRequest`): The request object. Request message for [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. parent (:class:`str`): @@ -324,6 +327,7 @@ async def create_entry_group( Note that this EntryGroup and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -334,12 +338,14 @@ async def create_entry_group( English letters, numbers and underscores, and be at most 64 characters. + This corresponds to the ``entry_group_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (:class:`google.cloud.datacatalog_v1.types.EntryGroup`): The entry group to create. Defaults to an empty entry group. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -351,10 +357,11 @@ async def create_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -412,18 +419,20 @@ async def get_entry_group( r"""Gets an EntryGroup. Args: - request (:class:`~.datacatalog.GetEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1.types.GetEntryGroupRequest`): The request object. Request message for [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. name (:class:`str`): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - read_mask (:class:`~.field_mask.FieldMask`): + read_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to return. If not set or empty, all fields are returned. + This corresponds to the ``read_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -435,10 +444,11 @@ async def get_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -504,19 +514,21 @@ async def update_entry_group( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1.types.UpdateEntryGroupRequest`): The request object. Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (:class:`google.cloud.datacatalog_v1.types.EntryGroup`): Required. The updated entry group. "name" field must be set. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -528,10 +540,11 @@ async def update_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -593,12 +606,13 @@ async def delete_entry_group( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1.types.DeleteEntryGroupRequest`): The request object. Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. name (:class:`str`): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -658,7 +672,7 @@ async def list_entry_groups( r"""Lists entry groups. Args: - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (:class:`google.cloud.datacatalog_v1.types.ListEntryGroupsRequest`): The request object. Request message for [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. parent (:class:`str`): @@ -667,6 +681,7 @@ async def list_entry_groups( Example: - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -678,9 +693,9 @@ async def list_entry_groups( sent along with the request as metadata. Returns: - ~.pagers.ListEntryGroupsAsyncPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListEntryGroupsAsyncPager: Response message for - [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. Iterating over this object will yield results and resolve additional pages automatically. @@ -759,7 +774,7 @@ async def create_entry( A maximum of 100,000 entries may be created per entry group. Args: - request (:class:`~.datacatalog.CreateEntryRequest`): + request (:class:`google.cloud.datacatalog_v1.types.CreateEntryRequest`): The request object. Request message for [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. parent (:class:`str`): @@ -770,16 +785,18 @@ async def create_entry( Note that this Entry and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. entry_id (:class:`str`): Required. The id of the entry to create. + This corresponds to the ``entry_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry (:class:`~.datacatalog.Entry`): + entry (:class:`google.cloud.datacatalog_v1.types.Entry`): Required. The entry to create. This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this @@ -792,18 +809,19 @@ async def create_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -865,16 +883,17 @@ async def update_entry( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryRequest`): + request (:class:`google.cloud.datacatalog_v1.types.UpdateEntryRequest`): The request object. Request message for [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. - entry (:class:`~.datacatalog.Entry`): + entry (:class:`google.cloud.datacatalog_v1.types.Entry`): Required. The updated entry. The "name" field must be set. + This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -901,6 +920,7 @@ async def update_entry( - user_specified_system - linked_resource - source_system_timestamps + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -912,18 +932,19 @@ async def update_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -986,13 +1007,14 @@ async def delete_entry( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryRequest`): + request (:class:`google.cloud.datacatalog_v1.types.DeleteEntryRequest`): The request object. Request message for [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. name (:class:`str`): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1052,13 +1074,14 @@ async def get_entry( r"""Gets an entry. Args: - request (:class:`~.datacatalog.GetEntryRequest`): + request (:class:`google.cloud.datacatalog_v1.types.GetEntryRequest`): The request object. Request message for [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. name (:class:`str`): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1070,18 +1093,19 @@ async def get_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1142,7 +1166,7 @@ async def lookup_entry( Entry. Args: - request (:class:`~.datacatalog.LookupEntryRequest`): + request (:class:`google.cloud.datacatalog_v1.types.LookupEntryRequest`): The request object. Request message for [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. @@ -1153,18 +1177,19 @@ async def lookup_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1203,7 +1228,7 @@ async def list_entries( r"""Lists entries. Args: - request (:class:`~.datacatalog.ListEntriesRequest`): + request (:class:`google.cloud.datacatalog_v1.types.ListEntriesRequest`): The request object. Request message for [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. parent (:class:`str`): @@ -1211,6 +1236,7 @@ async def list_entries( entries, which can be provided in URL format. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1222,9 +1248,9 @@ async def list_entries( sent along with the request as metadata. Returns: - ~.pagers.ListEntriesAsyncPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListEntriesAsyncPager: Response message for - [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1298,7 +1324,7 @@ async def create_tag_template( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1.types.CreateTagTemplateRequest`): The request object. Request message for [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. parent (:class:`str`): @@ -1309,16 +1335,18 @@ async def create_tag_template( Example: - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. tag_template_id (:class:`str`): Required. The id of the tag template to create. + This corresponds to the ``tag_template_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (:class:`google.cloud.datacatalog_v1.types.TagTemplate`): Required. The tag template to create. This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this @@ -1331,16 +1359,16 @@ async def create_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1397,13 +1425,14 @@ async def get_tag_template( r"""Gets a tag template. Args: - request (:class:`~.datacatalog.GetTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1.types.GetTagTemplateRequest`): The request object. Request message for [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. name (:class:`str`): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1415,16 +1444,16 @@ async def get_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1486,16 +1515,17 @@ async def update_tag_template( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1.types.UpdateTagTemplateRequest`): The request object. Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (:class:`google.cloud.datacatalog_v1.types.TagTemplate`): Required. The template to update. The "name" field must be set. + This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The field mask specifies the parts of the template to overwrite. @@ -1505,6 +1535,7 @@ async def update_tag_template( If absent or empty, all of the allowed fields above will be updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1516,16 +1547,16 @@ async def update_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1587,7 +1618,7 @@ async def delete_tag_template( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1.types.DeleteTagTemplateRequest`): The request object. Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. name (:class:`str`): @@ -1595,6 +1626,7 @@ async def delete_tag_template( Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1603,6 +1635,7 @@ async def delete_tag_template( ``true``. This confirms the deletion of any possible tags using this template. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1670,7 +1703,7 @@ async def create_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1.types.CreateTagTemplateFieldRequest`): The request object. Request message for [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. parent (:class:`str`): @@ -1681,6 +1714,7 @@ async def create_tag_template_field( Example: - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1691,12 +1725,14 @@ async def create_tag_template_field( (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. + This corresponds to the ``tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (:class:`google.cloud.datacatalog_v1.types.TagTemplateField`): Required. The tag template field to create. + This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1708,7 +1744,7 @@ async def create_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1774,22 +1810,23 @@ async def update_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1.types.UpdateTagTemplateFieldRequest`): The request object. Request message for [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. name (:class:`str`): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (:class:`google.cloud.datacatalog_v1.types.TagTemplateField`): Required. The template to update. This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -1805,6 +1842,7 @@ async def update_tag_template_field( can only be added, existing enum values cannot be deleted nor renamed. Updating a template field from optional to required is NOT allowed. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1816,7 +1854,7 @@ async def update_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1880,19 +1918,21 @@ async def rename_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1.types.RenameTagTemplateFieldRequest`): The request object. Request message for [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. name (:class:`str`): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. new_tag_template_field_id (:class:`str`): Required. The new ID of this tag template field. For example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1904,7 +1944,7 @@ async def rename_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1967,7 +2007,7 @@ async def delete_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1.types.DeleteTagTemplateFieldRequest`): The request object. Request message for [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. name (:class:`str`): @@ -1975,6 +2015,7 @@ async def delete_tag_template_field( Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1983,6 +2024,7 @@ async def delete_tag_template_field( ``true``. This confirms the deletion of this field from any tags using this field. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2050,7 +2092,7 @@ async def create_tag( used to create the tag must be from the same organization. Args: - request (:class:`~.datacatalog.CreateTagRequest`): + request (:class:`google.cloud.datacatalog_v1.types.CreateTagRequest`): The request object. Request message for [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. parent (:class:`str`): @@ -2061,10 +2103,11 @@ async def create_tag( Note that this Tag and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag (:class:`~.tags.Tag`): + tag (:class:`google.cloud.datacatalog_v1.types.Tag`): Required. The tag to create. This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -2077,15 +2120,15 @@ async def create_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2141,19 +2184,21 @@ async def update_tag( r"""Updates an existing tag. Args: - request (:class:`~.datacatalog.UpdateTagRequest`): + request (:class:`google.cloud.datacatalog_v1.types.UpdateTagRequest`): The request object. Request message for [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. - tag (:class:`~.tags.Tag`): + tag (:class:`google.cloud.datacatalog_v1.types.Tag`): Required. The updated tag. The "name" field must be set. + This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2165,15 +2210,15 @@ async def update_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2228,13 +2273,14 @@ async def delete_tag( r"""Deletes a tag. Args: - request (:class:`~.datacatalog.DeleteTagRequest`): + request (:class:`google.cloud.datacatalog_v1.types.DeleteTagRequest`): The request object. Request message for [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. name (:class:`str`): Required. The name of the tag to delete. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2294,7 +2340,7 @@ async def list_tags( r"""Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. Args: - request (:class:`~.datacatalog.ListTagsRequest`): + request (:class:`google.cloud.datacatalog_v1.types.ListTagsRequest`): The request object. Request message for [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. parent (:class:`str`): @@ -2307,6 +2353,7 @@ async def list_tags( - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2318,9 +2365,9 @@ async def list_tags( sent along with the request as metadata. Returns: - ~.pagers.ListTagsAsyncPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListTagsAsyncPager: Response message for - [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -2404,7 +2451,7 @@ async def set_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.SetIamPolicyRequest`): The request object. Request message for `SetIamPolicy` method. resource (:class:`str`): @@ -2412,6 +2459,7 @@ async def set_iam_policy( policy is being specified. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2423,72 +2471,62 @@ async def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2561,7 +2599,7 @@ async def get_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.GetIamPolicyRequest`): The request object. Request message for `GetIamPolicy` method. resource (:class:`str`): @@ -2569,6 +2607,7 @@ async def get_iam_policy( policy is being requested. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2580,72 +2619,62 @@ async def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2716,7 +2745,7 @@ async def test_iam_permissions( this request. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest`): The request object. Request message for `TestIamPermissions` method. @@ -2727,8 +2756,8 @@ async def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1/services/data_catalog/client.py b/google/cloud/datacatalog_v1/services/data_catalog/client.py index dae42e6a..a9663871 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/client.py @@ -120,6 +120,22 @@ def _get_default_mtls_endpoint(api_endpoint): DEFAULT_ENDPOINT ) + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogClient: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_info(info) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials @@ -132,7 +148,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs): kwargs: Additional arguments to pass to the constructor. Returns: - {@api.name}: The constructed client. + DataCatalogClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file(filename) kwargs["credentials"] = credentials @@ -312,10 +328,10 @@ def __init__( credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. - transport (Union[str, ~.DataCatalogTransport]): The + transport (Union[str, DataCatalogTransport]): The transport to use. If set to None, a transport is chosen automatically. - client_options (client_options_lib.ClientOptions): Custom options for the + client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT @@ -442,19 +458,20 @@ def search_catalog( for more information. Args: - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1.types.SearchCatalogRequest): The request object. Request message for [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. - scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + scope (google.cloud.datacatalog_v1.types.SearchCatalogRequest.Scope): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - query (:class:`str`): + query (str): Required. The query string in search query syntax. The query must be non-empty. @@ -469,6 +486,7 @@ def search_catalog( `Data Catalog Search Syntax `__ for more information. + This corresponds to the ``query`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -480,9 +498,9 @@ def search_catalog( sent along with the request as metadata. Returns: - ~.pagers.SearchCatalogPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.SearchCatalogPager: Response message for - [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. + [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. Iterating over this object will yield results and resolve additional pages automatically. @@ -563,10 +581,10 @@ def create_entry_group( for more information). Args: - request (:class:`~.datacatalog.CreateEntryGroupRequest`): + request (google.cloud.datacatalog_v1.types.CreateEntryGroupRequest): The request object. Request message for [CreateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.CreateEntryGroup]. - parent (:class:`str`): + parent (str): Required. The name of the project this entry group is in. Example: @@ -574,22 +592,25 @@ def create_entry_group( Note that this EntryGroup and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group_id (:class:`str`): + entry_group_id (str): Required. The id of the entry group to create. The id must begin with a letter or underscore, contain only English letters, numbers and underscores, and be at most 64 characters. + This corresponds to the ``entry_group_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (google.cloud.datacatalog_v1.types.EntryGroup): The entry group to create. Defaults to an empty entry group. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -601,10 +622,11 @@ def create_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -663,18 +685,20 @@ def get_entry_group( r"""Gets an EntryGroup. Args: - request (:class:`~.datacatalog.GetEntryGroupRequest`): + request (google.cloud.datacatalog_v1.types.GetEntryGroupRequest): The request object. Request message for [GetEntryGroup][google.cloud.datacatalog.v1.DataCatalog.GetEntryGroup]. - name (:class:`str`): + name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - read_mask (:class:`~.field_mask.FieldMask`): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return. If not set or empty, all fields are returned. + This corresponds to the ``read_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -686,10 +710,11 @@ def get_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -750,19 +775,21 @@ def update_entry_group( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + request (google.cloud.datacatalog_v1.types.UpdateEntryGroupRequest): The request object. Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (google.cloud.datacatalog_v1.types.EntryGroup): Required. The updated entry group. "name" field must be set. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -774,10 +801,11 @@ def update_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1.Entry] resources. + google.cloud.datacatalog_v1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1.Entry] resources. """ # Create or coerce a protobuf request object. @@ -840,12 +868,13 @@ def delete_entry_group( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + request (google.cloud.datacatalog_v1.types.DeleteEntryGroupRequest): The request object. Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1.DataCatalog.DeleteEntryGroup]. - name (:class:`str`): + name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -906,15 +935,16 @@ def list_entry_groups( r"""Lists entry groups. Args: - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1.types.ListEntryGroupsRequest): The request object. Request message for [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. - parent (:class:`str`): + parent (str): Required. The name of the location that contains the entry groups, which can be provided in URL format. Example: - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -926,9 +956,9 @@ def list_entry_groups( sent along with the request as metadata. Returns: - ~.pagers.ListEntryGroupsPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListEntryGroupsPager: Response message for - [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. + [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1002,10 +1032,10 @@ def create_entry( A maximum of 100,000 entries may be created per entry group. Args: - request (:class:`~.datacatalog.CreateEntryRequest`): + request (google.cloud.datacatalog_v1.types.CreateEntryRequest): The request object. Request message for [CreateEntry][google.cloud.datacatalog.v1.DataCatalog.CreateEntry]. - parent (:class:`str`): + parent (str): Required. The name of the entry group this entry is in. Example: @@ -1013,16 +1043,18 @@ def create_entry( Note that this Entry and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_id (:class:`str`): + entry_id (str): Required. The id of the entry to create. + This corresponds to the ``entry_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry (:class:`~.datacatalog.Entry`): + entry (google.cloud.datacatalog_v1.types.Entry): Required. The entry to create. This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this @@ -1035,18 +1067,19 @@ def create_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1109,16 +1142,17 @@ def update_entry( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryRequest`): + request (google.cloud.datacatalog_v1.types.UpdateEntryRequest): The request object. Request message for [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. - entry (:class:`~.datacatalog.Entry`): + entry (google.cloud.datacatalog_v1.types.Entry): Required. The updated entry. The "name" field must be set. + This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -1145,6 +1179,7 @@ def update_entry( - user_specified_system - linked_resource - source_system_timestamps + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1156,18 +1191,19 @@ def update_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1231,13 +1267,14 @@ def delete_entry( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryRequest`): + request (google.cloud.datacatalog_v1.types.DeleteEntryRequest): The request object. Request message for [DeleteEntry][google.cloud.datacatalog.v1.DataCatalog.DeleteEntry]. - name (:class:`str`): + name (str): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1298,13 +1335,14 @@ def get_entry( r"""Gets an entry. Args: - request (:class:`~.datacatalog.GetEntryRequest`): + request (google.cloud.datacatalog_v1.types.GetEntryRequest): The request object. Request message for [GetEntry][google.cloud.datacatalog.v1.DataCatalog.GetEntry]. - name (:class:`str`): + name (str): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1316,18 +1354,19 @@ def get_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1383,7 +1422,7 @@ def lookup_entry( Entry. Args: - request (:class:`~.datacatalog.LookupEntryRequest`): + request (google.cloud.datacatalog_v1.types.LookupEntryRequest): The request object. Request message for [LookupEntry][google.cloud.datacatalog.v1.DataCatalog.LookupEntry]. @@ -1394,18 +1433,19 @@ def lookup_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic) or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1.Tag]. + google.cloud.datacatalog_v1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic) or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1.Tag]. """ # Create or coerce a protobuf request object. @@ -1439,14 +1479,15 @@ def list_entries( r"""Lists entries. Args: - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1.types.ListEntriesRequest): The request object. Request message for [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. - parent (:class:`str`): + parent (str): Required. The name of the entry group that contains the entries, which can be provided in URL format. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1458,9 +1499,9 @@ def list_entries( sent along with the request as metadata. Returns: - ~.pagers.ListEntriesPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListEntriesPager: Response message for - [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. + [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1529,10 +1570,10 @@ def create_tag_template( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateRequest`): + request (google.cloud.datacatalog_v1.types.CreateTagTemplateRequest): The request object. Request message for [CreateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplate]. - parent (:class:`str`): + parent (str): Required. The name of the project and the template location `region `__. @@ -1540,16 +1581,18 @@ def create_tag_template( Example: - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_id (:class:`str`): + tag_template_id (str): Required. The id of the tag template to create. + This corresponds to the ``tag_template_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (google.cloud.datacatalog_v1.types.TagTemplate): Required. The tag template to create. This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this @@ -1562,16 +1605,16 @@ def create_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1629,13 +1672,14 @@ def get_tag_template( r"""Gets a tag template. Args: - request (:class:`~.datacatalog.GetTagTemplateRequest`): + request (google.cloud.datacatalog_v1.types.GetTagTemplateRequest): The request object. Request message for [GetTagTemplate][google.cloud.datacatalog.v1.DataCatalog.GetTagTemplate]. - name (:class:`str`): + name (str): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1647,16 +1691,16 @@ def get_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1719,16 +1763,17 @@ def update_tag_template( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + request (google.cloud.datacatalog_v1.types.UpdateTagTemplateRequest): The request object. Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (google.cloud.datacatalog_v1.types.TagTemplate): Required. The template to update. The "name" field must be set. + This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The field mask specifies the parts of the template to overwrite. @@ -1738,6 +1783,7 @@ def update_tag_template( If absent or empty, all of the allowed fields above will be updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1749,16 +1795,16 @@ def update_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1821,22 +1867,24 @@ def delete_tag_template( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + request (google.cloud.datacatalog_v1.types.DeleteTagTemplateRequest): The request object. Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplate]. - name (:class:`str`): + name (str): Required. The name of the tag template to delete. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - force (:class:`bool`): + force (bool): Required. Currently, this field must always be set to ``true``. This confirms the deletion of any possible tags using this template. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1905,10 +1953,10 @@ def create_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1.types.CreateTagTemplateFieldRequest): The request object. Request message for [CreateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.CreateTagTemplateField]. - parent (:class:`str`): + parent (str): Required. The name of the project and the template location `region `__. @@ -1916,22 +1964,25 @@ def create_tag_template_field( Example: - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field_id (:class:`str`): + tag_template_field_id (str): Required. The ID of the tag template field to create. Field ids can contain letters (both uppercase and lowercase), numbers (0-9), underscores (_) and dashes (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. + This corresponds to the ``tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (google.cloud.datacatalog_v1.types.TagTemplateField): Required. The tag template field to create. + This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1943,7 +1994,7 @@ def create_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2012,22 +2063,23 @@ def update_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1.types.UpdateTagTemplateFieldRequest): The request object. Request message for [UpdateTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (google.cloud.datacatalog_v1.types.TagTemplateField): Required. The template to update. This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -2043,6 +2095,7 @@ def update_tag_template_field( can only be added, existing enum values cannot be deleted nor renamed. Updating a template field from optional to required is NOT allowed. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2054,7 +2107,7 @@ def update_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2121,19 +2174,21 @@ def rename_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1.types.RenameTagTemplateFieldRequest): The request object. Request message for [RenameTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.RenameTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - new_tag_template_field_id (:class:`str`): + new_tag_template_field_id (str): Required. The new ID of this tag template field. For example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2145,7 +2200,7 @@ def rename_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2211,22 +2266,24 @@ def delete_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1.types.DeleteTagTemplateFieldRequest): The request object. Request message for [DeleteTagTemplateField][google.cloud.datacatalog.v1.DataCatalog.DeleteTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template field to delete. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - force (:class:`bool`): + force (bool): Required. Currently, this field must always be set to ``true``. This confirms the deletion of this field from any tags using this field. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2297,10 +2354,10 @@ def create_tag( used to create the tag must be from the same organization. Args: - request (:class:`~.datacatalog.CreateTagRequest`): + request (google.cloud.datacatalog_v1.types.CreateTagRequest): The request object. Request message for [CreateTag][google.cloud.datacatalog.v1.DataCatalog.CreateTag]. - parent (:class:`str`): + parent (str): Required. The name of the resource to attach this tag to. Tags can be attached to Entries. Example: @@ -2308,10 +2365,11 @@ def create_tag( Note that this Tag and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag (:class:`~.tags.Tag`): + tag (google.cloud.datacatalog_v1.types.Tag): Required. The tag to create. This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -2324,15 +2382,15 @@ def create_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2389,19 +2447,21 @@ def update_tag( r"""Updates an existing tag. Args: - request (:class:`~.datacatalog.UpdateTagRequest`): + request (google.cloud.datacatalog_v1.types.UpdateTagRequest): The request object. Request message for [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. - tag (:class:`~.tags.Tag`): + tag (google.cloud.datacatalog_v1.types.Tag): Required. The updated tag. The "name" field must be set. + This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2413,15 +2473,15 @@ def update_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2477,13 +2537,14 @@ def delete_tag( r"""Deletes a tag. Args: - request (:class:`~.datacatalog.DeleteTagRequest`): + request (google.cloud.datacatalog_v1.types.DeleteTagRequest): The request object. Request message for [DeleteTag][google.cloud.datacatalog.v1.DataCatalog.DeleteTag]. - name (:class:`str`): + name (str): Required. The name of the tag to delete. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2544,10 +2605,10 @@ def list_tags( r"""Lists the tags on an [Entry][google.cloud.datacatalog.v1.Entry]. Args: - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1.types.ListTagsRequest): The request object. Request message for [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. - parent (:class:`str`): + parent (str): Required. The name of the Data Catalog resource to list the tags of. The resource could be an [Entry][google.cloud.datacatalog.v1.Entry] or an @@ -2557,6 +2618,7 @@ def list_tags( - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2568,9 +2630,9 @@ def list_tags( sent along with the request as metadata. Returns: - ~.pagers.ListTagsPager: + google.cloud.datacatalog_v1.services.data_catalog.pagers.ListTagsPager: Response message for - [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. + [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -2649,14 +2711,15 @@ def set_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.SetIamPolicyRequest): The request object. Request message for `SetIamPolicy` method. - resource (:class:`str`): + resource (str): REQUIRED: The resource for which the policy is being specified. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2668,72 +2731,62 @@ def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2802,14 +2855,15 @@ def get_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.GetIamPolicyRequest): The request object. Request message for `GetIamPolicy` method. - resource (:class:`str`): + resource (str): REQUIRED: The resource for which the policy is being requested. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2821,72 +2875,62 @@ def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2947,7 +2991,7 @@ def test_iam_permissions( this request. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest): The request object. Request message for `TestIamPermissions` method. @@ -2958,8 +3002,8 @@ def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py index 05c81bfd..a5ce7581 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/pagers.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py @@ -26,7 +26,7 @@ class SearchCatalogPager: """A pager for iterating through ``search_catalog`` requests. This class thinly wraps an initial - :class:`~.datacatalog.SearchCatalogResponse` object, and + :class:`google.cloud.datacatalog_v1.types.SearchCatalogResponse` object, and provides an ``__iter__`` method to iterate through its ``results`` field. @@ -35,7 +35,7 @@ class SearchCatalogPager: through the ``results`` field on the corresponding responses. - All the usual :class:`~.datacatalog.SearchCatalogResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.SearchCatalogResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -53,9 +53,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1.types.SearchCatalogRequest): The initial request object. - response (:class:`~.datacatalog.SearchCatalogResponse`): + response (google.cloud.datacatalog_v1.types.SearchCatalogResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -88,7 +88,7 @@ class SearchCatalogAsyncPager: """A pager for iterating through ``search_catalog`` requests. This class thinly wraps an initial - :class:`~.datacatalog.SearchCatalogResponse` object, and + :class:`google.cloud.datacatalog_v1.types.SearchCatalogResponse` object, and provides an ``__aiter__`` method to iterate through its ``results`` field. @@ -97,7 +97,7 @@ class SearchCatalogAsyncPager: through the ``results`` field on the corresponding responses. - All the usual :class:`~.datacatalog.SearchCatalogResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.SearchCatalogResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -115,9 +115,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1.types.SearchCatalogRequest): The initial request object. - response (:class:`~.datacatalog.SearchCatalogResponse`): + response (google.cloud.datacatalog_v1.types.SearchCatalogResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -154,7 +154,7 @@ class ListEntryGroupsPager: """A pager for iterating through ``list_entry_groups`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntryGroupsResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListEntryGroupsResponse` object, and provides an ``__iter__`` method to iterate through its ``entry_groups`` field. @@ -163,7 +163,7 @@ class ListEntryGroupsPager: through the ``entry_groups`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListEntryGroupsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -181,9 +181,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1.types.ListEntryGroupsRequest): The initial request object. - response (:class:`~.datacatalog.ListEntryGroupsResponse`): + response (google.cloud.datacatalog_v1.types.ListEntryGroupsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -216,7 +216,7 @@ class ListEntryGroupsAsyncPager: """A pager for iterating through ``list_entry_groups`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntryGroupsResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListEntryGroupsResponse` object, and provides an ``__aiter__`` method to iterate through its ``entry_groups`` field. @@ -225,7 +225,7 @@ class ListEntryGroupsAsyncPager: through the ``entry_groups`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListEntryGroupsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -243,9 +243,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1.types.ListEntryGroupsRequest): The initial request object. - response (:class:`~.datacatalog.ListEntryGroupsResponse`): + response (google.cloud.datacatalog_v1.types.ListEntryGroupsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -282,7 +282,7 @@ class ListEntriesPager: """A pager for iterating through ``list_entries`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntriesResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListEntriesResponse` object, and provides an ``__iter__`` method to iterate through its ``entries`` field. @@ -291,7 +291,7 @@ class ListEntriesPager: through the ``entries`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntriesResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListEntriesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -309,9 +309,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1.types.ListEntriesRequest): The initial request object. - response (:class:`~.datacatalog.ListEntriesResponse`): + response (google.cloud.datacatalog_v1.types.ListEntriesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -344,7 +344,7 @@ class ListEntriesAsyncPager: """A pager for iterating through ``list_entries`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntriesResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListEntriesResponse` object, and provides an ``__aiter__`` method to iterate through its ``entries`` field. @@ -353,7 +353,7 @@ class ListEntriesAsyncPager: through the ``entries`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntriesResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListEntriesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -371,9 +371,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1.types.ListEntriesRequest): The initial request object. - response (:class:`~.datacatalog.ListEntriesResponse`): + response (google.cloud.datacatalog_v1.types.ListEntriesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -410,7 +410,7 @@ class ListTagsPager: """A pager for iterating through ``list_tags`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListTagsResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListTagsResponse` object, and provides an ``__iter__`` method to iterate through its ``tags`` field. @@ -419,7 +419,7 @@ class ListTagsPager: through the ``tags`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -437,9 +437,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1.types.ListTagsRequest): The initial request object. - response (:class:`~.datacatalog.ListTagsResponse`): + response (google.cloud.datacatalog_v1.types.ListTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -472,7 +472,7 @@ class ListTagsAsyncPager: """A pager for iterating through ``list_tags`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListTagsResponse` object, and + :class:`google.cloud.datacatalog_v1.types.ListTagsResponse` object, and provides an ``__aiter__`` method to iterate through its ``tags`` field. @@ -481,7 +481,7 @@ class ListTagsAsyncPager: through the ``tags`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1.types.ListTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -499,9 +499,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1.types.ListTagsRequest): The initial request object. - response (:class:`~.datacatalog.ListTagsResponse`): + response (google.cloud.datacatalog_v1.types.ListTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. diff --git a/google/cloud/datacatalog_v1/types/datacatalog.py b/google/cloud/datacatalog_v1/types/datacatalog.py index a02ed993..e90fb67a 100644 --- a/google/cloud/datacatalog_v1/types/datacatalog.py +++ b/google/cloud/datacatalog_v1/types/datacatalog.py @@ -83,7 +83,7 @@ class SearchCatalogRequest(proto.Message): [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. Attributes: - scope (~.datacatalog.SearchCatalogRequest.Scope): + scope (google.cloud.datacatalog_v1.types.SearchCatalogRequest.Scope): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. @@ -211,7 +211,7 @@ class SearchCatalogResponse(proto.Message): [SearchCatalog][google.cloud.datacatalog.v1.DataCatalog.SearchCatalog]. Attributes: - results (Sequence[~.search.SearchCatalogResult]): + results (Sequence[google.cloud.datacatalog_v1.types.SearchCatalogResult]): Search results. next_page_token (str): The token that can be used to retrieve the @@ -256,7 +256,7 @@ class CreateEntryGroupRequest(proto.Message): underscore, contain only English letters, numbers and underscores, and be at most 64 characters. - entry_group (~.datacatalog.EntryGroup): + entry_group (google.cloud.datacatalog_v1.types.EntryGroup): The entry group to create. Defaults to an empty entry group. """ @@ -273,10 +273,10 @@ class UpdateEntryGroupRequest(proto.Message): [UpdateEntryGroup][google.cloud.datacatalog.v1.DataCatalog.UpdateEntryGroup]. Attributes: - entry_group (~.datacatalog.EntryGroup): + entry_group (google.cloud.datacatalog_v1.types.EntryGroup): Required. The updated entry group. "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. @@ -295,7 +295,7 @@ class GetEntryGroupRequest(proto.Message): name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - read_mask (~.field_mask.FieldMask): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return. If not set or empty, all fields are returned. """ @@ -354,7 +354,7 @@ class ListEntryGroupsResponse(proto.Message): [ListEntryGroups][google.cloud.datacatalog.v1.DataCatalog.ListEntryGroups]. Attributes: - entry_groups (Sequence[~.datacatalog.EntryGroup]): + entry_groups (Sequence[google.cloud.datacatalog_v1.types.EntryGroup]): EntryGroup details. next_page_token (str): Token to retrieve the next page of results. @@ -386,7 +386,7 @@ class CreateEntryRequest(proto.Message): actually be stored in the location in this name. entry_id (str): Required. The id of the entry to create. - entry (~.datacatalog.Entry): + entry (google.cloud.datacatalog_v1.types.Entry): Required. The entry to create. """ @@ -402,10 +402,10 @@ class UpdateEntryRequest(proto.Message): [UpdateEntry][google.cloud.datacatalog.v1.DataCatalog.UpdateEntry]. Attributes: - entry (~.datacatalog.Entry): + entry (google.cloud.datacatalog_v1.types.Entry): Required. The updated entry. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -537,7 +537,7 @@ class Entry(proto.Message): Output only when Entry is of type in the EntryType enum. For entries with user_specified_type, this field is optional and defaults to an empty string. - type_ (~.datacatalog.EntryType): + type_ (google.cloud.datacatalog_v1.types.EntryType): The type of the entry. Only used for Entries with types in the EntryType enum. @@ -555,7 +555,7 @@ class Entry(proto.Message): Currently, only FILESET enum value is allowed. All other entries created through Data Catalog must use ``user_specified_type``. - integrated_system (~.common.IntegratedSystem): + integrated_system (google.cloud.datacatalog_v1.types.IntegratedSystem): Output only. This field indicates the entry's source system that Data Catalog integrates with, such as BigQuery or Pub/Sub. @@ -566,14 +566,14 @@ class Entry(proto.Message): contain letters, numbers, and underscores; are case insensitive; must be at least 1 character and at most 64 characters long. - gcs_fileset_spec (~.gcd_gcs_fileset_spec.GcsFilesetSpec): + gcs_fileset_spec (google.cloud.datacatalog_v1.types.GcsFilesetSpec): Specification that applies to a Cloud Storage fileset. This is only valid on entries of type FILESET. - bigquery_table_spec (~.table_spec.BigQueryTableSpec): + bigquery_table_spec (google.cloud.datacatalog_v1.types.BigQueryTableSpec): Specification that applies to a BigQuery table. This is only valid on entries of type ``TABLE``. - bigquery_date_sharded_spec (~.table_spec.BigQueryDateShardedSpec): + bigquery_date_sharded_spec (google.cloud.datacatalog_v1.types.BigQueryDateShardedSpec): Specification for a group of BigQuery tables with name pattern ``[prefix]YYYYMMDD``. Context: https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding. @@ -587,10 +587,10 @@ class Entry(proto.Message): several sentences or paragraphs that describe entry contents. Default value is an empty string. - schema (~.gcd_schema.Schema): + schema (google.cloud.datacatalog_v1.types.Schema): Schema of the entry. An entry might not have any schema attached to it. - source_system_timestamps (~.timestamps.SystemTimestamps): + source_system_timestamps (google.cloud.datacatalog_v1.types.SystemTimestamps): Timestamps about the underlying resource, not about this Data Catalog entry. Output only when Entry is of type in the EntryType enum. For entries with user_specified_type, this @@ -665,7 +665,7 @@ class EntryGroup(proto.Message): several sentences or paragraphs that describe entry group contents. Default value is an empty string. - data_catalog_timestamps (~.timestamps.SystemTimestamps): + data_catalog_timestamps (google.cloud.datacatalog_v1.types.SystemTimestamps): Output only. Timestamps about this EntryGroup. Default value is empty timestamps. """ @@ -696,7 +696,7 @@ class CreateTagTemplateRequest(proto.Message): tag_template_id (str): Required. The id of the tag template to create. - tag_template (~.gcd_tags.TagTemplate): + tag_template (google.cloud.datacatalog_v1.types.TagTemplate): Required. The tag template to create. """ @@ -726,10 +726,10 @@ class UpdateTagTemplateRequest(proto.Message): [UpdateTagTemplate][google.cloud.datacatalog.v1.DataCatalog.UpdateTagTemplate]. Attributes: - tag_template (~.gcd_tags.TagTemplate): + tag_template (google.cloud.datacatalog_v1.types.TagTemplate): Required. The template to update. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The field mask specifies the parts of the template to overwrite. @@ -780,7 +780,7 @@ class CreateTagRequest(proto.Message): Note that this Tag and its child resources may not actually be stored in the location in this name. - tag (~.gcd_tags.Tag): + tag (google.cloud.datacatalog_v1.types.Tag): Required. The tag to create. """ @@ -794,10 +794,10 @@ class UpdateTagRequest(proto.Message): [UpdateTag][google.cloud.datacatalog.v1.DataCatalog.UpdateTag]. Attributes: - tag (~.gcd_tags.Tag): + tag (google.cloud.datacatalog_v1.types.Tag): Required. The updated tag. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. @@ -840,7 +840,7 @@ class CreateTagTemplateFieldRequest(proto.Message): numbers (0-9), underscores (_) and dashes (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. - tag_template_field (~.gcd_tags.TagTemplateField): + tag_template_field (google.cloud.datacatalog_v1.types.TagTemplateField): Required. The tag template field to create. """ @@ -862,9 +862,9 @@ class UpdateTagTemplateFieldRequest(proto.Message): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - tag_template_field (~.gcd_tags.TagTemplateField): + tag_template_field (google.cloud.datacatalog_v1.types.TagTemplateField): Required. The template to update. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -967,7 +967,7 @@ class ListTagsResponse(proto.Message): [ListTags][google.cloud.datacatalog.v1.DataCatalog.ListTags]. Attributes: - tags (Sequence[~.gcd_tags.Tag]): + tags (Sequence[google.cloud.datacatalog_v1.types.Tag]): [Tag][google.cloud.datacatalog.v1.Tag] details. next_page_token (str): Token to retrieve the next page of results. @@ -1001,7 +1001,7 @@ class ListEntriesRequest(proto.Message): page_token (str): Token that specifies which page is requested. If empty, the first page is returned. - read_mask (~.field_mask.FieldMask): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return for each Entry. If not set or empty, all fields are returned. For example, setting read_mask to contain only one path "name" will cause ListEntries to @@ -1022,7 +1022,7 @@ class ListEntriesResponse(proto.Message): [ListEntries][google.cloud.datacatalog.v1.DataCatalog.ListEntries]. Attributes: - entries (Sequence[~.datacatalog.Entry]): + entries (Sequence[google.cloud.datacatalog_v1.types.Entry]): Entry details. next_page_token (str): Token to retrieve the next page of results. diff --git a/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py b/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py index 64518aff..ce3a5a6e 100644 --- a/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py +++ b/google/cloud/datacatalog_v1/types/gcs_fileset_spec.py @@ -63,7 +63,7 @@ class GcsFilesetSpec(proto.Message): for example: - ``gs://bucket_name/[a-m]??.j*g`` - sample_gcs_file_specs (Sequence[~.gcs_fileset_spec.GcsFileSpec]): + sample_gcs_file_specs (Sequence[google.cloud.datacatalog_v1.types.GcsFileSpec]): Output only. Sample files contained in this fileset, not all files contained in this fileset are represented here. @@ -83,7 +83,7 @@ class GcsFileSpec(proto.Message): file_path (str): Required. The full file path. Example: ``gs://bucket_name/a/b.txt``. - gcs_timestamps (~.timestamps.SystemTimestamps): + gcs_timestamps (google.cloud.datacatalog_v1.types.SystemTimestamps): Output only. Timestamps about the Cloud Storage file. size_bytes (int): diff --git a/google/cloud/datacatalog_v1/types/schema.py b/google/cloud/datacatalog_v1/types/schema.py index 98560462..debec332 100644 --- a/google/cloud/datacatalog_v1/types/schema.py +++ b/google/cloud/datacatalog_v1/types/schema.py @@ -27,7 +27,7 @@ class Schema(proto.Message): r"""Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). Attributes: - columns (Sequence[~.schema.ColumnSchema]): + columns (Sequence[google.cloud.datacatalog_v1.types.ColumnSchema]): Required. Schema of columns. A maximum of 10,000 columns and sub-columns can be specified. """ @@ -52,7 +52,7 @@ class ColumnSchema(proto.Message): this column are required, nullable, etc. Only ``NULLABLE``, ``REQUIRED`` and ``REPEATED`` are supported. Default mode is ``NULLABLE``. - subcolumns (Sequence[~.schema.ColumnSchema]): + subcolumns (Sequence[google.cloud.datacatalog_v1.types.ColumnSchema]): Optional. Schema of sub-columns. A column can have zero or more sub-columns. """ diff --git a/google/cloud/datacatalog_v1/types/search.py b/google/cloud/datacatalog_v1/types/search.py index eb4370da..cdcb129f 100644 --- a/google/cloud/datacatalog_v1/types/search.py +++ b/google/cloud/datacatalog_v1/types/search.py @@ -43,7 +43,7 @@ class SearchCatalogResult(proto.Message): search. Attributes: - search_result_type (~.search.SearchResultType): + search_result_type (google.cloud.datacatalog_v1.types.SearchResultType): Type of the search result. This field can be used to determine which Get method to call to fetch the full resource. @@ -66,7 +66,7 @@ class SearchCatalogResult(proto.Message): Example: - ``//bigquery.googleapis.com/projects/projectId/datasets/datasetId/tables/tableId`` - integrated_system (~.common.IntegratedSystem): + integrated_system (google.cloud.datacatalog_v1.types.IntegratedSystem): Output only. This field indicates the entry's source system that Data Catalog integrates with, such as BigQuery or Cloud Pub/Sub. diff --git a/google/cloud/datacatalog_v1/types/table_spec.py b/google/cloud/datacatalog_v1/types/table_spec.py index 4c86f64f..8404dba2 100644 --- a/google/cloud/datacatalog_v1/types/table_spec.py +++ b/google/cloud/datacatalog_v1/types/table_spec.py @@ -41,12 +41,12 @@ class BigQueryTableSpec(proto.Message): r"""Describes a BigQuery table. Attributes: - table_source_type (~.gcd_table_spec.TableSourceType): + table_source_type (google.cloud.datacatalog_v1.types.TableSourceType): Output only. The table source type. - view_spec (~.gcd_table_spec.ViewSpec): + view_spec (google.cloud.datacatalog_v1.types.ViewSpec): Table view specification. This field should only be populated if ``table_source_type`` is ``BIGQUERY_VIEW``. - table_spec (~.gcd_table_spec.TableSpec): + table_spec (google.cloud.datacatalog_v1.types.TableSpec): Spec of a BigQuery table. This field should only be populated if ``table_source_type`` is ``BIGQUERY_TABLE``. """ diff --git a/google/cloud/datacatalog_v1/types/tags.py b/google/cloud/datacatalog_v1/types/tags.py index 4ef4efa7..e85c5036 100644 --- a/google/cloud/datacatalog_v1/types/tags.py +++ b/google/cloud/datacatalog_v1/types/tags.py @@ -63,7 +63,7 @@ class Tag(proto.Message): separate the column names. Example: - ``outer_column.inner_column`` - fields (Sequence[~.tags.Tag.FieldsEntry]): + fields (Sequence[google.cloud.datacatalog_v1.types.Tag.FieldsEntry]): Required. This maps the ID of a tag field to the value of and additional information about that field. Valid field IDs are defined by the @@ -98,10 +98,10 @@ class TagField(proto.Message): bool_value (bool): Holds the value for a tag field with boolean type. - timestamp_value (~.timestamp.Timestamp): + timestamp_value (google.protobuf.timestamp_pb2.Timestamp): Holds the value for a tag field with timestamp type. - enum_value (~.tags.TagField.EnumValue): + enum_value (google.cloud.datacatalog_v1.types.TagField.EnumValue): Holds the value for a tag field with enum type. This value must be one of the allowed values in the definition of this enum. @@ -165,7 +165,7 @@ class TagTemplate(proto.Message): display_name (str): The display name for this template. Defaults to an empty string. - fields (Sequence[~.tags.TagTemplate.FieldsEntry]): + fields (Sequence[google.cloud.datacatalog_v1.types.TagTemplate.FieldsEntry]): Required. Map of tag template field IDs to the settings for the field. This map is an exhaustive list of the allowed fields. This map must contain at least one field and at most @@ -202,7 +202,7 @@ class TagTemplateField(proto.Message): display_name (str): The display name for this field. Defaults to an empty string. - type_ (~.tags.FieldType): + type_ (google.cloud.datacatalog_v1.types.FieldType): Required. The type of value this tag field can contain. is_required (bool): @@ -232,10 +232,10 @@ class FieldType(proto.Message): r""" Attributes: - primitive_type (~.tags.FieldType.PrimitiveType): + primitive_type (google.cloud.datacatalog_v1.types.FieldType.PrimitiveType): Represents primitive types - string, bool etc. - enum_type (~.tags.FieldType.EnumType): + enum_type (google.cloud.datacatalog_v1.types.FieldType.EnumType): Represents an enum type. """ @@ -251,7 +251,7 @@ class EnumType(proto.Message): r""" Attributes: - allowed_values (Sequence[~.tags.FieldType.EnumType.EnumValue]): + allowed_values (Sequence[google.cloud.datacatalog_v1.types.FieldType.EnumType.EnumValue]): Required on create; optional on update. The set of allowed values for this enum. This set must not be empty, the display names of the diff --git a/google/cloud/datacatalog_v1/types/timestamps.py b/google/cloud/datacatalog_v1/types/timestamps.py index 451b9a43..4d4a834f 100644 --- a/google/cloud/datacatalog_v1/types/timestamps.py +++ b/google/cloud/datacatalog_v1/types/timestamps.py @@ -31,13 +31,13 @@ class SystemTimestamps(proto.Message): system. Attributes: - create_time (~.timestamp.Timestamp): + create_time (google.protobuf.timestamp_pb2.Timestamp): The creation time of the resource within the given system. - update_time (~.timestamp.Timestamp): + update_time (google.protobuf.timestamp_pb2.Timestamp): The last-modified time of the resource within the given system. - expire_time (~.timestamp.Timestamp): + expire_time (google.protobuf.timestamp_pb2.Timestamp): Output only. The expiration time of the resource within the given system. Currently only apllicable to BigQuery resources. diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py index bec3d14c..3a99aa56 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py @@ -94,6 +94,7 @@ class DataCatalogAsyncClient: DataCatalogClient.parse_common_location_path ) + from_service_account_info = DataCatalogClient.from_service_account_info from_service_account_file = DataCatalogClient.from_service_account_file from_service_account_json = from_service_account_file @@ -187,15 +188,16 @@ async def search_catalog( for more information. Args: - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest`): The request object. Request message for [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. - scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + scope (:class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest.Scope`): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -214,6 +216,7 @@ async def search_catalog( `Data Catalog Search Syntax `__ for more information. + This corresponds to the ``query`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -225,9 +228,9 @@ async def search_catalog( sent along with the request as metadata. Returns: - ~.pagers.SearchCatalogAsyncPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.SearchCatalogAsyncPager: Response message for - [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. Iterating over this object will yield results and resolve additional pages automatically. @@ -294,7 +297,7 @@ async def create_entry_group( for more information). Args: - request (:class:`~.datacatalog.CreateEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateEntryGroupRequest`): The request object. Request message for [CreateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup]. parent (:class:`str`): @@ -305,6 +308,7 @@ async def create_entry_group( Note that this EntryGroup and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -315,12 +319,14 @@ async def create_entry_group( English letters, numbers and underscores, and be at most 64 characters. + This corresponds to the ``entry_group_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (:class:`google.cloud.datacatalog_v1beta1.types.EntryGroup`): The entry group to create. Defaults to an empty entry group. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -332,11 +338,12 @@ async def create_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -398,19 +405,21 @@ async def update_entry_group( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateEntryGroupRequest`): The request object. Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (:class:`google.cloud.datacatalog_v1beta1.types.EntryGroup`): Required. The updated entry group. "name" field must be set. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -422,11 +431,12 @@ async def update_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -484,18 +494,20 @@ async def get_entry_group( r"""Gets an EntryGroup. Args: - request (:class:`~.datacatalog.GetEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.GetEntryGroupRequest`): The request object. Request message for [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. name (:class:`str`): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - read_mask (:class:`~.field_mask.FieldMask`): + read_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to return. If not set or empty, all fields are returned. + This corresponds to the ``read_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -507,11 +519,12 @@ async def get_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -579,12 +592,13 @@ async def delete_entry_group( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteEntryGroupRequest`): The request object. Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup]. name (:class:`str`): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -652,7 +666,7 @@ async def list_entry_groups( r"""Lists entry groups. Args: - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ListEntryGroupsRequest`): The request object. Request message for [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. parent (:class:`str`): @@ -661,6 +675,7 @@ async def list_entry_groups( Example: - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -672,9 +687,9 @@ async def list_entry_groups( sent along with the request as metadata. Returns: - ~.pagers.ListEntryGroupsAsyncPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListEntryGroupsAsyncPager: Response message for - [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. Iterating over this object will yield results and resolve additional pages automatically. @@ -747,7 +762,7 @@ async def create_entry( A maximum of 100,000 entries may be created per entry group. Args: - request (:class:`~.datacatalog.CreateEntryRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateEntryRequest`): The request object. Request message for [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry]. parent (:class:`str`): @@ -758,16 +773,18 @@ async def create_entry( Note that this Entry and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. entry_id (:class:`str`): Required. The id of the entry to create. + This corresponds to the ``entry_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry (:class:`~.datacatalog.Entry`): + entry (:class:`google.cloud.datacatalog_v1beta1.types.Entry`): Required. The entry to create. This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this @@ -780,18 +797,19 @@ async def create_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -853,16 +871,17 @@ async def update_entry( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateEntryRequest`): The request object. Request message for [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. - entry (:class:`~.datacatalog.Entry`): + entry (:class:`google.cloud.datacatalog_v1beta1.types.Entry`): Required. The updated entry. The "name" field must be set. + This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -889,6 +908,7 @@ async def update_entry( - user_specified_system - linked_resource - source_system_timestamps + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -900,18 +920,19 @@ async def update_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -974,13 +995,14 @@ async def delete_entry( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteEntryRequest`): The request object. Request message for [DeleteEntry][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry]. name (:class:`str`): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1048,13 +1070,14 @@ async def get_entry( r"""Gets an entry. Args: - request (:class:`~.datacatalog.GetEntryRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.GetEntryRequest`): The request object. Request message for [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. name (:class:`str`): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1066,18 +1089,19 @@ async def get_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1140,7 +1164,7 @@ async def lookup_entry( Entry. Args: - request (:class:`~.datacatalog.LookupEntryRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.LookupEntryRequest`): The request object. Request message for [LookupEntry][google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry]. @@ -1151,18 +1175,19 @@ async def lookup_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1203,7 +1228,7 @@ async def list_entries( r"""Lists entries. Args: - request (:class:`~.datacatalog.ListEntriesRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ListEntriesRequest`): The request object. Request message for [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. parent (:class:`str`): @@ -1211,6 +1236,7 @@ async def list_entries( entries, which can be provided in URL format. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1222,9 +1248,9 @@ async def list_entries( sent along with the request as metadata. Returns: - ~.pagers.ListEntriesAsyncPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListEntriesAsyncPager: Response message for - [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1292,7 +1318,7 @@ async def create_tag_template( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateTagTemplateRequest`): The request object. Request message for [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. parent (:class:`str`): @@ -1303,16 +1329,18 @@ async def create_tag_template( Example: - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. tag_template_id (:class:`str`): Required. The id of the tag template to create. + This corresponds to the ``tag_template_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (:class:`google.cloud.datacatalog_v1beta1.types.TagTemplate`): Required. The tag template to create. This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this @@ -1325,16 +1353,16 @@ async def create_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1391,13 +1419,14 @@ async def get_tag_template( r"""Gets a tag template. Args: - request (:class:`~.datacatalog.GetTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.GetTagTemplateRequest`): The request object. Request message for [GetTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate]. name (:class:`str`): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1409,16 +1438,16 @@ async def get_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1488,16 +1517,17 @@ async def update_tag_template( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateTagTemplateRequest`): The request object. Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (:class:`google.cloud.datacatalog_v1beta1.types.TagTemplate`): Required. The template to update. The "name" field must be set. + This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The field mask specifies the parts of the template to overwrite. @@ -1507,6 +1537,7 @@ async def update_tag_template( If absent or empty, all of the allowed fields above will be updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1518,16 +1549,16 @@ async def update_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1589,7 +1620,7 @@ async def delete_tag_template( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteTagTemplateRequest`): The request object. Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate]. name (:class:`str`): @@ -1597,6 +1628,7 @@ async def delete_tag_template( Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1605,6 +1637,7 @@ async def delete_tag_template( ``true``. This confirms the deletion of any possible tags using this template. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1680,7 +1713,7 @@ async def create_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateTagTemplateFieldRequest`): The request object. Request message for [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. parent (:class:`str`): @@ -1691,6 +1724,7 @@ async def create_tag_template_field( Example: - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1701,12 +1735,14 @@ async def create_tag_template_field( (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. + This corresponds to the ``tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (:class:`google.cloud.datacatalog_v1beta1.types.TagTemplateField`): Required. The tag template field to create. + This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1718,7 +1754,7 @@ async def create_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1784,22 +1820,23 @@ async def update_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateTagTemplateFieldRequest`): The request object. Request message for [UpdateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField]. name (:class:`str`): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (:class:`google.cloud.datacatalog_v1beta1.types.TagTemplateField`): Required. The template to update. This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -1815,6 +1852,7 @@ async def update_tag_template_field( can only be added, existing enum values cannot be deleted nor renamed. Updating a template field from optional to required is NOT allowed. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1826,7 +1864,7 @@ async def update_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1890,19 +1928,21 @@ async def rename_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.RenameTagTemplateFieldRequest`): The request object. Request message for [RenameTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField]. name (:class:`str`): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. new_tag_template_field_id (:class:`str`): Required. The new ID of this tag template field. For example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1914,7 +1954,7 @@ async def rename_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -1977,7 +2017,7 @@ async def delete_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteTagTemplateFieldRequest`): The request object. Request message for [DeleteTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField]. name (:class:`str`): @@ -1985,6 +2025,7 @@ async def delete_tag_template_field( Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1993,6 +2034,7 @@ async def delete_tag_template_field( ``true``. This confirms the deletion of this field from any tags using this field. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2069,7 +2111,7 @@ async def create_tag( used to create the tag must be from the same organization. Args: - request (:class:`~.datacatalog.CreateTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateTagRequest`): The request object. Request message for [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. parent (:class:`str`): @@ -2080,10 +2122,11 @@ async def create_tag( Note that this Tag and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag (:class:`~.tags.Tag`): + tag (:class:`google.cloud.datacatalog_v1beta1.types.Tag`): Required. The tag to create. This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -2096,15 +2139,15 @@ async def create_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1beta1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2160,19 +2203,21 @@ async def update_tag( r"""Updates an existing tag. Args: - request (:class:`~.datacatalog.UpdateTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateTagRequest`): The request object. Request message for [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. - tag (:class:`~.tags.Tag`): + tag (:class:`google.cloud.datacatalog_v1beta1.types.Tag`): Required. The updated tag. The "name" field must be set. + This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (:class:`google.protobuf.field_mask_pb2.FieldMask`): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2184,15 +2229,15 @@ async def update_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1beta1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2247,13 +2292,14 @@ async def delete_tag( r"""Deletes a tag. Args: - request (:class:`~.datacatalog.DeleteTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteTagRequest`): The request object. Request message for [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. name (:class:`str`): Required. The name of the tag to delete. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2322,7 +2368,7 @@ async def list_tags( [Entry][google.cloud.datacatalog.v1beta1.Entry]. Args: - request (:class:`~.datacatalog.ListTagsRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ListTagsRequest`): The request object. Request message for [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. parent (:class:`str`): @@ -2335,6 +2381,7 @@ async def list_tags( - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2346,9 +2393,9 @@ async def list_tags( sent along with the request as metadata. Returns: - ~.pagers.ListTagsAsyncPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListTagsAsyncPager: Response message for - [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -2434,7 +2481,7 @@ async def set_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.SetIamPolicyRequest`): The request object. Request message for `SetIamPolicy` method. resource (:class:`str`): @@ -2442,6 +2489,7 @@ async def set_iam_policy( policy is being specified. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2453,72 +2501,62 @@ async def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2591,7 +2629,7 @@ async def get_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.GetIamPolicyRequest`): The request object. Request message for `GetIamPolicy` method. resource (:class:`str`): @@ -2599,6 +2637,7 @@ async def get_iam_policy( policy is being requested. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2610,72 +2649,62 @@ async def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2740,7 +2769,7 @@ async def test_iam_permissions( this request. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest`): The request object. Request message for `TestIamPermissions` method. @@ -2751,8 +2780,8 @@ async def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py index da6b34fe..14b95915 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py @@ -120,6 +120,22 @@ def _get_default_mtls_endpoint(api_endpoint): DEFAULT_ENDPOINT ) + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogClient: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_info(info) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials @@ -132,7 +148,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs): kwargs: Additional arguments to pass to the constructor. Returns: - {@api.name}: The constructed client. + DataCatalogClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file(filename) kwargs["credentials"] = credentials @@ -312,10 +328,10 @@ def __init__( credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. - transport (Union[str, ~.DataCatalogTransport]): The + transport (Union[str, DataCatalogTransport]): The transport to use. If set to None, a transport is chosen automatically. - client_options (client_options_lib.ClientOptions): Custom options for the + client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT @@ -442,19 +458,20 @@ def search_catalog( for more information. Args: - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest): The request object. Request message for [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. - scope (:class:`~.datacatalog.SearchCatalogRequest.Scope`): + scope (google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest.Scope): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. Data Catalog will return an error in such a case. + This corresponds to the ``scope`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - query (:class:`str`): + query (str): Required. The query string in search query syntax. The query must be non-empty. @@ -469,6 +486,7 @@ def search_catalog( `Data Catalog Search Syntax `__ for more information. + This corresponds to the ``query`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -480,9 +498,9 @@ def search_catalog( sent along with the request as metadata. Returns: - ~.pagers.SearchCatalogPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.SearchCatalogPager: Response message for - [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. + [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. Iterating over this object will yield results and resolve additional pages automatically. @@ -550,10 +568,10 @@ def create_entry_group( for more information). Args: - request (:class:`~.datacatalog.CreateEntryGroupRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateEntryGroupRequest): The request object. Request message for [CreateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntryGroup]. - parent (:class:`str`): + parent (str): Required. The name of the project this entry group is in. Example: @@ -561,22 +579,25 @@ def create_entry_group( Note that this EntryGroup and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group_id (:class:`str`): + entry_group_id (str): Required. The id of the entry group to create. The id must begin with a letter or underscore, contain only English letters, numbers and underscores, and be at most 64 characters. + This corresponds to the ``entry_group_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (google.cloud.datacatalog_v1beta1.types.EntryGroup): The entry group to create. Defaults to an empty entry group. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -588,11 +609,12 @@ def create_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -655,19 +677,21 @@ def update_entry_group( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryGroupRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateEntryGroupRequest): The request object. Request message for [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. - entry_group (:class:`~.datacatalog.EntryGroup`): + entry_group (google.cloud.datacatalog_v1beta1.types.EntryGroup): Required. The updated entry group. "name" field must be set. + This corresponds to the ``entry_group`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -679,11 +703,12 @@ def update_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -742,18 +767,20 @@ def get_entry_group( r"""Gets an EntryGroup. Args: - request (:class:`~.datacatalog.GetEntryGroupRequest`): + request (google.cloud.datacatalog_v1beta1.types.GetEntryGroupRequest): The request object. Request message for [GetEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntryGroup]. - name (:class:`str`): + name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - read_mask (:class:`~.field_mask.FieldMask`): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return. If not set or empty, all fields are returned. + This corresponds to the ``read_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -765,11 +792,12 @@ def get_entry_group( sent along with the request as metadata. Returns: - ~.datacatalog.EntryGroup: - EntryGroup Metadata. An EntryGroup resource represents a - logical grouping of zero or more Data Catalog - [Entry][google.cloud.datacatalog.v1beta1.Entry] - resources. + google.cloud.datacatalog_v1beta1.types.EntryGroup: + EntryGroup Metadata. + An EntryGroup resource represents a logical grouping + of zero or more Data Catalog + [Entry][google.cloud.datacatalog.v1beta1.Entry] + resources. """ # Create or coerce a protobuf request object. @@ -830,12 +858,13 @@ def delete_entry_group( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryGroupRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteEntryGroupRequest): The request object. Request message for [DeleteEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntryGroup]. - name (:class:`str`): + name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -896,15 +925,16 @@ def list_entry_groups( r"""Lists entry groups. Args: - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntryGroupsRequest): The request object. Request message for [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. - parent (:class:`str`): + parent (str): Required. The name of the location that contains the entry groups, which can be provided in URL format. Example: - projects/{project_id}/locations/{location} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -916,9 +946,9 @@ def list_entry_groups( sent along with the request as metadata. Returns: - ~.pagers.ListEntryGroupsPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListEntryGroupsPager: Response message for - [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. + [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. Iterating over this object will yield results and resolve additional pages automatically. @@ -992,10 +1022,10 @@ def create_entry( A maximum of 100,000 entries may be created per entry group. Args: - request (:class:`~.datacatalog.CreateEntryRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateEntryRequest): The request object. Request message for [CreateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.CreateEntry]. - parent (:class:`str`): + parent (str): Required. The name of the entry group this entry is in. Example: @@ -1003,16 +1033,18 @@ def create_entry( Note that this Entry and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry_id (:class:`str`): + entry_id (str): Required. The id of the entry to create. + This corresponds to the ``entry_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - entry (:class:`~.datacatalog.Entry`): + entry (google.cloud.datacatalog_v1beta1.types.Entry): Required. The entry to create. This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this @@ -1025,18 +1057,19 @@ def create_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1099,16 +1132,17 @@ def update_entry( for more information). Args: - request (:class:`~.datacatalog.UpdateEntryRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateEntryRequest): The request object. Request message for [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. - entry (:class:`~.datacatalog.Entry`): + entry (google.cloud.datacatalog_v1beta1.types.Entry): Required. The updated entry. The "name" field must be set. + This corresponds to the ``entry`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -1135,6 +1169,7 @@ def update_entry( - user_specified_system - linked_resource - source_system_timestamps + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1146,18 +1181,19 @@ def update_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1221,13 +1257,14 @@ def delete_entry( for more information). Args: - request (:class:`~.datacatalog.DeleteEntryRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteEntryRequest): The request object. Request message for [DeleteEntry][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteEntry]. - name (:class:`str`): + name (str): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1288,13 +1325,14 @@ def get_entry( r"""Gets an entry. Args: - request (:class:`~.datacatalog.GetEntryRequest`): + request (google.cloud.datacatalog_v1beta1.types.GetEntryRequest): The request object. Request message for [GetEntry][google.cloud.datacatalog.v1beta1.DataCatalog.GetEntry]. - name (:class:`str`): + name (str): Required. The name of the entry. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1306,18 +1344,19 @@ def get_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1373,7 +1412,7 @@ def lookup_entry( Entry. Args: - request (:class:`~.datacatalog.LookupEntryRequest`): + request (google.cloud.datacatalog_v1beta1.types.LookupEntryRequest): The request object. Request message for [LookupEntry][google.cloud.datacatalog.v1beta1.DataCatalog.LookupEntry]. @@ -1384,18 +1423,19 @@ def lookup_entry( sent along with the request as metadata. Returns: - ~.datacatalog.Entry: - Entry Metadata. A Data Catalog Entry resource represents - another resource in Google Cloud Platform (such as a - BigQuery dataset or a Pub/Sub topic), or outside of - Google Cloud Platform. Clients can use the - ``linked_resource`` field in the Entry resource to refer - to the original resource ID of the source system. - - An Entry resource contains resource details, such as its - schema. An Entry can also be used to attach flexible - metadata, such as a - [Tag][google.cloud.datacatalog.v1beta1.Tag]. + google.cloud.datacatalog_v1beta1.types.Entry: + Entry Metadata. + A Data Catalog Entry resource represents another + resource in Google Cloud Platform (such as a BigQuery + dataset or a Pub/Sub topic), or outside of Google + Cloud Platform. Clients can use the linked_resource + field in the Entry resource to refer to the original + resource ID of the source system. + + An Entry resource contains resource details, such as + its schema. An Entry can also be used to attach + flexible metadata, such as a + [Tag][google.cloud.datacatalog.v1beta1.Tag]. """ # Create or coerce a protobuf request object. @@ -1429,14 +1469,15 @@ def list_entries( r"""Lists entries. Args: - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntriesRequest): The request object. Request message for [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. - parent (:class:`str`): + parent (str): Required. The name of the entry group that contains the entries, which can be provided in URL format. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1448,9 +1489,9 @@ def list_entries( sent along with the request as metadata. Returns: - ~.pagers.ListEntriesPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListEntriesPager: Response message for - [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. + [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1519,10 +1560,10 @@ def create_tag_template( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateTagTemplateRequest): The request object. Request message for [CreateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplate]. - parent (:class:`str`): + parent (str): Required. The name of the project and the template location [region](https://cloud.google.com/data-catalog/docs/concepts/regions. @@ -1530,16 +1571,18 @@ def create_tag_template( Example: - projects/{project_id}/locations/us-central1 + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_id (:class:`str`): + tag_template_id (str): Required. The id of the tag template to create. + This corresponds to the ``tag_template_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (google.cloud.datacatalog_v1beta1.types.TagTemplate): Required. The tag template to create. This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this @@ -1552,16 +1595,16 @@ def create_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1619,13 +1662,14 @@ def get_tag_template( r"""Gets a tag template. Args: - request (:class:`~.datacatalog.GetTagTemplateRequest`): + request (google.cloud.datacatalog_v1beta1.types.GetTagTemplateRequest): The request object. Request message for [GetTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.GetTagTemplate]. - name (:class:`str`): + name (str): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1637,16 +1681,16 @@ def get_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1709,16 +1753,17 @@ def update_tag_template( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateTagTemplateRequest): The request object. Request message for [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. - tag_template (:class:`~.tags.TagTemplate`): + tag_template (google.cloud.datacatalog_v1beta1.types.TagTemplate): Required. The template to update. The "name" field must be set. + This corresponds to the ``tag_template`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The field mask specifies the parts of the template to overwrite. @@ -1728,6 +1773,7 @@ def update_tag_template( If absent or empty, all of the allowed fields above will be updated. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1739,16 +1785,16 @@ def update_tag_template( sent along with the request as metadata. Returns: - ~.tags.TagTemplate: - A tag template defines a tag, which can have one or more - typed fields. The template is used to create and attach - the tag to GCP resources. `Tag template - roles `__ - provide permissions to create, edit, and use the - template. See, for example, the `TagTemplate - User `__ - role, which includes permission to use the tag template - to tag resources. + google.cloud.datacatalog_v1beta1.types.TagTemplate: + A tag template defines a tag, which can have one or more typed fields. + The template is used to create and attach the tag to + GCP resources. [Tag template + roles](\ https://cloud.google.com/iam/docs/understanding-roles#data-catalog-roles) + provide permissions to create, edit, and use the + template. See, for example, the [TagTemplate + User](\ https://cloud.google.com/data-catalog/docs/how-to/template-user) + role, which includes permission to use the tag + template to tag resources. """ # Create or coerce a protobuf request object. @@ -1811,22 +1857,24 @@ def delete_tag_template( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteTagTemplateRequest): The request object. Request message for [DeleteTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplate]. - name (:class:`str`): + name (str): Required. The name of the tag template to delete. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - force (:class:`bool`): + force (bool): Required. Currently, this field must always be set to ``true``. This confirms the deletion of any possible tags using this template. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1895,10 +1943,10 @@ def create_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.CreateTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateTagTemplateFieldRequest): The request object. Request message for [CreateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTagTemplateField]. - parent (:class:`str`): + parent (str): Required. The name of the project and the template location `region `__. @@ -1906,22 +1954,25 @@ def create_tag_template_field( Example: - projects/{project_id}/locations/us-central1/tagTemplates/{tag_template_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field_id (:class:`str`): + tag_template_field_id (str): Required. The ID of the tag template field to create. Field ids can contain letters (both uppercase and lowercase), numbers (0-9), underscores (_) and dashes (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. + This corresponds to the ``tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (google.cloud.datacatalog_v1beta1.types.TagTemplateField): Required. The tag template field to create. + This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1933,7 +1984,7 @@ def create_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2002,22 +2053,23 @@ def update_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.UpdateTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateTagTemplateFieldRequest): The request object. Request message for [UpdateTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag_template_field (:class:`~.tags.TagTemplateField`): + tag_template_field (google.cloud.datacatalog_v1beta1.types.TagTemplateField): Required. The template to update. This corresponds to the ``tag_template_field`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -2033,6 +2085,7 @@ def update_tag_template_field( can only be added, existing enum values cannot be deleted nor renamed. Updating a template field from optional to required is NOT allowed. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2044,7 +2097,7 @@ def update_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2111,19 +2164,21 @@ def rename_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.RenameTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1beta1.types.RenameTagTemplateFieldRequest): The request object. Request message for [RenameTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.RenameTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - new_tag_template_field_id (:class:`str`): + new_tag_template_field_id (str): Required. The new ID of this tag template field. For example, ``my_new_field``. + This corresponds to the ``new_tag_template_field_id`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2135,7 +2190,7 @@ def rename_tag_template_field( sent along with the request as metadata. Returns: - ~.tags.TagTemplateField: + google.cloud.datacatalog_v1beta1.types.TagTemplateField: The template for an individual field within a tag template. @@ -2201,22 +2256,24 @@ def delete_tag_template_field( for more information). Args: - request (:class:`~.datacatalog.DeleteTagTemplateFieldRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteTagTemplateFieldRequest): The request object. Request message for [DeleteTagTemplateField][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTagTemplateField]. - name (:class:`str`): + name (str): Required. The name of the tag template field to delete. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - force (:class:`bool`): + force (bool): Required. Currently, this field must always be set to ``true``. This confirms the deletion of this field from any tags using this field. ``force = false`` will be supported in the future. + This corresponds to the ``force`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2288,10 +2345,10 @@ def create_tag( used to create the tag must be from the same organization. Args: - request (:class:`~.datacatalog.CreateTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateTagRequest): The request object. Request message for [CreateTag][google.cloud.datacatalog.v1beta1.DataCatalog.CreateTag]. - parent (:class:`str`): + parent (str): Required. The name of the resource to attach this tag to. Tags can be attached to Entries. Example: @@ -2299,10 +2356,11 @@ def create_tag( Note that this Tag and its child resources may not actually be stored in the location in this name. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - tag (:class:`~.tags.Tag`): + tag (google.cloud.datacatalog_v1beta1.types.Tag): Required. The tag to create. This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -2315,15 +2373,15 @@ def create_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1beta1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2380,19 +2438,21 @@ def update_tag( r"""Updates an existing tag. Args: - request (:class:`~.datacatalog.UpdateTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateTagRequest): The request object. Request message for [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. - tag (:class:`~.tags.Tag`): + tag (google.cloud.datacatalog_v1beta1.types.Tag): Required. The updated tag. The "name" field must be set. + This corresponds to the ``tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - update_mask (:class:`~.field_mask.FieldMask`): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. + This corresponds to the ``update_mask`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2404,15 +2464,15 @@ def update_tag( sent along with the request as metadata. Returns: - ~.tags.Tag: - Tags are used to attach custom metadata to Data Catalog - resources. Tags conform to the specifications within - their tag template. + google.cloud.datacatalog_v1beta1.types.Tag: + Tags are used to attach custom metadata to Data Catalog resources. Tags + conform to the specifications within their tag + template. - See `Data Catalog - IAM `__ - for information on the permissions needed to create or - view tags. + See [Data Catalog + IAM](\ https://cloud.google.com/data-catalog/docs/concepts/iam) + for information on the permissions needed to create + or view tags. """ # Create or coerce a protobuf request object. @@ -2468,13 +2528,14 @@ def delete_tag( r"""Deletes a tag. Args: - request (:class:`~.datacatalog.DeleteTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteTagRequest): The request object. Request message for [DeleteTag][google.cloud.datacatalog.v1beta1.DataCatalog.DeleteTag]. - name (:class:`str`): + name (str): Required. The name of the tag to delete. Example: - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id}/tags/{tag_id} + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2536,10 +2597,10 @@ def list_tags( [Entry][google.cloud.datacatalog.v1beta1.Entry]. Args: - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTagsRequest): The request object. Request message for [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. - parent (:class:`str`): + parent (str): Required. The name of the Data Catalog resource to list the tags of. The resource could be an [Entry][google.cloud.datacatalog.v1beta1.Entry] or an @@ -2549,6 +2610,7 @@ def list_tags( - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id} - projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}/entries/{entry_id} + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2560,9 +2622,9 @@ def list_tags( sent along with the request as metadata. Returns: - ~.pagers.ListTagsPager: + google.cloud.datacatalog_v1beta1.services.data_catalog.pagers.ListTagsPager: Response message for - [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. + [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -2641,14 +2703,15 @@ def set_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.SetIamPolicyRequest): The request object. Request message for `SetIamPolicy` method. - resource (:class:`str`): + resource (str): REQUIRED: The resource for which the policy is being specified. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2660,72 +2723,62 @@ def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2794,14 +2847,15 @@ def get_iam_policy( entry groups. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.GetIamPolicyRequest): The request object. Request message for `GetIamPolicy` method. - resource (:class:`str`): + resource (str): REQUIRED: The resource for which the policy is being requested. See the operation documentation for the appropriate value for this field. + This corresponds to the ``resource`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -2813,72 +2867,62 @@ def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -2939,7 +2983,7 @@ def test_iam_permissions( this request. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest): The request object. Request message for `TestIamPermissions` method. @@ -2950,8 +2994,8 @@ def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py index ae87331f..17b6f5ed 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py @@ -26,7 +26,7 @@ class SearchCatalogPager: """A pager for iterating through ``search_catalog`` requests. This class thinly wraps an initial - :class:`~.datacatalog.SearchCatalogResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse` object, and provides an ``__iter__`` method to iterate through its ``results`` field. @@ -35,7 +35,7 @@ class SearchCatalogPager: through the ``results`` field on the corresponding responses. - All the usual :class:`~.datacatalog.SearchCatalogResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -53,9 +53,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest): The initial request object. - response (:class:`~.datacatalog.SearchCatalogResponse`): + response (google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -88,7 +88,7 @@ class SearchCatalogAsyncPager: """A pager for iterating through ``search_catalog`` requests. This class thinly wraps an initial - :class:`~.datacatalog.SearchCatalogResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse` object, and provides an ``__aiter__`` method to iterate through its ``results`` field. @@ -97,7 +97,7 @@ class SearchCatalogAsyncPager: through the ``results`` field on the corresponding responses. - All the usual :class:`~.datacatalog.SearchCatalogResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -115,9 +115,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.SearchCatalogRequest`): + request (google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest): The initial request object. - response (:class:`~.datacatalog.SearchCatalogResponse`): + response (google.cloud.datacatalog_v1beta1.types.SearchCatalogResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -154,7 +154,7 @@ class ListEntryGroupsPager: """A pager for iterating through ``list_entry_groups`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntryGroupsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse` object, and provides an ``__iter__`` method to iterate through its ``entry_groups`` field. @@ -163,7 +163,7 @@ class ListEntryGroupsPager: through the ``entry_groups`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -181,9 +181,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntryGroupsRequest): The initial request object. - response (:class:`~.datacatalog.ListEntryGroupsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -216,7 +216,7 @@ class ListEntryGroupsAsyncPager: """A pager for iterating through ``list_entry_groups`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntryGroupsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse` object, and provides an ``__aiter__`` method to iterate through its ``entry_groups`` field. @@ -225,7 +225,7 @@ class ListEntryGroupsAsyncPager: through the ``entry_groups`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntryGroupsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -243,9 +243,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntryGroupsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntryGroupsRequest): The initial request object. - response (:class:`~.datacatalog.ListEntryGroupsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListEntryGroupsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -282,7 +282,7 @@ class ListEntriesPager: """A pager for iterating through ``list_entries`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntriesResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListEntriesResponse` object, and provides an ``__iter__`` method to iterate through its ``entries`` field. @@ -291,7 +291,7 @@ class ListEntriesPager: through the ``entries`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntriesResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListEntriesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -309,9 +309,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntriesRequest): The initial request object. - response (:class:`~.datacatalog.ListEntriesResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListEntriesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -344,7 +344,7 @@ class ListEntriesAsyncPager: """A pager for iterating through ``list_entries`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListEntriesResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListEntriesResponse` object, and provides an ``__aiter__`` method to iterate through its ``entries`` field. @@ -353,7 +353,7 @@ class ListEntriesAsyncPager: through the ``entries`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListEntriesResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListEntriesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -371,9 +371,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListEntriesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListEntriesRequest): The initial request object. - response (:class:`~.datacatalog.ListEntriesResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListEntriesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -410,7 +410,7 @@ class ListTagsPager: """A pager for iterating through ``list_tags`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListTagsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListTagsResponse` object, and provides an ``__iter__`` method to iterate through its ``tags`` field. @@ -419,7 +419,7 @@ class ListTagsPager: through the ``tags`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -437,9 +437,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTagsRequest): The initial request object. - response (:class:`~.datacatalog.ListTagsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -472,7 +472,7 @@ class ListTagsAsyncPager: """A pager for iterating through ``list_tags`` requests. This class thinly wraps an initial - :class:`~.datacatalog.ListTagsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListTagsResponse` object, and provides an ``__aiter__`` method to iterate through its ``tags`` field. @@ -481,7 +481,7 @@ class ListTagsAsyncPager: through the ``tags`` field on the corresponding responses. - All the usual :class:`~.datacatalog.ListTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -499,9 +499,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.datacatalog.ListTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTagsRequest): The initial request object. - response (:class:`~.datacatalog.ListTagsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py index 759d80df..61f9daab 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py @@ -82,6 +82,7 @@ class PolicyTagManagerAsyncClient: PolicyTagManagerClient.parse_common_location_path ) + from_service_account_info = PolicyTagManagerClient.from_service_account_info from_service_account_file = PolicyTagManagerClient.from_service_account_file from_service_account_json = from_service_account_file @@ -159,17 +160,18 @@ async def create_taxonomy( r"""Creates a taxonomy in the specified project. Args: - request (:class:`~.policytagmanager.CreateTaxonomyRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreateTaxonomyRequest`): The request object. Request message for [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. parent (:class:`str`): Required. Resource name of the project that the taxonomy will belong to. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - taxonomy (:class:`~.policytagmanager.Taxonomy`): + taxonomy (:class:`google.cloud.datacatalog_v1beta1.types.Taxonomy`): The taxonomy to be created. This corresponds to the ``taxonomy`` field on the ``request`` instance; if ``request`` is provided, this @@ -182,14 +184,13 @@ async def create_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -246,13 +247,14 @@ async def delete_taxonomy( associated policies. Args: - request (:class:`~.policytagmanager.DeleteTaxonomyRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeleteTaxonomyRequest`): The request object. Request message for [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. name (:class:`str`): Required. Resource name of the taxonomy to be deleted. All policy tags in this taxonomy will also be deleted. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -312,12 +314,13 @@ async def update_taxonomy( r"""Updates a taxonomy. Args: - request (:class:`~.policytagmanager.UpdateTaxonomyRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdateTaxonomyRequest`): The request object. Request message for [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. - taxonomy (:class:`~.policytagmanager.Taxonomy`): + taxonomy (:class:`google.cloud.datacatalog_v1beta1.types.Taxonomy`): The taxonomy to update. Only description, display_name, and activated policy types can be updated. + This corresponds to the ``taxonomy`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -329,14 +332,13 @@ async def update_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -392,12 +394,13 @@ async def list_taxonomies( location that the caller has permission to view. Args: - request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ListTaxonomiesRequest`): The request object. Request message for [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. parent (:class:`str`): Required. Resource name of the project to list the taxonomies of. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -409,9 +412,9 @@ async def list_taxonomies( sent along with the request as metadata. Returns: - ~.pagers.ListTaxonomiesAsyncPager: + google.cloud.datacatalog_v1beta1.services.policy_tag_manager.pagers.ListTaxonomiesAsyncPager: Response message for - [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. Iterating over this object will yield results and resolve additional pages automatically. @@ -473,12 +476,13 @@ async def get_taxonomy( r"""Gets a taxonomy. Args: - request (:class:`~.policytagmanager.GetTaxonomyRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.GetTaxonomyRequest`): The request object. Request message for [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. name (:class:`str`): Required. Resource name of the requested taxonomy. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -490,14 +494,13 @@ async def get_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -551,17 +554,18 @@ async def create_policy_tag( r"""Creates a policy tag in the specified taxonomy. Args: - request (:class:`~.policytagmanager.CreatePolicyTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.CreatePolicyTagRequest`): The request object. Request message for [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. parent (:class:`str`): Required. Resource name of the taxonomy that the policy tag will belong to. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - policy_tag (:class:`~.policytagmanager.PolicyTag`): + policy_tag (:class:`google.cloud.datacatalog_v1beta1.types.PolicyTag`): The policy tag to be created. This corresponds to the ``policy_tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -574,7 +578,7 @@ async def create_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -638,13 +642,14 @@ async def delete_policy_tag( descendant policy tags. Args: - request (:class:`~.policytagmanager.DeletePolicyTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.DeletePolicyTagRequest`): The request object. Request message for [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. name (:class:`str`): Required. Resource name of the policy tag to be deleted. All of its descendant policy tags will also be deleted. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -704,13 +709,14 @@ async def update_policy_tag( r"""Updates a policy tag. Args: - request (:class:`~.policytagmanager.UpdatePolicyTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.UpdatePolicyTagRequest`): The request object. Request message for [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. - policy_tag (:class:`~.policytagmanager.PolicyTag`): + policy_tag (:class:`google.cloud.datacatalog_v1beta1.types.PolicyTag`): The policy tag to update. Only the description, display_name, and parent_policy_tag fields can be updated. + This corresponds to the ``policy_tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -722,7 +728,7 @@ async def update_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -785,12 +791,13 @@ async def list_policy_tags( r"""Lists all policy tags in a taxonomy. Args: - request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ListPolicyTagsRequest`): The request object. Request message for [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. parent (:class:`str`): Required. Resource name of the taxonomy to list the policy tags of. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -802,9 +809,9 @@ async def list_policy_tags( sent along with the request as metadata. Returns: - ~.pagers.ListPolicyTagsAsyncPager: + google.cloud.datacatalog_v1beta1.services.policy_tag_manager.pagers.ListPolicyTagsAsyncPager: Response message for - [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -866,12 +873,13 @@ async def get_policy_tag( r"""Gets a policy tag. Args: - request (:class:`~.policytagmanager.GetPolicyTagRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.GetPolicyTagRequest`): The request object. Request message for [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. name (:class:`str`): Required. Resource name of the requested policy tag. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -883,7 +891,7 @@ async def get_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -943,7 +951,7 @@ async def get_iam_policy( r"""Gets the IAM policy for a taxonomy or a policy tag. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.GetIamPolicyRequest`): The request object. Request message for `GetIamPolicy` method. @@ -954,72 +962,62 @@ async def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -1060,7 +1058,7 @@ async def set_iam_policy( r"""Sets the IAM policy for a taxonomy or a policy tag. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.SetIamPolicyRequest`): The request object. Request message for `SetIamPolicy` method. @@ -1071,72 +1069,62 @@ async def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -1178,7 +1166,7 @@ async def test_iam_permissions( specified taxonomy or policy tag. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (:class:`google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest`): The request object. Request message for `TestIamPermissions` method. @@ -1189,8 +1177,8 @@ async def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py index ffbb1f7f..1b88fc10 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py @@ -114,6 +114,22 @@ def _get_default_mtls_endpoint(api_endpoint): DEFAULT_ENDPOINT ) + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerClient: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_info(info) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials @@ -126,7 +142,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs): kwargs: Additional arguments to pass to the constructor. Returns: - {@api.name}: The constructed client. + PolicyTagManagerClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file(filename) kwargs["credentials"] = credentials @@ -255,10 +271,10 @@ def __init__( credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. - transport (Union[str, ~.PolicyTagManagerTransport]): The + transport (Union[str, PolicyTagManagerTransport]): The transport to use. If set to None, a transport is chosen automatically. - client_options (client_options_lib.ClientOptions): Custom options for the + client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT @@ -369,17 +385,18 @@ def create_taxonomy( r"""Creates a taxonomy in the specified project. Args: - request (:class:`~.policytagmanager.CreateTaxonomyRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreateTaxonomyRequest): The request object. Request message for [CreateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreateTaxonomy]. - parent (:class:`str`): + parent (str): Required. Resource name of the project that the taxonomy will belong to. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - taxonomy (:class:`~.policytagmanager.Taxonomy`): + taxonomy (google.cloud.datacatalog_v1beta1.types.Taxonomy): The taxonomy to be created. This corresponds to the ``taxonomy`` field on the ``request`` instance; if ``request`` is provided, this @@ -392,14 +409,13 @@ def create_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -457,13 +473,14 @@ def delete_taxonomy( associated policies. Args: - request (:class:`~.policytagmanager.DeleteTaxonomyRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeleteTaxonomyRequest): The request object. Request message for [DeleteTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeleteTaxonomy]. - name (:class:`str`): + name (str): Required. Resource name of the taxonomy to be deleted. All policy tags in this taxonomy will also be deleted. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -524,12 +541,13 @@ def update_taxonomy( r"""Updates a taxonomy. Args: - request (:class:`~.policytagmanager.UpdateTaxonomyRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdateTaxonomyRequest): The request object. Request message for [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. - taxonomy (:class:`~.policytagmanager.Taxonomy`): + taxonomy (google.cloud.datacatalog_v1beta1.types.Taxonomy): The taxonomy to update. Only description, display_name, and activated policy types can be updated. + This corresponds to the ``taxonomy`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -541,14 +559,13 @@ def update_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -605,12 +622,13 @@ def list_taxonomies( location that the caller has permission to view. Args: - request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTaxonomiesRequest): The request object. Request message for [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. - parent (:class:`str`): + parent (str): Required. Resource name of the project to list the taxonomies of. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -622,9 +640,9 @@ def list_taxonomies( sent along with the request as metadata. Returns: - ~.pagers.ListTaxonomiesPager: + google.cloud.datacatalog_v1beta1.services.policy_tag_manager.pagers.ListTaxonomiesPager: Response message for - [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. + [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. Iterating over this object will yield results and resolve additional pages automatically. @@ -687,12 +705,13 @@ def get_taxonomy( r"""Gets a taxonomy. Args: - request (:class:`~.policytagmanager.GetTaxonomyRequest`): + request (google.cloud.datacatalog_v1beta1.types.GetTaxonomyRequest): The request object. Request message for [GetTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetTaxonomy]. - name (:class:`str`): + name (str): Required. Resource name of the requested taxonomy. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -704,14 +723,13 @@ def get_taxonomy( sent along with the request as metadata. Returns: - ~.policytagmanager.Taxonomy: - A taxonomy is a collection of policy tags that classify - data along a common axis. For instance a data - *sensitivity* taxonomy could contain policy tags - denoting PII such as age, zipcode, and SSN. A data - *origin* taxonomy could contain policy tags to - distinguish user data, employee data, partner data, - public data. + google.cloud.datacatalog_v1beta1.types.Taxonomy: + A taxonomy is a collection of policy tags that classify data along a common + axis. For instance a data *sensitivity* taxonomy + could contain policy tags denoting PII such as age, + zipcode, and SSN. A data *origin* taxonomy could + contain policy tags to distinguish user data, + employee data, partner data, public data. """ # Create or coerce a protobuf request object. @@ -766,17 +784,18 @@ def create_policy_tag( r"""Creates a policy tag in the specified taxonomy. Args: - request (:class:`~.policytagmanager.CreatePolicyTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.CreatePolicyTagRequest): The request object. Request message for [CreatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.CreatePolicyTag]. - parent (:class:`str`): + parent (str): Required. Resource name of the taxonomy that the policy tag will belong to. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. - policy_tag (:class:`~.policytagmanager.PolicyTag`): + policy_tag (google.cloud.datacatalog_v1beta1.types.PolicyTag): The policy tag to be created. This corresponds to the ``policy_tag`` field on the ``request`` instance; if ``request`` is provided, this @@ -789,7 +808,7 @@ def create_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -854,13 +873,14 @@ def delete_policy_tag( descendant policy tags. Args: - request (:class:`~.policytagmanager.DeletePolicyTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.DeletePolicyTagRequest): The request object. Request message for [DeletePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.DeletePolicyTag]. - name (:class:`str`): + name (str): Required. Resource name of the policy tag to be deleted. All of its descendant policy tags will also be deleted. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -921,13 +941,14 @@ def update_policy_tag( r"""Updates a policy tag. Args: - request (:class:`~.policytagmanager.UpdatePolicyTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.UpdatePolicyTagRequest): The request object. Request message for [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. - policy_tag (:class:`~.policytagmanager.PolicyTag`): + policy_tag (google.cloud.datacatalog_v1beta1.types.PolicyTag): The policy tag to update. Only the description, display_name, and parent_policy_tag fields can be updated. + This corresponds to the ``policy_tag`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -939,7 +960,7 @@ def update_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -1003,12 +1024,13 @@ def list_policy_tags( r"""Lists all policy tags in a taxonomy. Args: - request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListPolicyTagsRequest): The request object. Request message for [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. - parent (:class:`str`): + parent (str): Required. Resource name of the taxonomy to list the policy tags of. + This corresponds to the ``parent`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1020,9 +1042,9 @@ def list_policy_tags( sent along with the request as metadata. Returns: - ~.pagers.ListPolicyTagsPager: + google.cloud.datacatalog_v1beta1.services.policy_tag_manager.pagers.ListPolicyTagsPager: Response message for - [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. + [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. Iterating over this object will yield results and resolve additional pages automatically. @@ -1085,12 +1107,13 @@ def get_policy_tag( r"""Gets a policy tag. Args: - request (:class:`~.policytagmanager.GetPolicyTagRequest`): + request (google.cloud.datacatalog_v1beta1.types.GetPolicyTagRequest): The request object. Request message for [GetPolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.GetPolicyTag]. - name (:class:`str`): + name (str): Required. Resource name of the requested policy tag. + This corresponds to the ``name`` field on the ``request`` instance; if ``request`` is provided, this should not be set. @@ -1102,7 +1125,7 @@ def get_policy_tag( sent along with the request as metadata. Returns: - ~.policytagmanager.PolicyTag: + google.cloud.datacatalog_v1beta1.types.PolicyTag: Denotes one policy tag in a taxonomy (e.g. ssn). Policy Tags can be defined in a hierarchy. For example, consider @@ -1163,7 +1186,7 @@ def get_iam_policy( r"""Gets the IAM policy for a taxonomy or a policy tag. Args: - request (:class:`~.iam_policy.GetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.GetIamPolicyRequest): The request object. Request message for `GetIamPolicy` method. @@ -1174,72 +1197,62 @@ def get_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -1276,7 +1289,7 @@ def set_iam_policy( r"""Sets the IAM policy for a taxonomy or a policy tag. Args: - request (:class:`~.iam_policy.SetIamPolicyRequest`): + request (google.iam.v1.iam_policy_pb2.SetIamPolicyRequest): The request object. Request message for `SetIamPolicy` method. @@ -1287,72 +1300,62 @@ def set_iam_policy( sent along with the request as metadata. Returns: - ~.policy.Policy: - Defines an Identity and Access Management (IAM) policy. - It is used to specify access control policies for Cloud - Platform resources. - - A ``Policy`` is a collection of ``bindings``. A - ``binding`` binds one or more ``members`` to a single - ``role``. Members can be user accounts, service - accounts, Google groups, and domains (such as G Suite). - A ``role`` is a named list of permissions (defined by - IAM or configured by users). A ``binding`` can - optionally specify a ``condition``, which is a logic - expression that further constrains the role binding - based on attributes about the request and/or target - resource. - - **JSON Example** - - :: - - { - "bindings": [ - { - "role": "roles/resourcemanager.organizationAdmin", - "members": [ - "user:mike@example.com", - "group:admins@example.com", - "domain:google.com", - "serviceAccount:my-project-id@appspot.gserviceaccount.com" - ] - }, - { - "role": "roles/resourcemanager.organizationViewer", - "members": ["user:eve@example.com"], - "condition": { - "title": "expirable access", - "description": "Does not grant access after Sep 2020", - "expression": "request.time < - timestamp('2020-10-01T00:00:00.000Z')", - } - } - ] - } - - **YAML Example** - - :: - - bindings: - - members: - - user:mike@example.com - - group:admins@example.com - - domain:google.com - - serviceAccount:my-project-id@appspot.gserviceaccount.com - role: roles/resourcemanager.organizationAdmin - - members: - - user:eve@example.com - role: roles/resourcemanager.organizationViewer - condition: - title: expirable access - description: Does not grant access after Sep 2020 - expression: request.time < timestamp('2020-10-01T00:00:00.000Z') - - For a description of IAM and its features, see the `IAM - developer's - guide `__. + google.iam.v1.policy_pb2.Policy: + Defines an Identity and Access Management (IAM) policy. It is used to + specify access control policies for Cloud Platform + resources. + + A Policy is a collection of bindings. A binding binds + one or more members to a single role. Members can be + user accounts, service accounts, Google groups, and + domains (such as G Suite). A role is a named list of + permissions (defined by IAM or configured by users). + A binding can optionally specify a condition, which + is a logic expression that further constrains the + role binding based on attributes about the request + and/or target resource. + + **JSON Example** + + { + "bindings": [ + { + "role": + "roles/resourcemanager.organizationAdmin", + "members": [ "user:mike@example.com", + "group:admins@example.com", + "domain:google.com", + "serviceAccount:my-project-id@appspot.gserviceaccount.com" + ] + + }, { "role": + "roles/resourcemanager.organizationViewer", + "members": ["user:eve@example.com"], + "condition": { "title": "expirable access", + "description": "Does not grant access after + Sep 2020", "expression": "request.time < + timestamp('2020-10-01T00:00:00.000Z')", } } + + ] + + } + + **YAML Example** + + bindings: - members: - user:\ mike@example.com - + group:\ admins@example.com - domain:google.com - + serviceAccount:\ my-project-id@appspot.gserviceaccount.com + role: roles/resourcemanager.organizationAdmin - + members: - user:\ eve@example.com role: + roles/resourcemanager.organizationViewer + condition: title: expirable access description: + Does not grant access after Sep 2020 expression: + request.time < + timestamp('2020-10-01T00:00:00.000Z') + + For a description of IAM and its features, see the + [IAM developer's + guide](\ https://cloud.google.com/iam/docs). """ # Create or coerce a protobuf request object. @@ -1390,7 +1393,7 @@ def test_iam_permissions( specified taxonomy or policy tag. Args: - request (:class:`~.iam_policy.TestIamPermissionsRequest`): + request (google.iam.v1.iam_policy_pb2.TestIamPermissionsRequest): The request object. Request message for `TestIamPermissions` method. @@ -1401,8 +1404,8 @@ def test_iam_permissions( sent along with the request as metadata. Returns: - ~.iam_policy.TestIamPermissionsResponse: - Response message for ``TestIamPermissions`` method. + google.iam.v1.iam_policy_pb2.TestIamPermissionsResponse: + Response message for TestIamPermissions method. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py index 4dd9013d..c216e352 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py @@ -24,7 +24,7 @@ class ListTaxonomiesPager: """A pager for iterating through ``list_taxonomies`` requests. This class thinly wraps an initial - :class:`~.policytagmanager.ListTaxonomiesResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse` object, and provides an ``__iter__`` method to iterate through its ``taxonomies`` field. @@ -33,7 +33,7 @@ class ListTaxonomiesPager: through the ``taxonomies`` field on the corresponding responses. - All the usual :class:`~.policytagmanager.ListTaxonomiesResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -51,9 +51,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTaxonomiesRequest): The initial request object. - response (:class:`~.policytagmanager.ListTaxonomiesResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -86,7 +86,7 @@ class ListTaxonomiesAsyncPager: """A pager for iterating through ``list_taxonomies`` requests. This class thinly wraps an initial - :class:`~.policytagmanager.ListTaxonomiesResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse` object, and provides an ``__aiter__`` method to iterate through its ``taxonomies`` field. @@ -95,7 +95,7 @@ class ListTaxonomiesAsyncPager: through the ``taxonomies`` field on the corresponding responses. - All the usual :class:`~.policytagmanager.ListTaxonomiesResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -113,9 +113,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.policytagmanager.ListTaxonomiesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListTaxonomiesRequest): The initial request object. - response (:class:`~.policytagmanager.ListTaxonomiesResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListTaxonomiesResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -152,7 +152,7 @@ class ListPolicyTagsPager: """A pager for iterating through ``list_policy_tags`` requests. This class thinly wraps an initial - :class:`~.policytagmanager.ListPolicyTagsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse` object, and provides an ``__iter__`` method to iterate through its ``policy_tags`` field. @@ -161,7 +161,7 @@ class ListPolicyTagsPager: through the ``policy_tags`` field on the corresponding responses. - All the usual :class:`~.policytagmanager.ListPolicyTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -179,9 +179,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListPolicyTagsRequest): The initial request object. - response (:class:`~.policytagmanager.ListPolicyTagsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. @@ -214,7 +214,7 @@ class ListPolicyTagsAsyncPager: """A pager for iterating through ``list_policy_tags`` requests. This class thinly wraps an initial - :class:`~.policytagmanager.ListPolicyTagsResponse` object, and + :class:`google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse` object, and provides an ``__aiter__`` method to iterate through its ``policy_tags`` field. @@ -223,7 +223,7 @@ class ListPolicyTagsAsyncPager: through the ``policy_tags`` field on the corresponding responses. - All the usual :class:`~.policytagmanager.ListPolicyTagsResponse` + All the usual :class:`google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse` attributes are available on the pager. If multiple requests are made, only the most recent response is retained, and thus used for attribute lookup. """ @@ -241,9 +241,9 @@ def __init__( Args: method (Callable): The method that was originally called, and which instantiated this pager. - request (:class:`~.policytagmanager.ListPolicyTagsRequest`): + request (google.cloud.datacatalog_v1beta1.types.ListPolicyTagsRequest): The initial request object. - response (:class:`~.policytagmanager.ListPolicyTagsResponse`): + response (google.cloud.datacatalog_v1beta1.types.ListPolicyTagsResponse): The initial response object. metadata (Sequence[Tuple[str, str]]): Strings which should be sent along with the request as metadata. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py index cfbd3082..36c2a489 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py @@ -87,6 +87,9 @@ class PolicyTagManagerSerializationAsyncClient: PolicyTagManagerSerializationClient.parse_common_location_path ) + from_service_account_info = ( + PolicyTagManagerSerializationClient.from_service_account_info + ) from_service_account_file = ( PolicyTagManagerSerializationClient.from_service_account_file ) @@ -169,7 +172,7 @@ async def import_taxonomies( creation using nested proto structure. Args: - request (:class:`~.policytagmanagerserialization.ImportTaxonomiesRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ImportTaxonomiesRequest`): The request object. Request message for [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. @@ -180,9 +183,9 @@ async def import_taxonomies( sent along with the request as metadata. Returns: - ~.policytagmanagerserialization.ImportTaxonomiesResponse: + google.cloud.datacatalog_v1beta1.types.ImportTaxonomiesResponse: Response message for - [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. """ # Create or coerce a protobuf request object. @@ -224,7 +227,7 @@ async def export_taxonomies( future ImportTaxonomies calls. Args: - request (:class:`~.policytagmanagerserialization.ExportTaxonomiesRequest`): + request (:class:`google.cloud.datacatalog_v1beta1.types.ExportTaxonomiesRequest`): The request object. Request message for [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. @@ -235,9 +238,9 @@ async def export_taxonomies( sent along with the request as metadata. Returns: - ~.policytagmanagerserialization.ExportTaxonomiesResponse: + google.cloud.datacatalog_v1beta1.types.ExportTaxonomiesResponse: Response message for - [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py index 65a709e3..739c3020 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py @@ -119,6 +119,22 @@ def _get_default_mtls_endpoint(api_endpoint): DEFAULT_ENDPOINT ) + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerSerializationClient: The constructed client. + """ + credentials = service_account.Credentials.from_service_account_info(info) + kwargs["credentials"] = credentials + return cls(*args, **kwargs) + @classmethod def from_service_account_file(cls, filename: str, *args, **kwargs): """Creates an instance of this client using the provided credentials @@ -131,7 +147,7 @@ def from_service_account_file(cls, filename: str, *args, **kwargs): kwargs: Additional arguments to pass to the constructor. Returns: - {@api.name}: The constructed client. + PolicyTagManagerSerializationClient: The constructed client. """ credentials = service_account.Credentials.from_service_account_file(filename) kwargs["credentials"] = credentials @@ -239,10 +255,10 @@ def __init__( credentials identify the application to the service; if none are specified, the client will attempt to ascertain the credentials from the environment. - transport (Union[str, ~.PolicyTagManagerSerializationTransport]): The + transport (Union[str, PolicyTagManagerSerializationTransport]): The transport to use. If set to None, a transport is chosen automatically. - client_options (client_options_lib.ClientOptions): Custom options for the + client_options (google.api_core.client_options.ClientOptions): Custom options for the client. It won't take effect if a ``transport`` instance is provided. (1) The ``api_endpoint`` property can be used to override the default endpoint provided by the client. GOOGLE_API_USE_MTLS_ENDPOINT @@ -355,7 +371,7 @@ def import_taxonomies( creation using nested proto structure. Args: - request (:class:`~.policytagmanagerserialization.ImportTaxonomiesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ImportTaxonomiesRequest): The request object. Request message for [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. @@ -366,9 +382,9 @@ def import_taxonomies( sent along with the request as metadata. Returns: - ~.policytagmanagerserialization.ImportTaxonomiesResponse: + google.cloud.datacatalog_v1beta1.types.ImportTaxonomiesResponse: Response message for - [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. + [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. """ # Create or coerce a protobuf request object. @@ -413,7 +429,7 @@ def export_taxonomies( future ImportTaxonomies calls. Args: - request (:class:`~.policytagmanagerserialization.ExportTaxonomiesRequest`): + request (google.cloud.datacatalog_v1beta1.types.ExportTaxonomiesRequest): The request object. Request message for [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. @@ -424,9 +440,9 @@ def export_taxonomies( sent along with the request as metadata. Returns: - ~.policytagmanagerserialization.ExportTaxonomiesResponse: + google.cloud.datacatalog_v1beta1.types.ExportTaxonomiesResponse: Response message for - [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. + [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. """ # Create or coerce a protobuf request object. diff --git a/google/cloud/datacatalog_v1beta1/types/datacatalog.py b/google/cloud/datacatalog_v1beta1/types/datacatalog.py index ee843cac..f12ca6e6 100644 --- a/google/cloud/datacatalog_v1beta1/types/datacatalog.py +++ b/google/cloud/datacatalog_v1beta1/types/datacatalog.py @@ -85,7 +85,7 @@ class SearchCatalogRequest(proto.Message): [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. Attributes: - scope (~.datacatalog.SearchCatalogRequest.Scope): + scope (google.cloud.datacatalog_v1beta1.types.SearchCatalogRequest.Scope): Required. The scope of this search request. A ``scope`` that has empty ``include_org_ids``, ``include_project_ids`` AND false ``include_gcp_public_datasets`` is considered invalid. @@ -172,7 +172,7 @@ class SearchCatalogResponse(proto.Message): [SearchCatalog][google.cloud.datacatalog.v1beta1.DataCatalog.SearchCatalog]. Attributes: - results (Sequence[~.search.SearchCatalogResult]): + results (Sequence[google.cloud.datacatalog_v1beta1.types.SearchCatalogResult]): Search results. next_page_token (str): The token that can be used to retrieve the @@ -209,7 +209,7 @@ class CreateEntryGroupRequest(proto.Message): underscore, contain only English letters, numbers and underscores, and be at most 64 characters. - entry_group (~.datacatalog.EntryGroup): + entry_group (google.cloud.datacatalog_v1beta1.types.EntryGroup): The entry group to create. Defaults to an empty entry group. """ @@ -226,10 +226,10 @@ class UpdateEntryGroupRequest(proto.Message): [UpdateEntryGroup][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntryGroup]. Attributes: - entry_group (~.datacatalog.EntryGroup): + entry_group (google.cloud.datacatalog_v1beta1.types.EntryGroup): Required. The updated entry group. "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry group. If absent or empty, all modifiable fields are updated. @@ -248,7 +248,7 @@ class GetEntryGroupRequest(proto.Message): name (str): Required. The name of the entry group. For example, ``projects/{project_id}/locations/{location}/entryGroups/{entry_group_id}``. - read_mask (~.field_mask.FieldMask): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return. If not set or empty, all fields are returned. """ @@ -307,7 +307,7 @@ class ListEntryGroupsResponse(proto.Message): [ListEntryGroups][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntryGroups]. Attributes: - entry_groups (Sequence[~.datacatalog.EntryGroup]): + entry_groups (Sequence[google.cloud.datacatalog_v1beta1.types.EntryGroup]): EntryGroup details. next_page_token (str): Token to retrieve the next page of results. @@ -339,7 +339,7 @@ class CreateEntryRequest(proto.Message): actually be stored in the location in this name. entry_id (str): Required. The id of the entry to create. - entry (~.datacatalog.Entry): + entry (google.cloud.datacatalog_v1beta1.types.Entry): Required. The entry to create. """ @@ -355,10 +355,10 @@ class UpdateEntryRequest(proto.Message): [UpdateEntry][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateEntry]. Attributes: - entry (~.datacatalog.Entry): + entry (google.cloud.datacatalog_v1beta1.types.Entry): Required. The updated entry. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the entry. If absent or empty, all modifiable fields are updated. @@ -490,7 +490,7 @@ class Entry(proto.Message): Output only when Entry is of type in the EntryType enum. For entries with user_specified_type, this field is optional and defaults to an empty string. - type_ (~.datacatalog.EntryType): + type_ (google.cloud.datacatalog_v1beta1.types.EntryType): The type of the entry. Only used for Entries with types in the EntryType enum. @@ -508,7 +508,7 @@ class Entry(proto.Message): Currently, only FILESET enum value is allowed. All other entries created through Data Catalog must use ``user_specified_type``. - integrated_system (~.common.IntegratedSystem): + integrated_system (google.cloud.datacatalog_v1beta1.types.IntegratedSystem): Output only. This field indicates the entry's source system that Data Catalog integrates with, such as BigQuery or Pub/Sub. @@ -519,14 +519,14 @@ class Entry(proto.Message): contain letters, numbers, and underscores; are case insensitive; must be at least 1 character and at most 64 characters long. - gcs_fileset_spec (~.gcd_gcs_fileset_spec.GcsFilesetSpec): + gcs_fileset_spec (google.cloud.datacatalog_v1beta1.types.GcsFilesetSpec): Specification that applies to a Cloud Storage fileset. This is only valid on entries of type FILESET. - bigquery_table_spec (~.table_spec.BigQueryTableSpec): + bigquery_table_spec (google.cloud.datacatalog_v1beta1.types.BigQueryTableSpec): Specification that applies to a BigQuery table. This is only valid on entries of type ``TABLE``. - bigquery_date_sharded_spec (~.table_spec.BigQueryDateShardedSpec): + bigquery_date_sharded_spec (google.cloud.datacatalog_v1beta1.types.BigQueryDateShardedSpec): Specification for a group of BigQuery tables with name pattern ``[prefix]YYYYMMDD``. Context: https://cloud.google.com/bigquery/docs/partitioned-tables#partitioning_versus_sharding. @@ -540,10 +540,10 @@ class Entry(proto.Message): several sentences or paragraphs that describe entry contents. Default value is an empty string. - schema (~.gcd_schema.Schema): + schema (google.cloud.datacatalog_v1beta1.types.Schema): Schema of the entry. An entry might not have any schema attached to it. - source_system_timestamps (~.timestamps.SystemTimestamps): + source_system_timestamps (google.cloud.datacatalog_v1beta1.types.SystemTimestamps): Output only. Timestamps about the underlying resource, not about this Data Catalog entry. Output only when Entry is of type in the EntryType enum. For entries with @@ -619,7 +619,7 @@ class EntryGroup(proto.Message): several sentences or paragraphs that describe entry group contents. Default value is an empty string. - data_catalog_timestamps (~.timestamps.SystemTimestamps): + data_catalog_timestamps (google.cloud.datacatalog_v1beta1.types.SystemTimestamps): Output only. Timestamps about this EntryGroup. Default value is empty timestamps. """ @@ -650,7 +650,7 @@ class CreateTagTemplateRequest(proto.Message): tag_template_id (str): Required. The id of the tag template to create. - tag_template (~.gcd_tags.TagTemplate): + tag_template (google.cloud.datacatalog_v1beta1.types.TagTemplate): Required. The tag template to create. """ @@ -680,10 +680,10 @@ class UpdateTagTemplateRequest(proto.Message): [UpdateTagTemplate][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTagTemplate]. Attributes: - tag_template (~.gcd_tags.TagTemplate): + tag_template (google.cloud.datacatalog_v1beta1.types.TagTemplate): Required. The template to update. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The field mask specifies the parts of the template to overwrite. @@ -734,7 +734,7 @@ class CreateTagRequest(proto.Message): Note that this Tag and its child resources may not actually be stored in the location in this name. - tag (~.gcd_tags.Tag): + tag (google.cloud.datacatalog_v1beta1.types.Tag): Required. The tag to create. """ @@ -748,10 +748,10 @@ class UpdateTagRequest(proto.Message): [UpdateTag][google.cloud.datacatalog.v1beta1.DataCatalog.UpdateTag]. Attributes: - tag (~.gcd_tags.Tag): + tag (google.cloud.datacatalog_v1beta1.types.Tag): Required. The updated tag. The "name" field must be set. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to update on the Tag. If absent or empty, all modifiable fields are updated. Currently the only modifiable field is the field ``fields``. @@ -794,7 +794,7 @@ class CreateTagTemplateFieldRequest(proto.Message): numbers (0-9), underscores (_) and dashes (-). Field IDs must be at least 1 character long and at most 128 characters long. Field IDs must also be unique within their template. - tag_template_field (~.gcd_tags.TagTemplateField): + tag_template_field (google.cloud.datacatalog_v1beta1.types.TagTemplateField): Required. The tag template field to create. """ @@ -816,9 +816,9 @@ class UpdateTagTemplateFieldRequest(proto.Message): Required. The name of the tag template field. Example: - projects/{project_id}/locations/{location}/tagTemplates/{tag_template_id}/fields/{tag_template_field_id} - tag_template_field (~.gcd_tags.TagTemplateField): + tag_template_field (google.cloud.datacatalog_v1beta1.types.TagTemplateField): Required. The template to update. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): Optional. The field mask specifies the parts of the template to be updated. Allowed fields: @@ -921,7 +921,7 @@ class ListTagsResponse(proto.Message): [ListTags][google.cloud.datacatalog.v1beta1.DataCatalog.ListTags]. Attributes: - tags (Sequence[~.gcd_tags.Tag]): + tags (Sequence[google.cloud.datacatalog_v1beta1.types.Tag]): [Tag][google.cloud.datacatalog.v1beta1.Tag] details. next_page_token (str): Token to retrieve the next page of results. @@ -955,7 +955,7 @@ class ListEntriesRequest(proto.Message): page_token (str): Token that specifies which page is requested. If empty, the first page is returned. - read_mask (~.field_mask.FieldMask): + read_mask (google.protobuf.field_mask_pb2.FieldMask): The fields to return for each Entry. If not set or empty, all fields are returned. For example, setting read_mask to contain only one path "name" will cause ListEntries to @@ -976,7 +976,7 @@ class ListEntriesResponse(proto.Message): [ListEntries][google.cloud.datacatalog.v1beta1.DataCatalog.ListEntries]. Attributes: - entries (Sequence[~.datacatalog.Entry]): + entries (Sequence[google.cloud.datacatalog_v1beta1.types.Entry]): Entry details. next_page_token (str): Token to retrieve the next page of results. diff --git a/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py b/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py index cc52615b..68826009 100644 --- a/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py +++ b/google/cloud/datacatalog_v1beta1/types/gcs_fileset_spec.py @@ -64,7 +64,7 @@ class GcsFilesetSpec(proto.Message): for example: - ``gs://bucket_name/[a-m]??.j*g`` - sample_gcs_file_specs (Sequence[~.gcs_fileset_spec.GcsFileSpec]): + sample_gcs_file_specs (Sequence[google.cloud.datacatalog_v1beta1.types.GcsFileSpec]): Output only. Sample files contained in this fileset, not all files contained in this fileset are represented here. @@ -84,7 +84,7 @@ class GcsFileSpec(proto.Message): file_path (str): Required. The full file path. Example: ``gs://bucket_name/a/b.txt``. - gcs_timestamps (~.timestamps.SystemTimestamps): + gcs_timestamps (google.cloud.datacatalog_v1beta1.types.SystemTimestamps): Output only. Timestamps about the Cloud Storage file. size_bytes (int): diff --git a/google/cloud/datacatalog_v1beta1/types/policytagmanager.py b/google/cloud/datacatalog_v1beta1/types/policytagmanager.py index ad1694c3..f3478c90 100644 --- a/google/cloud/datacatalog_v1beta1/types/policytagmanager.py +++ b/google/cloud/datacatalog_v1beta1/types/policytagmanager.py @@ -67,7 +67,7 @@ class Taxonomy(proto.Message): be at most 2000 bytes long when encoded in UTF-8. If not set, defaults to an empty description. - activated_policy_types (Sequence[~.policytagmanager.Taxonomy.PolicyType]): + activated_policy_types (Sequence[google.cloud.datacatalog_v1beta1.types.Taxonomy.PolicyType]): Optional. A list of policy types that are activated for this taxonomy. If not set, defaults to an empty list. @@ -146,7 +146,7 @@ class CreateTaxonomyRequest(proto.Message): parent (str): Required. Resource name of the project that the taxonomy will belong to. - taxonomy (~.policytagmanager.Taxonomy): + taxonomy (google.cloud.datacatalog_v1beta1.types.Taxonomy): The taxonomy to be created. """ @@ -174,10 +174,10 @@ class UpdateTaxonomyRequest(proto.Message): [UpdateTaxonomy][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdateTaxonomy]. Attributes: - taxonomy (~.policytagmanager.Taxonomy): + taxonomy (google.cloud.datacatalog_v1beta1.types.Taxonomy): The taxonomy to update. Only description, display_name, and activated policy types can be updated. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The update mask applies to the resource. For the ``FieldMask`` definition, see https://developers.google.com/protocol-buffers/docs/reference/google.protobuf#fieldmask @@ -219,7 +219,7 @@ class ListTaxonomiesResponse(proto.Message): [ListTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListTaxonomies]. Attributes: - taxonomies (Sequence[~.policytagmanager.Taxonomy]): + taxonomies (Sequence[google.cloud.datacatalog_v1beta1.types.Taxonomy]): Taxonomies that the project contains. next_page_token (str): Token used to retrieve the next page of @@ -257,7 +257,7 @@ class CreatePolicyTagRequest(proto.Message): parent (str): Required. Resource name of the taxonomy that the policy tag will belong to. - policy_tag (~.policytagmanager.PolicyTag): + policy_tag (google.cloud.datacatalog_v1beta1.types.PolicyTag): The policy tag to be created. """ @@ -285,10 +285,10 @@ class UpdatePolicyTagRequest(proto.Message): [UpdatePolicyTag][google.cloud.datacatalog.v1beta1.PolicyTagManager.UpdatePolicyTag]. Attributes: - policy_tag (~.policytagmanager.PolicyTag): + policy_tag (google.cloud.datacatalog_v1beta1.types.PolicyTag): The policy tag to update. Only the description, display_name, and parent_policy_tag fields can be updated. - update_mask (~.field_mask.FieldMask): + update_mask (google.protobuf.field_mask_pb2.FieldMask): The update mask applies to the resource. Only display_name, description and parent_policy_tag can be updated and thus can be listed in the mask. If update_mask is not provided, @@ -334,7 +334,7 @@ class ListPolicyTagsResponse(proto.Message): [ListPolicyTags][google.cloud.datacatalog.v1beta1.PolicyTagManager.ListPolicyTags]. Attributes: - policy_tags (Sequence[~.policytagmanager.PolicyTag]): + policy_tags (Sequence[google.cloud.datacatalog_v1beta1.types.PolicyTag]): The policy tags that are in the requested taxonomy. next_page_token (str): diff --git a/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py b/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py index 2f76dbc7..eba6c7b6 100644 --- a/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py +++ b/google/cloud/datacatalog_v1beta1/types/policytagmanagerserialization.py @@ -48,7 +48,7 @@ class SerializedTaxonomy(proto.Message): length of the description is limited to 2000 bytes when encoded in UTF-8. If not set, defaults to an empty description. - policy_tags (Sequence[~.policytagmanagerserialization.SerializedPolicyTag]): + policy_tags (Sequence[google.cloud.datacatalog_v1beta1.types.SerializedPolicyTag]): Top level policy tags associated with the taxonomy if any. """ @@ -75,7 +75,7 @@ class SerializedPolicyTag(proto.Message): length of the description is limited to 2000 bytes when encoded in UTF-8. If not set, defaults to an empty description. - child_policy_tags (Sequence[~.policytagmanagerserialization.SerializedPolicyTag]): + child_policy_tags (Sequence[google.cloud.datacatalog_v1beta1.types.SerializedPolicyTag]): Children of the policy tag if any. """ @@ -96,7 +96,7 @@ class ImportTaxonomiesRequest(proto.Message): parent (str): Required. Resource name of project that the newly created taxonomies will belong to. - inline_source (~.policytagmanagerserialization.InlineSource): + inline_source (google.cloud.datacatalog_v1beta1.types.InlineSource): Inline source used for taxonomies import """ @@ -111,7 +111,7 @@ class InlineSource(proto.Message): r"""Inline source used for taxonomies import. Attributes: - taxonomies (Sequence[~.policytagmanagerserialization.SerializedTaxonomy]): + taxonomies (Sequence[google.cloud.datacatalog_v1beta1.types.SerializedTaxonomy]): Required. Taxonomies to be imported. """ @@ -125,7 +125,7 @@ class ImportTaxonomiesResponse(proto.Message): [ImportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ImportTaxonomies]. Attributes: - taxonomies (Sequence[~.policytagmanager.Taxonomy]): + taxonomies (Sequence[google.cloud.datacatalog_v1beta1.types.Taxonomy]): Taxonomies that were imported. """ @@ -161,7 +161,7 @@ class ExportTaxonomiesResponse(proto.Message): [ExportTaxonomies][google.cloud.datacatalog.v1beta1.PolicyTagManagerSerialization.ExportTaxonomies]. Attributes: - taxonomies (Sequence[~.policytagmanagerserialization.SerializedTaxonomy]): + taxonomies (Sequence[google.cloud.datacatalog_v1beta1.types.SerializedTaxonomy]): List of taxonomies and policy tags in a tree structure. """ diff --git a/google/cloud/datacatalog_v1beta1/types/schema.py b/google/cloud/datacatalog_v1beta1/types/schema.py index ebc56879..51c2c566 100644 --- a/google/cloud/datacatalog_v1beta1/types/schema.py +++ b/google/cloud/datacatalog_v1beta1/types/schema.py @@ -27,7 +27,7 @@ class Schema(proto.Message): r"""Represents a schema (e.g. BigQuery, GoogleSQL, Avro schema). Attributes: - columns (Sequence[~.schema.ColumnSchema]): + columns (Sequence[google.cloud.datacatalog_v1beta1.types.ColumnSchema]): Required. Schema of columns. A maximum of 10,000 columns and sub-columns can be specified. """ @@ -52,7 +52,7 @@ class ColumnSchema(proto.Message): this column are required, nullable, etc. Only ``NULLABLE``, ``REQUIRED`` and ``REPEATED`` are supported. Default mode is ``NULLABLE``. - subcolumns (Sequence[~.schema.ColumnSchema]): + subcolumns (Sequence[google.cloud.datacatalog_v1beta1.types.ColumnSchema]): Optional. Schema of sub-columns. A column can have zero or more sub-columns. """ diff --git a/google/cloud/datacatalog_v1beta1/types/search.py b/google/cloud/datacatalog_v1beta1/types/search.py index 87f828d2..5c4d9568 100644 --- a/google/cloud/datacatalog_v1beta1/types/search.py +++ b/google/cloud/datacatalog_v1beta1/types/search.py @@ -40,7 +40,7 @@ class SearchCatalogResult(proto.Message): search. Attributes: - search_result_type (~.search.SearchResultType): + search_result_type (google.cloud.datacatalog_v1beta1.types.SearchResultType): Type of the search result. This field can be used to determine which Get method to call to fetch the full resource. diff --git a/google/cloud/datacatalog_v1beta1/types/table_spec.py b/google/cloud/datacatalog_v1beta1/types/table_spec.py index 254afd21..8c041930 100644 --- a/google/cloud/datacatalog_v1beta1/types/table_spec.py +++ b/google/cloud/datacatalog_v1beta1/types/table_spec.py @@ -41,12 +41,12 @@ class BigQueryTableSpec(proto.Message): r"""Describes a BigQuery table. Attributes: - table_source_type (~.gcd_table_spec.TableSourceType): + table_source_type (google.cloud.datacatalog_v1beta1.types.TableSourceType): Output only. The table source type. - view_spec (~.gcd_table_spec.ViewSpec): + view_spec (google.cloud.datacatalog_v1beta1.types.ViewSpec): Table view specification. This field should only be populated if ``table_source_type`` is ``BIGQUERY_VIEW``. - table_spec (~.gcd_table_spec.TableSpec): + table_spec (google.cloud.datacatalog_v1beta1.types.TableSpec): Spec of a BigQuery table. This field should only be populated if ``table_source_type`` is ``BIGQUERY_TABLE``. """ diff --git a/google/cloud/datacatalog_v1beta1/types/tags.py b/google/cloud/datacatalog_v1beta1/types/tags.py index 098fd3c2..575e9964 100644 --- a/google/cloud/datacatalog_v1beta1/types/tags.py +++ b/google/cloud/datacatalog_v1beta1/types/tags.py @@ -63,7 +63,7 @@ class Tag(proto.Message): separate the column names. Example: - ``outer_column.inner_column`` - fields (Sequence[~.tags.Tag.FieldsEntry]): + fields (Sequence[google.cloud.datacatalog_v1beta1.types.Tag.FieldsEntry]): Required. This maps the ID of a tag field to the value of and additional information about that field. Valid field IDs are defined by the @@ -98,10 +98,10 @@ class TagField(proto.Message): bool_value (bool): Holds the value for a tag field with boolean type. - timestamp_value (~.timestamp.Timestamp): + timestamp_value (google.protobuf.timestamp_pb2.Timestamp): Holds the value for a tag field with timestamp type. - enum_value (~.tags.TagField.EnumValue): + enum_value (google.cloud.datacatalog_v1beta1.types.TagField.EnumValue): Holds the value for a tag field with enum type. This value must be one of the allowed values in the definition of this enum. @@ -165,7 +165,7 @@ class TagTemplate(proto.Message): display_name (str): The display name for this template. Defaults to an empty string. - fields (Sequence[~.tags.TagTemplate.FieldsEntry]): + fields (Sequence[google.cloud.datacatalog_v1beta1.types.TagTemplate.FieldsEntry]): Required. Map of tag template field IDs to the settings for the field. This map is an exhaustive list of the allowed fields. This map must contain at least one field and at most @@ -202,7 +202,7 @@ class TagTemplateField(proto.Message): display_name (str): The display name for this field. Defaults to an empty string. - type_ (~.tags.FieldType): + type_ (google.cloud.datacatalog_v1beta1.types.FieldType): Required. The type of value this tag field can contain. is_required (bool): @@ -232,10 +232,10 @@ class FieldType(proto.Message): r""" Attributes: - primitive_type (~.tags.FieldType.PrimitiveType): + primitive_type (google.cloud.datacatalog_v1beta1.types.FieldType.PrimitiveType): Represents primitive types - string, bool etc. - enum_type (~.tags.FieldType.EnumType): + enum_type (google.cloud.datacatalog_v1beta1.types.FieldType.EnumType): Represents an enum type. """ @@ -251,7 +251,7 @@ class EnumType(proto.Message): r""" Attributes: - allowed_values (Sequence[~.tags.FieldType.EnumType.EnumValue]): + allowed_values (Sequence[google.cloud.datacatalog_v1beta1.types.FieldType.EnumType.EnumValue]): Required on create; optional on update. The set of allowed values for this enum. This set must not be empty, the display names of the diff --git a/google/cloud/datacatalog_v1beta1/types/timestamps.py b/google/cloud/datacatalog_v1beta1/types/timestamps.py index 82ef8a06..fe45394c 100644 --- a/google/cloud/datacatalog_v1beta1/types/timestamps.py +++ b/google/cloud/datacatalog_v1beta1/types/timestamps.py @@ -31,13 +31,13 @@ class SystemTimestamps(proto.Message): system. Attributes: - create_time (~.timestamp.Timestamp): + create_time (google.protobuf.timestamp_pb2.Timestamp): The creation time of the resource within the given system. - update_time (~.timestamp.Timestamp): + update_time (google.protobuf.timestamp_pb2.Timestamp): The last-modified time of the resource within the given system. - expire_time (~.timestamp.Timestamp): + expire_time (google.protobuf.timestamp_pb2.Timestamp): Output only. The expiration time of the resource within the given system. Currently only apllicable to BigQuery resources. diff --git a/synth.metadata b/synth.metadata index 059cec33..3dc57936 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,15 +4,15 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "890b6cc7c323a61255e001a21081beafb88c83f5" + "sha": "fda528a1da2ec1dbf6b3ad33eb2d33780a77f3d9" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "dd372aa22ded7a8ba6f0e03a80e06358a3fa0907", - "internalRef": "347055288" + "sha": "520682435235d9c503983a360a2090025aa47cd1", + "internalRef": "350246057" } }, { @@ -51,6 +51,7 @@ } ], "generatedFiles": [ + ".coveragerc", ".flake8", ".github/CONTRIBUTING.md", ".github/ISSUE_TEMPLATE/bug_report.md", @@ -103,8 +104,12 @@ "docs/_static/custom.css", "docs/_templates/layout.html", "docs/conf.py", + "docs/datacatalog_v1/data_catalog.rst", "docs/datacatalog_v1/services.rst", "docs/datacatalog_v1/types.rst", + "docs/datacatalog_v1beta1/data_catalog.rst", + "docs/datacatalog_v1beta1/policy_tag_manager.rst", + "docs/datacatalog_v1beta1/policy_tag_manager_serialization.rst", "docs/datacatalog_v1beta1/services.rst", "docs/datacatalog_v1beta1/types.rst", "docs/multiprocessing.rst", diff --git a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py index ba76e5b9..3c2ea655 100644 --- a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py @@ -94,7 +94,20 @@ def test__get_default_mtls_endpoint(): assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi -@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient]) +def test_data_catalog_client_from_service_account_info(): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_info" + ) as factory: + factory.return_value = creds + info = {"valid": True} + client = DataCatalogClient.from_service_account_info(info) + assert client.transport._credentials == creds + + assert client.transport._host == "datacatalog.googleapis.com:443" + + +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient,]) def test_data_catalog_client_from_service_account_file(client_class): creds = credentials.AnonymousCredentials() with mock.patch.object( @@ -112,7 +125,10 @@ def test_data_catalog_client_from_service_account_file(client_class): def test_data_catalog_client_get_transport_class(): transport = DataCatalogClient.get_transport_class() - assert transport == transports.DataCatalogGrpcTransport + available_transports = [ + transports.DataCatalogGrpcTransport, + ] + assert transport in available_transports transport = DataCatalogClient.get_transport_class("grpc") assert transport == transports.DataCatalogGrpcTransport @@ -6589,7 +6605,7 @@ def test_transport_get_channel(): @pytest.mark.parametrize( "transport_class", - [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], + [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport,], ) def test_transport_adc(transport_class): # Test default credentials are used if not provided. @@ -6736,7 +6752,7 @@ def test_data_catalog_host_with_port(): def test_data_catalog_grpc_transport_channel(): - channel = grpc.insecure_channel("http://localhost/") + channel = grpc.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.DataCatalogGrpcTransport( @@ -6748,7 +6764,7 @@ def test_data_catalog_grpc_transport_channel(): def test_data_catalog_grpc_asyncio_transport_channel(): - channel = aio.insecure_channel("http://localhost/") + channel = aio.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.DataCatalogGrpcAsyncIOTransport( @@ -6768,7 +6784,7 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c "grpc.ssl_channel_credentials", autospec=True ) as grpc_ssl_channel_cred: with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_ssl_cred = mock.Mock() grpc_ssl_channel_cred.return_value = mock_ssl_cred @@ -6818,7 +6834,7 @@ def test_data_catalog_transport_channel_mtls_with_adc(transport_class): ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), ): with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_grpc_channel = mock.Mock() grpc_create_channel.return_value = mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py index e1b71ede..4f9f630f 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py @@ -96,7 +96,20 @@ def test__get_default_mtls_endpoint(): assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi -@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient]) +def test_data_catalog_client_from_service_account_info(): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_info" + ) as factory: + factory.return_value = creds + info = {"valid": True} + client = DataCatalogClient.from_service_account_info(info) + assert client.transport._credentials == creds + + assert client.transport._host == "datacatalog.googleapis.com:443" + + +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient,]) def test_data_catalog_client_from_service_account_file(client_class): creds = credentials.AnonymousCredentials() with mock.patch.object( @@ -114,7 +127,10 @@ def test_data_catalog_client_from_service_account_file(client_class): def test_data_catalog_client_get_transport_class(): transport = DataCatalogClient.get_transport_class() - assert transport == transports.DataCatalogGrpcTransport + available_transports = [ + transports.DataCatalogGrpcTransport, + ] + assert transport in available_transports transport = DataCatalogClient.get_transport_class("grpc") assert transport == transports.DataCatalogGrpcTransport @@ -6584,7 +6600,7 @@ def test_transport_get_channel(): @pytest.mark.parametrize( "transport_class", - [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], + [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport,], ) def test_transport_adc(transport_class): # Test default credentials are used if not provided. @@ -6731,7 +6747,7 @@ def test_data_catalog_host_with_port(): def test_data_catalog_grpc_transport_channel(): - channel = grpc.insecure_channel("http://localhost/") + channel = grpc.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.DataCatalogGrpcTransport( @@ -6743,7 +6759,7 @@ def test_data_catalog_grpc_transport_channel(): def test_data_catalog_grpc_asyncio_transport_channel(): - channel = aio.insecure_channel("http://localhost/") + channel = aio.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.DataCatalogGrpcAsyncIOTransport( @@ -6763,7 +6779,7 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c "grpc.ssl_channel_credentials", autospec=True ) as grpc_ssl_channel_cred: with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_ssl_cred = mock.Mock() grpc_ssl_channel_cred.return_value = mock_ssl_cred @@ -6813,7 +6829,7 @@ def test_data_catalog_transport_channel_mtls_with_adc(transport_class): ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), ): with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_grpc_channel = mock.Mock() grpc_create_channel.return_value = mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py index 98b5c966..a8e44f60 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py @@ -94,8 +94,21 @@ def test__get_default_mtls_endpoint(): ) +def test_policy_tag_manager_client_from_service_account_info(): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_info" + ) as factory: + factory.return_value = creds + info = {"valid": True} + client = PolicyTagManagerClient.from_service_account_info(info) + assert client.transport._credentials == creds + + assert client.transport._host == "datacatalog.googleapis.com:443" + + @pytest.mark.parametrize( - "client_class", [PolicyTagManagerClient, PolicyTagManagerAsyncClient] + "client_class", [PolicyTagManagerClient, PolicyTagManagerAsyncClient,] ) def test_policy_tag_manager_client_from_service_account_file(client_class): creds = credentials.AnonymousCredentials() @@ -114,7 +127,10 @@ def test_policy_tag_manager_client_from_service_account_file(client_class): def test_policy_tag_manager_client_get_transport_class(): transport = PolicyTagManagerClient.get_transport_class() - assert transport == transports.PolicyTagManagerGrpcTransport + available_transports = [ + transports.PolicyTagManagerGrpcTransport, + ] + assert transport in available_transports transport = PolicyTagManagerClient.get_transport_class("grpc") assert transport == transports.PolicyTagManagerGrpcTransport @@ -3610,7 +3626,7 @@ def test_policy_tag_manager_host_with_port(): def test_policy_tag_manager_grpc_transport_channel(): - channel = grpc.insecure_channel("http://localhost/") + channel = grpc.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.PolicyTagManagerGrpcTransport( @@ -3622,7 +3638,7 @@ def test_policy_tag_manager_grpc_transport_channel(): def test_policy_tag_manager_grpc_asyncio_transport_channel(): - channel = aio.insecure_channel("http://localhost/") + channel = aio.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.PolicyTagManagerGrpcAsyncIOTransport( @@ -3647,7 +3663,7 @@ def test_policy_tag_manager_transport_channel_mtls_with_client_cert_source( "grpc.ssl_channel_credentials", autospec=True ) as grpc_ssl_channel_cred: with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_ssl_cred = mock.Mock() grpc_ssl_channel_cred.return_value = mock_ssl_cred @@ -3700,7 +3716,7 @@ def test_policy_tag_manager_transport_channel_mtls_with_adc(transport_class): ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), ): with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_grpc_channel = mock.Mock() grpc_create_channel.return_value = mock_grpc_channel diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py index a3c3540e..6a42e9d0 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py @@ -95,9 +95,22 @@ def test__get_default_mtls_endpoint(): ) +def test_policy_tag_manager_serialization_client_from_service_account_info(): + creds = credentials.AnonymousCredentials() + with mock.patch.object( + service_account.Credentials, "from_service_account_info" + ) as factory: + factory.return_value = creds + info = {"valid": True} + client = PolicyTagManagerSerializationClient.from_service_account_info(info) + assert client.transport._credentials == creds + + assert client.transport._host == "datacatalog.googleapis.com:443" + + @pytest.mark.parametrize( "client_class", - [PolicyTagManagerSerializationClient, PolicyTagManagerSerializationAsyncClient], + [PolicyTagManagerSerializationClient, PolicyTagManagerSerializationAsyncClient,], ) def test_policy_tag_manager_serialization_client_from_service_account_file( client_class, @@ -118,7 +131,10 @@ def test_policy_tag_manager_serialization_client_from_service_account_file( def test_policy_tag_manager_serialization_client_get_transport_class(): transport = PolicyTagManagerSerializationClient.get_transport_class() - assert transport == transports.PolicyTagManagerSerializationGrpcTransport + available_transports = [ + transports.PolicyTagManagerSerializationGrpcTransport, + ] + assert transport in available_transports transport = PolicyTagManagerSerializationClient.get_transport_class("grpc") assert transport == transports.PolicyTagManagerSerializationGrpcTransport @@ -930,7 +946,7 @@ def test_policy_tag_manager_serialization_host_with_port(): def test_policy_tag_manager_serialization_grpc_transport_channel(): - channel = grpc.insecure_channel("http://localhost/") + channel = grpc.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.PolicyTagManagerSerializationGrpcTransport( @@ -942,7 +958,7 @@ def test_policy_tag_manager_serialization_grpc_transport_channel(): def test_policy_tag_manager_serialization_grpc_asyncio_transport_channel(): - channel = aio.insecure_channel("http://localhost/") + channel = aio.secure_channel("http://localhost/", grpc.local_channel_credentials()) # Check that channel is used if provided. transport = transports.PolicyTagManagerSerializationGrpcAsyncIOTransport( @@ -967,7 +983,7 @@ def test_policy_tag_manager_serialization_transport_channel_mtls_with_client_cer "grpc.ssl_channel_credentials", autospec=True ) as grpc_ssl_channel_cred: with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_ssl_cred = mock.Mock() grpc_ssl_channel_cred.return_value = mock_ssl_cred @@ -1022,7 +1038,7 @@ def test_policy_tag_manager_serialization_transport_channel_mtls_with_adc( ssl_credentials=mock.PropertyMock(return_value=mock_ssl_cred), ): with mock.patch.object( - transport_class, "create_channel", autospec=True + transport_class, "create_channel" ) as grpc_create_channel: mock_grpc_channel = mock.Mock() grpc_create_channel.return_value = mock_grpc_channel From dcbb907ae8b7e20b431a8dd8e86e11a789d02762 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Thu, 21 Jan 2021 11:58:02 -0800 Subject: [PATCH 16/26] chore: add 3.9 to noxfile template (#101) This PR was generated using Autosynth. :rainbow: Synth log will be available here: https://source.cloud.google.com/results/invocations/29194dd0-d137-4c19-b14a-efe9aaef350f/targets - [ ] To automatically regenerate this PR, check this box. Source-Link: https://github.com/googleapis/synthtool/commit/56ddc68f36b32341e9f22c2c59b4ce6aa3ba635f --- samples/snippets/noxfile.py | 2 +- synth.metadata | 6 +++--- 2 files changed, 4 insertions(+), 4 deletions(-) diff --git a/samples/snippets/noxfile.py b/samples/snippets/noxfile.py index bca0522e..97bf7da8 100644 --- a/samples/snippets/noxfile.py +++ b/samples/snippets/noxfile.py @@ -85,7 +85,7 @@ def get_pytest_env_vars() -> Dict[str, str]: # DO NOT EDIT - automatically generated. # All versions used to tested samples. -ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8"] +ALL_VERSIONS = ["2.7", "3.6", "3.7", "3.8", "3.9"] # Any default versions that should be ignored. IGNORED_VERSIONS = TEST_CONFIG['ignored_versions'] diff --git a/synth.metadata b/synth.metadata index 3dc57936..16bf97bf 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "fda528a1da2ec1dbf6b3ad33eb2d33780a77f3d9" + "sha": "2dbb3ef062b52925ad421c5c469ed6e67671e878" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "16ec872dd898d7de6e1822badfac32484b5d9031" + "sha": "56ddc68f36b32341e9f22c2c59b4ce6aa3ba635f" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "16ec872dd898d7de6e1822badfac32484b5d9031" + "sha": "56ddc68f36b32341e9f22c2c59b4ce6aa3ba635f" } } ], From 00a5f47c852550de644960a258fcff29121cda91 Mon Sep 17 00:00:00 2001 From: Justin Beckwith Date: Fri, 29 Jan 2021 17:06:02 -0800 Subject: [PATCH 17/26] build: migrate to flakybot (#106) --- .kokoro/test-samples.sh | 8 ++++---- .kokoro/trampoline_v2.sh | 2 +- 2 files changed, 5 insertions(+), 5 deletions(-) diff --git a/.kokoro/test-samples.sh b/.kokoro/test-samples.sh index 49ac61fa..8fb79dcb 100755 --- a/.kokoro/test-samples.sh +++ b/.kokoro/test-samples.sh @@ -87,11 +87,11 @@ for file in samples/**/requirements.txt; do python3.6 -m nox -s "$RUN_TESTS_SESSION" EXIT=$? - # If this is a periodic build, send the test log to the Build Cop Bot. - # See https://github.com/googleapis/repo-automation-bots/tree/master/packages/buildcop. + # If this is a periodic build, send the test log to the FlakyBot. + # See https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot. if [[ $KOKORO_BUILD_ARTIFACTS_SUBDIR = *"periodic"* ]]; then - chmod +x $KOKORO_GFILE_DIR/linux_amd64/buildcop - $KOKORO_GFILE_DIR/linux_amd64/buildcop + chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot + $KOKORO_GFILE_DIR/linux_amd64/flakybot fi if [[ $EXIT -ne 0 ]]; then diff --git a/.kokoro/trampoline_v2.sh b/.kokoro/trampoline_v2.sh index 719bcd5b..4af6cdc2 100755 --- a/.kokoro/trampoline_v2.sh +++ b/.kokoro/trampoline_v2.sh @@ -159,7 +159,7 @@ if [[ -n "${KOKORO_BUILD_ID:-}" ]]; then "KOKORO_GITHUB_COMMIT" "KOKORO_GITHUB_PULL_REQUEST_NUMBER" "KOKORO_GITHUB_PULL_REQUEST_COMMIT" - # For Build Cop Bot + # For FlakyBot "KOKORO_GITHUB_COMMIT_URL" "KOKORO_GITHUB_PULL_REQUEST_URL" ) From 59a44bc744a6322a2a23313c851eb77204110e79 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Wed, 3 Feb 2021 09:42:56 -0800 Subject: [PATCH 18/26] feat: add `client_cert_source_for_mtls` argument to transports (#107) * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore: update Go generator, rules_go, and protobuf PiperOrigin-RevId: 352816749 Source-Author: Google APIs Source-Date: Wed Jan 20 10:06:23 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: ceaaf31b3d13badab7cf9d3b570f5639db5593d9 Source-Link: https://github.com/googleapis/googleapis/commit/ceaaf31b3d13badab7cf9d3b570f5639db5593d9 * chore: upgrade gapic-generator-python to 0.40.5 PiperOrigin-RevId: 354996675 Source-Author: Google APIs Source-Date: Mon Feb 1 12:11:49 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 20712b8fe95001b312f62c6c5f33e3e3ec92cfaf Source-Link: https://github.com/googleapis/googleapis/commit/20712b8fe95001b312f62c6c5f33e3e3ec92cfaf * feat: Add Pub/Sub endpoints for Cloud Channel API. PiperOrigin-RevId: 355059873 Source-Author: Google APIs Source-Date: Mon Feb 1 17:13:22 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 6ef9eaea379fc1cc0355e06a5a20b594543ee693 Source-Link: https://github.com/googleapis/googleapis/commit/6ef9eaea379fc1cc0355e06a5a20b594543ee693 * revert flakybot changes Co-authored-by: Tim Swast --- .../services/data_catalog/client.py | 18 +- .../services/data_catalog/transports/grpc.py | 23 ++- .../data_catalog/transports/grpc_asyncio.py | 23 ++- google/cloud/datacatalog_v1beta1/__init__.py | 4 +- .../services/data_catalog/client.py | 18 +- .../services/data_catalog/transports/grpc.py | 23 ++- .../data_catalog/transports/grpc_asyncio.py | 23 ++- .../services/policy_tag_manager/client.py | 18 +- .../policy_tag_manager/transports/grpc.py | 23 ++- .../transports/grpc_asyncio.py | 23 ++- .../client.py | 18 +- .../transports/grpc.py | 23 ++- .../transports/grpc_asyncio.py | 23 ++- synth.metadata | 6 +- .../gapic/datacatalog_v1/test_data_catalog.py | 179 ++++++++++------- .../datacatalog_v1beta1/test_data_catalog.py | 179 ++++++++++------- .../test_policy_tag_manager.py | 182 ++++++++++------- .../test_policy_tag_manager_serialization.py | 184 +++++++++++------- 18 files changed, 593 insertions(+), 397 deletions(-) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/client.py b/google/cloud/datacatalog_v1/services/data_catalog/client.py index a9663871..acc03c3c 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/client.py @@ -367,21 +367,17 @@ def __init__( util.strtobool(os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")) ) - ssl_credentials = None + client_cert_source_func = None is_mtls = False if use_client_cert: if client_options.client_cert_source: - import grpc # type: ignore - - cert, key = client_options.client_cert_source() - ssl_credentials = grpc.ssl_channel_credentials( - certificate_chain=cert, private_key=key - ) is_mtls = True + client_cert_source_func = client_options.client_cert_source else: - creds = SslCredentials() - is_mtls = creds.is_mtls - ssl_credentials = creds.ssl_credentials if is_mtls else None + is_mtls = mtls.has_default_client_cert_source() + client_cert_source_func = ( + mtls.default_client_cert_source() if is_mtls else None + ) # Figure out which api endpoint to use. if client_options.api_endpoint is not None: @@ -424,7 +420,7 @@ def __init__( credentials_file=client_options.credentials_file, host=api_endpoint, scopes=client_options.scopes, - ssl_channel_credentials=ssl_credentials, + client_cert_source_for_mtls=client_cert_source_func, quota_project_id=client_options.quota_project_id, client_info=client_info, ) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py index b5b4d6c6..34c1e8b4 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc.py @@ -62,6 +62,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -92,6 +93,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -108,6 +113,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -117,11 +127,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -165,12 +170,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py index 2a25f5b8..08c887ca 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/transports/grpc_asyncio.py @@ -106,6 +106,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id=None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -137,6 +138,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -153,6 +158,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -162,11 +172,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -210,12 +215,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 16534418..be0bdd8e 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,6 +103,7 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", + "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -140,7 +141,6 @@ "LookupEntryRequest", "PolicyTag", "PolicyTagManagerClient", - "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", "SearchCatalogRequest", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "DataCatalogClient", + "PolicyTagManagerSerializationClient", ) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py index 14b95915..d7441888 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py @@ -367,21 +367,17 @@ def __init__( util.strtobool(os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")) ) - ssl_credentials = None + client_cert_source_func = None is_mtls = False if use_client_cert: if client_options.client_cert_source: - import grpc # type: ignore - - cert, key = client_options.client_cert_source() - ssl_credentials = grpc.ssl_channel_credentials( - certificate_chain=cert, private_key=key - ) is_mtls = True + client_cert_source_func = client_options.client_cert_source else: - creds = SslCredentials() - is_mtls = creds.is_mtls - ssl_credentials = creds.ssl_credentials if is_mtls else None + is_mtls = mtls.has_default_client_cert_source() + client_cert_source_func = ( + mtls.default_client_cert_source() if is_mtls else None + ) # Figure out which api endpoint to use. if client_options.api_endpoint is not None: @@ -424,7 +420,7 @@ def __init__( credentials_file=client_options.credentials_file, host=api_endpoint, scopes=client_options.scopes, - ssl_channel_credentials=ssl_credentials, + client_cert_source_for_mtls=client_cert_source_func, quota_project_id=client_options.quota_project_id, client_info=client_info, ) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py index e4fd43f0..ecaf17bf 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc.py @@ -62,6 +62,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -92,6 +93,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -108,6 +113,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -117,11 +127,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -165,12 +170,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py index 11229337..05440f84 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/transports/grpc_asyncio.py @@ -106,6 +106,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id=None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -137,6 +138,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -153,6 +158,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -162,11 +172,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -210,12 +215,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py index 1b88fc10..7bbb2b41 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py @@ -310,21 +310,17 @@ def __init__( util.strtobool(os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")) ) - ssl_credentials = None + client_cert_source_func = None is_mtls = False if use_client_cert: if client_options.client_cert_source: - import grpc # type: ignore - - cert, key = client_options.client_cert_source() - ssl_credentials = grpc.ssl_channel_credentials( - certificate_chain=cert, private_key=key - ) is_mtls = True + client_cert_source_func = client_options.client_cert_source else: - creds = SslCredentials() - is_mtls = creds.is_mtls - ssl_credentials = creds.ssl_credentials if is_mtls else None + is_mtls = mtls.has_default_client_cert_source() + client_cert_source_func = ( + mtls.default_client_cert_source() if is_mtls else None + ) # Figure out which api endpoint to use. if client_options.api_endpoint is not None: @@ -367,7 +363,7 @@ def __init__( credentials_file=client_options.credentials_file, host=api_endpoint, scopes=client_options.scopes, - ssl_channel_credentials=ssl_credentials, + client_cert_source_for_mtls=client_cert_source_func, quota_project_id=client_options.quota_project_id, client_info=client_info, ) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py index 8d316d4b..d12db533 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc.py @@ -61,6 +61,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -91,6 +92,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -107,6 +112,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -116,11 +126,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -164,12 +169,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py index eef5872a..e4e5790e 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/transports/grpc_asyncio.py @@ -105,6 +105,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id=None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -136,6 +137,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -152,6 +157,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -161,11 +171,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -209,12 +214,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py index 739c3020..59221c65 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/client.py @@ -294,21 +294,17 @@ def __init__( util.strtobool(os.getenv("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false")) ) - ssl_credentials = None + client_cert_source_func = None is_mtls = False if use_client_cert: if client_options.client_cert_source: - import grpc # type: ignore - - cert, key = client_options.client_cert_source() - ssl_credentials = grpc.ssl_channel_credentials( - certificate_chain=cert, private_key=key - ) is_mtls = True + client_cert_source_func = client_options.client_cert_source else: - creds = SslCredentials() - is_mtls = creds.is_mtls - ssl_credentials = creds.ssl_credentials if is_mtls else None + is_mtls = mtls.has_default_client_cert_source() + client_cert_source_func = ( + mtls.default_client_cert_source() if is_mtls else None + ) # Figure out which api endpoint to use. if client_options.api_endpoint is not None: @@ -351,7 +347,7 @@ def __init__( credentials_file=client_options.credentials_file, host=api_endpoint, scopes=client_options.scopes, - ssl_channel_credentials=ssl_credentials, + client_cert_source_for_mtls=client_cert_source_func, quota_project_id=client_options.quota_project_id, client_info=client_info, ) diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py index 943dcf5e..30911c24 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc.py @@ -61,6 +61,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id: Optional[str] = None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -91,6 +92,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -107,6 +112,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -116,11 +126,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -164,12 +169,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py index 7d51d774..3a0d0aad 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/transports/grpc_asyncio.py @@ -105,6 +105,7 @@ def __init__( api_mtls_endpoint: str = None, client_cert_source: Callable[[], Tuple[bytes, bytes]] = None, ssl_channel_credentials: grpc.ChannelCredentials = None, + client_cert_source_for_mtls: Callable[[], Tuple[bytes, bytes]] = None, quota_project_id=None, client_info: gapic_v1.client_info.ClientInfo = DEFAULT_CLIENT_INFO, ) -> None: @@ -136,6 +137,10 @@ def __init__( ``api_mtls_endpoint`` is None. ssl_channel_credentials (grpc.ChannelCredentials): SSL credentials for grpc channel. It is ignored if ``channel`` is provided. + client_cert_source_for_mtls (Optional[Callable[[], Tuple[bytes, bytes]]]): + A callback to provide client certificate bytes and private key bytes, + both in PEM format. It is used to configure mutual TLS channel. It is + ignored if ``channel`` or ``ssl_channel_credentials`` is provided. quota_project_id (Optional[str]): An optional project to use for billing and quota. client_info (google.api_core.gapic_v1.client_info.ClientInfo): @@ -152,6 +157,11 @@ def __init__( """ self._ssl_channel_credentials = ssl_channel_credentials + if api_mtls_endpoint: + warnings.warn("api_mtls_endpoint is deprecated", DeprecationWarning) + if client_cert_source: + warnings.warn("client_cert_source is deprecated", DeprecationWarning) + if channel: # Sanity check: Ensure that channel and credentials are not both # provided. @@ -161,11 +171,6 @@ def __init__( self._grpc_channel = channel self._ssl_channel_credentials = None elif api_mtls_endpoint: - warnings.warn( - "api_mtls_endpoint and client_cert_source are deprecated", - DeprecationWarning, - ) - host = ( api_mtls_endpoint if ":" in api_mtls_endpoint @@ -209,12 +214,18 @@ def __init__( scopes=self.AUTH_SCOPES, quota_project_id=quota_project_id ) + if client_cert_source_for_mtls and not ssl_channel_credentials: + cert, key = client_cert_source_for_mtls() + self._ssl_channel_credentials = grpc.ssl_channel_credentials( + certificate_chain=cert, private_key=key + ) + # create a new channel. The provided one is ignored. self._grpc_channel = type(self).create_channel( host, credentials=credentials, credentials_file=credentials_file, - ssl_credentials=ssl_channel_credentials, + ssl_credentials=self._ssl_channel_credentials, scopes=scopes or self.AUTH_SCOPES, quota_project_id=quota_project_id, options=[ diff --git a/synth.metadata b/synth.metadata index 16bf97bf..008c72bb 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,15 +4,15 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "2dbb3ef062b52925ad421c5c469ed6e67671e878" + "sha": "00a5f47c852550de644960a258fcff29121cda91" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "520682435235d9c503983a360a2090025aa47cd1", - "internalRef": "350246057" + "sha": "6ef9eaea379fc1cc0355e06a5a20b594543ee693", + "internalRef": "355059873" } }, { diff --git a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py index 3c2ea655..8fc96676 100644 --- a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py @@ -177,7 +177,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -193,7 +193,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -209,7 +209,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_MTLS_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -237,7 +237,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id="octopus", client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -286,29 +286,25 @@ def test_data_catalog_client_mtls_env_auto( client_cert_source=client_cert_source_callback ) with mock.patch.object(transport_class, "__init__") as patched: - ssl_channel_creds = mock.Mock() - with mock.patch( - "grpc.ssl_channel_credentials", return_value=ssl_channel_creds - ): - patched.return_value = None - client = client_class(client_options=options) + patched.return_value = None + client = client_class(client_options=options) - if use_client_cert_env == "false": - expected_ssl_channel_creds = None - expected_host = client.DEFAULT_ENDPOINT - else: - expected_ssl_channel_creds = ssl_channel_creds - expected_host = client.DEFAULT_MTLS_ENDPOINT + if use_client_cert_env == "false": + expected_client_cert_source = None + expected_host = client.DEFAULT_ENDPOINT + else: + expected_client_cert_source = client_cert_source_callback + expected_host = client.DEFAULT_MTLS_ENDPOINT - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=expected_host, + scopes=None, + client_cert_source_for_mtls=expected_client_cert_source, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) # Check the case ADC client cert is provided. Whether client cert is used depends on # GOOGLE_API_USE_CLIENT_CERTIFICATE value. @@ -317,66 +313,53 @@ def test_data_catalog_client_mtls_env_auto( ): with mock.patch.object(transport_class, "__init__") as patched: with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, ): with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.ssl_credentials", - new_callable=mock.PropertyMock, - ) as ssl_credentials_mock: - if use_client_cert_env == "false": - is_mtls_mock.return_value = False - ssl_credentials_mock.return_value = None - expected_host = client.DEFAULT_ENDPOINT - expected_ssl_channel_creds = None - else: - is_mtls_mock.return_value = True - ssl_credentials_mock.return_value = mock.Mock() - expected_host = client.DEFAULT_MTLS_ENDPOINT - expected_ssl_channel_creds = ( - ssl_credentials_mock.return_value - ) - - patched.return_value = None - client = client_class() - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + "google.auth.transport.mtls.default_client_cert_source", + return_value=client_cert_source_callback, + ): + if use_client_cert_env == "false": + expected_host = client.DEFAULT_ENDPOINT + expected_client_cert_source = None + else: + expected_host = client.DEFAULT_MTLS_ENDPOINT + expected_client_cert_source = client_cert_source_callback - # Check the case client_cert_source and ADC client cert are not provided. - with mock.patch.dict( - os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} - ): - with mock.patch.object(transport_class, "__init__") as patched: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None - ): - with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - is_mtls_mock.return_value = False patched.return_value = None client = client_class() patched.assert_called_once_with( credentials=None, credentials_file=None, - host=client.DEFAULT_ENDPOINT, + host=expected_host, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=expected_client_cert_source, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) + # Check the case client_cert_source and ADC client cert are not provided. + with mock.patch.dict( + os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} + ): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + client_cert_source_for_mtls=None, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) + @pytest.mark.parametrize( "client_class,transport_class,transport_name", @@ -402,7 +385,7 @@ def test_data_catalog_client_client_options_scopes( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=["1", "2"], - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -432,7 +415,7 @@ def test_data_catalog_client_client_options_credentials_file( credentials_file="credentials.json", host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -449,7 +432,7 @@ def test_data_catalog_client_client_options_from_dict(): credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -6731,6 +6714,48 @@ def test_data_catalog_transport_auth_adc(): ) +@pytest.mark.parametrize( + "transport_class", + [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], +) +def test_data_catalog_grpc_transport_client_cert_source_for_mtls(transport_class): + cred = credentials.AnonymousCredentials() + + # Check ssl_channel_credentials is used if provided. + with mock.patch.object(transport_class, "create_channel") as mock_create_channel: + mock_ssl_channel_creds = mock.Mock() + transport_class( + host="squid.clam.whelk", + credentials=cred, + ssl_channel_credentials=mock_ssl_channel_creds, + ) + mock_create_channel.assert_called_once_with( + "squid.clam.whelk:443", + credentials=cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_channel_creds, + quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], + ) + + # Check if ssl_channel_credentials is not provided, then client_cert_source_for_mtls + # is used. + with mock.patch.object(transport_class, "create_channel", return_value=mock.Mock()): + with mock.patch("grpc.ssl_channel_credentials") as mock_ssl_cred: + transport_class( + credentials=cred, + client_cert_source_for_mtls=client_cert_source_callback, + ) + expected_cert, expected_key = client_cert_source_callback() + mock_ssl_cred.assert_called_once_with( + certificate_chain=expected_cert, private_key=expected_key + ) + + def test_data_catalog_host_no_port(): client = DataCatalogClient( credentials=credentials.AnonymousCredentials(), @@ -6775,6 +6800,8 @@ def test_data_catalog_grpc_asyncio_transport_channel(): assert transport._ssl_channel_credentials == None +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], @@ -6822,6 +6849,8 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c assert transport._ssl_channel_credentials == mock_ssl_cred +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py index 4f9f630f..7191fc9b 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py @@ -179,7 +179,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -195,7 +195,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -211,7 +211,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_MTLS_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -239,7 +239,7 @@ def test_data_catalog_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id="octopus", client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -288,29 +288,25 @@ def test_data_catalog_client_mtls_env_auto( client_cert_source=client_cert_source_callback ) with mock.patch.object(transport_class, "__init__") as patched: - ssl_channel_creds = mock.Mock() - with mock.patch( - "grpc.ssl_channel_credentials", return_value=ssl_channel_creds - ): - patched.return_value = None - client = client_class(client_options=options) + patched.return_value = None + client = client_class(client_options=options) - if use_client_cert_env == "false": - expected_ssl_channel_creds = None - expected_host = client.DEFAULT_ENDPOINT - else: - expected_ssl_channel_creds = ssl_channel_creds - expected_host = client.DEFAULT_MTLS_ENDPOINT + if use_client_cert_env == "false": + expected_client_cert_source = None + expected_host = client.DEFAULT_ENDPOINT + else: + expected_client_cert_source = client_cert_source_callback + expected_host = client.DEFAULT_MTLS_ENDPOINT - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=expected_host, + scopes=None, + client_cert_source_for_mtls=expected_client_cert_source, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) # Check the case ADC client cert is provided. Whether client cert is used depends on # GOOGLE_API_USE_CLIENT_CERTIFICATE value. @@ -319,66 +315,53 @@ def test_data_catalog_client_mtls_env_auto( ): with mock.patch.object(transport_class, "__init__") as patched: with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, ): with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.ssl_credentials", - new_callable=mock.PropertyMock, - ) as ssl_credentials_mock: - if use_client_cert_env == "false": - is_mtls_mock.return_value = False - ssl_credentials_mock.return_value = None - expected_host = client.DEFAULT_ENDPOINT - expected_ssl_channel_creds = None - else: - is_mtls_mock.return_value = True - ssl_credentials_mock.return_value = mock.Mock() - expected_host = client.DEFAULT_MTLS_ENDPOINT - expected_ssl_channel_creds = ( - ssl_credentials_mock.return_value - ) - - patched.return_value = None - client = client_class() - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + "google.auth.transport.mtls.default_client_cert_source", + return_value=client_cert_source_callback, + ): + if use_client_cert_env == "false": + expected_host = client.DEFAULT_ENDPOINT + expected_client_cert_source = None + else: + expected_host = client.DEFAULT_MTLS_ENDPOINT + expected_client_cert_source = client_cert_source_callback - # Check the case client_cert_source and ADC client cert are not provided. - with mock.patch.dict( - os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} - ): - with mock.patch.object(transport_class, "__init__") as patched: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None - ): - with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - is_mtls_mock.return_value = False patched.return_value = None client = client_class() patched.assert_called_once_with( credentials=None, credentials_file=None, - host=client.DEFAULT_ENDPOINT, + host=expected_host, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=expected_client_cert_source, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) + # Check the case client_cert_source and ADC client cert are not provided. + with mock.patch.dict( + os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} + ): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + client_cert_source_for_mtls=None, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) + @pytest.mark.parametrize( "client_class,transport_class,transport_name", @@ -404,7 +387,7 @@ def test_data_catalog_client_client_options_scopes( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=["1", "2"], - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -434,7 +417,7 @@ def test_data_catalog_client_client_options_credentials_file( credentials_file="credentials.json", host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -451,7 +434,7 @@ def test_data_catalog_client_client_options_from_dict(): credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -6726,6 +6709,48 @@ def test_data_catalog_transport_auth_adc(): ) +@pytest.mark.parametrize( + "transport_class", + [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], +) +def test_data_catalog_grpc_transport_client_cert_source_for_mtls(transport_class): + cred = credentials.AnonymousCredentials() + + # Check ssl_channel_credentials is used if provided. + with mock.patch.object(transport_class, "create_channel") as mock_create_channel: + mock_ssl_channel_creds = mock.Mock() + transport_class( + host="squid.clam.whelk", + credentials=cred, + ssl_channel_credentials=mock_ssl_channel_creds, + ) + mock_create_channel.assert_called_once_with( + "squid.clam.whelk:443", + credentials=cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_channel_creds, + quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], + ) + + # Check if ssl_channel_credentials is not provided, then client_cert_source_for_mtls + # is used. + with mock.patch.object(transport_class, "create_channel", return_value=mock.Mock()): + with mock.patch("grpc.ssl_channel_credentials") as mock_ssl_cred: + transport_class( + credentials=cred, + client_cert_source_for_mtls=client_cert_source_callback, + ) + expected_cert, expected_key = client_cert_source_callback() + mock_ssl_cred.assert_called_once_with( + certificate_chain=expected_cert, private_key=expected_key + ) + + def test_data_catalog_host_no_port(): client = DataCatalogClient( credentials=credentials.AnonymousCredentials(), @@ -6770,6 +6795,8 @@ def test_data_catalog_grpc_asyncio_transport_channel(): assert transport._ssl_channel_credentials == None +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], @@ -6817,6 +6844,8 @@ def test_data_catalog_transport_channel_mtls_with_client_cert_source(transport_c assert transport._ssl_channel_credentials == mock_ssl_cred +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [transports.DataCatalogGrpcTransport, transports.DataCatalogGrpcAsyncIOTransport], diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py index a8e44f60..c5ed26ec 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py @@ -181,7 +181,7 @@ def test_policy_tag_manager_client_client_options( credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -197,7 +197,7 @@ def test_policy_tag_manager_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -213,7 +213,7 @@ def test_policy_tag_manager_client_client_options( credentials_file=None, host=client.DEFAULT_MTLS_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -241,7 +241,7 @@ def test_policy_tag_manager_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id="octopus", client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -302,29 +302,25 @@ def test_policy_tag_manager_client_mtls_env_auto( client_cert_source=client_cert_source_callback ) with mock.patch.object(transport_class, "__init__") as patched: - ssl_channel_creds = mock.Mock() - with mock.patch( - "grpc.ssl_channel_credentials", return_value=ssl_channel_creds - ): - patched.return_value = None - client = client_class(client_options=options) + patched.return_value = None + client = client_class(client_options=options) - if use_client_cert_env == "false": - expected_ssl_channel_creds = None - expected_host = client.DEFAULT_ENDPOINT - else: - expected_ssl_channel_creds = ssl_channel_creds - expected_host = client.DEFAULT_MTLS_ENDPOINT + if use_client_cert_env == "false": + expected_client_cert_source = None + expected_host = client.DEFAULT_ENDPOINT + else: + expected_client_cert_source = client_cert_source_callback + expected_host = client.DEFAULT_MTLS_ENDPOINT - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=expected_host, + scopes=None, + client_cert_source_for_mtls=expected_client_cert_source, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) # Check the case ADC client cert is provided. Whether client cert is used depends on # GOOGLE_API_USE_CLIENT_CERTIFICATE value. @@ -333,66 +329,53 @@ def test_policy_tag_manager_client_mtls_env_auto( ): with mock.patch.object(transport_class, "__init__") as patched: with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, ): with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.ssl_credentials", - new_callable=mock.PropertyMock, - ) as ssl_credentials_mock: - if use_client_cert_env == "false": - is_mtls_mock.return_value = False - ssl_credentials_mock.return_value = None - expected_host = client.DEFAULT_ENDPOINT - expected_ssl_channel_creds = None - else: - is_mtls_mock.return_value = True - ssl_credentials_mock.return_value = mock.Mock() - expected_host = client.DEFAULT_MTLS_ENDPOINT - expected_ssl_channel_creds = ( - ssl_credentials_mock.return_value - ) - - patched.return_value = None - client = client_class() - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + "google.auth.transport.mtls.default_client_cert_source", + return_value=client_cert_source_callback, + ): + if use_client_cert_env == "false": + expected_host = client.DEFAULT_ENDPOINT + expected_client_cert_source = None + else: + expected_host = client.DEFAULT_MTLS_ENDPOINT + expected_client_cert_source = client_cert_source_callback - # Check the case client_cert_source and ADC client cert are not provided. - with mock.patch.dict( - os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} - ): - with mock.patch.object(transport_class, "__init__") as patched: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None - ): - with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - is_mtls_mock.return_value = False patched.return_value = None client = client_class() patched.assert_called_once_with( credentials=None, credentials_file=None, - host=client.DEFAULT_ENDPOINT, + host=expected_host, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=expected_client_cert_source, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) + # Check the case client_cert_source and ADC client cert are not provided. + with mock.patch.dict( + os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} + ): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + client_cert_source_for_mtls=None, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) + @pytest.mark.parametrize( "client_class,transport_class,transport_name", @@ -418,7 +401,7 @@ def test_policy_tag_manager_client_client_options_scopes( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=["1", "2"], - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -448,7 +431,7 @@ def test_policy_tag_manager_client_client_options_credentials_file( credentials_file="credentials.json", host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -467,7 +450,7 @@ def test_policy_tag_manager_client_client_options_from_dict(): credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -3605,6 +3588,51 @@ def test_policy_tag_manager_transport_auth_adc(): ) +@pytest.mark.parametrize( + "transport_class", + [ + transports.PolicyTagManagerGrpcTransport, + transports.PolicyTagManagerGrpcAsyncIOTransport, + ], +) +def test_policy_tag_manager_grpc_transport_client_cert_source_for_mtls(transport_class): + cred = credentials.AnonymousCredentials() + + # Check ssl_channel_credentials is used if provided. + with mock.patch.object(transport_class, "create_channel") as mock_create_channel: + mock_ssl_channel_creds = mock.Mock() + transport_class( + host="squid.clam.whelk", + credentials=cred, + ssl_channel_credentials=mock_ssl_channel_creds, + ) + mock_create_channel.assert_called_once_with( + "squid.clam.whelk:443", + credentials=cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_channel_creds, + quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], + ) + + # Check if ssl_channel_credentials is not provided, then client_cert_source_for_mtls + # is used. + with mock.patch.object(transport_class, "create_channel", return_value=mock.Mock()): + with mock.patch("grpc.ssl_channel_credentials") as mock_ssl_cred: + transport_class( + credentials=cred, + client_cert_source_for_mtls=client_cert_source_callback, + ) + expected_cert, expected_key = client_cert_source_callback() + mock_ssl_cred.assert_called_once_with( + certificate_chain=expected_cert, private_key=expected_key + ) + + def test_policy_tag_manager_host_no_port(): client = PolicyTagManagerClient( credentials=credentials.AnonymousCredentials(), @@ -3649,6 +3677,8 @@ def test_policy_tag_manager_grpc_asyncio_transport_channel(): assert transport._ssl_channel_credentials == None +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [ @@ -3701,6 +3731,8 @@ def test_policy_tag_manager_transport_channel_mtls_with_client_cert_source( assert transport._ssl_channel_credentials == mock_ssl_cred +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [ diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py index 6a42e9d0..88c6a79b 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py @@ -193,7 +193,7 @@ def test_policy_tag_manager_serialization_client_client_options( credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -209,7 +209,7 @@ def test_policy_tag_manager_serialization_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -225,7 +225,7 @@ def test_policy_tag_manager_serialization_client_client_options( credentials_file=None, host=client.DEFAULT_MTLS_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -253,7 +253,7 @@ def test_policy_tag_manager_serialization_client_client_options( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id="octopus", client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -314,29 +314,25 @@ def test_policy_tag_manager_serialization_client_mtls_env_auto( client_cert_source=client_cert_source_callback ) with mock.patch.object(transport_class, "__init__") as patched: - ssl_channel_creds = mock.Mock() - with mock.patch( - "grpc.ssl_channel_credentials", return_value=ssl_channel_creds - ): - patched.return_value = None - client = client_class(client_options=options) + patched.return_value = None + client = client_class(client_options=options) - if use_client_cert_env == "false": - expected_ssl_channel_creds = None - expected_host = client.DEFAULT_ENDPOINT - else: - expected_ssl_channel_creds = ssl_channel_creds - expected_host = client.DEFAULT_MTLS_ENDPOINT + if use_client_cert_env == "false": + expected_client_cert_source = None + expected_host = client.DEFAULT_ENDPOINT + else: + expected_client_cert_source = client_cert_source_callback + expected_host = client.DEFAULT_MTLS_ENDPOINT - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=expected_host, + scopes=None, + client_cert_source_for_mtls=expected_client_cert_source, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) # Check the case ADC client cert is provided. Whether client cert is used depends on # GOOGLE_API_USE_CLIENT_CERTIFICATE value. @@ -345,66 +341,53 @@ def test_policy_tag_manager_serialization_client_mtls_env_auto( ): with mock.patch.object(transport_class, "__init__") as patched: with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=True, ): with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.ssl_credentials", - new_callable=mock.PropertyMock, - ) as ssl_credentials_mock: - if use_client_cert_env == "false": - is_mtls_mock.return_value = False - ssl_credentials_mock.return_value = None - expected_host = client.DEFAULT_ENDPOINT - expected_ssl_channel_creds = None - else: - is_mtls_mock.return_value = True - ssl_credentials_mock.return_value = mock.Mock() - expected_host = client.DEFAULT_MTLS_ENDPOINT - expected_ssl_channel_creds = ( - ssl_credentials_mock.return_value - ) - - patched.return_value = None - client = client_class() - patched.assert_called_once_with( - credentials=None, - credentials_file=None, - host=expected_host, - scopes=None, - ssl_channel_credentials=expected_ssl_channel_creds, - quota_project_id=None, - client_info=transports.base.DEFAULT_CLIENT_INFO, - ) + "google.auth.transport.mtls.default_client_cert_source", + return_value=client_cert_source_callback, + ): + if use_client_cert_env == "false": + expected_host = client.DEFAULT_ENDPOINT + expected_client_cert_source = None + else: + expected_host = client.DEFAULT_MTLS_ENDPOINT + expected_client_cert_source = client_cert_source_callback - # Check the case client_cert_source and ADC client cert are not provided. - with mock.patch.dict( - os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} - ): - with mock.patch.object(transport_class, "__init__") as patched: - with mock.patch( - "google.auth.transport.grpc.SslCredentials.__init__", return_value=None - ): - with mock.patch( - "google.auth.transport.grpc.SslCredentials.is_mtls", - new_callable=mock.PropertyMock, - ) as is_mtls_mock: - is_mtls_mock.return_value = False patched.return_value = None client = client_class() patched.assert_called_once_with( credentials=None, credentials_file=None, - host=client.DEFAULT_ENDPOINT, + host=expected_host, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=expected_client_cert_source, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) + # Check the case client_cert_source and ADC client cert are not provided. + with mock.patch.dict( + os.environ, {"GOOGLE_API_USE_CLIENT_CERTIFICATE": use_client_cert_env} + ): + with mock.patch.object(transport_class, "__init__") as patched: + with mock.patch( + "google.auth.transport.mtls.has_default_client_cert_source", + return_value=False, + ): + patched.return_value = None + client = client_class() + patched.assert_called_once_with( + credentials=None, + credentials_file=None, + host=client.DEFAULT_ENDPOINT, + scopes=None, + client_cert_source_for_mtls=None, + quota_project_id=None, + client_info=transports.base.DEFAULT_CLIENT_INFO, + ) + @pytest.mark.parametrize( "client_class,transport_class,transport_name", @@ -434,7 +417,7 @@ def test_policy_tag_manager_serialization_client_client_options_scopes( credentials_file=None, host=client.DEFAULT_ENDPOINT, scopes=["1", "2"], - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -468,7 +451,7 @@ def test_policy_tag_manager_serialization_client_client_options_credentials_file credentials_file="credentials.json", host=client.DEFAULT_ENDPOINT, scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -487,7 +470,7 @@ def test_policy_tag_manager_serialization_client_client_options_from_dict(): credentials_file=None, host="squid.clam.whelk", scopes=None, - ssl_channel_credentials=None, + client_cert_source_for_mtls=None, quota_project_id=None, client_info=transports.base.DEFAULT_CLIENT_INFO, ) @@ -925,6 +908,53 @@ def test_policy_tag_manager_serialization_transport_auth_adc(): ) +@pytest.mark.parametrize( + "transport_class", + [ + transports.PolicyTagManagerSerializationGrpcTransport, + transports.PolicyTagManagerSerializationGrpcAsyncIOTransport, + ], +) +def test_policy_tag_manager_serialization_grpc_transport_client_cert_source_for_mtls( + transport_class, +): + cred = credentials.AnonymousCredentials() + + # Check ssl_channel_credentials is used if provided. + with mock.patch.object(transport_class, "create_channel") as mock_create_channel: + mock_ssl_channel_creds = mock.Mock() + transport_class( + host="squid.clam.whelk", + credentials=cred, + ssl_channel_credentials=mock_ssl_channel_creds, + ) + mock_create_channel.assert_called_once_with( + "squid.clam.whelk:443", + credentials=cred, + credentials_file=None, + scopes=("https://www.googleapis.com/auth/cloud-platform",), + ssl_credentials=mock_ssl_channel_creds, + quota_project_id=None, + options=[ + ("grpc.max_send_message_length", -1), + ("grpc.max_receive_message_length", -1), + ], + ) + + # Check if ssl_channel_credentials is not provided, then client_cert_source_for_mtls + # is used. + with mock.patch.object(transport_class, "create_channel", return_value=mock.Mock()): + with mock.patch("grpc.ssl_channel_credentials") as mock_ssl_cred: + transport_class( + credentials=cred, + client_cert_source_for_mtls=client_cert_source_callback, + ) + expected_cert, expected_key = client_cert_source_callback() + mock_ssl_cred.assert_called_once_with( + certificate_chain=expected_cert, private_key=expected_key + ) + + def test_policy_tag_manager_serialization_host_no_port(): client = PolicyTagManagerSerializationClient( credentials=credentials.AnonymousCredentials(), @@ -969,6 +999,8 @@ def test_policy_tag_manager_serialization_grpc_asyncio_transport_channel(): assert transport._ssl_channel_credentials == None +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [ @@ -1021,6 +1053,8 @@ def test_policy_tag_manager_serialization_transport_channel_mtls_with_client_cer assert transport._ssl_channel_credentials == mock_ssl_cred +# Remove this test when deprecated arguments (api_mtls_endpoint, client_cert_source) are +# removed from grpc/grpc_asyncio transport constructor. @pytest.mark.parametrize( "transport_class", [ From 2dd95220a49eee886c2da27248a2a5941474943a Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Wed, 3 Feb 2021 09:52:03 -0800 Subject: [PATCH 19/26] test: update to latest configs (#104) This PR was generated using Autosynth. :rainbow: Synth log will be available here: https://source.cloud.google.com/results/invocations/6490b6f9-79ee-41b3-8813-ec1cd161dafb/targets - [ ] To automatically regenerate this PR, check this box. Source-Link: https://github.com/googleapis/synthtool/commit/778d8beae28d6d87eb01fdc839a4b4d966ed2ebe Source-Link: https://github.com/googleapis/synthtool/commit/573f7655311b553a937f9123bee17bf78497db95 Source-Link: https://github.com/googleapis/synthtool/commit/ba960d730416fe05c50547e975ce79fcee52c671 --- .github/header-checker-lint.yml | 15 +++++++++++++++ .trampolinerc | 1 - noxfile.py | 1 + synth.metadata | 5 +++-- 4 files changed, 19 insertions(+), 3 deletions(-) create mode 100644 .github/header-checker-lint.yml diff --git a/.github/header-checker-lint.yml b/.github/header-checker-lint.yml new file mode 100644 index 00000000..fc281c05 --- /dev/null +++ b/.github/header-checker-lint.yml @@ -0,0 +1,15 @@ +{"allowedCopyrightHolders": ["Google LLC"], + "allowedLicenses": ["Apache-2.0", "MIT", "BSD-3"], + "ignoreFiles": ["**/requirements.txt", "**/requirements-test.txt"], + "sourceFileExtensions": [ + "ts", + "js", + "java", + "sh", + "Dockerfile", + "yaml", + "py", + "html", + "txt" + ] +} \ No newline at end of file diff --git a/.trampolinerc b/.trampolinerc index c7d663ae..383b6ec8 100644 --- a/.trampolinerc +++ b/.trampolinerc @@ -18,7 +18,6 @@ required_envvars+=( "STAGING_BUCKET" "V2_STAGING_BUCKET" - "NOX_SESSION" ) # Add env vars which are passed down into the container here. diff --git a/noxfile.py b/noxfile.py index 8fca72e4..52fd61ca 100644 --- a/noxfile.py +++ b/noxfile.py @@ -86,6 +86,7 @@ def default(session): session.install( "mock", "pytest", "pytest-cov", ) + session.install("-e", ".") # Run py.test against the unit tests. diff --git a/synth.metadata b/synth.metadata index 008c72bb..e56a72ab 100644 --- a/synth.metadata +++ b/synth.metadata @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "56ddc68f36b32341e9f22c2c59b4ce6aa3ba635f" + "sha": "778d8beae28d6d87eb01fdc839a4b4d966ed2ebe" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "56ddc68f36b32341e9f22c2c59b4ce6aa3ba635f" + "sha": "778d8beae28d6d87eb01fdc839a4b4d966ed2ebe" } } ], @@ -58,6 +58,7 @@ ".github/ISSUE_TEMPLATE/feature_request.md", ".github/ISSUE_TEMPLATE/support_request.md", ".github/PULL_REQUEST_TEMPLATE.md", + ".github/header-checker-lint.yml", ".github/release-please.yml", ".github/snippet-bot.yml", ".gitignore", From 2f98f2244271d92f79fdb26103478166958b8c8a Mon Sep 17 00:00:00 2001 From: Ricardo Mendes Date: Tue, 16 Feb 2021 20:14:10 -0300 Subject: [PATCH 20/26] docs: fix `type_` attribute name in the migration guide (#113) The main goal of this PR is to fix the `entry.type_` attribute name, which is incorrect in the current sample due to the breaking changes introduced by version 3.0.0. I'm leveraging it to add blank lines between the **Breaking Change** blocks' titles and contents in order to improve readability. --- UPGRADING.md | 15 +++++++++------ 1 file changed, 9 insertions(+), 6 deletions(-) diff --git a/UPGRADING.md b/UPGRADING.md index 1fa25990..1046f20e 100644 --- a/UPGRADING.md +++ b/UPGRADING.md @@ -7,13 +7,15 @@ If you experience issues or have questions, please file an [issue](https://githu ## Supported Python Versions > **WARNING**: Breaking change -The 3.0.0 release requires Python 3.6+. +> +> The 3.0.0 release requires Python 3.6+. ## Method Calls > **WARNING**: Breaking change -Methods expect request objects. We provide a script that will convert most common use cases. +> +> Methods expect request objects. We provide a script that will convert most common use cases. * Install the library @@ -100,7 +102,7 @@ response = client.create_entry_group( parent=parent, entry_group_id=entry_group_id, entry_group=entry_group - ) # Make an API request. +) # Make an API request. ``` This call is invalid because it mixes `request` with a keyword argument `entry_group`. Executing this code @@ -121,8 +123,9 @@ response = client.create_entry_group( ## Enums and Types -> **WARNING**: Breaking change -The submodules `enums` and `types` have been removed. +> **WARNING**: Breaking changes +> +> The submodules `enums` and `types` have been removed; the `type` attributes were renamed to `type_` to avoid name collisions. **Before:** ```py @@ -136,7 +139,7 @@ entry.type = datacatalog_v1.enums.EntryType.FILESET ```py from google.cloud import datacatalog_v1 entry = datacatalog_v1.Entry() -entry.type = datacatalog_v1.EntryType.FILESET +entry.type_ = datacatalog_v1.EntryType.FILESET ``` ## Common Resource Path Helper Methods From 4bfa587903105cb3de2272618374df0b04156017 Mon Sep 17 00:00:00 2001 From: Shinichi TAMURA Date: Sat, 20 Feb 2021 01:56:03 +0900 Subject: [PATCH 21/26] docs: fix upgrade guide (#114) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Current upgrade guide is confusing especially when upgrading from v2.0.0 to v3.0.0 because some of the change is only introduced at v2.0.0. I fixed this. This is just a documentation fix rather than code fix, so note that following checks are not passed --- Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly: - [ ] Make sure to open an issue as a [bug/issue](https://github.com/googleapis/python-datacatalog/issues/new/choose) before writing your code! That way we can discuss the change, evaluate designs, and agree on the general idea - [ ] Ensure the tests and linter pass - [ ] Code coverage does not decrease (if any source code was changed) - [ ] Appropriate docs were updated (if necessary) Fixes # 🦕 --- ## Additional question There are three way to invoke API-calling methods: (1) pass keyword arguments, (2) pass `request` argument with request class and (3) pass `request` argument with dict. The auto-migration tool `fixup_datacatalog_v1_keywords.py` fixes code to use method (3) even if the methods accepts keyword arguments. However, IMO, it's poorly typed and not linter/autocompletion-friendly. Actually what is the recommended way? And is there a plan to remove support of method (1)? | | Available since | Typed | Note | |:---|:---|:---|:---| | (1) | v1 | ◎ (since v2) | | | (2) | v2 | △ | Code looks redundant | | (3) | v2 | ☓ | | (1) ```py response = client.create_entry_group( parent=parent, entry_group_id=entry_group_id, entry_group=entry_group ) ``` (2) ```py response = client.create_entry_group( datacatalog.CreateEntryGroupRequest( parent=parent, entry_group_id=entry_group_id, entry_group=entry_group ) ) ``` (3) ```py response = client.create_entry_group( request={ "parent": parent, "entry_group_id": entry_group_id, "entry_group": entry_group } ) ``` --- UPGRADING.md | 38 +++++++++++++++++++++++++++++++++----- 1 file changed, 33 insertions(+), 5 deletions(-) diff --git a/UPGRADING.md b/UPGRADING.md index 1046f20e..34f5be7e 100644 --- a/UPGRADING.md +++ b/UPGRADING.md @@ -1,18 +1,28 @@ # 3.0.0 Migration Guide -The 3.0 release of the `google-cloud-datacatalog` client is a significant upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for earlier versions of this library will likely require updates to use this version. This document describes the changes that have been made, and what you need to do to update your usage. +This document describes the breaking changes that have been made, and what you need to do to update your usage. + +The most significant change was introduced at v2.0 release based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python), and includes substantial interface changes. Existing code written for eariler versions of this library will likely require updates to use this version. If you experience issues or have questions, please file an [issue](https://github.com/googleapis/python-datacatalog/issues). ## Supported Python Versions +| Applicable previous versions | +|:-----------------------------| +| v1.0.0 or lower | + > **WARNING**: Breaking change > -> The 3.0.0 release requires Python 3.6+. +> The 2.0.0 release requires Python 3.6+. ## Method Calls +| Applicable previous versions | +|:-----------------------------| +| v1.0.0 or lower | + > **WARNING**: Breaking change > > Methods expect request objects. We provide a script that will convert most common use cases. @@ -62,7 +72,7 @@ In `google-cloud-datacatalog<=1.0.0`, parameters required by the API were positi ): ``` -In the 3.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. +Since the 2.0.0 release, all methods have a single positional parameter `request`. Method docstrings indicate whether a parameter is required or optional. Some methods have additional keyword only parameters. The available parameters depend on the `google.api.method_signature` annotation specified by the API producer. @@ -122,6 +132,9 @@ response = client.create_entry_group( ## Enums and Types +| Applicable previous versions | +|:-----------------------------| +| v2.0.0 or lower | > **WARNING**: Breaking changes > @@ -142,7 +155,22 @@ entry = datacatalog_v1.Entry() entry.type_ = datacatalog_v1.EntryType.FILESET ``` +The renamed attributes are: + +* `TagTemplateField.type` -> `TagTemplatedField.type_` +* `ColumnSchema.type` -> `ColumnSchema.type_` +* `Entry.type` -> `Entry.type_` + ## Common Resource Path Helper Methods -The `location_path` method existing in `google-cloud-datacatalog<=1.0.0` was renamed to `common_location_path`. -And more resource path helper methods were added: `common_billing_account_path`, `common_folder_path`, `common_organization_path`, and `common_project_path`. +| Applicable previous versions | +|:-----------------------------| +| v1.0.0 or lower | + +The `location_path` method existing in `google-cloud-datacatalog<=1.0.0` was renamed to `common_location_path` in v3.0.0. + +If you are upgrading from v1.0.0 or lower, modify your code to use new method name. + +If you are upgrading from v2.0.0, and constructing paths manually as described in [previous upgrade guide](https://github.com/googleapis/python-datacatalog/blob/v2.0.0/UPGRADING.md#project-path-helper-methods), now you can use `common_location_path` method. + +There are also more resource path helper methods were added: `common_billing_account_path`, `common_folder_path`, `common_organization_path`, and `common_project_path`. From ec9d92555c4adaa97a3dfccb85f060cc86ed2747 Mon Sep 17 00:00:00 2001 From: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com> Date: Thu, 18 Mar 2021 03:24:01 -0600 Subject: [PATCH 22/26] chore: clean up synth.py (#117) Clean up a some replacements that are not needed. Fixes #94. --- synth.py | 21 --------------------- 1 file changed, 21 deletions(-) diff --git a/synth.py b/synth.py index 7037d24b..ed4e1f6b 100644 --- a/synth.py +++ b/synth.py @@ -46,14 +46,6 @@ ], ) -# Fix docstring issue for classes with no summary line -s.replace( - "google/cloud/**/proto/*_pb2.py", - ''''__doc__': """Attributes:''', - '''"__doc__": """ - Attributes:''', -) - # ---------------------------------------------------------------------------- # Add templated files # ---------------------------------------------------------------------------- @@ -69,18 +61,5 @@ python.py_samples() -# Temporarily disable warnings due to -# https://github.com/googleapis/gapic-generator-python/issues/525 -s.replace("noxfile.py", '[\"\']-W[\"\']', '# "-W"') - -# ---------------------------------------------------------------------------- -# Samples templates -# ---------------------------------------------------------------------------- - -python.py_samples() - -# Temporarily disable warnings due to -# https://github.com/googleapis/gapic-generator-python/issues/525 -s.replace("noxfile.py", '[\"\']-W[\"\']', '# "-W"') s.shell.run(["nox", "-s", "blacken"], hide_output=False) From cde0673fbfb2289bc017268d3799eb428162f3d8 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Thu, 18 Mar 2021 11:26:02 -0700 Subject: [PATCH 23/26] chore: update build templates (#120) This PR was generated using Autosynth. :rainbow: Synth log will be available here: https://source.cloud.google.com/results/invocations/207b67ab-fbf5-4761-ae84-bb5f23460acf/targets - [ ] To automatically regenerate this PR, check this box. Source-Link: https://github.com/googleapis/synthtool/commit/eda422b90c3dde4a872a13e6b78a8f802c40d0db Source-Link: https://github.com/googleapis/synthtool/commit/2c54c473779ea731128cea61a3a6c975a08a5378 Source-Link: https://github.com/googleapis/synthtool/commit/0780323da96d5a53925fe0547757181fe76e8f1e Source-Link: https://github.com/googleapis/synthtool/commit/d17674372e27fb8f23013935e794aa37502071aa Source-Link: https://github.com/googleapis/synthtool/commit/4679e7e415221f03ff2a71e3ffad75b9ec41d87e Source-Link: https://github.com/googleapis/synthtool/commit/33366574ffb9e11737b3547eb6f020ecae0536e8 Source-Link: https://github.com/googleapis/synthtool/commit/d1bb9173100f62c0cfc8f3138b62241e7f47ca6a Source-Link: https://github.com/googleapis/synthtool/commit/778d8beae28d6d87eb01fdc839a4b4d966ed2ebe --- .gitignore | 4 +++- .kokoro/build.sh | 10 ++++++++++ CONTRIBUTING.rst | 22 ++++++++++++++++++---- MANIFEST.in | 4 ++-- docs/UPGRADING.md | 0 noxfile.py | 25 ++++++++++++++++++++++--- renovate.json | 3 ++- synth.metadata | 6 +++--- 8 files changed, 60 insertions(+), 14 deletions(-) mode change 100644 => 120000 docs/UPGRADING.md diff --git a/.gitignore b/.gitignore index b9daa52f..b4243ced 100644 --- a/.gitignore +++ b/.gitignore @@ -50,8 +50,10 @@ docs.metadata # Virtual environment env/ + +# Test logs coverage.xml -sponge_log.xml +*sponge_log.xml # System test environment variables. system_tests/local_test_setup diff --git a/.kokoro/build.sh b/.kokoro/build.sh index d86a114e..a6220c8a 100755 --- a/.kokoro/build.sh +++ b/.kokoro/build.sh @@ -40,6 +40,16 @@ python3 -m pip uninstall --yes --quiet nox-automation python3 -m pip install --upgrade --quiet nox python3 -m nox --version +# If this is a continuous build, send the test log to the FlakyBot. +# See https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot. +if [[ $KOKORO_BUILD_ARTIFACTS_SUBDIR = *"continuous"* ]]; then + cleanup() { + chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot + $KOKORO_GFILE_DIR/linux_amd64/flakybot + } + trap cleanup EXIT HUP +fi + # If NOX_SESSION is set, it only runs the specified session, # otherwise run all the sessions. if [[ -n "${NOX_SESSION:-}" ]]; then diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst index 1d7f7e5d..631f94f9 100644 --- a/CONTRIBUTING.rst +++ b/CONTRIBUTING.rst @@ -70,9 +70,14 @@ We use `nox `__ to instrument our tests. - To test your changes, run unit tests with ``nox``:: $ nox -s unit-2.7 - $ nox -s unit-3.7 + $ nox -s unit-3.8 $ ... +- Args to pytest can be passed through the nox command separated by a `--`. For + example, to run a single test:: + + $ nox -s unit-3.8 -- -k + .. note:: The unit tests and system tests are described in the @@ -93,8 +98,12 @@ On Debian/Ubuntu:: ************ Coding Style ************ +- We use the automatic code formatter ``black``. You can run it using + the nox session ``blacken``. This will eliminate many lint errors. Run via:: + + $ nox -s blacken -- PEP8 compliance, with exceptions defined in the linter configuration. +- PEP8 compliance is required, with exceptions defined in the linter configuration. If you have ``nox`` installed, you can test that you have not introduced any non-compliant code via:: @@ -133,13 +142,18 @@ Running System Tests - To run system tests, you can execute:: - $ nox -s system-3.7 + # Run all system tests + $ nox -s system-3.8 $ nox -s system-2.7 + # Run a single system test + $ nox -s system-3.8 -- -k + + .. note:: System tests are only configured to run under Python 2.7 and - Python 3.7. For expediency, we do not run them in older versions + Python 3.8. For expediency, we do not run them in older versions of Python 3. This alone will not run the tests. You'll need to change some local diff --git a/MANIFEST.in b/MANIFEST.in index e9e29d12..e783f4c6 100644 --- a/MANIFEST.in +++ b/MANIFEST.in @@ -16,10 +16,10 @@ # Generated by synthtool. DO NOT EDIT! include README.rst LICENSE -recursive-include google *.json *.proto +recursive-include google *.json *.proto py.typed recursive-include tests * global-exclude *.py[co] global-exclude __pycache__ # Exclude scripts for samples readmegen -prune scripts/readme-gen \ No newline at end of file +prune scripts/readme-gen diff --git a/docs/UPGRADING.md b/docs/UPGRADING.md deleted file mode 100644 index 01097c8c..00000000 --- a/docs/UPGRADING.md +++ /dev/null @@ -1 +0,0 @@ -../UPGRADING.md \ No newline at end of file diff --git a/docs/UPGRADING.md b/docs/UPGRADING.md new file mode 120000 index 00000000..01097c8c --- /dev/null +++ b/docs/UPGRADING.md @@ -0,0 +1 @@ +../UPGRADING.md \ No newline at end of file diff --git a/noxfile.py b/noxfile.py index 52fd61ca..9249c5b9 100644 --- a/noxfile.py +++ b/noxfile.py @@ -41,6 +41,9 @@ "docs", ] +# Error if a python version is missing +nox.options.error_on_missing_interpreters = True + @nox.session(python=DEFAULT_PYTHON_VERSION) def lint(session): @@ -93,6 +96,7 @@ def default(session): session.run( "py.test", "--quiet", + f"--junitxml=unit_{session.python}_sponge_log.xml", "--cov=google/cloud", "--cov=tests/unit", "--cov-append", @@ -122,6 +126,9 @@ def system(session): # Sanity check: Only run tests if the environment variable is set. if not os.environ.get("GOOGLE_APPLICATION_CREDENTIALS", ""): session.skip("Credentials must be set via environment variable") + # Install pyopenssl for mTLS testing. + if os.environ.get("GOOGLE_API_USE_CLIENT_CERTIFICATE", "false") == "true": + session.install("pyopenssl") system_test_exists = os.path.exists(system_test_path) system_test_folder_exists = os.path.exists(system_test_folder_path) @@ -141,9 +148,21 @@ def system(session): # Run py.test against the system tests. if system_test_exists: - session.run("py.test", "--quiet", system_test_path, *session.posargs) + session.run( + "py.test", + "--quiet", + f"--junitxml=system_{session.python}_sponge_log.xml", + system_test_path, + *session.posargs, + ) if system_test_folder_exists: - session.run("py.test", "--quiet", system_test_folder_path, *session.posargs) + session.run( + "py.test", + "--quiet", + f"--junitxml=system_{session.python}_sponge_log.xml", + system_test_folder_path, + *session.posargs, + ) @nox.session(python=DEFAULT_PYTHON_VERSION) @@ -169,7 +188,7 @@ def docs(session): shutil.rmtree(os.path.join("docs", "_build"), ignore_errors=True) session.run( "sphinx-build", - # # "-W", # warnings as errors + "-W", # warnings as errors "-T", # show full traceback on exception "-N", # no colors "-b", diff --git a/renovate.json b/renovate.json index 4fa94931..f08bc22c 100644 --- a/renovate.json +++ b/renovate.json @@ -1,5 +1,6 @@ { "extends": [ "config:base", ":preserveSemverRanges" - ] + ], + "ignorePaths": [".pre-commit-config.yaml"] } diff --git a/synth.metadata b/synth.metadata index e56a72ab..34e965a4 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "00a5f47c852550de644960a258fcff29121cda91" + "sha": "ec9d92555c4adaa97a3dfccb85f060cc86ed2747" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "778d8beae28d6d87eb01fdc839a4b4d966ed2ebe" + "sha": "eda422b90c3dde4a872a13e6b78a8f802c40d0db" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "778d8beae28d6d87eb01fdc839a4b4d966ed2ebe" + "sha": "eda422b90c3dde4a872a13e6b78a8f802c40d0db" } } ], From 931c1c460c40c7c1b7fe32ef16c48275be2e5a16 Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Fri, 19 Mar 2021 10:53:00 -0700 Subject: [PATCH 24/26] chore: upgrade gapic-generator-python to 0.42.2 (#119) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * feat: Add Pub/Sub endpoints for Cloud Channel API. PiperOrigin-RevId: 355059873 Source-Author: Google APIs Source-Date: Mon Feb 1 17:13:22 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 6ef9eaea379fc1cc0355e06a5a20b594543ee693 Source-Link: https://github.com/googleapis/googleapis/commit/6ef9eaea379fc1cc0355e06a5a20b594543ee693 * fix: Fix constraint resource pattern annotation PiperOrigin-RevId: 355915985 Source-Author: Google APIs Source-Date: Fri Feb 5 13:27:16 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 35ce99fec72979f6f9b2a5adae835a49648a3231 Source-Link: https://github.com/googleapis/googleapis/commit/35ce99fec72979f6f9b2a5adae835a49648a3231 * chore: update gapic-generator-python PiperOrigin-RevId: 355923884 Source-Author: Google APIs Source-Date: Fri Feb 5 14:04:52 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 5e3dacee19405529b841b53797df799c2383536c Source-Link: https://github.com/googleapis/googleapis/commit/5e3dacee19405529b841b53797df799c2383536c * chore: remove non-existent package option in java_gapic_library rules for cloud APIs Committer: @miraleung PiperOrigin-RevId: 356328938 Source-Author: Google APIs Source-Date: Mon Feb 8 12:39:42 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 78e0057d81c6969507bf1195b5aad8ac3e7feafd Source-Link: https://github.com/googleapis/googleapis/commit/78e0057d81c6969507bf1195b5aad8ac3e7feafd * feat: Make resolution status field available for error groups. Now callers can set the status of an error group by passing this to UpdateGroup. When not specified, it's treated like OPEN. feat: Make source location available for error groups created from GAE. PiperOrigin-RevId: 356330876 Source-Author: Google APIs Source-Date: Mon Feb 8 12:48:44 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: c1e59709c1d28795fe9b70eb479579556056bfad Source-Link: https://github.com/googleapis/googleapis/commit/c1e59709c1d28795fe9b70eb479579556056bfad * feat: added ApplySoftwareUpdate API docs: various clarifications, new documentation for ApplySoftwareUpdate chore: update proto annotations PiperOrigin-RevId: 356380191 Source-Author: Google APIs Source-Date: Mon Feb 8 16:30:59 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 84cf54e45ed5970980ae868e0a1e5ad1266a8830 Source-Link: https://github.com/googleapis/googleapis/commit/84cf54e45ed5970980ae868e0a1e5ad1266a8830 * fix: Remove dependency on AppEngine's proto definitions. This also removes the source_references field. PiperOrigin-RevId: 356540804 Source-Author: Google APIs Source-Date: Tue Feb 9 10:53:59 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 4f60776fe99f1fd8261b6a0493a5f5f4d7e8d969 Source-Link: https://github.com/googleapis/googleapis/commit/4f60776fe99f1fd8261b6a0493a5f5f4d7e8d969 * docs: Update rules for currency_code in budget_amount. PiperOrigin-RevId: 357051517 Source-Author: Google APIs Source-Date: Thu Feb 11 13:54:03 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: f3a60f63c13fb434745ea59b990a82d6ffc803b5 Source-Link: https://github.com/googleapis/googleapis/commit/f3a60f63c13fb434745ea59b990a82d6ffc803b5 * feat: Publish new fields to support Customer Managed Encryption Keys (CMEK) on the existing Cloud Bigtable service methods. PiperOrigin-RevId: 359130387 Source-Author: Google APIs Source-Date: Tue Feb 23 14:08:20 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: eabec5a21219401bad79e1cc7d900c1658aee5fd Source-Link: https://github.com/googleapis/googleapis/commit/eabec5a21219401bad79e1cc7d900c1658aee5fd * chore: update gapic-generator-python to 0.40.11 PiperOrigin-RevId: 359562873 Source-Author: Google APIs Source-Date: Thu Feb 25 10:52:32 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 07932bb995e7dc91b43620ea8402c6668c7d102c Source-Link: https://github.com/googleapis/googleapis/commit/07932bb995e7dc91b43620ea8402c6668c7d102c * feat: add PHP µ-generator build targets to googleads Committer: @aohren PiperOrigin-RevId: 361555541 Source-Author: Google APIs Source-Date: Mon Mar 8 07:00:53 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 65d71a60baf9650404a4d9d65f29e9ba8db490d1 Source-Link: https://github.com/googleapis/googleapis/commit/65d71a60baf9650404a4d9d65f29e9ba8db490d1 * chore: upgrade gapic-generator-python to 0.42.2 PiperOrigin-RevId: 361662015 Source-Author: Google APIs Source-Date: Mon Mar 8 14:47:18 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 28a591963253d52ce3a25a918cafbdd9928de8cf Source-Link: https://github.com/googleapis/googleapis/commit/28a591963253d52ce3a25a918cafbdd9928de8cf * feat: added fallback option when restoring an agent docs: clarified experiment length PiperOrigin-RevId: 362090097 Source-Author: Google APIs Source-Date: Wed Mar 10 10:50:48 2021 -0800 Source-Repo: googleapis/googleapis Source-Sha: 4b16c60a8fffe213d3a5002f85696fef2b6a8172 Source-Link: https://github.com/googleapis/googleapis/commit/4b16c60a8fffe213d3a5002f85696fef2b6a8172 * feat:added mosaic layout docs:clarified alignment_period max value and updated IAM docs link PiperOrigin-RevId: 362979558 Source-Author: Google APIs Source-Date: Mon Mar 15 10:55:19 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 0dafa3963ef6fcb8a7f5daaa4bec12adb04de518 Source-Link: https://github.com/googleapis/googleapis/commit/0dafa3963ef6fcb8a7f5daaa4bec12adb04de518 * fix!: remove rpc or fields that are unintended to release fix!: remove StreamingAnalyzeContent, CreateCallMatcher, ListCallMatchers, DeleteCallMatcher rpc from v2/v2beta1 fix!: remove `input_audio` field from AnalyzeContentRequest from v2/v2beta1 fix!: remove proto message CreateCallMatcherRequest, CreateCallMatcherResponse, ListCallMatchersRequest, ListCallMatchersResponse, DeleteCallMatcherRequest, DeleteCallMatcherResponse, CallMatcher, StreamingAnalyzeContentRequest, StreamingAnalyzeContentResponse, AudioInput from v2/v2beta1, TelephonyDtmfEvents, TelephonyDtmf from v2 Committer: @sheimi PiperOrigin-RevId: 363762006 Source-Author: Google APIs Source-Date: Thu Mar 18 15:37:05 2021 -0700 Source-Repo: googleapis/googleapis Source-Sha: 8d17d8fafbb87ac64bb3179b99ac34ed41375a51 Source-Link: https://github.com/googleapis/googleapis/commit/8d17d8fafbb87ac64bb3179b99ac34ed41375a51 --- .../services/data_catalog/async_client.py | 32 +- .../services/data_catalog/client.py | 29 +- .../services/data_catalog/pagers.py | 11 +- google/cloud/datacatalog_v1/types/__init__.py | 148 +++--- google/cloud/datacatalog_v1beta1/__init__.py | 4 +- .../services/data_catalog/async_client.py | 32 +- .../services/data_catalog/client.py | 29 +- .../services/data_catalog/pagers.py | 11 +- .../policy_tag_manager/async_client.py | 32 +- .../services/policy_tag_manager/client.py | 21 +- .../services/policy_tag_manager/pagers.py | 11 +- .../async_client.py | 36 +- .../datacatalog_v1beta1/types/__init__.py | 232 ++++----- synth.metadata | 6 +- tests/unit/gapic/datacatalog_v1/__init__.py | 15 + .../gapic/datacatalog_v1/test_data_catalog.py | 464 +++++++++++++++++- .../gapic/datacatalog_v1beta1/__init__.py | 15 + .../datacatalog_v1beta1/test_data_catalog.py | 464 +++++++++++++++++- .../test_policy_tag_manager.py | 226 ++++++++- .../test_policy_tag_manager_serialization.py | 49 +- 20 files changed, 1623 insertions(+), 244 deletions(-) diff --git a/google/cloud/datacatalog_v1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py index dd7e759f..befe47ea 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/async_client.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/async_client.py @@ -94,8 +94,36 @@ class DataCatalogAsyncClient: DataCatalogClient.parse_common_location_path ) - from_service_account_info = DataCatalogClient.from_service_account_info - from_service_account_file = DataCatalogClient.from_service_account_file + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogAsyncClient: The constructed client. + """ + return DataCatalogClient.from_service_account_info.__func__(DataCatalogAsyncClient, info, *args, **kwargs) # type: ignore + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogAsyncClient: The constructed client. + """ + return DataCatalogClient.from_service_account_file.__func__(DataCatalogAsyncClient, filename, *args, **kwargs) # type: ignore + from_service_account_json = from_service_account_file @property diff --git a/google/cloud/datacatalog_v1/services/data_catalog/client.py b/google/cloud/datacatalog_v1/services/data_catalog/client.py index acc03c3c..ea8551d4 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/client.py @@ -2795,13 +2795,16 @@ def set_iam_policy( "the individual field arguments should be set." ) - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.SetIamPolicyRequest(**request) - elif not request: - request = iam_policy.SetIamPolicyRequest(resource=resource,) + # Null request, just make one. + request = iam_policy.SetIamPolicyRequest() + + if resource is not None: + request.resource = resource # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -2939,13 +2942,16 @@ def get_iam_policy( "the individual field arguments should be set." ) - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.GetIamPolicyRequest(**request) - elif not request: - request = iam_policy.GetIamPolicyRequest(resource=resource,) + # Null request, just make one. + request = iam_policy.GetIamPolicyRequest() + + if resource is not None: + request.resource = resource # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -3003,10 +3009,13 @@ def test_iam_permissions( """ # Create or coerce a protobuf request object. - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.TestIamPermissionsRequest(**request) + elif not request: + # Null request, just make one. + request = iam_policy.TestIamPermissionsRequest() # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. diff --git a/google/cloud/datacatalog_v1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py index a5ce7581..7ce770e2 100644 --- a/google/cloud/datacatalog_v1/services/data_catalog/pagers.py +++ b/google/cloud/datacatalog_v1/services/data_catalog/pagers.py @@ -15,7 +15,16 @@ # limitations under the License. # -from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple +from typing import ( + Any, + AsyncIterable, + Awaitable, + Callable, + Iterable, + Sequence, + Tuple, + Optional, +) from google.cloud.datacatalog_v1.types import datacatalog from google.cloud.datacatalog_v1.types import search diff --git a/google/cloud/datacatalog_v1/types/__init__.py b/google/cloud/datacatalog_v1/types/__init__.py index de273a2c..fc60cdfe 100644 --- a/google/cloud/datacatalog_v1/types/__init__.py +++ b/google/cloud/datacatalog_v1/types/__init__.py @@ -15,115 +15,115 @@ # limitations under the License. # -from .timestamps import SystemTimestamps +from .datacatalog import ( + CreateEntryGroupRequest, + CreateEntryRequest, + CreateTagRequest, + CreateTagTemplateFieldRequest, + CreateTagTemplateRequest, + DeleteEntryGroupRequest, + DeleteEntryRequest, + DeleteTagRequest, + DeleteTagTemplateFieldRequest, + DeleteTagTemplateRequest, + Entry, + EntryGroup, + GetEntryGroupRequest, + GetEntryRequest, + GetTagTemplateRequest, + ListEntriesRequest, + ListEntriesResponse, + ListEntryGroupsRequest, + ListEntryGroupsResponse, + ListTagsRequest, + ListTagsResponse, + LookupEntryRequest, + RenameTagTemplateFieldRequest, + SearchCatalogRequest, + SearchCatalogResponse, + UpdateEntryGroupRequest, + UpdateEntryRequest, + UpdateTagRequest, + UpdateTagTemplateFieldRequest, + UpdateTagTemplateRequest, + EntryType, +) from .gcs_fileset_spec import ( GcsFilesetSpec, GcsFileSpec, ) from .schema import ( - Schema, ColumnSchema, + Schema, ) from .search import ( SearchCatalogResult, SearchResultType, ) from .table_spec import ( + BigQueryDateShardedSpec, BigQueryTableSpec, - ViewSpec, TableSpec, - BigQueryDateShardedSpec, + ViewSpec, TableSourceType, ) from .tags import ( + FieldType, Tag, TagField, TagTemplate, TagTemplateField, - FieldType, -) -from .datacatalog import ( - SearchCatalogRequest, - SearchCatalogResponse, - CreateEntryGroupRequest, - UpdateEntryGroupRequest, - GetEntryGroupRequest, - DeleteEntryGroupRequest, - ListEntryGroupsRequest, - ListEntryGroupsResponse, - CreateEntryRequest, - UpdateEntryRequest, - DeleteEntryRequest, - GetEntryRequest, - LookupEntryRequest, - Entry, - EntryGroup, - CreateTagTemplateRequest, - GetTagTemplateRequest, - UpdateTagTemplateRequest, - DeleteTagTemplateRequest, - CreateTagRequest, - UpdateTagRequest, - DeleteTagRequest, - CreateTagTemplateFieldRequest, - UpdateTagTemplateFieldRequest, - RenameTagTemplateFieldRequest, - DeleteTagTemplateFieldRequest, - ListTagsRequest, - ListTagsResponse, - ListEntriesRequest, - ListEntriesResponse, - EntryType, ) +from .timestamps import SystemTimestamps __all__ = ( "IntegratedSystem", - "SystemTimestamps", + "CreateEntryGroupRequest", + "CreateEntryRequest", + "CreateTagRequest", + "CreateTagTemplateFieldRequest", + "CreateTagTemplateRequest", + "DeleteEntryGroupRequest", + "DeleteEntryRequest", + "DeleteTagRequest", + "DeleteTagTemplateFieldRequest", + "DeleteTagTemplateRequest", + "Entry", + "EntryGroup", + "GetEntryGroupRequest", + "GetEntryRequest", + "GetTagTemplateRequest", + "ListEntriesRequest", + "ListEntriesResponse", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "ListTagsRequest", + "ListTagsResponse", + "LookupEntryRequest", + "RenameTagTemplateFieldRequest", + "SearchCatalogRequest", + "SearchCatalogResponse", + "UpdateEntryGroupRequest", + "UpdateEntryRequest", + "UpdateTagRequest", + "UpdateTagTemplateFieldRequest", + "UpdateTagTemplateRequest", + "EntryType", "GcsFilesetSpec", "GcsFileSpec", - "Schema", "ColumnSchema", + "Schema", "SearchCatalogResult", "SearchResultType", + "BigQueryDateShardedSpec", "BigQueryTableSpec", - "ViewSpec", "TableSpec", - "BigQueryDateShardedSpec", + "ViewSpec", "TableSourceType", + "FieldType", "Tag", "TagField", "TagTemplate", "TagTemplateField", - "FieldType", - "SearchCatalogRequest", - "SearchCatalogResponse", - "CreateEntryGroupRequest", - "UpdateEntryGroupRequest", - "GetEntryGroupRequest", - "DeleteEntryGroupRequest", - "ListEntryGroupsRequest", - "ListEntryGroupsResponse", - "CreateEntryRequest", - "UpdateEntryRequest", - "DeleteEntryRequest", - "GetEntryRequest", - "LookupEntryRequest", - "Entry", - "EntryGroup", - "CreateTagTemplateRequest", - "GetTagTemplateRequest", - "UpdateTagTemplateRequest", - "DeleteTagTemplateRequest", - "CreateTagRequest", - "UpdateTagRequest", - "DeleteTagRequest", - "CreateTagTemplateFieldRequest", - "UpdateTagTemplateFieldRequest", - "RenameTagTemplateFieldRequest", - "DeleteTagTemplateFieldRequest", - "ListTagsRequest", - "ListTagsResponse", - "ListEntriesRequest", - "ListEntriesResponse", - "EntryType", + "SystemTimestamps", ) diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index be0bdd8e..8bc01583 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -140,7 +140,7 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", - "PolicyTagManagerClient", + "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", "SearchCatalogRequest", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerSerializationClient", + "PolicyTagManagerClient", ) diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py index 3a99aa56..c937d527 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/async_client.py @@ -94,8 +94,36 @@ class DataCatalogAsyncClient: DataCatalogClient.parse_common_location_path ) - from_service_account_info = DataCatalogClient.from_service_account_info - from_service_account_file = DataCatalogClient.from_service_account_file + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogAsyncClient: The constructed client. + """ + return DataCatalogClient.from_service_account_info.__func__(DataCatalogAsyncClient, info, *args, **kwargs) # type: ignore + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + DataCatalogAsyncClient: The constructed client. + """ + return DataCatalogClient.from_service_account_file.__func__(DataCatalogAsyncClient, filename, *args, **kwargs) # type: ignore + from_service_account_json = from_service_account_file @property diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py index d7441888..28d471aa 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/client.py @@ -2787,13 +2787,16 @@ def set_iam_policy( "the individual field arguments should be set." ) - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.SetIamPolicyRequest(**request) - elif not request: - request = iam_policy.SetIamPolicyRequest(resource=resource,) + # Null request, just make one. + request = iam_policy.SetIamPolicyRequest() + + if resource is not None: + request.resource = resource # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -2931,13 +2934,16 @@ def get_iam_policy( "the individual field arguments should be set." ) - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.GetIamPolicyRequest(**request) - elif not request: - request = iam_policy.GetIamPolicyRequest(resource=resource,) + # Null request, just make one. + request = iam_policy.GetIamPolicyRequest() + + if resource is not None: + request.resource = resource # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -2995,10 +3001,13 @@ def test_iam_permissions( """ # Create or coerce a protobuf request object. - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.TestIamPermissionsRequest(**request) + elif not request: + # Null request, just make one. + request = iam_policy.TestIamPermissionsRequest() # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. diff --git a/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py index 17b6f5ed..9cd6e4d7 100644 --- a/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py +++ b/google/cloud/datacatalog_v1beta1/services/data_catalog/pagers.py @@ -15,7 +15,16 @@ # limitations under the License. # -from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple +from typing import ( + Any, + AsyncIterable, + Awaitable, + Callable, + Iterable, + Sequence, + Tuple, + Optional, +) from google.cloud.datacatalog_v1beta1.types import datacatalog from google.cloud.datacatalog_v1beta1.types import search diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py index 61f9daab..7f0cbecc 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/async_client.py @@ -82,8 +82,36 @@ class PolicyTagManagerAsyncClient: PolicyTagManagerClient.parse_common_location_path ) - from_service_account_info = PolicyTagManagerClient.from_service_account_info - from_service_account_file = PolicyTagManagerClient.from_service_account_file + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerAsyncClient: The constructed client. + """ + return PolicyTagManagerClient.from_service_account_info.__func__(PolicyTagManagerAsyncClient, info, *args, **kwargs) # type: ignore + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerAsyncClient: The constructed client. + """ + return PolicyTagManagerClient.from_service_account_file.__func__(PolicyTagManagerAsyncClient, filename, *args, **kwargs) # type: ignore + from_service_account_json = from_service_account_file @property diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py index 7bbb2b41..152d0a10 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/client.py @@ -1253,10 +1253,13 @@ def get_iam_policy( """ # Create or coerce a protobuf request object. - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.GetIamPolicyRequest(**request) + elif not request: + # Null request, just make one. + request = iam_policy.GetIamPolicyRequest() # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -1356,10 +1359,13 @@ def set_iam_policy( """ # Create or coerce a protobuf request object. - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.SetIamPolicyRequest(**request) + elif not request: + # Null request, just make one. + request = iam_policy.SetIamPolicyRequest() # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. @@ -1405,10 +1411,13 @@ def test_iam_permissions( """ # Create or coerce a protobuf request object. - # The request isn't a proto-plus wrapped type, - # so it must be constructed via keyword expansion. if isinstance(request, dict): + # The request isn't a proto-plus wrapped type, + # so it must be constructed via keyword expansion. request = iam_policy.TestIamPermissionsRequest(**request) + elif not request: + # Null request, just make one. + request = iam_policy.TestIamPermissionsRequest() # Wrap the RPC method; this adds retry and timeout information, # and friendly error handling. diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py index c216e352..7253f781 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager/pagers.py @@ -15,7 +15,16 @@ # limitations under the License. # -from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple +from typing import ( + Any, + AsyncIterable, + Awaitable, + Callable, + Iterable, + Sequence, + Tuple, + Optional, +) from google.cloud.datacatalog_v1beta1.types import policytagmanager diff --git a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py index 36c2a489..40eda2b7 100644 --- a/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py +++ b/google/cloud/datacatalog_v1beta1/services/policy_tag_manager_serialization/async_client.py @@ -87,12 +87,36 @@ class PolicyTagManagerSerializationAsyncClient: PolicyTagManagerSerializationClient.parse_common_location_path ) - from_service_account_info = ( - PolicyTagManagerSerializationClient.from_service_account_info - ) - from_service_account_file = ( - PolicyTagManagerSerializationClient.from_service_account_file - ) + @classmethod + def from_service_account_info(cls, info: dict, *args, **kwargs): + """Creates an instance of this client using the provided credentials info. + + Args: + info (dict): The service account private key info. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerSerializationAsyncClient: The constructed client. + """ + return PolicyTagManagerSerializationClient.from_service_account_info.__func__(PolicyTagManagerSerializationAsyncClient, info, *args, **kwargs) # type: ignore + + @classmethod + def from_service_account_file(cls, filename: str, *args, **kwargs): + """Creates an instance of this client using the provided credentials + file. + + Args: + filename (str): The path to the service account private key json + file. + args: Additional arguments to pass to the constructor. + kwargs: Additional arguments to pass to the constructor. + + Returns: + PolicyTagManagerSerializationAsyncClient: The constructed client. + """ + return PolicyTagManagerSerializationClient.from_service_account_file.__func__(PolicyTagManagerSerializationAsyncClient, filename, *args, **kwargs) # type: ignore + from_service_account_json = from_service_account_file @property diff --git a/google/cloud/datacatalog_v1beta1/types/__init__.py b/google/cloud/datacatalog_v1beta1/types/__init__.py index 253122a9..55067b1a 100644 --- a/google/cloud/datacatalog_v1beta1/types/__init__.py +++ b/google/cloud/datacatalog_v1beta1/types/__init__.py @@ -15,161 +15,161 @@ # limitations under the License. # -from .timestamps import SystemTimestamps -from .gcs_fileset_spec import ( - GcsFilesetSpec, - GcsFileSpec, -) -from .schema import ( - Schema, - ColumnSchema, -) -from .search import ( - SearchCatalogResult, - SearchResultType, -) -from .table_spec import ( - BigQueryTableSpec, - ViewSpec, - TableSpec, - BigQueryDateShardedSpec, - TableSourceType, -) -from .tags import ( - Tag, - TagField, - TagTemplate, - TagTemplateField, - FieldType, -) from .datacatalog import ( - SearchCatalogRequest, - SearchCatalogResponse, CreateEntryGroupRequest, - UpdateEntryGroupRequest, - GetEntryGroupRequest, - DeleteEntryGroupRequest, - ListEntryGroupsRequest, - ListEntryGroupsResponse, CreateEntryRequest, - UpdateEntryRequest, + CreateTagRequest, + CreateTagTemplateFieldRequest, + CreateTagTemplateRequest, + DeleteEntryGroupRequest, DeleteEntryRequest, - GetEntryRequest, - LookupEntryRequest, + DeleteTagRequest, + DeleteTagTemplateFieldRequest, + DeleteTagTemplateRequest, Entry, EntryGroup, - CreateTagTemplateRequest, + GetEntryGroupRequest, + GetEntryRequest, GetTagTemplateRequest, - UpdateTagTemplateRequest, - DeleteTagTemplateRequest, - CreateTagRequest, - UpdateTagRequest, - DeleteTagRequest, - CreateTagTemplateFieldRequest, - UpdateTagTemplateFieldRequest, - RenameTagTemplateFieldRequest, - DeleteTagTemplateFieldRequest, - ListTagsRequest, - ListTagsResponse, ListEntriesRequest, ListEntriesResponse, + ListEntryGroupsRequest, + ListEntryGroupsResponse, + ListTagsRequest, + ListTagsResponse, + LookupEntryRequest, + RenameTagTemplateFieldRequest, + SearchCatalogRequest, + SearchCatalogResponse, + UpdateEntryGroupRequest, + UpdateEntryRequest, + UpdateTagRequest, + UpdateTagTemplateFieldRequest, + UpdateTagTemplateRequest, EntryType, ) +from .gcs_fileset_spec import ( + GcsFilesetSpec, + GcsFileSpec, +) from .policytagmanager import ( - Taxonomy, - PolicyTag, + CreatePolicyTagRequest, CreateTaxonomyRequest, + DeletePolicyTagRequest, DeleteTaxonomyRequest, - UpdateTaxonomyRequest, - ListTaxonomiesRequest, - ListTaxonomiesResponse, + GetPolicyTagRequest, GetTaxonomyRequest, - CreatePolicyTagRequest, - DeletePolicyTagRequest, - UpdatePolicyTagRequest, ListPolicyTagsRequest, ListPolicyTagsResponse, - GetPolicyTagRequest, + ListTaxonomiesRequest, + ListTaxonomiesResponse, + PolicyTag, + Taxonomy, + UpdatePolicyTagRequest, + UpdateTaxonomyRequest, ) from .policytagmanagerserialization import ( - SerializedTaxonomy, - SerializedPolicyTag, - ImportTaxonomiesRequest, - InlineSource, - ImportTaxonomiesResponse, ExportTaxonomiesRequest, ExportTaxonomiesResponse, + ImportTaxonomiesRequest, + ImportTaxonomiesResponse, + InlineSource, + SerializedPolicyTag, + SerializedTaxonomy, +) +from .schema import ( + ColumnSchema, + Schema, +) +from .search import ( + SearchCatalogResult, + SearchResultType, +) +from .table_spec import ( + BigQueryDateShardedSpec, + BigQueryTableSpec, + TableSpec, + ViewSpec, + TableSourceType, ) +from .tags import ( + FieldType, + Tag, + TagField, + TagTemplate, + TagTemplateField, +) +from .timestamps import SystemTimestamps __all__ = ( "IntegratedSystem", - "SystemTimestamps", - "GcsFilesetSpec", - "GcsFileSpec", - "Schema", - "ColumnSchema", - "SearchCatalogResult", - "SearchResultType", - "BigQueryTableSpec", - "ViewSpec", - "TableSpec", - "BigQueryDateShardedSpec", - "TableSourceType", - "Tag", - "TagField", - "TagTemplate", - "TagTemplateField", - "FieldType", - "SearchCatalogRequest", - "SearchCatalogResponse", "CreateEntryGroupRequest", - "UpdateEntryGroupRequest", - "GetEntryGroupRequest", - "DeleteEntryGroupRequest", - "ListEntryGroupsRequest", - "ListEntryGroupsResponse", "CreateEntryRequest", - "UpdateEntryRequest", + "CreateTagRequest", + "CreateTagTemplateFieldRequest", + "CreateTagTemplateRequest", + "DeleteEntryGroupRequest", "DeleteEntryRequest", - "GetEntryRequest", - "LookupEntryRequest", + "DeleteTagRequest", + "DeleteTagTemplateFieldRequest", + "DeleteTagTemplateRequest", "Entry", "EntryGroup", - "CreateTagTemplateRequest", + "GetEntryGroupRequest", + "GetEntryRequest", "GetTagTemplateRequest", - "UpdateTagTemplateRequest", - "DeleteTagTemplateRequest", - "CreateTagRequest", - "UpdateTagRequest", - "DeleteTagRequest", - "CreateTagTemplateFieldRequest", - "UpdateTagTemplateFieldRequest", - "RenameTagTemplateFieldRequest", - "DeleteTagTemplateFieldRequest", - "ListTagsRequest", - "ListTagsResponse", "ListEntriesRequest", "ListEntriesResponse", + "ListEntryGroupsRequest", + "ListEntryGroupsResponse", + "ListTagsRequest", + "ListTagsResponse", + "LookupEntryRequest", + "RenameTagTemplateFieldRequest", + "SearchCatalogRequest", + "SearchCatalogResponse", + "UpdateEntryGroupRequest", + "UpdateEntryRequest", + "UpdateTagRequest", + "UpdateTagTemplateFieldRequest", + "UpdateTagTemplateRequest", "EntryType", - "Taxonomy", - "PolicyTag", + "GcsFilesetSpec", + "GcsFileSpec", + "CreatePolicyTagRequest", "CreateTaxonomyRequest", + "DeletePolicyTagRequest", "DeleteTaxonomyRequest", - "UpdateTaxonomyRequest", - "ListTaxonomiesRequest", - "ListTaxonomiesResponse", + "GetPolicyTagRequest", "GetTaxonomyRequest", - "CreatePolicyTagRequest", - "DeletePolicyTagRequest", - "UpdatePolicyTagRequest", "ListPolicyTagsRequest", "ListPolicyTagsResponse", - "GetPolicyTagRequest", - "SerializedTaxonomy", - "SerializedPolicyTag", - "ImportTaxonomiesRequest", - "InlineSource", - "ImportTaxonomiesResponse", + "ListTaxonomiesRequest", + "ListTaxonomiesResponse", + "PolicyTag", + "Taxonomy", + "UpdatePolicyTagRequest", + "UpdateTaxonomyRequest", "ExportTaxonomiesRequest", "ExportTaxonomiesResponse", + "ImportTaxonomiesRequest", + "ImportTaxonomiesResponse", + "InlineSource", + "SerializedPolicyTag", + "SerializedTaxonomy", + "ColumnSchema", + "Schema", + "SearchCatalogResult", + "SearchResultType", + "BigQueryDateShardedSpec", + "BigQueryTableSpec", + "TableSpec", + "ViewSpec", + "TableSourceType", + "FieldType", + "Tag", + "TagField", + "TagTemplate", + "TagTemplateField", + "SystemTimestamps", ) diff --git a/synth.metadata b/synth.metadata index 34e965a4..fdc054a6 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,15 +4,15 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "ec9d92555c4adaa97a3dfccb85f060cc86ed2747" + "sha": "cde0673fbfb2289bc017268d3799eb428162f3d8" } }, { "git": { "name": "googleapis", "remote": "https://github.com/googleapis/googleapis.git", - "sha": "6ef9eaea379fc1cc0355e06a5a20b594543ee693", - "internalRef": "355059873" + "sha": "8d17d8fafbb87ac64bb3179b99ac34ed41375a51", + "internalRef": "363762006" } }, { diff --git a/tests/unit/gapic/datacatalog_v1/__init__.py b/tests/unit/gapic/datacatalog_v1/__init__.py index 8b137891..42ffdf2b 100644 --- a/tests/unit/gapic/datacatalog_v1/__init__.py +++ b/tests/unit/gapic/datacatalog_v1/__init__.py @@ -1 +1,16 @@ +# -*- coding: utf-8 -*- +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# diff --git a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py index 8fc96676..301b8027 100644 --- a/tests/unit/gapic/datacatalog_v1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1/test_data_catalog.py @@ -94,15 +94,17 @@ def test__get_default_mtls_endpoint(): assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi -def test_data_catalog_client_from_service_account_info(): +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient,]) +def test_data_catalog_client_from_service_account_info(client_class): creds = credentials.AnonymousCredentials() with mock.patch.object( service_account.Credentials, "from_service_account_info" ) as factory: factory.return_value = creds info = {"valid": True} - client = DataCatalogClient.from_service_account_info(info) + client = client_class.from_service_account_info(info) assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -116,9 +118,11 @@ def test_data_catalog_client_from_service_account_file(client_class): factory.return_value = creds client = client_class.from_service_account_file("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) client = client_class.from_service_account_json("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -477,6 +481,22 @@ def test_search_catalog_from_dict(): test_search_catalog(request_type=dict) +def test_search_catalog_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.search_catalog), "__call__") as call: + client.search_catalog() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.SearchCatalogRequest() + + @pytest.mark.asyncio async def test_search_catalog_async( transport: str = "grpc_asyncio", request_type=datacatalog.SearchCatalogRequest @@ -793,6 +813,24 @@ def test_create_entry_group_from_dict(): test_create_entry_group(request_type=dict) +def test_create_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_entry_group), "__call__" + ) as call: + client.create_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryGroupRequest() + + @pytest.mark.asyncio async def test_create_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateEntryGroupRequest @@ -1032,6 +1070,22 @@ def test_get_entry_group_from_dict(): test_get_entry_group(request_type=dict) +def test_get_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_entry_group), "__call__") as call: + client.get_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryGroupRequest() + + @pytest.mark.asyncio async def test_get_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetEntryGroupRequest @@ -1253,6 +1307,24 @@ def test_update_entry_group_from_dict(): test_update_entry_group(request_type=dict) +def test_update_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_entry_group), "__call__" + ) as call: + client.update_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryGroupRequest() + + @pytest.mark.asyncio async def test_update_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateEntryGroupRequest @@ -1479,6 +1551,24 @@ def test_delete_entry_group_from_dict(): test_delete_entry_group(request_type=dict) +def test_delete_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_entry_group), "__call__" + ) as call: + client.delete_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryGroupRequest() + + @pytest.mark.asyncio async def test_delete_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteEntryGroupRequest @@ -1676,6 +1766,24 @@ def test_list_entry_groups_from_dict(): test_list_entry_groups(request_type=dict) +def test_list_entry_groups_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.list_entry_groups), "__call__" + ) as call: + client.list_entry_groups() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntryGroupsRequest() + + @pytest.mark.asyncio async def test_list_entry_groups_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListEntryGroupsRequest @@ -2049,6 +2157,22 @@ def test_create_entry_from_dict(): test_create_entry(request_type=dict) +def test_create_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.create_entry), "__call__") as call: + client.create_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryRequest() + + @pytest.mark.asyncio async def test_create_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateEntryRequest @@ -2285,6 +2409,22 @@ def test_update_entry_from_dict(): test_update_entry(request_type=dict) +def test_update_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.update_entry), "__call__") as call: + client.update_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryRequest() + + @pytest.mark.asyncio async def test_update_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateEntryRequest @@ -2494,6 +2634,22 @@ def test_delete_entry_from_dict(): test_delete_entry(request_type=dict) +def test_delete_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.delete_entry), "__call__") as call: + client.delete_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryRequest() + + @pytest.mark.asyncio async def test_delete_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteEntryRequest @@ -2691,6 +2847,22 @@ def test_get_entry_from_dict(): test_get_entry(request_type=dict) +def test_get_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_entry), "__call__") as call: + client.get_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryRequest() + + @pytest.mark.asyncio async def test_get_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetEntryRequest @@ -2905,6 +3077,22 @@ def test_lookup_entry_from_dict(): test_lookup_entry(request_type=dict) +def test_lookup_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.lookup_entry), "__call__") as call: + client.lookup_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.LookupEntryRequest() + + @pytest.mark.asyncio async def test_lookup_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.LookupEntryRequest @@ -2991,6 +3179,22 @@ def test_list_entries_from_dict(): test_list_entries(request_type=dict) +def test_list_entries_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_entries), "__call__") as call: + client.list_entries() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntriesRequest() + + @pytest.mark.asyncio async def test_list_entries_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListEntriesRequest @@ -3326,6 +3530,24 @@ def test_create_tag_template_from_dict(): test_create_tag_template(request_type=dict) +def test_create_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_tag_template), "__call__" + ) as call: + client.create_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateRequest() + + @pytest.mark.asyncio async def test_create_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateTagTemplateRequest @@ -3551,6 +3773,22 @@ def test_get_tag_template_from_dict(): test_get_tag_template(request_type=dict) +def test_get_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_tag_template), "__call__") as call: + client.get_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetTagTemplateRequest() + + @pytest.mark.asyncio async def test_get_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetTagTemplateRequest @@ -3746,6 +3984,24 @@ def test_update_tag_template_from_dict(): test_update_tag_template(request_type=dict) +def test_update_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_tag_template), "__call__" + ) as call: + client.update_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateRequest() + + @pytest.mark.asyncio async def test_update_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateTagTemplateRequest @@ -3964,6 +4220,24 @@ def test_delete_tag_template_from_dict(): test_delete_tag_template(request_type=dict) +def test_delete_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_tag_template), "__call__" + ) as call: + client.delete_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateRequest() + + @pytest.mark.asyncio async def test_delete_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteTagTemplateRequest @@ -4176,6 +4450,24 @@ def test_create_tag_template_field_from_dict(): test_create_tag_template_field(request_type=dict) +def test_create_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_tag_template_field), "__call__" + ) as call: + client.create_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_create_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4424,6 +4716,24 @@ def test_update_tag_template_field_from_dict(): test_update_tag_template_field(request_type=dict) +def test_update_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_tag_template_field), "__call__" + ) as call: + client.update_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_update_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4672,6 +4982,24 @@ def test_rename_tag_template_field_from_dict(): test_rename_tag_template_field(request_type=dict) +def test_rename_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.rename_tag_template_field), "__call__" + ) as call: + client.rename_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.RenameTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_rename_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4898,6 +5226,24 @@ def test_delete_tag_template_field_from_dict(): test_delete_tag_template_field(request_type=dict) +def test_delete_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_tag_template_field), "__call__" + ) as call: + client.delete_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_delete_tag_template_field_async( transport: str = "grpc_asyncio", @@ -5107,6 +5453,22 @@ def test_create_tag_from_dict(): test_create_tag(request_type=dict) +def test_create_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.create_tag), "__call__") as call: + client.create_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagRequest() + + @pytest.mark.asyncio async def test_create_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateTagRequest @@ -5321,6 +5683,22 @@ def test_update_tag_from_dict(): test_update_tag(request_type=dict) +def test_update_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.update_tag), "__call__") as call: + client.update_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagRequest() + + @pytest.mark.asyncio async def test_update_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateTagRequest @@ -5525,6 +5903,22 @@ def test_delete_tag_from_dict(): test_delete_tag(request_type=dict) +def test_delete_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.delete_tag), "__call__") as call: + client.delete_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagRequest() + + @pytest.mark.asyncio async def test_delete_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteTagRequest @@ -5708,6 +6102,22 @@ def test_list_tags_from_dict(): test_list_tags(request_type=dict) +def test_list_tags_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_tags), "__call__") as call: + client.list_tags() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListTagsRequest() + + @pytest.mark.asyncio async def test_list_tags_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListTagsRequest @@ -6003,6 +6413,22 @@ def test_set_iam_policy_from_dict(): test_set_iam_policy(request_type=dict) +def test_set_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.set_iam_policy), "__call__") as call: + client.set_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + @pytest.mark.asyncio async def test_set_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.SetIamPolicyRequest @@ -6210,6 +6636,22 @@ def test_get_iam_policy_from_dict(): test_get_iam_policy(request_type=dict) +def test_get_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_iam_policy), "__call__") as call: + client.get_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + @pytest.mark.asyncio async def test_get_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.GetIamPolicyRequest @@ -6419,6 +6861,24 @@ def test_test_iam_permissions_from_dict(): test_test_iam_permissions(request_type=dict) +def test_test_iam_permissions_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.test_iam_permissions), "__call__" + ) as call: + client.test_iam_permissions() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + @pytest.mark.asyncio async def test_test_iam_permissions_async( transport: str = "grpc_asyncio", request_type=iam_policy.TestIamPermissionsRequest diff --git a/tests/unit/gapic/datacatalog_v1beta1/__init__.py b/tests/unit/gapic/datacatalog_v1beta1/__init__.py index 8b137891..42ffdf2b 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/__init__.py +++ b/tests/unit/gapic/datacatalog_v1beta1/__init__.py @@ -1 +1,16 @@ +# -*- coding: utf-8 -*- +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py index 7191fc9b..1d7aeb41 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_data_catalog.py @@ -96,15 +96,17 @@ def test__get_default_mtls_endpoint(): assert DataCatalogClient._get_default_mtls_endpoint(non_googleapi) == non_googleapi -def test_data_catalog_client_from_service_account_info(): +@pytest.mark.parametrize("client_class", [DataCatalogClient, DataCatalogAsyncClient,]) +def test_data_catalog_client_from_service_account_info(client_class): creds = credentials.AnonymousCredentials() with mock.patch.object( service_account.Credentials, "from_service_account_info" ) as factory: factory.return_value = creds info = {"valid": True} - client = DataCatalogClient.from_service_account_info(info) + client = client_class.from_service_account_info(info) assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -118,9 +120,11 @@ def test_data_catalog_client_from_service_account_file(client_class): factory.return_value = creds client = client_class.from_service_account_file("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) client = client_class.from_service_account_json("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -477,6 +481,22 @@ def test_search_catalog_from_dict(): test_search_catalog(request_type=dict) +def test_search_catalog_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.search_catalog), "__call__") as call: + client.search_catalog() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.SearchCatalogRequest() + + @pytest.mark.asyncio async def test_search_catalog_async( transport: str = "grpc_asyncio", request_type=datacatalog.SearchCatalogRequest @@ -788,6 +808,24 @@ def test_create_entry_group_from_dict(): test_create_entry_group(request_type=dict) +def test_create_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_entry_group), "__call__" + ) as call: + client.create_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryGroupRequest() + + @pytest.mark.asyncio async def test_create_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateEntryGroupRequest @@ -1029,6 +1067,24 @@ def test_update_entry_group_from_dict(): test_update_entry_group(request_type=dict) +def test_update_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_entry_group), "__call__" + ) as call: + client.update_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryGroupRequest() + + @pytest.mark.asyncio async def test_update_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateEntryGroupRequest @@ -1264,6 +1320,22 @@ def test_get_entry_group_from_dict(): test_get_entry_group(request_type=dict) +def test_get_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_entry_group), "__call__") as call: + client.get_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryGroupRequest() + + @pytest.mark.asyncio async def test_get_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetEntryGroupRequest @@ -1474,6 +1546,24 @@ def test_delete_entry_group_from_dict(): test_delete_entry_group(request_type=dict) +def test_delete_entry_group_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_entry_group), "__call__" + ) as call: + client.delete_entry_group() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryGroupRequest() + + @pytest.mark.asyncio async def test_delete_entry_group_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteEntryGroupRequest @@ -1671,6 +1761,24 @@ def test_list_entry_groups_from_dict(): test_list_entry_groups(request_type=dict) +def test_list_entry_groups_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.list_entry_groups), "__call__" + ) as call: + client.list_entry_groups() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntryGroupsRequest() + + @pytest.mark.asyncio async def test_list_entry_groups_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListEntryGroupsRequest @@ -2044,6 +2152,22 @@ def test_create_entry_from_dict(): test_create_entry(request_type=dict) +def test_create_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.create_entry), "__call__") as call: + client.create_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateEntryRequest() + + @pytest.mark.asyncio async def test_create_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateEntryRequest @@ -2280,6 +2404,22 @@ def test_update_entry_from_dict(): test_update_entry(request_type=dict) +def test_update_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.update_entry), "__call__") as call: + client.update_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateEntryRequest() + + @pytest.mark.asyncio async def test_update_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateEntryRequest @@ -2489,6 +2629,22 @@ def test_delete_entry_from_dict(): test_delete_entry(request_type=dict) +def test_delete_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.delete_entry), "__call__") as call: + client.delete_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteEntryRequest() + + @pytest.mark.asyncio async def test_delete_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteEntryRequest @@ -2686,6 +2842,22 @@ def test_get_entry_from_dict(): test_get_entry(request_type=dict) +def test_get_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_entry), "__call__") as call: + client.get_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetEntryRequest() + + @pytest.mark.asyncio async def test_get_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetEntryRequest @@ -2900,6 +3072,22 @@ def test_lookup_entry_from_dict(): test_lookup_entry(request_type=dict) +def test_lookup_entry_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.lookup_entry), "__call__") as call: + client.lookup_entry() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.LookupEntryRequest() + + @pytest.mark.asyncio async def test_lookup_entry_async( transport: str = "grpc_asyncio", request_type=datacatalog.LookupEntryRequest @@ -2986,6 +3174,22 @@ def test_list_entries_from_dict(): test_list_entries(request_type=dict) +def test_list_entries_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_entries), "__call__") as call: + client.list_entries() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListEntriesRequest() + + @pytest.mark.asyncio async def test_list_entries_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListEntriesRequest @@ -3321,6 +3525,24 @@ def test_create_tag_template_from_dict(): test_create_tag_template(request_type=dict) +def test_create_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_tag_template), "__call__" + ) as call: + client.create_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateRequest() + + @pytest.mark.asyncio async def test_create_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateTagTemplateRequest @@ -3546,6 +3768,22 @@ def test_get_tag_template_from_dict(): test_get_tag_template(request_type=dict) +def test_get_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_tag_template), "__call__") as call: + client.get_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.GetTagTemplateRequest() + + @pytest.mark.asyncio async def test_get_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.GetTagTemplateRequest @@ -3741,6 +3979,24 @@ def test_update_tag_template_from_dict(): test_update_tag_template(request_type=dict) +def test_update_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_tag_template), "__call__" + ) as call: + client.update_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateRequest() + + @pytest.mark.asyncio async def test_update_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateTagTemplateRequest @@ -3959,6 +4215,24 @@ def test_delete_tag_template_from_dict(): test_delete_tag_template(request_type=dict) +def test_delete_tag_template_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_tag_template), "__call__" + ) as call: + client.delete_tag_template() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateRequest() + + @pytest.mark.asyncio async def test_delete_tag_template_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteTagTemplateRequest @@ -4171,6 +4445,24 @@ def test_create_tag_template_field_from_dict(): test_create_tag_template_field(request_type=dict) +def test_create_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_tag_template_field), "__call__" + ) as call: + client.create_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_create_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4419,6 +4711,24 @@ def test_update_tag_template_field_from_dict(): test_update_tag_template_field(request_type=dict) +def test_update_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_tag_template_field), "__call__" + ) as call: + client.update_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_update_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4667,6 +4977,24 @@ def test_rename_tag_template_field_from_dict(): test_rename_tag_template_field(request_type=dict) +def test_rename_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.rename_tag_template_field), "__call__" + ) as call: + client.rename_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.RenameTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_rename_tag_template_field_async( transport: str = "grpc_asyncio", @@ -4893,6 +5221,24 @@ def test_delete_tag_template_field_from_dict(): test_delete_tag_template_field(request_type=dict) +def test_delete_tag_template_field_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_tag_template_field), "__call__" + ) as call: + client.delete_tag_template_field() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagTemplateFieldRequest() + + @pytest.mark.asyncio async def test_delete_tag_template_field_async( transport: str = "grpc_asyncio", @@ -5102,6 +5448,22 @@ def test_create_tag_from_dict(): test_create_tag(request_type=dict) +def test_create_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.create_tag), "__call__") as call: + client.create_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.CreateTagRequest() + + @pytest.mark.asyncio async def test_create_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.CreateTagRequest @@ -5316,6 +5678,22 @@ def test_update_tag_from_dict(): test_update_tag(request_type=dict) +def test_update_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.update_tag), "__call__") as call: + client.update_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.UpdateTagRequest() + + @pytest.mark.asyncio async def test_update_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.UpdateTagRequest @@ -5520,6 +5898,22 @@ def test_delete_tag_from_dict(): test_delete_tag(request_type=dict) +def test_delete_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.delete_tag), "__call__") as call: + client.delete_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.DeleteTagRequest() + + @pytest.mark.asyncio async def test_delete_tag_async( transport: str = "grpc_asyncio", request_type=datacatalog.DeleteTagRequest @@ -5703,6 +6097,22 @@ def test_list_tags_from_dict(): test_list_tags(request_type=dict) +def test_list_tags_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_tags), "__call__") as call: + client.list_tags() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == datacatalog.ListTagsRequest() + + @pytest.mark.asyncio async def test_list_tags_async( transport: str = "grpc_asyncio", request_type=datacatalog.ListTagsRequest @@ -5998,6 +6408,22 @@ def test_set_iam_policy_from_dict(): test_set_iam_policy(request_type=dict) +def test_set_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.set_iam_policy), "__call__") as call: + client.set_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + @pytest.mark.asyncio async def test_set_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.SetIamPolicyRequest @@ -6205,6 +6631,22 @@ def test_get_iam_policy_from_dict(): test_get_iam_policy(request_type=dict) +def test_get_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_iam_policy), "__call__") as call: + client.get_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + @pytest.mark.asyncio async def test_get_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.GetIamPolicyRequest @@ -6414,6 +6856,24 @@ def test_test_iam_permissions_from_dict(): test_test_iam_permissions(request_type=dict) +def test_test_iam_permissions_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = DataCatalogClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.test_iam_permissions), "__call__" + ) as call: + client.test_iam_permissions() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + @pytest.mark.asyncio async def test_test_iam_permissions_async( transport: str = "grpc_asyncio", request_type=iam_policy.TestIamPermissionsRequest diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py index c5ed26ec..4b073a35 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager.py @@ -94,15 +94,19 @@ def test__get_default_mtls_endpoint(): ) -def test_policy_tag_manager_client_from_service_account_info(): +@pytest.mark.parametrize( + "client_class", [PolicyTagManagerClient, PolicyTagManagerAsyncClient,] +) +def test_policy_tag_manager_client_from_service_account_info(client_class): creds = credentials.AnonymousCredentials() with mock.patch.object( service_account.Credentials, "from_service_account_info" ) as factory: factory.return_value = creds info = {"valid": True} - client = PolicyTagManagerClient.from_service_account_info(info) + client = client_class.from_service_account_info(info) assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -118,9 +122,11 @@ def test_policy_tag_manager_client_from_service_account_file(client_class): factory.return_value = creds client = client_class.from_service_account_file("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) client = client_class.from_service_account_json("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -506,6 +512,22 @@ def test_create_taxonomy_from_dict(): test_create_taxonomy(request_type=dict) +def test_create_taxonomy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.create_taxonomy), "__call__") as call: + client.create_taxonomy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.CreateTaxonomyRequest() + + @pytest.mark.asyncio async def test_create_taxonomy_async( transport: str = "grpc_asyncio", request_type=policytagmanager.CreateTaxonomyRequest @@ -729,6 +751,22 @@ def test_delete_taxonomy_from_dict(): test_delete_taxonomy(request_type=dict) +def test_delete_taxonomy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.delete_taxonomy), "__call__") as call: + client.delete_taxonomy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.DeleteTaxonomyRequest() + + @pytest.mark.asyncio async def test_delete_taxonomy_async( transport: str = "grpc_asyncio", request_type=policytagmanager.DeleteTaxonomyRequest @@ -933,6 +971,22 @@ def test_update_taxonomy_from_dict(): test_update_taxonomy(request_type=dict) +def test_update_taxonomy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.update_taxonomy), "__call__") as call: + client.update_taxonomy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.UpdateTaxonomyRequest() + + @pytest.mark.asyncio async def test_update_taxonomy_async( transport: str = "grpc_asyncio", request_type=policytagmanager.UpdateTaxonomyRequest @@ -1155,6 +1209,22 @@ def test_list_taxonomies_from_dict(): test_list_taxonomies(request_type=dict) +def test_list_taxonomies_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_taxonomies), "__call__") as call: + client.list_taxonomies() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.ListTaxonomiesRequest() + + @pytest.mark.asyncio async def test_list_taxonomies_async( transport: str = "grpc_asyncio", request_type=policytagmanager.ListTaxonomiesRequest @@ -1515,6 +1585,22 @@ def test_get_taxonomy_from_dict(): test_get_taxonomy(request_type=dict) +def test_get_taxonomy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_taxonomy), "__call__") as call: + client.get_taxonomy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.GetTaxonomyRequest() + + @pytest.mark.asyncio async def test_get_taxonomy_async( transport: str = "grpc_asyncio", request_type=policytagmanager.GetTaxonomyRequest @@ -1743,6 +1829,24 @@ def test_create_policy_tag_from_dict(): test_create_policy_tag(request_type=dict) +def test_create_policy_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.create_policy_tag), "__call__" + ) as call: + client.create_policy_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.CreatePolicyTagRequest() + + @pytest.mark.asyncio async def test_create_policy_tag_async( transport: str = "grpc_asyncio", @@ -1978,6 +2082,24 @@ def test_delete_policy_tag_from_dict(): test_delete_policy_tag(request_type=dict) +def test_delete_policy_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.delete_policy_tag), "__call__" + ) as call: + client.delete_policy_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.DeletePolicyTagRequest() + + @pytest.mark.asyncio async def test_delete_policy_tag_async( transport: str = "grpc_asyncio", @@ -2194,6 +2316,24 @@ def test_update_policy_tag_from_dict(): test_update_policy_tag(request_type=dict) +def test_update_policy_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.update_policy_tag), "__call__" + ) as call: + client.update_policy_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.UpdatePolicyTagRequest() + + @pytest.mark.asyncio async def test_update_policy_tag_async( transport: str = "grpc_asyncio", @@ -2428,6 +2568,22 @@ def test_list_policy_tags_from_dict(): test_list_policy_tags(request_type=dict) +def test_list_policy_tags_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.list_policy_tags), "__call__") as call: + client.list_policy_tags() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.ListPolicyTagsRequest() + + @pytest.mark.asyncio async def test_list_policy_tags_async( transport: str = "grpc_asyncio", request_type=policytagmanager.ListPolicyTagsRequest @@ -2799,6 +2955,22 @@ def test_get_policy_tag_from_dict(): test_get_policy_tag(request_type=dict) +def test_get_policy_tag_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_policy_tag), "__call__") as call: + client.get_policy_tag() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanager.GetPolicyTagRequest() + + @pytest.mark.asyncio async def test_get_policy_tag_async( transport: str = "grpc_asyncio", request_type=policytagmanager.GetPolicyTagRequest @@ -3012,6 +3184,22 @@ def test_get_iam_policy_from_dict(): test_get_iam_policy(request_type=dict) +def test_get_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.get_iam_policy), "__call__") as call: + client.get_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.GetIamPolicyRequest() + + @pytest.mark.asyncio async def test_get_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.GetIamPolicyRequest @@ -3156,6 +3344,22 @@ def test_set_iam_policy_from_dict(): test_set_iam_policy(request_type=dict) +def test_set_iam_policy_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object(type(client.transport.set_iam_policy), "__call__") as call: + client.set_iam_policy() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.SetIamPolicyRequest() + + @pytest.mark.asyncio async def test_set_iam_policy_async( transport: str = "grpc_asyncio", request_type=iam_policy.SetIamPolicyRequest @@ -3302,6 +3506,24 @@ def test_test_iam_permissions_from_dict(): test_test_iam_permissions(request_type=dict) +def test_test_iam_permissions_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.test_iam_permissions), "__call__" + ) as call: + client.test_iam_permissions() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == iam_policy.TestIamPermissionsRequest() + + @pytest.mark.asyncio async def test_test_iam_permissions_async( transport: str = "grpc_asyncio", request_type=iam_policy.TestIamPermissionsRequest diff --git a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py index 88c6a79b..8a3ccf97 100644 --- a/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py +++ b/tests/unit/gapic/datacatalog_v1beta1/test_policy_tag_manager_serialization.py @@ -95,15 +95,22 @@ def test__get_default_mtls_endpoint(): ) -def test_policy_tag_manager_serialization_client_from_service_account_info(): +@pytest.mark.parametrize( + "client_class", + [PolicyTagManagerSerializationClient, PolicyTagManagerSerializationAsyncClient,], +) +def test_policy_tag_manager_serialization_client_from_service_account_info( + client_class, +): creds = credentials.AnonymousCredentials() with mock.patch.object( service_account.Credentials, "from_service_account_info" ) as factory: factory.return_value = creds info = {"valid": True} - client = PolicyTagManagerSerializationClient.from_service_account_info(info) + client = client_class.from_service_account_info(info) assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -122,9 +129,11 @@ def test_policy_tag_manager_serialization_client_from_service_account_file( factory.return_value = creds client = client_class.from_service_account_file("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) client = client_class.from_service_account_json("dummy/file/path.json") assert client.transport._credentials == creds + assert isinstance(client, client_class) assert client.transport._host == "datacatalog.googleapis.com:443" @@ -512,6 +521,24 @@ def test_import_taxonomies_from_dict(): test_import_taxonomies(request_type=dict) +def test_import_taxonomies_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.import_taxonomies), "__call__" + ) as call: + client.import_taxonomies() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanagerserialization.ImportTaxonomiesRequest() + + @pytest.mark.asyncio async def test_import_taxonomies_async( transport: str = "grpc_asyncio", @@ -646,6 +673,24 @@ def test_export_taxonomies_from_dict(): test_export_taxonomies(request_type=dict) +def test_export_taxonomies_empty_call(): + # This test is a coverage failsafe to make sure that totally empty calls, + # i.e. request == None and no flattened fields passed, work. + client = PolicyTagManagerSerializationClient( + credentials=credentials.AnonymousCredentials(), transport="grpc", + ) + + # Mock the actual call within the gRPC stub, and fake the request. + with mock.patch.object( + type(client.transport.export_taxonomies), "__call__" + ) as call: + client.export_taxonomies() + call.assert_called() + _, args, _ = call.mock_calls[0] + + assert args[0] == policytagmanagerserialization.ExportTaxonomiesRequest() + + @pytest.mark.asyncio async def test_export_taxonomies_async( transport: str = "grpc_asyncio", From 865cf514eae6ed301ece1a1b3a018fbe2599bd6f Mon Sep 17 00:00:00 2001 From: Yoshi Automation Bot Date: Mon, 22 Mar 2021 07:47:58 -0700 Subject: [PATCH 25/26] chore(python): add kokoro configs for periodic builds against head (#122) * changes without context autosynth cannot find the source of changes triggered by earlier changes in this repository, or by version upgrades to tools such as linters. * chore(python): add kokoro configs for periodic builds against head This change should be non-destructive. Note for library repo maintainers: After applying this change, you can easily add (or change) periodic builds against head by adding config files in google3. See python-pubsub repo for example. Source-Author: Takashi Matsuo Source-Date: Fri Mar 19 11:17:59 2021 -0700 Source-Repo: googleapis/synthtool Source-Sha: 79c8dd7ee768292f933012d3a69a5b4676404cda Source-Link: https://github.com/googleapis/synthtool/commit/79c8dd7ee768292f933012d3a69a5b4676404cda --- .kokoro/samples/python3.6/periodic-head.cfg | 11 ++ .kokoro/samples/python3.7/periodic-head.cfg | 11 ++ .kokoro/samples/python3.8/periodic-head.cfg | 11 ++ .kokoro/test-samples-against-head.sh | 28 +++++ .kokoro/test-samples-impl.sh | 102 +++++++++++++++++++ .kokoro/test-samples.sh | 96 +++-------------- google/cloud/datacatalog_v1beta1/__init__.py | 4 +- synth.metadata | 11 +- 8 files changed, 189 insertions(+), 85 deletions(-) create mode 100644 .kokoro/samples/python3.6/periodic-head.cfg create mode 100644 .kokoro/samples/python3.7/periodic-head.cfg create mode 100644 .kokoro/samples/python3.8/periodic-head.cfg create mode 100755 .kokoro/test-samples-against-head.sh create mode 100755 .kokoro/test-samples-impl.sh diff --git a/.kokoro/samples/python3.6/periodic-head.cfg b/.kokoro/samples/python3.6/periodic-head.cfg new file mode 100644 index 00000000..f9cfcd33 --- /dev/null +++ b/.kokoro/samples/python3.6/periodic-head.cfg @@ -0,0 +1,11 @@ +# Format: //devtools/kokoro/config/proto/build.proto + +env_vars: { + key: "INSTALL_LIBRARY_FROM_SOURCE" + value: "True" +} + +env_vars: { + key: "TRAMPOLINE_BUILD_FILE" + value: "github/python-pubsub/.kokoro/test-samples-against-head.sh" +} diff --git a/.kokoro/samples/python3.7/periodic-head.cfg b/.kokoro/samples/python3.7/periodic-head.cfg new file mode 100644 index 00000000..f9cfcd33 --- /dev/null +++ b/.kokoro/samples/python3.7/periodic-head.cfg @@ -0,0 +1,11 @@ +# Format: //devtools/kokoro/config/proto/build.proto + +env_vars: { + key: "INSTALL_LIBRARY_FROM_SOURCE" + value: "True" +} + +env_vars: { + key: "TRAMPOLINE_BUILD_FILE" + value: "github/python-pubsub/.kokoro/test-samples-against-head.sh" +} diff --git a/.kokoro/samples/python3.8/periodic-head.cfg b/.kokoro/samples/python3.8/periodic-head.cfg new file mode 100644 index 00000000..f9cfcd33 --- /dev/null +++ b/.kokoro/samples/python3.8/periodic-head.cfg @@ -0,0 +1,11 @@ +# Format: //devtools/kokoro/config/proto/build.proto + +env_vars: { + key: "INSTALL_LIBRARY_FROM_SOURCE" + value: "True" +} + +env_vars: { + key: "TRAMPOLINE_BUILD_FILE" + value: "github/python-pubsub/.kokoro/test-samples-against-head.sh" +} diff --git a/.kokoro/test-samples-against-head.sh b/.kokoro/test-samples-against-head.sh new file mode 100755 index 00000000..bba2ca68 --- /dev/null +++ b/.kokoro/test-samples-against-head.sh @@ -0,0 +1,28 @@ +#!/bin/bash +# Copyright 2020 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +# A customized test runner for samples. +# +# For periodic builds, you can specify this file for testing against head. + +# `-e` enables the script to automatically fail when a command fails +# `-o pipefail` sets the exit code to the rightmost comment to exit with a non-zero +set -eo pipefail +# Enables `**` to include files nested inside sub-folders +shopt -s globstar + +cd github/python-datacatalog + +exec .kokoro/test-samples-impl.sh diff --git a/.kokoro/test-samples-impl.sh b/.kokoro/test-samples-impl.sh new file mode 100755 index 00000000..cf5de74c --- /dev/null +++ b/.kokoro/test-samples-impl.sh @@ -0,0 +1,102 @@ +#!/bin/bash +# Copyright 2021 Google LLC +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# https://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + + +# `-e` enables the script to automatically fail when a command fails +# `-o pipefail` sets the exit code to the rightmost comment to exit with a non-zero +set -eo pipefail +# Enables `**` to include files nested inside sub-folders +shopt -s globstar + +# Exit early if samples directory doesn't exist +if [ ! -d "./samples" ]; then + echo "No tests run. `./samples` not found" + exit 0 +fi + +# Disable buffering, so that the logs stream through. +export PYTHONUNBUFFERED=1 + +# Debug: show build environment +env | grep KOKORO + +# Install nox +python3.6 -m pip install --upgrade --quiet nox + +# Use secrets acessor service account to get secrets +if [[ -f "${KOKORO_GFILE_DIR}/secrets_viewer_service_account.json" ]]; then + gcloud auth activate-service-account \ + --key-file="${KOKORO_GFILE_DIR}/secrets_viewer_service_account.json" \ + --project="cloud-devrel-kokoro-resources" +fi + +# This script will create 3 files: +# - testing/test-env.sh +# - testing/service-account.json +# - testing/client-secrets.json +./scripts/decrypt-secrets.sh + +source ./testing/test-env.sh +export GOOGLE_APPLICATION_CREDENTIALS=$(pwd)/testing/service-account.json + +# For cloud-run session, we activate the service account for gcloud sdk. +gcloud auth activate-service-account \ + --key-file "${GOOGLE_APPLICATION_CREDENTIALS}" + +export GOOGLE_CLIENT_SECRETS=$(pwd)/testing/client-secrets.json + +echo -e "\n******************** TESTING PROJECTS ********************" + +# Switch to 'fail at end' to allow all tests to complete before exiting. +set +e +# Use RTN to return a non-zero value if the test fails. +RTN=0 +ROOT=$(pwd) +# Find all requirements.txt in the samples directory (may break on whitespace). +for file in samples/**/requirements.txt; do + cd "$ROOT" + # Navigate to the project folder. + file=$(dirname "$file") + cd "$file" + + echo "------------------------------------------------------------" + echo "- testing $file" + echo "------------------------------------------------------------" + + # Use nox to execute the tests for the project. + python3.6 -m nox -s "$RUN_TESTS_SESSION" + EXIT=$? + + # If this is a periodic build, send the test log to the FlakyBot. + # See https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot. + if [[ $KOKORO_BUILD_ARTIFACTS_SUBDIR = *"periodic"* ]]; then + chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot + $KOKORO_GFILE_DIR/linux_amd64/flakybot + fi + + if [[ $EXIT -ne 0 ]]; then + RTN=1 + echo -e "\n Testing failed: Nox returned a non-zero exit code. \n" + else + echo -e "\n Testing completed.\n" + fi + +done +cd "$ROOT" + +# Workaround for Kokoro permissions issue: delete secrets +rm testing/{test-env.sh,client-secrets.json,service-account.json} + +exit "$RTN" diff --git a/.kokoro/test-samples.sh b/.kokoro/test-samples.sh index 8fb79dcb..8038bde4 100755 --- a/.kokoro/test-samples.sh +++ b/.kokoro/test-samples.sh @@ -13,6 +13,10 @@ # See the License for the specific language governing permissions and # limitations under the License. +# The default test runner for samples. +# +# For periodic builds, we rewinds the repo to the latest release, and +# run test-samples-impl.sh. # `-e` enables the script to automatically fail when a command fails # `-o pipefail` sets the exit code to the rightmost comment to exit with a non-zero @@ -24,87 +28,19 @@ cd github/python-datacatalog # Run periodic samples tests at latest release if [[ $KOKORO_BUILD_ARTIFACTS_SUBDIR = *"periodic"* ]]; then + # preserving the test runner implementation. + cp .kokoro/test-samples-impl.sh "${TMPDIR}/test-samples-impl.sh" + echo "--- IMPORTANT IMPORTANT IMPORTANT ---" + echo "Now we rewind the repo back to the latest release..." LATEST_RELEASE=$(git describe --abbrev=0 --tags) git checkout $LATEST_RELEASE -fi - -# Exit early if samples directory doesn't exist -if [ ! -d "./samples" ]; then - echo "No tests run. `./samples` not found" - exit 0 -fi - -# Disable buffering, so that the logs stream through. -export PYTHONUNBUFFERED=1 - -# Debug: show build environment -env | grep KOKORO - -# Install nox -python3.6 -m pip install --upgrade --quiet nox - -# Use secrets acessor service account to get secrets -if [[ -f "${KOKORO_GFILE_DIR}/secrets_viewer_service_account.json" ]]; then - gcloud auth activate-service-account \ - --key-file="${KOKORO_GFILE_DIR}/secrets_viewer_service_account.json" \ - --project="cloud-devrel-kokoro-resources" -fi - -# This script will create 3 files: -# - testing/test-env.sh -# - testing/service-account.json -# - testing/client-secrets.json -./scripts/decrypt-secrets.sh - -source ./testing/test-env.sh -export GOOGLE_APPLICATION_CREDENTIALS=$(pwd)/testing/service-account.json - -# For cloud-run session, we activate the service account for gcloud sdk. -gcloud auth activate-service-account \ - --key-file "${GOOGLE_APPLICATION_CREDENTIALS}" - -export GOOGLE_CLIENT_SECRETS=$(pwd)/testing/client-secrets.json - -echo -e "\n******************** TESTING PROJECTS ********************" - -# Switch to 'fail at end' to allow all tests to complete before exiting. -set +e -# Use RTN to return a non-zero value if the test fails. -RTN=0 -ROOT=$(pwd) -# Find all requirements.txt in the samples directory (may break on whitespace). -for file in samples/**/requirements.txt; do - cd "$ROOT" - # Navigate to the project folder. - file=$(dirname "$file") - cd "$file" - - echo "------------------------------------------------------------" - echo "- testing $file" - echo "------------------------------------------------------------" - - # Use nox to execute the tests for the project. - python3.6 -m nox -s "$RUN_TESTS_SESSION" - EXIT=$? - - # If this is a periodic build, send the test log to the FlakyBot. - # See https://github.com/googleapis/repo-automation-bots/tree/master/packages/flakybot. - if [[ $KOKORO_BUILD_ARTIFACTS_SUBDIR = *"periodic"* ]]; then - chmod +x $KOKORO_GFILE_DIR/linux_amd64/flakybot - $KOKORO_GFILE_DIR/linux_amd64/flakybot + echo "The current head is: " + echo $(git rev-parse --verify HEAD) + echo "--- IMPORTANT IMPORTANT IMPORTANT ---" + # move back the test runner implementation if there's no file. + if [ ! -f .kokoro/test-samples-impl.sh ]; then + cp "${TMPDIR}/test-samples-impl.sh" .kokoro/test-samples-impl.sh fi +fi - if [[ $EXIT -ne 0 ]]; then - RTN=1 - echo -e "\n Testing failed: Nox returned a non-zero exit code. \n" - else - echo -e "\n Testing completed.\n" - fi - -done -cd "$ROOT" - -# Workaround for Kokoro permissions issue: delete secrets -rm testing/{test-env.sh,client-secrets.json,service-account.json} - -exit "$RTN" +exec .kokoro/test-samples-impl.sh diff --git a/google/cloud/datacatalog_v1beta1/__init__.py b/google/cloud/datacatalog_v1beta1/__init__.py index 8bc01583..16534418 100644 --- a/google/cloud/datacatalog_v1beta1/__init__.py +++ b/google/cloud/datacatalog_v1beta1/__init__.py @@ -103,7 +103,6 @@ "CreateTagTemplateFieldRequest", "CreateTagTemplateRequest", "CreateTaxonomyRequest", - "DataCatalogClient", "DeleteEntryGroupRequest", "DeleteEntryRequest", "DeletePolicyTagRequest", @@ -140,6 +139,7 @@ "ListTaxonomiesResponse", "LookupEntryRequest", "PolicyTag", + "PolicyTagManagerClient", "PolicyTagManagerSerializationClient", "RenameTagTemplateFieldRequest", "Schema", @@ -165,5 +165,5 @@ "UpdateTagTemplateRequest", "UpdateTaxonomyRequest", "ViewSpec", - "PolicyTagManagerClient", + "DataCatalogClient", ) diff --git a/synth.metadata b/synth.metadata index fdc054a6..981413d7 100644 --- a/synth.metadata +++ b/synth.metadata @@ -4,7 +4,7 @@ "git": { "name": ".", "remote": "https://github.com/googleapis/python-datacatalog.git", - "sha": "cde0673fbfb2289bc017268d3799eb428162f3d8" + "sha": "931c1c460c40c7c1b7fe32ef16c48275be2e5a16" } }, { @@ -19,14 +19,14 @@ "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "eda422b90c3dde4a872a13e6b78a8f802c40d0db" + "sha": "79c8dd7ee768292f933012d3a69a5b4676404cda" } }, { "git": { "name": "synthtool", "remote": "https://github.com/googleapis/synthtool.git", - "sha": "eda422b90c3dde4a872a13e6b78a8f802c40d0db" + "sha": "79c8dd7ee768292f933012d3a69a5b4676404cda" } } ], @@ -83,16 +83,21 @@ ".kokoro/samples/lint/presubmit.cfg", ".kokoro/samples/python3.6/common.cfg", ".kokoro/samples/python3.6/continuous.cfg", + ".kokoro/samples/python3.6/periodic-head.cfg", ".kokoro/samples/python3.6/periodic.cfg", ".kokoro/samples/python3.6/presubmit.cfg", ".kokoro/samples/python3.7/common.cfg", ".kokoro/samples/python3.7/continuous.cfg", + ".kokoro/samples/python3.7/periodic-head.cfg", ".kokoro/samples/python3.7/periodic.cfg", ".kokoro/samples/python3.7/presubmit.cfg", ".kokoro/samples/python3.8/common.cfg", ".kokoro/samples/python3.8/continuous.cfg", + ".kokoro/samples/python3.8/periodic-head.cfg", ".kokoro/samples/python3.8/periodic.cfg", ".kokoro/samples/python3.8/presubmit.cfg", + ".kokoro/test-samples-against-head.sh", + ".kokoro/test-samples-impl.sh", ".kokoro/test-samples.sh", ".kokoro/trampoline.sh", ".kokoro/trampoline_v2.sh", From fba0d036a27a4895aad975c7b63c7917c18de2b9 Mon Sep 17 00:00:00 2001 From: "release-please[bot]" <55107282+release-please[bot]@users.noreply.github.com> Date: Mon, 22 Mar 2021 14:56:02 +0000 Subject: [PATCH 26/26] chore: release 3.1.0 (#108) :robot: I have created a release \*beep\* \*boop\* --- ## [3.1.0](https://www.github.com/googleapis/python-datacatalog/compare/v3.0.0...v3.1.0) (2021-03-22) ### Features * add `client_cert_source_for_mtls` argument to transports ([#107](https://www.github.com/googleapis/python-datacatalog/issues/107)) ([59a44bc](https://www.github.com/googleapis/python-datacatalog/commit/59a44bc744a6322a2a23313c851eb77204110e79)) ### Bug Fixes * remove gRPC send/recv limit; add enums to `types/__init__.py` ([#87](https://www.github.com/googleapis/python-datacatalog/issues/87)) ([e0c40c7](https://www.github.com/googleapis/python-datacatalog/commit/e0c40c765242868570532b5074fd239aa2c259e9)) ### Documentation * document enum values with `undoc-members` option ([#93](https://www.github.com/googleapis/python-datacatalog/issues/93)) ([2dbb3ef](https://www.github.com/googleapis/python-datacatalog/commit/2dbb3ef062b52925ad421c5c469ed6e67671e878)) * fix `type_` attribute name in the migration guide ([#113](https://www.github.com/googleapis/python-datacatalog/issues/113)) ([2f98f22](https://www.github.com/googleapis/python-datacatalog/commit/2f98f2244271d92f79fdb26103478166958b8c8a)) * fix upgrade guide ([#114](https://www.github.com/googleapis/python-datacatalog/issues/114)) ([4bfa587](https://www.github.com/googleapis/python-datacatalog/commit/4bfa587903105cb3de2272618374df0b04156017)) * update the upgrade guide to be from 1.0 to 3.0 ([#77](https://www.github.com/googleapis/python-datacatalog/issues/77)) ([eed034a](https://www.github.com/googleapis/python-datacatalog/commit/eed034a3969913e40554300ae97c5e00e4fcc79a)) --- This PR was generated with [Release Please](https://github.com/googleapis/release-please). See [documentation](https://github.com/googleapis/release-please#release-please). --- CHANGELOG.md | 20 ++++++++++++++++++++ setup.py | 2 +- 2 files changed, 21 insertions(+), 1 deletion(-) diff --git a/CHANGELOG.md b/CHANGELOG.md index a7dbc5b1..d5eb10dc 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -4,6 +4,26 @@ [1]: https://pypi.org/project/google-cloud-datacatalog/#history +## [3.1.0](https://www.github.com/googleapis/python-datacatalog/compare/v3.0.0...v3.1.0) (2021-03-22) + + +### Features + +* add `client_cert_source_for_mtls` argument to transports ([#107](https://www.github.com/googleapis/python-datacatalog/issues/107)) ([59a44bc](https://www.github.com/googleapis/python-datacatalog/commit/59a44bc744a6322a2a23313c851eb77204110e79)) + + +### Bug Fixes + +* remove gRPC send/recv limit; add enums to `types/__init__.py` ([#87](https://www.github.com/googleapis/python-datacatalog/issues/87)) ([e0c40c7](https://www.github.com/googleapis/python-datacatalog/commit/e0c40c765242868570532b5074fd239aa2c259e9)) + + +### Documentation + +* document enum values with `undoc-members` option ([#93](https://www.github.com/googleapis/python-datacatalog/issues/93)) ([2dbb3ef](https://www.github.com/googleapis/python-datacatalog/commit/2dbb3ef062b52925ad421c5c469ed6e67671e878)) +* fix `type_` attribute name in the migration guide ([#113](https://www.github.com/googleapis/python-datacatalog/issues/113)) ([2f98f22](https://www.github.com/googleapis/python-datacatalog/commit/2f98f2244271d92f79fdb26103478166958b8c8a)) +* fix upgrade guide ([#114](https://www.github.com/googleapis/python-datacatalog/issues/114)) ([4bfa587](https://www.github.com/googleapis/python-datacatalog/commit/4bfa587903105cb3de2272618374df0b04156017)) +* update the upgrade guide to be from 1.0 to 3.0 ([#77](https://www.github.com/googleapis/python-datacatalog/issues/77)) ([eed034a](https://www.github.com/googleapis/python-datacatalog/commit/eed034a3969913e40554300ae97c5e00e4fcc79a)) + ## [3.0.0](https://www.github.com/googleapis/python-datacatalog/compare/v2.0.0...v3.0.0) (2020-11-17) diff --git a/setup.py b/setup.py index e302810c..fc8dbc1a 100644 --- a/setup.py +++ b/setup.py @@ -21,7 +21,7 @@ name = "google-cloud-datacatalog" description = "Google Cloud Data Catalog API API client library" -version = "3.0.0" +version = "3.1.0" # Should be one of: # 'Development Status :: 3 - Alpha' # 'Development Status :: 4 - Beta'