將 Sensitive Data Protection 檢查工作結果傳送至 Security Command Center

本指南將逐步說明如何檢查 Cloud Storage、Datastore 模式的 Firestore (Datastore) 或 BigQuery 中的資料,並將檢查結果傳送至 Security Command Center。

如果是 BigQuery 資料,您還可以執行剖析,這與檢查作業不同。您也可以將資料剖析檔傳送至 Security Command Center。詳情請參閱「將資料剖析檔發布至 Security Command Center」。

總覽

Security Command Center 可讓您收集安全性威脅的相關資料、識別安全性威脅,並在威脅造成業務傷害或損失之前採取相應動作。您可以使用 Security Command Center,在單一的集中式資訊主頁中執行多項安全性相關動作。

Sensitive Data Protection 內建與 Security Command Center 的整合功能。當您使用 Sensitive Data Protection 動作檢查Google Cloud 儲存空間存放區中的機密資料時,系統會將結果直接傳送至 Security Command Center 資訊主頁。這些指標會顯示在其他安全性指標旁。

完成本指南中的步驟後,您將執行下列操作:

  • 啟用 Security Command Center 和 Sensitive Data Protection。
  • 設定 Sensitive Data Protection,檢查 Google Cloud 儲存空間存放區 (Cloud Storage 值區、BigQuery 資料表或 Datastore 種類)。
  • 設定 Sensitive Data Protection 掃描,將檢查作業結果傳送至 Security Command Center。

如要進一步瞭解 Security Command Center,請參閱 Security Command Center 說明文件

如要將探索掃描 (而非檢查作業) 的結果傳送至 Security Command Center,請參閱剖析機構、資料夾專案的說明文件。

費用

在本文件中,您會使用 Google Cloud的下列計費元件:

  • Sensitive Data Protection
  • Cloud Storage
  • BigQuery
  • Datastore

如要根據預測用量估算費用,請使用 Pricing Calculator

初次使用 Google Cloud 的使用者可能符合免費試用資格。

事前準備

如要將 Sensitive Data Protection 掃描結果傳送至 Security Command Center,請務必先完成下列各項操作:

  • 步驟 1:設定 Google Cloud 儲存空間存放區。
  • 步驟 2:設定 Identity and Access Management (IAM) 角色。
  • 步驟 3:啟用 Security Command Center。
  • 步驟 4:啟用 Sensitive Data Protection。
  • 步驟 5:將 Sensitive Data Protection 設為 Security Command Center 的安全來源。

設定這些元件的步驟會在以下各節中說明。

步驟 1:設定 Google Cloud 儲存空間存放區

選擇要掃描您自己的 Google Cloud 儲存空間存放區還是範例儲存空間存放區。本主題提供這兩種情境的操作說明。

掃描您自己的資料

如要掃描您自己的現有 Cloud Storage bucket、BigQuery 資料表或 Datastore 種類,請先開啟存放區所在的專案。在後續步驟中,您將針對此專案及其機構啟用 Security Command Center 和 Sensitive Data Protection。

開啟要使用的專案後,請繼續步驟 2,設定一些 IAM 角色

掃描範例資料

如要掃描測試資料集,請先確認已設定帳單帳戶,然後建立新專案。如要完成這個步驟,您必須具備 IAM 專案建立者角色。進一步瞭解 IAM 角色

  1. 如果尚未設定帳單,請先設定帳單帳戶。

    瞭解如何啟用計費功能

  2. 前往 Google Cloud 控制台的「New Project」(新增專案) 頁面。

    前往「New Project」(新專案)

  3. 在「帳單帳戶」下拉式清單中,選取專案應計入的帳單帳戶。
  4. 在「Organization」(機構) 下拉式清單中,選取要建立專案的機構。
  5. 在「位置」下拉式清單中,選取要建立專案的機構或資料夾。

接下來,下載並儲存範例資料:

  1. 前往 GitHub 中的 Cloud Run functions 教學課程存放區
  2. 按一下「Clone or download」(複製或下載),然後按一下「Download ZIP」(下載 ZIP 檔)
  3. 將下載的 ZIP 檔案解壓縮。
  4. 前往 Google Cloud 控制台的「Storage Browser」(儲存空間瀏覽器) 頁面。

    前往 Cloud Storage

  5. 點選「建立值區」
  6. 在「Create a bucket」(建立 Bucket) 頁面中,為值區提供唯一名稱,然後按一下「Create」(建立)
  7. 在「Bucket details」(值區詳細資料) 頁面中,按一下「Upload folder」(上傳資料夾)
  8. 前往解壓縮的 dlp-cloud-functions-tutorials-master 資料夾並開啟,然後選取 sample_data 資料夾。 按一下「上傳」,將資料夾的內容上傳至 Cloud Storage。

請記下您為 Cloud Storage bucket 提供的名稱,後續步驟中會用到。檔案上傳完成後,即可繼續操作。

步驟 2:設定 IAM 角色

如要使用 Sensitive Data Protection 將掃描結果傳送至 Security Command Center,您需要「安全中心管理員」和「Sensitive Data Protection 作業編輯者」身分與存取權管理角色。本節說明如何新增角色。如要完成本節,您必須具備「機構管理員」IAM 角色。

  1. 前往「身分與存取權管理」頁面。

    前往「IAM」頁面

  2. 在「依主體檢視」分頁中,找出您的 Google 帳戶,然後按一下「編輯主體」
  3. 新增「安全中心管理員」和「Sensitive Data Protection 工作編輯者」角色:

    1. 在「編輯存取權」面板中,按一下「新增其他角色」
    2. 在「Select a role」(請選擇角色) 清單中,搜尋並選取「Security Center Admin」(安全中心管理員)
    3. 按一下 [Add another role] (新增其他角色)
    4. 在「Select a role」(請選擇角色) 清單中,搜尋並選取「DLP Jobs Editor」(DLP 工作編輯者)
    5. 按一下 [儲存]

您現在擁有機構的 Sensitive Data Protection 工作編輯者和安全中心管理員角色。這些角色可讓您完成本主題其餘部分的工作。

步驟 3:啟用 Security Command Center

  1. 前往 Google Cloud 控制台的「Security Command Center」頁面。

    前往 Security Command Center

  2. 在「Organization」(機構) 下拉式清單中,選取要啟用敏感資料保護機制的機構,然後按一下「Select」(選取)

  3. 在顯示的「啟用資產探索」頁面中,選取「當前和日後的所有專案」,然後按一下「啟用」。系統應會顯示一則訊息,指出 Sensitive Data Protection 正在開始資產探索。

資產探索完成後,Sensitive Data Protection 會顯示 Google Cloud 支援的資產。資產探索可能需要幾分鐘,您可能需要重新整理頁面才能顯示資產。

如要進一步瞭解如何啟用 Security Command Center,請參閱 Security Command Center 說明文件

步驟 4:啟用 Sensitive Data Protection

針對要掃描的專案啟用 Sensitive Data Protection。專案所在的機構必須與您已啟用 Security Command Center 的機構相同。如要使用Google Cloud 控制台啟用 Sensitive Data Protection,請按照下列步驟操作:

  1. 前往 Google Cloud 控制台的「Enable access to API」頁面。

    啟用 API

  2. 在工具列中,選取本指南步驟 1 的專案。專案必須包含您要掃描的 Cloud Storage 值區、BigQuery 表格或 Datastore 種類。
  3. 點選「下一步」
  4. 按一下「啟用」

專案現已啟用 Sensitive Data Protection。

步驟 5:啟用 Sensitive Data Protection,做為 Security Command Center 的整合式服務

如要在 Security Command Center 中查看 Sensitive Data Protection 掃描結果,請啟用「Sensitive Data Protection」做為整合式服務。詳情請參閱 Security Command Center 說明文件中的「新增 Google Cloud 整合式服務」。

Sensitive Data Protection 的發現項目會顯示在 Security Command Center 的「發現項目」頁面。

設定及執行機密資料保護檢查掃描

在本節中,您將設定並執行 Sensitive Data Protection 檢查工作。

您在此設定的「檢查工作」會指示 Sensitive Data Protection 掃描儲存在 Cloud Storage 中的範例資料,或掃描您儲存在 Cloud Storage、Datastore 或 BigQuery 中的資料。您指定的工作設定也會指示 Sensitive Data Protection 將掃描結果儲存在 Security Command Center 中。

步驟 1:記下專案 ID

  1. 前往 Google Cloud 控制台。

    前往 Google Cloud 控制台

  2. 按一下 [選取]。
  3. 在「Select from」(從以下項目選取) 下拉式清單中,選取已啟用 Security Command Center 的機構。
  4. 在「ID」下方,複製含有要掃描資料的專案 ID。
  5. 在「名稱」下方,按一下專案以選取。

步驟 2:開啟 APIs Explorer 並設定工作

  1. 前往 dlpJobs.create 方法參考資料頁面的 APIs Explorer,方法是點選下列按鈕:

    開啟 APIs Explorer

  2. 在「parent」(父項) 方塊中輸入下列內容,其中 PROJECT_ID 是您在步驟 1 中記下的專案 ID:
    projects/PROJECT_ID

針對您要使用的資料種類,將「Request body」(請求主體) 欄位替換成下列 JSON:Cloud Storage 值區中的範例資料,或您儲存在 Cloud Storage、Datastore 或 BigQuery 中的資料。

範例資料

如果您建立 Cloud Storage 值區來儲存範例資料,請複製下列 JSON,然後貼到「Request body」(請求主體) 欄位中。將 BUCKET_NAME 替換為您為 Cloud Storage 值區提供的名稱:

{
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://BUCKET_NAME/**"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

Cloud Storage 資料

如要掃描您自己的 Cloud Storage 值區,請複製下列 JSON,並將其貼到「Request body」(請求主體) 欄位中。

請將 PATH_NAME 替換成您要掃描的位置路徑。如要以遞迴方式掃描,請在路徑結尾加上兩個星號,例如 gs://path_to_files/**。如要掃描特定目錄,而不深入掃描,請在路徑結尾加上一個星號,例如 gs://path_to_files/*

{
  "inspectJob":{
    "storageConfig":{
      "cloudStorageOptions":{
        "fileSet":{
          "url":"gs://PATH_NAME"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

如要進一步瞭解可用的掃描選項,請參閱檢查儲存空間與資料庫以找出機密資料一文。

Datastore 資料

如要掃描您保存在 Datastore 中的資料,請複製下列 JSON,並將其貼到「Request body」(請求主體) 欄位中。

請將 DATASTORE_KIND 替換成 Datastore 種類的名稱。您也可以將 NAMESPACE_IDPROJECT_ID 分別替換成命名空間和專案 ID,或在必要時完全移除 "partitionID"

{
  "inspectJob":{
    "storageConfig":{
      "datastoreOptions":{
        "kind":{
          "name":"DATASTORE_KIND"
        },
        "partitionId":{
          "namespaceId":"NAMESPACE_ID",
          "projectId":"PROJECT_ID"
        }
      }
    },
    "inspectConfig":{
      "infoTypes":[
        {
          "name":"EMAIL_ADDRESS"
        },
        {
          "name":"PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name":"PHONE_NUMBER"
        }
      ],
      "includeQuote":true,
      "minLikelihood":"UNLIKELY",
      "limits":{
        "maxFindingsPerRequest":100
      }
    },
    "actions":[
      {
        "publishSummaryToCscc":{

        }
      }
    ]
  }
}

如要進一步瞭解可用的掃描選項,請參閱檢查儲存空間與資料庫以找出機密資料一文。

BigQuery 資料

如要掃描您自己的 BigQuery 表格,請複製下列 JSON,並將其貼到「Request body」(請求主體) 欄位中。

請將 PROJECT_IDBIGQUERY_DATASET_NAMEBIGQUERY_TABLE_NAME 分別替換成專案 ID、BigQuery 資料集和表格名稱。

{
  "inspectJob":
  {
    "storageConfig":
    {
      "bigQueryOptions":
      {
        "tableReference":
        {
          "projectId": "PROJECT_ID",
          "datasetId": "BIGQUERY_DATASET_NAME",
          "tableId": "BIGQUERY_TABLE_NAME"
        }
      }
    },
    "inspectConfig":
    {
      "infoTypes":
      [
        {
          "name": "EMAIL_ADDRESS"
        },
        {
          "name": "PERSON_NAME"
        },
        {
          "name": "LOCATION"
        },
        {
          "name": "PHONE_NUMBER"
        }
      ],
      "includeQuote": true,
      "minLikelihood": "UNLIKELY",
      "limits":
      {
        "maxFindingsPerRequest": 100
      }
    },
    "actions":
    [
      {
        "publishSummaryToCscc":
        {
        }
      }
    ]
  }
}

如要進一步瞭解可用的掃描選項,請參閱檢查儲存空間與資料庫以找出機密資料一文。

步驟 3:執行要求以啟動檢查工作

按照上述步驟設定工作後,請按一下「Execute」(執行)傳送要求。如果要求成功,回應會顯示在要求下方,且會顯示成功代碼和 JSON 物件,指示您建立的 Sensitive Data Protection 工作狀態。

檢查 Sensitive Data Protection 檢查掃描的狀態

掃描要求的回應包含檢查掃描工作的工作 ID ("name" 金鑰),以及檢查工作的目前狀態 ("state" 金鑰)。提交要求後,工作狀態會立即變成 "PENDING"

提交掃描要求之後,系統會立即開始掃描內容。

如要查看檢查工作的狀態,請按照下列步驟操作:

  1. 前往 dlpJobs.get 方法參考資料頁面的 APIs Explorer,方法是點選下列按鈕:

    開啟 APIs Explorer

  2. 在「name」(名稱) 方塊中,輸入掃描要求的 JSON 回應中的工作名稱,格式如下:
    projects/PROJECT_ID/dlpJobs/JOB_ID
    工作 ID 的格式為 i-1234567890123456789
  3. 如要提交要求,請按一下 [Execute] (執行)

如果回應 JSON 物件的 "state" 金鑰指示工作狀態為 "DONE",則表示檢查工作已完成。

如要查看其餘的回應 JSON,請向下捲動頁面。在 "result" > "infoTypeStats" 下,列出的每個資訊類型都應該有對應的 "count"。如果沒有,請確認您輸入的 JSON 是否正確,以及資料的路徑與位置是否正確。

檢查工作完成後,您可以繼續閱讀本指南的下一節,在 Security Command Center 中查看掃描結果。

程式碼範例:檢查 Cloud Storage bucket

這個範例示範如何使用 DLP API 建立檢查工作,檢查 Cloud Storage 值區並將發現項目傳送至 Security Command Center。

C#

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


using System.Collections.Generic;
using System.Linq;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using static Google.Cloud.Dlp.V2.InspectConfig.Types;

public class InspectStorageWithSCCIntegration
{
    public static DlpJob SendGcsData(
        string projectId,
        string gcsPath,
        Likelihood minLikelihood = Likelihood.Unlikely,
        IEnumerable<InfoType> infoTypes = null)
    {
        // Instantiate the dlp client.
        var dlp = DlpServiceClient.Create();

        // Specify the GCS file to be inspected.
        var storageConfig = new StorageConfig
        {
            CloudStorageOptions = new CloudStorageOptions
            {
                FileSet = new CloudStorageOptions.Types.FileSet
                {
                    Url = gcsPath
                }
            }
        };

        // Specify the type of info to be inspected and construct the inspect config.
        var inspectConfig = new InspectConfig
        {
            InfoTypes =
            {
                infoTypes ?? new InfoType[]
                {
                    new InfoType { Name = "EMAIL_ADDRESS" },
                    new InfoType { Name = "PERSON_NAME" },
                    new InfoType { Name = "LOCATION" },
                    new InfoType { Name = "PHONE_NUMBER" }
                }
            },
            IncludeQuote = true,
            MinLikelihood = minLikelihood,
            Limits = new FindingLimits
            {
                MaxFindingsPerRequest = 100
            }
        };

        // Construct the SCC action which will be performed after inspecting the storage.
        var actions = new Action[]
        {
            new Action
            {
                PublishSummaryToCscc = new Action.Types.PublishSummaryToCscc()
            }
        };

        // Construct the inspect job config using storage config, inspect config and action.
        var inspectJob = new InspectJobConfig
        {
            StorageConfig = storageConfig,
            InspectConfig = inspectConfig,
            Actions = { actions }
        };

        // Construct the request.
        var request = new CreateDlpJobRequest
        {
            ParentAsLocationName = new LocationName(projectId, "global"),
            InspectJob = inspectJob
        };

        // Call the API.
        DlpJob response = dlp.CreateDlpJob(request);

        return response;
    }
}

Go

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// inspectGCSFileSendToScc inspects sensitive data in a Google Cloud Storage (GCS) file
// and sends the inspection results to Google Cloud Security Command Center (SCC) for further analysis.
func inspectGCSFileSendToScc(w io.Writer, projectID, gcsPath string) error {
	// projectID := "my-project-id"
	// gcsPath := "gs://" + "your-bucket-name" + "path/to/file.txt"

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Specify the GCS file to be inspected.
	cloudStorageOptions := &dlppb.CloudStorageOptions{
		FileSet: &dlppb.CloudStorageOptions_FileSet{
			Url: gcsPath,
		},
	}

	// storageCfg represents the configuration for data inspection in various storage types.
	storageConfig := &dlppb.StorageConfig{
		Type: &dlppb.StorageConfig_CloudStorageOptions{
			CloudStorageOptions: cloudStorageOptions,
		},
	}

	// Specify the type of info the inspection will look for.
	// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
	infoTypes := []*dlppb.InfoType{
		{Name: "EMAIL_ADDRESS"},
		{Name: "PERSON_NAME"},
		{Name: "LOCATION"},
		{Name: "PHONE_NUMBER"},
	}

	// The minimum likelihood required before returning a match.
	minLikelihood := dlppb.Likelihood_UNLIKELY

	// The maximum number of findings to report (0 = server maximum).
	findingLimits := &dlppb.InspectConfig_FindingLimits{
		MaxFindingsPerItem: 100,
	}

	inspectConfig := &dlppb.InspectConfig{
		InfoTypes:     infoTypes,
		MinLikelihood: minLikelihood,
		Limits:        findingLimits,
		IncludeQuote:  true,
	}

	// Specify the action that is triggered when the job completes.
	action := &dlppb.Action{
		Action: &dlppb.Action_PublishSummaryToCscc_{
			PublishSummaryToCscc: &dlppb.Action_PublishSummaryToCscc{},
		},
	}

	// Configure the inspection job we want the service to perform.
	inspectJobConfig := &dlppb.InspectJobConfig{
		StorageConfig: storageConfig,
		InspectConfig: inspectConfig,
		Actions: []*dlppb.Action{
			action,
		},
	}

	// Create the request for the job configured above.
	req := &dlppb.CreateDlpJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
		Job: &dlppb.CreateDlpJobRequest_InspectJob{
			InspectJob: inspectJobConfig,
		},
	}

	// Send the request.
	resp, err := client.CreateDlpJob(ctx, req)
	if err != nil {
		return err
	}

	// Print the result.
	fmt.Fprintf(w, "Job created successfully: %v", resp.Name)
	return nil
}

Java

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.Action;
import com.google.privacy.dlp.v2.CloudStorageOptions;
import com.google.privacy.dlp.v2.CreateDlpJobRequest;
import com.google.privacy.dlp.v2.DlpJob;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InfoTypeStats;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectDataSourceDetails;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.Likelihood;
import com.google.privacy.dlp.v2.LocationName;
import com.google.privacy.dlp.v2.StorageConfig;
import java.io.IOException;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import java.util.stream.Stream;

public class InspectGcsFileSendToScc {

  private static final int TIMEOUT_MINUTES = 15;

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    // The Google Cloud project id to use as a parent resource.
    String projectId = "your-project-id";
    // The name of the file in the Google Cloud Storage bucket.
    String gcsPath = "gs://" + "your-bucket-name" + "path/to/file.txt";
    createJobSendToScc(projectId, gcsPath);
  }

  // Creates a DLP Job to scan the sample data stored in a Cloud Storage and save its scan results
  // to Security Command Center.
  public static void createJobSendToScc(String projectId, String gcsPath)
      throws IOException, InterruptedException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Specify the GCS file to be inspected.
      CloudStorageOptions cloudStorageOptions =
          CloudStorageOptions.newBuilder()
              .setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath))
              .build();

      StorageConfig storageConfig =
          StorageConfig.newBuilder()
              .setCloudStorageOptions(cloudStorageOptions)
              .build();

      // Specify the type of info the inspection will look for.
      // See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
      List<InfoType> infoTypes =
          Stream.of("EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER")
              .map(it -> InfoType.newBuilder().setName(it).build())
              .collect(Collectors.toList());

      // The minimum likelihood required before returning a match.
      // See: https://cloud.google.com/dlp/docs/likelihood
      Likelihood minLikelihood = Likelihood.UNLIKELY;

      // The maximum number of findings to report (0 = server maximum)
      InspectConfig.FindingLimits findingLimits =
          InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();

      InspectConfig inspectConfig =
          InspectConfig.newBuilder()
              .addAllInfoTypes(infoTypes)
              .setIncludeQuote(true)
              .setMinLikelihood(minLikelihood)
              .setLimits(findingLimits)
              .build();

      // Specify the action that is triggered when the job completes.
      Action.PublishSummaryToCscc publishSummaryToCscc =
          Action.PublishSummaryToCscc.getDefaultInstance();
      Action action = Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();

      // Configure the inspection job we want the service to perform.
      InspectJobConfig inspectJobConfig =
          InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .setStorageConfig(storageConfig)
              .addActions(action)
              .build();

      // Construct the job creation request to be sent by the client.
      CreateDlpJobRequest createDlpJobRequest =
          CreateDlpJobRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setInspectJob(inspectJobConfig)
              .build();

      // Send the job creation request and process the response.
      DlpJob response = dlpServiceClient.createDlpJob(createDlpJobRequest);
      // Get the current time.
      long startTime = System.currentTimeMillis();

      // Check if the job state is DONE.
      while (response.getState() != DlpJob.JobState.DONE) {
        // Sleep for 30 second.
        Thread.sleep(30000);

        // Get the updated job status.
        response = dlpServiceClient.getDlpJob(response.getName());

        // Check if the timeout duration has exceeded.
        long elapsedTime = System.currentTimeMillis() - startTime;
        if (TimeUnit.MILLISECONDS.toMinutes(elapsedTime) >= TIMEOUT_MINUTES) {
          System.out.printf("Job did not complete within %d minutes.%n", TIMEOUT_MINUTES);
          break;
        }
      }
      // Print the results.
      System.out.println("Job status: " + response.getState());
      System.out.println("Job name: " + response.getName());
      InspectDataSourceDetails.Result result = response.getInspectDetails().getResult();
      System.out.println("Findings: ");
      for (InfoTypeStats infoTypeStat : result.getInfoTypeStatsList()) {
        System.out.print("\tInfo type: " + infoTypeStat.getInfoType().getName());
        System.out.println("\tCount: " + infoTypeStat.getCount());
      }
    }
  }
}

Node.js

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlpClient = new DLP.DlpServiceClient();

// The project ID to run the API call under
// const projectId = 'your-project-id';

// The name of the file in the bucket
// const gcsPath = 'gcs-file-path';

async function inspectGCSSendToScc() {
  // Specify the storage configuration object with GCS URL.
  const storageConfig = {
    cloudStorageOptions: {
      fileSet: {
        url: gcsPath,
      },
    },
  };

  // Construct the info types to look for in the GCS file.
  const infoTypes = [
    {name: 'EMAIL_ADDRESS'},
    {name: 'PERSON_NAME'},
    {name: 'LOCATION'},
    {name: 'PHONE_NUMBER'},
  ];

  // Construct the inspection configuration.
  const inspectConfig = {
    infoTypes,
    minLikelihood: DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,
    limits: {
      maxFindingsPerItem: 100,
    },
  };

  // Specify the action that is triggered when the job completes.
  const action = {
    publishSummaryToCscc: {},
  };

  // Configure the inspection job we want the service to perform.
  const jobConfig = {
    inspectConfig,
    storageConfig,
    actions: [action],
  };

  // Construct the job creation request to be sent by the client.
  const request = {
    parent: `projects/${projectId}/locations/global`,
    inspectJob: jobConfig,
  };

  // Send the job creation request and process the response.
  const [jobsResponse] = await dlpClient.createDlpJob(request);
  const jobName = jobsResponse.name;

  // Waiting for a maximum of 15 minutes for the job to get complete.
  let job;
  let numOfAttempts = 30;
  while (numOfAttempts > 0) {
    // Fetch DLP Job status
    [job] = await dlpClient.getDlpJob({name: jobName});

    // Check if the job has completed.
    if (job.state === 'DONE') {
      break;
    }
    if (job.state === 'FAILED') {
      console.log('Job Failed, Please check the configuration.');
      return;
    }
    // Sleep for a short duration before checking the job status again.
    await new Promise(resolve => {
      setTimeout(() => resolve(), 30000);
    });
    numOfAttempts -= 1;
  }

  // Print out the results.
  const infoTypeStats = job.inspectDetails.result.infoTypeStats;
  if (infoTypeStats.length > 0) {
    infoTypeStats.forEach(infoTypeStat => {
      console.log(
        `Found ${infoTypeStat.count} instance(s) of infoType ${infoTypeStat.infoType.name}.`
      );
    });
  } else {
    console.log('No findings.');
  }
}
await inspectGCSSendToScc();

PHP

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

use Google\Cloud\Dlp\V2\Action;
use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\CloudStorageOptions;
use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;
use Google\Cloud\Dlp\V2\CreateDlpJobRequest;
use Google\Cloud\Dlp\V2\DlpJob\JobState;
use Google\Cloud\Dlp\V2\GetDlpJobRequest;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\StorageConfig;

/**
 * (GCS) Send Cloud DLP scan results to Security Command Center.
 * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center.
 *
 * @param string $callingProjectId  The project ID to run the API call under.
 * @param string $gcsUri            GCS file to be inspected.
 */
function inspect_gcs_send_to_scc(
    // TODO(developer): Replace sample parameters before running the code.
    string $callingProjectId,
    string $gcsUri = 'gs://GOOGLE_STORAGE_BUCKET_NAME/dlp_sample.csv'
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Construct the items to be inspected.
    $cloudStorageOptions = (new CloudStorageOptions())
        ->setFileSet((new FileSet())
            ->setUrl($gcsUri));

    $storageConfig = (new StorageConfig())
        ->setCloudStorageOptions(($cloudStorageOptions));

    // Specify the type of info the inspection will look for.
    $infoTypes = [
        (new InfoType())->setName('EMAIL_ADDRESS'),
        (new InfoType())->setName('PERSON_NAME'),
        (new InfoType())->setName('LOCATION'),
        (new InfoType())->setName('PHONE_NUMBER')
    ];

    // Specify how the content should be inspected.
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood(likelihood::UNLIKELY)
        ->setLimits((new FindingLimits())
            ->setMaxFindingsPerRequest(100))
        ->setInfoTypes($infoTypes)
        ->setIncludeQuote(true);

    // Specify the action that is triggered when the job completes.
    $action = (new Action())
        ->setPublishSummaryToCscc(new PublishSummaryToCscc());

    // Construct inspect job config to run.
    $inspectJobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig)
        ->setActions([$action]);

    // Send the job creation request and process the response.
    $parent = "projects/$callingProjectId/locations/global";
    $createDlpJobRequest = (new CreateDlpJobRequest())
        ->setParent($parent)
        ->setInspectJob($inspectJobConfig);
    $job = $dlp->createDlpJob($createDlpJobRequest);

    $numOfAttempts = 10;
    do {
        printf('Waiting for job to complete' . PHP_EOL);
        sleep(10);
        $getDlpJobRequest = (new GetDlpJobRequest())
            ->setName($job->getName());
        $job = $dlp->getDlpJob($getDlpJobRequest);
        if ($job->getState() == JobState::DONE) {
            break;
        }
        $numOfAttempts--;
    } while ($numOfAttempts > 0);

    // Print finding counts.
    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));
    switch ($job->getState()) {
        case JobState::DONE:
            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();
            if (count($infoTypeStats) === 0) {
                printf('No findings.' . PHP_EOL);
            } else {
                foreach ($infoTypeStats as $infoTypeStat) {
                    printf(
                        '  Found %s instance(s) of infoType %s' . PHP_EOL,
                        $infoTypeStat->getCount(),
                        $infoTypeStat->getInfoType()->getName()
                    );
                }
            }
            break;
        case JobState::FAILED:
            printf('Job %s had errors:' . PHP_EOL, $job->getName());
            $errors = $job->getErrors();
            foreach ($errors as $error) {
                var_dump($error->getDetails());
            }
            break;
        case JobState::PENDING:
            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);
            break;
        default:
            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');
    }
}

Python

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import time
from typing import List

import google.cloud.dlp


def inspect_gcs_send_to_scc(
    project: str,
    bucket: str,
    info_types: List[str],
    max_findings: int = 100,
) -> None:
    """
    Uses the Data Loss Prevention API to inspect Google Cloud Storage
    data and send the results to Google Security Command Center.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        bucket: The name of the GCS bucket containing the file, as a string.
        info_types: A list of strings representing infoTypes to inspect for.
            A full list of infoType categories can be fetched from the API.
        max_findings: The maximum number of findings to report; 0 = no maximum.
    """
    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries.
    info_types = [{"name": info_type} for info_type in info_types]

    # Construct the configuration dictionary.
    inspect_config = {
        "info_types": info_types,
        "min_likelihood": google.cloud.dlp_v2.Likelihood.UNLIKELY,
        "limits": {"max_findings_per_request": max_findings},
        "include_quote": True,
    }

    # Construct a cloud_storage_options dictionary with the bucket's URL.
    url = f"gs://{bucket}"
    storage_config = {"cloud_storage_options": {"file_set": {"url": url}}}

    # Tell the API where to send a notification when the job is complete.
    actions = [{"publish_summary_to_cscc": {}}]

    # Construct the job definition.
    job = {
        "inspect_config": inspect_config,
        "storage_config": storage_config,
        "actions": actions,
    }

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Call the API.
    response = dlp.create_dlp_job(
        request={
            "parent": parent,
            "inspect_job": job,
        }
    )
    print(f"Inspection Job started : {response.name}")

    job_name = response.name

    # Waiting for maximum 15 minutes for the job to get complete.
    no_of_attempts = 30
    while no_of_attempts > 0:
        # Get the DLP job status.
        job = dlp.get_dlp_job(request={"name": job_name})
        # Check if the job has completed.
        if job.state == google.cloud.dlp_v2.DlpJob.JobState.DONE:
            break
        elif job.state == google.cloud.dlp_v2.DlpJob.JobState.FAILED:
            print("Job Failed, Please check the configuration.")
            return

        # Sleep for a short duration before checking the job status again.
        time.sleep(30)
        no_of_attempts -= 1

    # Print out the results.
    print(f"Job name: {job.name}")
    result = job.inspect_details.result
    print("Processed Bytes: ", result.processed_bytes)
    if result.info_type_stats:
        for stats in result.info_type_stats:
            print(f"Info type: {stats.info_type.name}")
            print(f"Count: {stats.count}")
    else:
        print("No findings.")

程式碼範例:檢查 BigQuery 資料表

這個範例示範如何使用 DLP API 建立檢查工作,檢查 BigQuery 表格並將結果傳送至 Security Command Center。

C#

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


using System.Collections.Generic;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using static Google.Cloud.Dlp.V2.InspectConfig.Types;

public class InspectBigQueryWithSCCIntegration
{
    public static DlpJob SendBigQueryData(
        string projectId,
        Likelihood minLikelihood = Likelihood.Unlikely,
        IEnumerable<InfoType> infoTypes = null)
    {
        // Instantiate the dlp client.
        var dlp = DlpServiceClient.Create();

        // Construct the storage config by providing the table to be inspected.
        var storageConfig = new StorageConfig
        {
            BigQueryOptions = new BigQueryOptions
            {
                TableReference = new BigQueryTable
                {
                    ProjectId = "bigquery-public-data",
                    DatasetId = "usa_names",
                    TableId = "usa_1910_current",
                }
            }
        };

        // Construct the inspect config by specifying the type of info to be inspected.
        var inspectConfig = new InspectConfig
        {
            InfoTypes =
            {
                infoTypes ?? new InfoType[]
                {
                    new InfoType { Name = "EMAIL_ADDRESS" },
                    new InfoType { Name = "PERSON_NAME" }
                }
            },
            IncludeQuote = true,
            MinLikelihood = minLikelihood,
            Limits = new FindingLimits
            {
                MaxFindingsPerRequest = 100
            }
        };

        // Construct the SCC action which will be performed after inspecting the source.
        var actions = new Action[]
        {
            new Action
            {
                PublishSummaryToCscc = new Action.Types.PublishSummaryToCscc()
            }
        };

        // Construct the inspect job config using storage config, inspect config and action.
        var inspectJob = new InspectJobConfig
        {
            StorageConfig = storageConfig,
            InspectConfig = inspectConfig,
            Actions = { actions }
        };

        // Construct the request.
        var request = new CreateDlpJobRequest
        {
            ParentAsLocationName = new LocationName(projectId, "global"),
            InspectJob = inspectJob
        };

        // Call the API.
        DlpJob response = dlp.CreateDlpJob(request);

        System.Console.WriteLine($"Job created successfully. Job name: {response.Name}");

        return response;
    }
}

Go

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// inspectBigQuerySendToScc configures the inspection job that instructs Cloud DLP to scan data stored in BigQuery,
// and also instructs Cloud DLP to save its scan results to Security Command Center.
func inspectBigQuerySendToScc(w io.Writer, projectID, bigQueryDatasetId, bigQueryTableId string) error {
	// projectID := "my-project-id"
	// bigQueryDatasetId := "your-project-bigquery-dataset"
	// bigQueryTableId := "your-project-bigquery_table"

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Specify the BigQuery table to be inspected.
	tableReference := &dlppb.BigQueryTable{
		ProjectId: projectID,
		DatasetId: bigQueryDatasetId,
		TableId:   bigQueryTableId,
	}

	bigQueryOptions := &dlppb.BigQueryOptions{
		TableReference: tableReference,
	}

	// Specify the type of storage that you have configured.
	storageConfig := &dlppb.StorageConfig{
		Type: &dlppb.StorageConfig_BigQueryOptions{
			BigQueryOptions: bigQueryOptions,
		},
	}

	// Specify the type of info the inspection will look for.
	// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types.
	infoTypes := []*dlppb.InfoType{
		{Name: "EMAIL_ADDRESS"},
		{Name: "PERSON_NAME"},
		{Name: "LOCATION"},
		{Name: "PHONE_NUMBER"},
	}

	// The minimum likelihood required before returning a match.
	minLikelihood := dlppb.Likelihood_UNLIKELY

	// The maximum number of findings to report (0 = server maximum).
	findingLimits := &dlppb.InspectConfig_FindingLimits{
		MaxFindingsPerItem: 100,
	}

	// Specify how the content should be inspected.
	inspectConfig := &dlppb.InspectConfig{
		InfoTypes:     infoTypes,
		MinLikelihood: minLikelihood,
		Limits:        findingLimits,
		IncludeQuote:  true,
	}

	// Specify the action that is triggered when the job completes.
	action := &dlppb.Action{
		Action: &dlppb.Action_PublishSummaryToCscc_{
			PublishSummaryToCscc: &dlppb.Action_PublishSummaryToCscc{},
		},
	}

	// Configure the inspection job we want the service to perform.
	inspectJobConfig := &dlppb.InspectJobConfig{
		StorageConfig: storageConfig,
		InspectConfig: inspectConfig,
		Actions: []*dlppb.Action{
			action,
		},
	}

	// Create the request for the job configured above.
	req := &dlppb.CreateDlpJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
		Job: &dlppb.CreateDlpJobRequest_InspectJob{
			InspectJob: inspectJobConfig,
		},
	}

	// Send the request.
	resp, err := client.CreateDlpJob(ctx, req)
	if err != nil {
		return err
	}

	// Print the result
	fmt.Fprintf(w, "Job created successfully: %v", resp.Name)
	return nil
}

Java

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.Action;
import com.google.privacy.dlp.v2.BigQueryOptions;
import com.google.privacy.dlp.v2.BigQueryTable;
import com.google.privacy.dlp.v2.CreateDlpJobRequest;
import com.google.privacy.dlp.v2.DlpJob;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InfoTypeStats;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectDataSourceDetails;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.Likelihood;
import com.google.privacy.dlp.v2.LocationName;
import com.google.privacy.dlp.v2.StorageConfig;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import java.util.stream.Stream;

public class InspectBigQuerySendToScc {

  private static final int TIMEOUT_MINUTES = 15;

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    // The Google Cloud project id to use as a parent resource.
    String projectId = "your-project-id";
    // The BigQuery dataset id to be used and the reference table name to be inspected.
    String bigQueryDatasetId = "your-project-bigquery-dataset";
    String bigQueryTableId = "your-project-bigquery_table";
    inspectBigQuerySendToScc(projectId, bigQueryDatasetId, bigQueryTableId);
  }

  // Inspects a BigQuery Table to send data to Security Command Center.
  public static void inspectBigQuerySendToScc(
      String projectId, String bigQueryDatasetId, String bigQueryTableId) throws Exception {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Specify the BigQuery table to be inspected.
      BigQueryTable tableReference =
          BigQueryTable.newBuilder()
              .setProjectId(projectId)
              .setDatasetId(bigQueryDatasetId)
              .setTableId(bigQueryTableId)
              .build();

      BigQueryOptions bigQueryOptions =
          BigQueryOptions.newBuilder().setTableReference(tableReference).build();

      StorageConfig storageConfig =
          StorageConfig.newBuilder().setBigQueryOptions(bigQueryOptions).build();

      // Specify the type of info the inspection will look for.
      List<InfoType> infoTypes =
          Stream.of("EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER")
              .map(it -> InfoType.newBuilder().setName(it).build())
              .collect(Collectors.toList());

      // The minimum likelihood required before returning a match.
      Likelihood minLikelihood = Likelihood.UNLIKELY;

      // The maximum number of findings to report (0 = server maximum)
      InspectConfig.FindingLimits findingLimits =
          InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();

      // Specify how the content should be inspected.
      InspectConfig inspectConfig =
          InspectConfig.newBuilder()
              .addAllInfoTypes(infoTypes)
              .setIncludeQuote(true)
              .setMinLikelihood(minLikelihood)
              .setLimits(findingLimits)
              .build();

      // Specify the action that is triggered when the job completes.
      Action.PublishSummaryToCscc publishSummaryToCscc =
          Action.PublishSummaryToCscc.getDefaultInstance();
      Action action = Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();

      // Configure the inspection job we want the service to perform.
      InspectJobConfig inspectJobConfig =
          InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .setStorageConfig(storageConfig)
              .addActions(action)
              .build();

      // Construct the job creation request to be sent by the client.
      CreateDlpJobRequest createDlpJobRequest =
          CreateDlpJobRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setInspectJob(inspectJobConfig)
              .build();

      // Send the job creation request and process the response.
      DlpJob response = dlpServiceClient.createDlpJob(createDlpJobRequest);

      // Get the current time.
      long startTime = System.currentTimeMillis();

      // Check if the job state is DONE.
      while (response.getState() != DlpJob.JobState.DONE) {
        // Sleep for 30 second.
        Thread.sleep(30000);

        // Get the updated job status.
        response = dlpServiceClient.getDlpJob(response.getName());

        // Check if the timeout duration has exceeded.
        long elapsedTime = System.currentTimeMillis() - startTime;
        if (TimeUnit.MILLISECONDS.toMinutes(elapsedTime) >= TIMEOUT_MINUTES) {
          System.out.printf("Job did not complete within %d minutes.%n", TIMEOUT_MINUTES);
          break;
        }
      }
      // Print the results.
      System.out.println("Job status: " + response.getState());
      System.out.println("Job name: " + response.getName());
      InspectDataSourceDetails.Result result = response.getInspectDetails().getResult();
      System.out.println("Findings: ");
      for (InfoTypeStats infoTypeStat : result.getInfoTypeStatsList()) {
        System.out.print("\tInfo type: " + infoTypeStat.getInfoType().getName());
        System.out.println("\tCount: " + infoTypeStat.getCount());
      }
    }
  }
}

Node.js

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under.
// const projectId = "your-project-id";

// The project ID the table is stored under
// This may or (for public datasets) may not equal the calling project ID
// const dataProjectId = 'my-project';

// The ID of the dataset to inspect, e.g. 'my_dataset'
// const datasetId = 'my_dataset';

// The ID of the table to inspect, e.g. 'my_table'
// const tableId = 'my_table';

async function inspectBigQuerySendToScc() {
  // Specify the storage configuration object with big query table.
  const storageItem = {
    bigQueryOptions: {
      tableReference: {
        projectId: dataProjectId,
        datasetId: datasetId,
        tableId: tableId,
      },
    },
  };

  // Specify the type of info the inspection will look for.
  const infoTypes = [
    {name: 'EMAIL_ADDRESS'},
    {name: 'PERSON_NAME'},
    {name: 'LOCATION'},
    {name: 'PHONE_NUMBER'},
  ];

  // Construct inspect configuration.
  const inspectConfig = {
    infoTypes: infoTypes,
    includeQuote: true,
    minLikelihood: DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,
    limits: {
      maxFindingsPerItem: 100,
    },
  };

  // Specify the action that is triggered when the job completes.
  const action = {
    publishSummaryToCscc: {
      enable: true,
    },
  };

  // Configure the inspection job we want the service to perform.
  const inspectJobConfig = {
    inspectConfig: inspectConfig,
    storageConfig: storageItem,
    actions: [action],
  };

  // Construct the job creation request to be sent by the client.
  const request = {
    parent: `projects/${projectId}/locations/global`,
    inspectJob: inspectJobConfig,
  };

  // Send the job creation request and process the response.
  const [jobsResponse] = await dlp.createDlpJob(request);
  const jobName = jobsResponse.name;

  // Waiting for a maximum of 15 minutes for the job to get complete.
  let job;
  let numOfAttempts = 30;
  while (numOfAttempts > 0) {
    // Fetch DLP Job status
    [job] = await dlp.getDlpJob({name: jobName});

    // Check if the job has completed.
    if (job.state === 'DONE') {
      break;
    }
    if (job.state === 'FAILED') {
      console.log('Job Failed, Please check the configuration.');
      return;
    }
    // Sleep for a short duration before checking the job status again.
    await new Promise(resolve => {
      setTimeout(() => resolve(), 30000);
    });
    numOfAttempts -= 1;
  }

  // Print out the results.
  const infoTypeStats = job.inspectDetails.result.infoTypeStats;
  if (infoTypeStats.length > 0) {
    infoTypeStats.forEach(infoTypeStat => {
      console.log(
        `  Found ${infoTypeStat.count} instance(s) of infoType ${infoTypeStat.infoType.name}.`
      );
    });
  } else {
    console.log('No findings.');
  }
}
await inspectBigQuerySendToScc();

PHP

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

use Google\Cloud\Dlp\V2\Action;
use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;
use Google\Cloud\Dlp\V2\BigQueryOptions;
use Google\Cloud\Dlp\V2\BigQueryTable;
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\CreateDlpJobRequest;
use Google\Cloud\Dlp\V2\DlpJob\JobState;
use Google\Cloud\Dlp\V2\GetDlpJobRequest;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\StorageConfig;

/**
 * (BIGQUERY) Send Cloud DLP scan results to Security Command Center.
 * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center.
 *
 * @param string $callingProjectId  The project ID to run the API call under.
 * @param string $projectId         The ID of the Project.
 * @param string $datasetId         The ID of the BigQuery Dataset.
 * @param string $tableId           The ID of the BigQuery Table to be inspected.
 */
function inspect_bigquery_send_to_scc(
    // TODO(developer): Replace sample parameters before running the code.
    string $callingProjectId,
    string $projectId,
    string $datasetId,
    string $tableId
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Construct the items to be inspected.
    $bigqueryTable = (new BigQueryTable())
        ->setProjectId($projectId)
        ->setDatasetId($datasetId)
        ->setTableId($tableId);
    $bigQueryOptions = (new BigQueryOptions())
        ->setTableReference($bigqueryTable);

    $storageConfig = (new StorageConfig())
        ->setBigQueryOptions(($bigQueryOptions));

    // Specify the type of info the inspection will look for.
    $infoTypes = [
        (new InfoType())->setName('EMAIL_ADDRESS'),
        (new InfoType())->setName('PERSON_NAME'),
        (new InfoType())->setName('LOCATION'),
        (new InfoType())->setName('PHONE_NUMBER')
    ];

    // Specify how the content should be inspected.
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood(likelihood::UNLIKELY)
        ->setLimits((new FindingLimits())
            ->setMaxFindingsPerRequest(100))
        ->setInfoTypes($infoTypes)
        ->setIncludeQuote(true);

    // Specify the action that is triggered when the job completes.
    $action = (new Action())
        ->setPublishSummaryToCscc(new PublishSummaryToCscc());

    // Configure the inspection job we want the service to perform.
    $inspectJobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig)
        ->setActions([$action]);

    // Send the job creation request and process the response.
    $parent = "projects/$callingProjectId/locations/global";
    $createDlpJobRequest = (new CreateDlpJobRequest())
        ->setParent($parent)
        ->setInspectJob($inspectJobConfig);
    $job = $dlp->createDlpJob($createDlpJobRequest);

    $numOfAttempts = 10;
    do {
        printf('Waiting for job to complete' . PHP_EOL);
        sleep(10);
        $getDlpJobRequest = (new GetDlpJobRequest())
            ->setName($job->getName());
        $job = $dlp->getDlpJob($getDlpJobRequest);
        if ($job->getState() == JobState::DONE) {
            break;
        }
        $numOfAttempts--;
    } while ($numOfAttempts > 0);

    // Print finding counts.
    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));
    switch ($job->getState()) {
        case JobState::DONE:
            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();
            if (count($infoTypeStats) === 0) {
                printf('No findings.' . PHP_EOL);
            } else {
                foreach ($infoTypeStats as $infoTypeStat) {
                    printf(
                        '  Found %s instance(s) of infoType %s' . PHP_EOL,
                        $infoTypeStat->getCount(),
                        $infoTypeStat->getInfoType()->getName()
                    );
                }
            }
            break;
        case JobState::FAILED:
            printf('Job %s had errors:' . PHP_EOL, $job->getName());
            $errors = $job->getErrors();
            foreach ($errors as $error) {
                var_dump($error->getDetails());
            }
            break;
        case JobState::PENDING:
            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);
            break;
        default:
            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');
    }
}

Python

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import time
from typing import List

import google.cloud.dlp


def inspect_bigquery_send_to_scc(
    project: str,
    info_types: List[str],
    max_findings: int = 100,
) -> None:
    """
    Uses the Data Loss Prevention API to inspect public bigquery dataset
    and send the results to Google Security Command Center.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        info_types: A list of strings representing infoTypes to inspect for.
            A full list of infoType categories can be fetched from the API.
        max_findings: The maximum number of findings to report; 0 = no maximum
    """
    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries.
    info_types = [{"name": info_type} for info_type in info_types]

    # Construct the configuration dictionary.
    inspect_config = {
        "info_types": info_types,
        "min_likelihood": google.cloud.dlp_v2.Likelihood.UNLIKELY,
        "limits": {"max_findings_per_request": max_findings},
        "include_quote": True,
    }

    # Construct a Cloud Storage Options dictionary with the big query options.
    storage_config = {
        "big_query_options": {
            "table_reference": {
                "project_id": "bigquery-public-data",
                "dataset_id": "usa_names",
                "table_id": "usa_1910_current",
            }
        }
    }

    # Tell the API where to send a notification when the job is complete.
    actions = [{"publish_summary_to_cscc": {}}]

    # Construct the job definition.
    job = {
        "inspect_config": inspect_config,
        "storage_config": storage_config,
        "actions": actions,
    }

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Call the API.
    response = dlp.create_dlp_job(
        request={
            "parent": parent,
            "inspect_job": job,
        }
    )
    print(f"Inspection Job started : {response.name}")

    job_name = response.name

    # Waiting for a maximum of 15 minutes for the job to get complete.
    no_of_attempts = 30
    while no_of_attempts > 0:
        # Get the DLP job status.
        job = dlp.get_dlp_job(request={"name": job_name})
        # Check if the job has completed.
        if job.state == google.cloud.dlp_v2.DlpJob.JobState.DONE:
            break
        if job.state == google.cloud.dlp_v2.DlpJob.JobState.FAILED:
            print("Job Failed, Please check the configuration.")
            return

        # Sleep for a short duration before checking the job status again.
        time.sleep(30)
        no_of_attempts -= 1

    # Print out the results.
    print(f"Job name: {job.name}")
    result = job.inspect_details.result
    if result.info_type_stats:
        for stats in result.info_type_stats:
            print(f"Info type: {stats.info_type.name}")
            print(f"Count: {stats.count}")
    else:
        print("No findings.")

程式碼範例:檢查 Datastore 種類

這個範例示範如何使用 DLP API 建立檢查工作,檢查 Datastore 種類並將發現項目傳送至 Security Command Center。

C#

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


using System.Collections.Generic;
using System.Linq;
using Google.Api.Gax.ResourceNames;
using Google.Cloud.Dlp.V2;
using static Google.Cloud.Dlp.V2.InspectConfig.Types;

public class InspectDataStoreJobWithSCCIntegration
{
    public static DlpJob SendInspectDatastoreToSCC(
        string projectId,
        string kindName,
        string namespaceId,
        Likelihood minLikelihood = Likelihood.Unlikely,
        IEnumerable<InfoType> infoTypes = null)
    {
        // Instantiate the dlp client.
        var dlp = DlpServiceClient.Create();

        // Specify the Datastore entity to be inspected and construct the storage
        // config. The NamespaceId is to be used for partition entity and the datastore kind defining
        // a data set.
        var storageConfig = new StorageConfig
        {
            DatastoreOptions = new DatastoreOptions
            {
                Kind = new KindExpression { Name = kindName },
                PartitionId = new PartitionId
                {
                    NamespaceId = namespaceId,
                    ProjectId = projectId
                }
            }
        };

        // Specify the type of info to be inspected and construct the inspect config.
        var inspectConfig = new InspectConfig
        {
            InfoTypes =
            {
                infoTypes ?? new InfoType[]
                {
                    new InfoType { Name = "EMAIL_ADDRESS" },
                    new InfoType { Name = "PERSON_NAME" },
                    new InfoType { Name = "LOCATION" },
                    new InfoType { Name = "PHONE_NUMBER" }
                }
            },
            IncludeQuote = true,
            MinLikelihood = minLikelihood,
            Limits = new FindingLimits
            {
                MaxFindingsPerRequest = 100
            }
        };

        // Construct the SCC action which will be performed after inspecting the datastore.
        var actions = new Action[]
        {
            new Action
            {
                PublishSummaryToCscc = new Action.Types.PublishSummaryToCscc()
            }
        };

        // Construct the inspect job config using storage config, inspect config and action.
        var inspectJob = new InspectJobConfig
        {
            StorageConfig = storageConfig,
            InspectConfig = inspectConfig,
            Actions = { actions }
        };

        // Construct the request.
        var request = new CreateDlpJobRequest
        {
            ParentAsLocationName = new LocationName(projectId, "global"),
            InspectJob = inspectJob
        };

        // Call the API.
        DlpJob response = dlp.CreateDlpJob(request);

        return response;
    }
}

Go

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import (
	"context"
	"fmt"
	"io"

	dlp "cloud.google.com/go/dlp/apiv2"
	"cloud.google.com/go/dlp/apiv2/dlppb"
)

// inspectDataStoreSendToScc inspects sensitive data in a Datastore
// and sends the results to Google Cloud Security Command Center (SCC).
func inspectDataStoreSendToScc(w io.Writer, projectID, datastoreNamespace, datastoreKind string) error {
	// projectID := "my-project-id"
	// datastoreNamespace := "your-datastore-namespace"
	// datastoreKind := "your-datastore-kind"

	ctx := context.Background()

	// Initialize a client once and reuse it to send multiple requests. Clients
	// are safe to use across goroutines. When the client is no longer needed,
	// call the Close method to cleanup its resources.
	client, err := dlp.NewClient(ctx)
	if err != nil {
		return err
	}

	// Closing the client safely cleans up background resources.
	defer client.Close()

	// Specify the Datastore entity to be inspected.
	partitionId := &dlppb.PartitionId{
		ProjectId:   projectID,
		NamespaceId: datastoreNamespace,
	}

	// kindExpr represents an expression specifying a kind or range of kinds for data inspection in DLP.
	kindExpression := &dlppb.KindExpression{
		Name: datastoreKind,
	}

	// Specify datastoreOptions so that It holds the configuration options for inspecting data in
	// Google Cloud Datastore.
	datastoreOptions := &dlppb.DatastoreOptions{
		PartitionId: partitionId,
		Kind:        kindExpression,
	}

	// Specify the storageConfig to represents the configuration settings for inspecting data
	// in different storage types, such as BigQuery and Cloud Storage.
	storageConfig := &dlppb.StorageConfig{
		Type: &dlppb.StorageConfig_DatastoreOptions{
			DatastoreOptions: datastoreOptions,
		},
	}

	// Specify the type of info the inspection will look for.
	// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types
	infoTypes := []*dlppb.InfoType{
		{Name: "EMAIL_ADDRESS"},
		{Name: "PERSON_NAME"},
		{Name: "LOCATION"},
		{Name: "PHONE_NUMBER"},
	}

	// The minimum likelihood required before returning a match.
	minLikelihood := dlppb.Likelihood_UNLIKELY

	// The maximum number of findings to report (0 = server maximum).
	findingLimits := &dlppb.InspectConfig_FindingLimits{
		MaxFindingsPerItem: 100,
	}

	inspectConfig := &dlppb.InspectConfig{
		InfoTypes:     infoTypes,
		MinLikelihood: minLikelihood,
		Limits:        findingLimits,
		IncludeQuote:  true,
	}

	// Specify the action that is triggered when the job completes.
	action := &dlppb.Action{
		Action: &dlppb.Action_PublishSummaryToCscc_{
			PublishSummaryToCscc: &dlppb.Action_PublishSummaryToCscc{},
		},
	}

	// Configure the inspection job we want the service to perform.
	inspectJobConfig := &dlppb.InspectJobConfig{
		StorageConfig: storageConfig,
		InspectConfig: inspectConfig,
		Actions: []*dlppb.Action{
			action,
		},
	}

	// Create the request for the job configured above.
	req := &dlppb.CreateDlpJobRequest{
		Parent: fmt.Sprintf("projects/%s/locations/global", projectID),
		Job: &dlppb.CreateDlpJobRequest_InspectJob{
			InspectJob: inspectJobConfig,
		},
	}

	// Send the request.
	resp, err := client.CreateDlpJob(ctx, req)
	if err != nil {
		return err
	}

	// Print the result
	fmt.Fprintf(w, "Job created successfully: %v", resp.Name)
	return nil
}

Java

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。


import com.google.cloud.dlp.v2.DlpServiceClient;
import com.google.privacy.dlp.v2.Action;
import com.google.privacy.dlp.v2.CreateDlpJobRequest;
import com.google.privacy.dlp.v2.DatastoreOptions;
import com.google.privacy.dlp.v2.DlpJob;
import com.google.privacy.dlp.v2.InfoType;
import com.google.privacy.dlp.v2.InfoTypeStats;
import com.google.privacy.dlp.v2.InspectConfig;
import com.google.privacy.dlp.v2.InspectDataSourceDetails;
import com.google.privacy.dlp.v2.InspectJobConfig;
import com.google.privacy.dlp.v2.KindExpression;
import com.google.privacy.dlp.v2.Likelihood;
import com.google.privacy.dlp.v2.LocationName;
import com.google.privacy.dlp.v2.PartitionId;
import com.google.privacy.dlp.v2.StorageConfig;
import java.io.IOException;
import java.util.List;
import java.util.concurrent.TimeUnit;
import java.util.stream.Collectors;
import java.util.stream.Stream;

public class InspectDatastoreSendToScc {

  private static final int TIMEOUT_MINUTES = 15;

  public static void main(String[] args) throws Exception {
    // TODO(developer): Replace these variables before running the sample.
    // The Google Cloud project id to use as a parent resource.
    String projectId = "your-project-id";
    // The namespace specifier to be used for the partition entity.
    String datastoreNamespace = "your-datastore-namespace";
    // The datastore kind defining a data set.
    String datastoreKind = "your-datastore-kind";
    inspectDatastoreSendToScc(projectId, datastoreNamespace, datastoreKind);
  }

  // Creates a DLP Job to scan the sample data stored in a DataStore table and save its scan results
  // to Security Command Center.
  public static void inspectDatastoreSendToScc(
      String projectId, String datastoreNamespace, String datastoreKind)
      throws IOException, InterruptedException {
    // Initialize client that will be used to send requests. This client only needs to be created
    // once, and can be reused for multiple requests. After completing all of your requests, call
    // the "close" method on the client to safely clean up any remaining background resources.
    try (DlpServiceClient dlpServiceClient = DlpServiceClient.create()) {

      // Specify the Datastore entity to be inspected.
      PartitionId partitionId =
          PartitionId.newBuilder()
              .setProjectId(projectId)
              .setNamespaceId(datastoreNamespace)
              .build();

      KindExpression kindExpression = KindExpression.newBuilder().setName(datastoreKind).build();

      DatastoreOptions datastoreOptions =
          DatastoreOptions.newBuilder().setKind(kindExpression).setPartitionId(partitionId).build();

      StorageConfig storageConfig =
          StorageConfig.newBuilder().setDatastoreOptions(datastoreOptions).build();

      // Specify the type of info the inspection will look for.
      List<InfoType> infoTypes =
          Stream.of("EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER")
              .map(it -> InfoType.newBuilder().setName(it).build())
              .collect(Collectors.toList());

      // The minimum likelihood required before returning a match.
      Likelihood minLikelihood = Likelihood.UNLIKELY;

      // The maximum number of findings to report (0 = server maximum)
      InspectConfig.FindingLimits findingLimits =
          InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();

      // Specify how the content should be inspected.
      InspectConfig inspectConfig =
          InspectConfig.newBuilder()
              .addAllInfoTypes(infoTypes)
              .setIncludeQuote(true)
              .setMinLikelihood(minLikelihood)
              .setLimits(findingLimits)
              .build();

      // Specify the action that is triggered when the job completes.
      Action.PublishSummaryToCscc publishSummaryToCscc =
          Action.PublishSummaryToCscc.getDefaultInstance();
      Action action = Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();

      // Configure the inspection job we want the service to perform.
      InspectJobConfig inspectJobConfig =
          InspectJobConfig.newBuilder()
              .setInspectConfig(inspectConfig)
              .setStorageConfig(storageConfig)
              .addActions(action)
              .build();

      // Construct the job creation request to be sent by the client.
      CreateDlpJobRequest createDlpJobRequest =
          CreateDlpJobRequest.newBuilder()
              .setParent(LocationName.of(projectId, "global").toString())
              .setInspectJob(inspectJobConfig)
              .build();

      // Send the job creation request and process the response.
      DlpJob response = dlpServiceClient.createDlpJob(createDlpJobRequest);
      // Get the current time.
      long startTime = System.currentTimeMillis();

      // Check if the job state is DONE.
      while (response.getState() != DlpJob.JobState.DONE) {
        // Sleep for 30 second.
        Thread.sleep(30000);

        // Get the updated job status.
        response = dlpServiceClient.getDlpJob(response.getName());

        // Check if the timeout duration has exceeded.
        long elapsedTime = System.currentTimeMillis() - startTime;
        if (TimeUnit.MILLISECONDS.toMinutes(elapsedTime) >= TIMEOUT_MINUTES) {
          System.out.printf("Job did not complete within %d minutes.%n", TIMEOUT_MINUTES);
          break;
        }
      }
      // Print the results.
      System.out.println("Job status: " + response.getState());
      System.out.println("Job name: " + response.getName());
      InspectDataSourceDetails.Result result = response.getInspectDetails().getResult();
      System.out.println("Findings: ");
      for (InfoTypeStats infoTypeStat : result.getInfoTypeStatsList()) {
        System.out.print("\tInfo type: " + infoTypeStat.getInfoType().getName());
        System.out.println("\tCount: " + infoTypeStat.getCount());
      }
    }
  }
}

Node.js

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

// Imports the Google Cloud Data Loss Prevention library
const DLP = require('@google-cloud/dlp');

// Instantiates a client
const dlp = new DLP.DlpServiceClient();

// The project ID to run the API call under.
// const projectId = "your-project-id";

// Datastore namespace
// const datastoreNamespace = 'datastore-namespace';

// Datastore kind
// const datastoreKind = 'datastore-kind';

async function inspectDatastoreSendToScc() {
  // Specify the storage configuration object with datastore.
  const storageConfig = {
    datastoreOptions: {
      kind: {
        name: datastoreKind,
      },
      partitionId: {
        projectId: projectId,
        namespaceId: datastoreNamespace,
      },
    },
  };

  // Construct the info types to look for in the datastore.
  const infoTypes = [
    {name: 'EMAIL_ADDRESS'},
    {name: 'PERSON_NAME'},
    {name: 'LOCATION'},
    {name: 'PHONE_NUMBER'},
  ];

  // Construct the inspection configuration.
  const inspectConfig = {
    infoTypes: infoTypes,
    minLikelihood: DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,
    limits: {
      maxFindingsPerItem: 100,
    },
    includeQuote: true,
  };

  // Specify the action that is triggered when the job completes
  const action = {
    publishSummaryToCscc: {enable: true},
  };

  // Configure the inspection job we want the service to perform.
  const inspectJobConfig = {
    inspectConfig: inspectConfig,
    storageConfig: storageConfig,
    actions: [action],
  };

  // Construct the job creation request to be sent by the client.
  const request = {
    parent: `projects/${projectId}/locations/global`,
    inspectJob: inspectJobConfig,
  };

  // Send the job creation request and process the response.
  const [jobsResponse] = await dlp.createDlpJob(request);
  const jobName = jobsResponse.name;

  // Waiting for a maximum of 15 minutes for the job to get complete.
  let job;
  let numOfAttempts = 30;
  while (numOfAttempts > 0) {
    // Fetch DLP Job status
    [job] = await dlp.getDlpJob({name: jobName});

    // Check if the job has completed.
    if (job.state === 'DONE') {
      break;
    }
    if (job.state === 'FAILED') {
      console.log('Job Failed, Please check the configuration.');
      return;
    }
    // Sleep for a short duration before checking the job status again.
    await new Promise(resolve => {
      setTimeout(() => resolve(), 30000);
    });
    numOfAttempts -= 1;
  }

  // Print out the results.
  const infoTypeStats = job.inspectDetails.result.infoTypeStats;
  if (infoTypeStats.length > 0) {
    infoTypeStats.forEach(infoTypeStat => {
      console.log(
        `Found ${infoTypeStat.count} instance(s) of infoType ${infoTypeStat.infoType.name}.`
      );
    });
  } else {
    console.log('No findings.');
  }
}
await inspectDatastoreSendToScc();

PHP

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

use Google\Cloud\Dlp\V2\Action;
use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;
use Google\Cloud\Dlp\V2\CreateDlpJobRequest;
use Google\Cloud\Dlp\V2\DatastoreOptions;
use Google\Cloud\Dlp\V2\DlpJob\JobState;
use Google\Cloud\Dlp\V2\GetDlpJobRequest;
use Google\Cloud\Dlp\V2\InfoType;
use Google\Cloud\Dlp\V2\InspectConfig;
use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;
use Google\Cloud\Dlp\V2\InspectJobConfig;
use Google\Cloud\Dlp\V2\KindExpression;
use Google\Cloud\Dlp\V2\Likelihood;
use Google\Cloud\Dlp\V2\PartitionId;
use Google\Cloud\Dlp\V2\StorageConfig;

/**
 * (DATASTORE) Send Cloud DLP scan results to Security Command Center.
 * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center.
 *
 * @param string $callingProjectId  The project ID to run the API call under.
 * @param string $kindName          Datastore kind name to be inspected.
 * @param string $namespaceId       Namespace Id to be inspected.
 */
function inspect_datastore_send_to_scc(
    string $callingProjectId,
    string $kindName,
    string $namespaceId
): void {
    // Instantiate a client.
    $dlp = new DlpServiceClient();

    // Construct the items to be inspected.
    $datastoreOptions = (new DatastoreOptions())
        ->setKind((new KindExpression())
            ->setName($kindName))
        ->setPartitionId((new PartitionId())
            ->setNamespaceId($namespaceId)
            ->setProjectId($callingProjectId));

    $storageConfig = (new StorageConfig())
        ->setDatastoreOptions(($datastoreOptions));

    // Specify the type of info the inspection will look for.
    $infoTypes = [
        (new InfoType())->setName('EMAIL_ADDRESS'),
        (new InfoType())->setName('PERSON_NAME'),
        (new InfoType())->setName('LOCATION'),
        (new InfoType())->setName('PHONE_NUMBER')
    ];

    // Specify how the content should be inspected.
    $inspectConfig = (new InspectConfig())
        ->setMinLikelihood(likelihood::UNLIKELY)
        ->setLimits((new FindingLimits())
            ->setMaxFindingsPerRequest(100))
        ->setInfoTypes($infoTypes)
        ->setIncludeQuote(true);

    // Specify the action that is triggered when the job completes.
    $action = (new Action())
        ->setPublishSummaryToCscc(new PublishSummaryToCscc());

    // Construct inspect job config to run.
    $inspectJobConfig = (new InspectJobConfig())
        ->setInspectConfig($inspectConfig)
        ->setStorageConfig($storageConfig)
        ->setActions([$action]);

    // Send the job creation request and process the response.
    $parent = "projects/$callingProjectId/locations/global";
    $createDlpJobRequest = (new CreateDlpJobRequest())
        ->setParent($parent)
        ->setInspectJob($inspectJobConfig);
    $job = $dlp->createDlpJob($createDlpJobRequest);

    $numOfAttempts = 10;
    do {
        printf('Waiting for job to complete' . PHP_EOL);
        sleep(10);
        $getDlpJobRequest = (new GetDlpJobRequest())
            ->setName($job->getName());
        $job = $dlp->getDlpJob($getDlpJobRequest);
        if ($job->getState() == JobState::DONE) {
            break;
        }
        $numOfAttempts--;
    } while ($numOfAttempts > 0);

    // Print finding counts.
    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));
    switch ($job->getState()) {
        case JobState::DONE:
            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();
            if (count($infoTypeStats) === 0) {
                printf('No findings.' . PHP_EOL);
            } else {
                foreach ($infoTypeStats as $infoTypeStat) {
                    printf(
                        '  Found %s instance(s) of infoType %s' . PHP_EOL,
                        $infoTypeStat->getCount(),
                        $infoTypeStat->getInfoType()->getName()
                    );
                }
            }
            break;
        case JobState::FAILED:
            printf('Job %s had errors:' . PHP_EOL, $job->getName());
            $errors = $job->getErrors();
            foreach ($errors as $error) {
                var_dump($error->getDetails());
            }
            break;
        case JobState::PENDING:
            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);
            break;
        default:
            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');
    }
}

Python

如要瞭解如何安裝及使用 Sensitive Data Protection 的用戶端程式庫,請參閱這篇文章

如要驗證 Sensitive Data Protection,請設定應用程式預設憑證。 詳情請參閱「為本機開發環境設定驗證」。

import time
from typing import List

import google.cloud.dlp


def inspect_datastore_send_to_scc(
    project: str,
    datastore_project: str,
    kind: str,
    info_types: List[str],
    namespace_id: str = None,
    max_findings: int = 100,
) -> None:
    """
    Uses the Data Loss Prevention API to inspect Datastore data and
    send the results to Google Security Command Center.
    Args:
        project: The Google Cloud project id to use as a parent resource.
        datastore_project: The Google Cloud project id of the target Datastore.
        kind: The kind of the Datastore entity to inspect, e.g. 'Person'.
        info_types: A list of strings representing infoTypes to inspect for.
            A full list of infoType categories can be fetched from the API.
        namespace_id: The namespace of the Datastore document, if applicable.
        max_findings: The maximum number of findings to report; 0 = no maximum

    """
    # Instantiate a client.
    dlp = google.cloud.dlp_v2.DlpServiceClient()

    # Prepare info_types by converting the list of strings into a list of
    # dictionaries.
    info_types = [{"name": info_type} for info_type in info_types]

    # Construct the configuration dictionary.
    inspect_config = {
        "info_types": info_types,
        "min_likelihood": google.cloud.dlp_v2.Likelihood.UNLIKELY,
        "limits": {"max_findings_per_request": max_findings},
        "include_quote": True,
    }

    # Construct a cloud_storage_options dictionary with datastore options.
    storage_config = {
        "datastore_options": {
            "partition_id": {
                "project_id": datastore_project,
                "namespace_id": namespace_id,
            },
            "kind": {"name": kind},
        }
    }

    # Tell the API where to send a notification when the job is complete.
    actions = [{"publish_summary_to_cscc": {}}]

    # Construct the job definition.
    job = {
        "inspect_config": inspect_config,
        "storage_config": storage_config,
        "actions": actions,
    }

    # Convert the project id into a full resource id.
    parent = f"projects/{project}"

    # Call the API
    response = dlp.create_dlp_job(
        request={
            "parent": parent,
            "inspect_job": job,
        }
    )
    print(f"Inspection Job started : {response.name}")

    job_name = response.name

    # Waiting for a maximum of 15 minutes for the job to get complete.
    no_of_attempts = 30
    while no_of_attempts > 0:
        # Get the DLP job status.
        job = dlp.get_dlp_job(request={"name": job_name})
        # Check if the job has completed.
        if job.state == google.cloud.dlp_v2.DlpJob.JobState.DONE:
            break
        if job.state == google.cloud.dlp_v2.DlpJob.JobState.FAILED:
            print("Job Failed, Please check the configuration.")
            return

        # Sleep for a short duration before checking the job status again.
        time.sleep(30)
        no_of_attempts -= 1

    # Print out the results.
    print(f"Job name: {job.name}")
    result = job.inspect_details.result
    if result.info_type_stats:
        for stats in result.info_type_stats:
            print(f"Info type: {stats.info_type.name}")
            print(f"Count: {stats.count}")
    else:
        print("No findings.")

在 Security Command Center 查看 Sensitive Data Protection 掃描結果

由於您已指示 Sensitive Data Protection 將檢查工作結果傳送至 Security Command Center,因此現在可以在 Security Command Center 中查看檢查工作結果:

  1. 前往 Google Cloud 控制台的 Security Command Center「發現項目」頁面。

    前往「發現項目」

  2. 選取已啟用 Security Command Center 的機構。
  3. 在「Query editor」(查詢編輯器) 欄位中輸入以下內容,查詢資訊保護的發現項目。

    state="ACTIVE"
    AND NOT mute="MUTED"
    AND (parent_display_name="Sensitive Data Protection" OR parent_display_name="Cloud Data Loss Prevention")
    

    如要進一步瞭解查詢編輯器,請參閱「在Google Cloud 控制台中編輯發現項目查詢」。

    如果 Sensitive Data Protection 傳送了任何發現項目,這些項目會顯示在發現項目清單中。這份清單包含 Sensitive Data Protection 的所有發現項目,包括檢查作業和探索 (資料剖析) 作業的發現項目。

本指南中提供的操作說明只會開啟某些 Sensitive Data Protection 內建偵測工具。

清除所用資源

如要避免系統向您的 Google Cloud 帳戶收取您在本主題中所用資源的相關費用,請按照下列指示操作:

刪除專案

如要避免付費,最簡單的方法就是按照本主題中提供的操作說明刪除您建立的專案。

  1. In the Google Cloud console, go to the Manage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then click Delete.
  3. In the dialog, type the project ID, and then click Shut down to delete the project.

如果您使用此方法刪除專案,系統也會一併刪除您建立的 Sensitive Data Protection 工作和 Cloud Storage bucket。您不必遵循下列章節中的操作說明進行操作。

刪除 Sensitive Data Protection 工作

如果您掃描自己的資料,則只需刪除您建立的檢查工作:

  1. 前往 dlpJobs.delete 方法參考資料頁面的 APIs Explorer,方法是點選下列按鈕:

    開啟 APIs Explorer

  2. 在「name」(名稱) 方塊中,輸入掃描要求的 JSON 回應中的工作名稱,格式如下:
    projects/PROJECT_ID/dlpJobs/JOB_ID
    工作 ID 的格式為 i-1234567890123456789

如果您建立了其他檢查工作,或只是想要確認是否已成功刪除工作,可以列出所有現有工作:

  1. 前往 dlpJobs.list 方法參考資料頁面的 APIs Explorer,方法是點選下列按鈕:

    開啟 APIs Explorer

  2. 在「parent」(父項) 方塊中,輸入專案 ID,格式如下:
    projects/PROJECT_ID
  3. 按一下 [Execute] (執行)

如果回應中未列出任何工作,則表示您已刪除所有工作。如果回應中列出了工作,請針對這些工作重複刪除程序。

刪除 Cloud Storage 值區

如果您建立了新的 Cloud Storage bucket 來保存範例資料,請刪除 bucket:

  1. 開啟 Cloud Storage 瀏覽器

    開啟 Cloud Storage

  2. 在 Cloud Storage 瀏覽器中,選取您建立的值區名稱旁邊的核取方塊,然後按一下 [Delete] (刪除)

後續步驟