New Year Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

Google Cloud Certified - Professional Cloud Security Engineer Question and Answers

Google Cloud Certified - Professional Cloud Security Engineer

Last Update Jan 14, 2026
Total Questions : 318

We are offering FREE Professional-Cloud-Security-Engineer Google exam questions. All you do is to just go and sign up. Give your details, prepare Professional-Cloud-Security-Engineer free exam questions and then go for complete pool of Google Cloud Certified - Professional Cloud Security Engineer test questions that will help you more.

Professional-Cloud-Security-Engineer pdf

Professional-Cloud-Security-Engineer PDF

$36.75  $104.99
Professional-Cloud-Security-Engineer Engine

Professional-Cloud-Security-Engineer Testing Engine

$43.75  $124.99
Professional-Cloud-Security-Engineer PDF + Engine

Professional-Cloud-Security-Engineer PDF + Testing Engine

$57.75  $164.99
Questions 1

Your organization has implemented synchronization and SAML federation between Cloud Identity and Microsoft Active Directory. You want to reduce the risk of Google Cloud user accounts being compromised. What should you do?

Options:

A.  

Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with security keys in the Google Admin console.

B.  

Create a Cloud Identity password policy with strong password settings, and configure 2-Step Verification with verification codes via text or phone call in the Google Admin console.

C.  

Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with security keys in the Google Admin console.

D.  

Create an Active Directory domain password policy with strong password settings, and configure post-SSO (single sign-on) 2-Step Verification with verification codes via text or phone call in the Google Admin console.

Discussion 0
Questions 2

You are routing all your internet facing traffic from Google Cloud through your on-premises internet connection. You want to accomplish this goal securely and with the highest bandwidth possible.

What should you do?

Options:

A.  

Create an HA VPN connection to Google Cloud Replace the default 0 0 0 0/0 route.

B.  

Create a routing VM in Compute Engine Configure the default route with the VM as the next hop.

C.  

Configure Cloud Interconnect with HA VPN Replace the default 0 0 0 0/0 route to an on-premises destination.

D.  

Configure Cloud Interconnect and route traffic through an on-premises firewall.

Discussion 0
Questions 3

Your organization is using Vertex AI Workbench Instances. You must ensure that newly deployed instances are automatically kept up-to-date and that users cannot accidentally alter settings in the operating system. What should you do?​

Options:

A.  

Enable the VM Manager and ensure the corresponding Google Compute Engine instances are added.​

B.  

Enforce the disableRootAccess and requireAutoUpgradeSchedule organization policies for newly deployed instances.​

C.  

Assign the AI Notebooks Runner and AI Notebooks Viewer roles to the users of the AI Workbench Instances.​

D.  

Implement a firewall rule that prevents Secure Shell access to the corresponding Google Compute Engine instances by using tags.​

Discussion 0
Questions 4

A DevOps team will create a new container to run on Google Kubernetes Engine. As the application will be internet-facing, they want to minimize the attack surface of the container.

What should they do?

Options:

A.  

Use Cloud Build to build the container images.

B.  

Build small containers using small base images.

C.  

Delete non-used versions from Container Registry.

D.  

Use a Continuous Delivery tool to deploy the application.

Discussion 0
Questions 5

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.  

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.  

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.  

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.  

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Discussion 0
Questions 6

Your security team uses encryption keys to ensure confidentiality of user data. You want to establish a process to reduce the impact of a potentially compromised symmetric encryption key in Cloud Key Management Service (Cloud KMS).

Which steps should your team take before an incident occurs? (Choose two.)

Options:

A.  

Disable and revoke access to compromised keys.

B.  

Enable automatic key version rotation on a regular schedule.

C.  

Manually rotate key versions on an ad hoc schedule.

D.  

Limit the number of messages encrypted with each key version.

E.  

Disable the Cloud KMS API.

Discussion 0
Questions 7

A company has redundant mail servers in different Google Cloud Platform regions and wants to route customers to the nearest mail server based on location.

How should the company accomplish this?

Options:

A.  

Configure TCP Proxy Load Balancing as a global load balancing service listening on port 995.

B.  

Create a Network Load Balancer to listen on TCP port 995 with a forwarding rule to forward traffic basedon location.

C.  

Use Cross-Region Load Balancing with an HTTP(S) load balancer to route traffic to the nearest region.

D.  

Use Cloud CDN to route the mail traffic to the closest origin mail server based on client IP address.

Discussion 0
Questions 8

You are migrating an application into the cloud The application will need to read data from a Cloud Storage bucket. Due to local regulatory requirements, you need to hold the key material used for encryption fully under your control and you require a valid rationale for accessing the key material.

What should you do?

Options:

A.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys. Configure an 1AM deny policy for unauthorized groups

B.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys backed by a Cloud Hardware Security Module (HSM). Enable data access logs.

C.  

Generate a key in your on-premises environment and store it in a Hardware Security Module (HSM) that is managed on-premises Use this key as an external key in the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and set the external key system to reject unauthorized accesses.

D.  

Generate a key in your on-premises environment to encrypt the data before you upload the data to the Cloud Storage bucket Upload the key to the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and have the external key system reject unauthorized accesses.

Discussion 0
Questions 9

Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.

What should you do?

Options:

A.  

• 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring• 2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly

B.  

• 1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium• 2 Monitor the findings in SCC

C.  

* 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring• 2 Activate Confidential Computing• 3 Enforce these actions by using organization policies

D.  

• 1 Use secure hardened images from the Google Cloud Marketplace• 2 When deploying the images activate the Confidential Computing option• 3 Enforce the use of the correct images and Confidential Computing by using organization policies

Discussion 0
Questions 10

Options:

A.  

Configure IAM permissions on individual Model Garden to restrict access to specific models.

B.  

Regularly audit user activity logs in Vertex AI to identify and revoke access to unapproved models.

C.  

Train custom models within your Vertex AI project and restrict user access to these models.

D.  

Implement an organization policy that restricts the vertexai.allowedModels constraint.

Discussion 0
Questions 11

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

Options:

A.  

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.  

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.  

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.  

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Discussion 0
Questions 12

You want data on Compute Engine disks to be encrypted at rest with keys managed by Cloud Key Management Service (KMS). Cloud Identity and Access Management (IAM) permissions to these keys must be managed in a grouped way because the permissions should be the same for all keys.

What should you do?

Options:

A.  

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the Key level.

B.  

Create a single KeyRing for all persistent disks and all Keys in this KeyRing. Manage the IAM permissions at the KeyRing level.

C.  

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the Key level.

D.  

Create a KeyRing per persistent disk, with each KeyRing containing a single Key. Manage the IAM permissions at the KeyRing level.

Discussion 0
Questions 13

Your company is deploying a large number of containerized applications to GKE. The existing CI/CD pipeline uses Cloud Build to construct container images, transfers the images to Artifact Registry, and then deploys the images to GKE. You need to ensure that only images that have passed vulnerability scanning and meet specific corporate policies are allowed to be deployed. The process needs to be automated and integrated into the existing CI/CD pipeline. What should you do?

Options:

A.  

Implement a custom script in the Cloud Build pipeline that uses a third-party vulnerability scanning tool. Fail the build if vulnerabilities are found.

B.  

Configure GKE to use only images from a specific, trusted Artifact Registry repository. Manually inspect all images before pushing them to this repository.

C.  

Configure a policy in Binary Authorization to use Artifact Analysis vulnerability scanning to only allow images that pass the scan to deploy to your GKE clusters.

D.  

Enable Artifact Analysis vulnerability scanning and regularly scan images in Artifact Registry. Remove any images that do not meet the vulnerability requirements before deployment.

Discussion 0
Questions 14

Your company requires the security and network engineering teams to identify all network anomalies and be able to capture payloads within VPCs. Which method should you use?

Options:

A.  

Define an organization policy constraint.

B.  

Configure packet mirroring policies.

C.  

Enable VPC Flow Logs on the subnet.

D.  

Monitor and analyze Cloud Audit Logs.

Discussion 0
Questions 15

Your company is using GSuite and has developed an application meant for internal usage on Google App Engine. You need to make sure that an external user cannot gain access to the application even when an employee’s password has been compromised.

What should you do?

Options:

A.  

Enforce 2-factor authentication in GSuite for all users.

B.  

Configure Cloud Identity-Aware Proxy for the App Engine Application.

C.  

Provision user passwords using GSuite Password Sync.

D.  

Configure Cloud VPN between your private network and GCP.

Discussion 0
Questions 16

An organization is starting to move its infrastructure from its on-premises environment to Google Cloud Platform (GCP). The first step the organization wants to take is to migrate its ongoing data backup and disaster recovery solutions to GCP. The organization's on-premises production environment is going to be the next phase for migration to GCP. Stable networking connectivity between the on-premises environment and GCP is also being implemented.

Which GCP solution should the organization use?

Options:

A.  

BigQuery using a data pipeline job with continuous updates via Cloud VPN

B.  

Cloud Storage using a scheduled task and gsutil via Cloud Interconnect

C.  

Compute Engines Virtual Machines using Persistent Disk via Cloud Interconnect

D.  

Cloud Datastore using regularly scheduled batch upload jobs via Cloud VPN

Discussion 0
Questions 17

You are auditing all your Google Cloud resources in the production project. You want to identity all principals who can change firewall rules.

What should you do?

Options:

A.  

Use Policy Analyzer lo query the permissions compute, firewalls, create ofcompute, firewalls. Create of compute,firewalls.delete.

B.  

Reference the Security Health Analytics - Firewall Vulnerability Findings in the Security Command Center.

C.  

Use Policy Analyzer to query the permissions compute, firewalls, get of compute, firewalls, list.

D.  

Use Firewall Insights to understand your firewall rules usage patterns.

Discussion 0
Questions 18

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

Options:

A.  

Create a Log Sink at the organization level with a Pub/Sub destination.

B.  

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.  

Enable Data Access audit logs at the organization level to apply to all projects.

D.  

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.  

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Discussion 0
Questions 19

Your organization leverages folders to represent different teams within your Google Cloud environment. To support Infrastructure as Code (IaC) practices, each team receives a dedicated service account upon onboarding. You want to ensure that teams have comprehensive permissions to manage resources within their assigned folders while adhering to the principle of least privilege. You must design the permissions for these team-based service accounts in the most effective way possible. What should you do?​

Options:

A.  

Grant each service account the folder administrator role on its respective folder.​

B.  

Grant each service account the project creator role at the organization level and use folder-level IAM conditions to restrict project creation to specific folders.​Reddit

C.  

Assign each service account the project editor role at the organization level and instruct teams to use IAM bindings at the folder level for fine-grained permissions.​

D.  

Assign each service account the folder IAM administrator role on its respective folder to allow teams to create and manage additional custom roles if needed.​

Discussion 0
Questions 20

Your company is deploying a three-tier web application—web, application, and database—on Google Cloud. You need to configure network isolation between tiers to minimize the attack surface. The web tier needs to be accessible from the public internet, the application tier should only be accessible from the web tier, and the database tier should only be accessible from the application tier. Your solution must follow Google-recommended practices. What should you do?

Options:

A.  

Create three separate VPC networks, one for each tier. Configure VPC Network Peering between the web and application VPCs, and between the application and database VPCs. Use firewall rules to control the traffic.

B.  

Create a single subnet for all tiers. Create firewall rules that allow all traffic between instances within the same subnet. Use application-level security to prevent unauthorized access.

C.  

Create three subnets within the VPC, one for each tier. Create firewall rules that allow traffic on specific ports on each subnet. Use network tags or service accounts on the VMs to apply the firewall rules.

D.  

Create three subnets within the VPC, one for each tier. Enable Private Google Access on each subnet. Create a single firewall rule allowing all traffic between the subnets.

Discussion 0
Questions 21

Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services.

Which two settings must remain disabled to meet these requirements? (Choose two.)

Options:

A.  

Public IP

B.  

IP Forwarding

C.  

Private Google Access

D.  

Static routes

E.  

IAM Network User Role

Discussion 0
Questions 22

You are setting up a CI/CD pipeline to deploy containerized applications to your production clusters on Google Kubernetes Engine (GKE). You need to prevent containers with known vulnerabilities from being deployed. You have the following requirements for your solution:

Must be cloud-native

Must be cost-efficient

Minimize operational overhead

How should you accomplish this? (Choose two.)

Options:

A.  

Create a Cloud Build pipeline that will monitor changes to your container templates in a Cloud Source Repositories repository. Add a step to analyze Container Analysis results before allowing the build to continue.

B.  

Use a Cloud Function triggered by log events in Google Cloud's operations suite to automatically scan your container images in Container Registry.

C.  

Use a cron job on a Compute Engine instance to scan your existing repositories for known vulnerabilities and raise an alert if a non-compliant container image is found.

D.  

Deploy Jenkins on GKE and configure a CI/CD pipeline to deploy your containers to Container Registry. Add a step to validate your container images before deploying your container to the cluster.

E.  

In your CI/CD pipeline, add an attestation on your container image when no vulnerabilities have been found. Use a Binary Authorization policy to block deployments of containers with no attestation in your cluster.

Discussion 0
Questions 23

Which two security characteristics are related to the use of VPC peering to connect two VPC networks? (Choose two.)

Options:

A.  

Central management of routes, firewalls, and VPNs for peered networks

B.  

Non-transitive peered networks; where only directly peered networks can communicate

C.  

Ability to peer networks that belong to different Google Cloud Platform organizations

D.  

Firewall rules that can be created with a tag from one peered network to another peered network

E.  

Ability to share specific subnets across peered networks

Discussion 0
Questions 24

Your organization has recently migrated sensitive customer data to Cloud Storage buckets. For compliance reasons, you must ensure that all vendor data access and administrative access by Google personnel is logged. What should you do?

Options:

A.  

Configure Data Access audit logs for Cloud Storage on the project hosting the Cloud Storage buckets.

B.  

Enable Access Transparency for the organization.

C.  

Configure Data Access audit logs for Cloud Storage at the organization level.

D.  

Enable Access Transparency for the project hosting the Cloud Storage buckets.

Discussion 0
Questions 25

You have just created a new log bucket to replace the _Default log bucket. You want to route all log entries that are currently routed to the _Default log bucket to this new log bucket in the most efficient manner. What should you do?​

Options:

A.  

Create a user-defined sink with inclusion filters copied from the _Default sink. Select the new log bucket as the sink destination.​

B.  

Create exclusion filters for the _Default sink to prevent it from receiving new logs. Create a user-defined sink, and select the new log bucket as the sink destination.​

C.  

Disable the _Default sink. Create a user-defined sink and select the new log bucket as the sink destination.​

D.  

Edit the _Default sink, and select the new log bucket as the sink destination.​

Discussion 0
Questions 26

You want to set up a secure, internal network within Google Cloud for database servers. The servers must not have any direct communication with the public internet. What should you do?

Options:

A.  

Assign a static public IP address to each database server. Use firewall rules to restrict external access.

B.  

Create a VPC with a private subnet. Assign a private IP address to each database server.

C.  

Assign both a private IP address and a public IP address to each database server.

D.  

Assign a private IP address to each database server. Use a NAT gateway to provide internet connectivity to the database servers.

Discussion 0
Questions 27

All logs in your organization are aggregated into a centralized Google Cloud logging project for analysis and long-term retention.4 While most of the log data can be viewed by operations teams, there are specific sensitive fields (i.e., protoPayload.authenticationinfo.principalEmail) that contain identifiable information that should be restricted only to security teams. You need to implement a solution that allows different teams to view their respective application logs in the centralized logging project. It must also restrict access to specific sensitive fields within those logs to only a designated security group. Your solution must ensure that other fields in the same log entry remain visible to other authorized groups. What should you do?

Options:

A.  

Configure field-level access in Cloud Logging by defining data access policies that specify sensitive fields and the authorized principals.

B.  

Use Cloud IAM custom roles with specific permissions on logging.privateLogEntries.list. Define field-level access within the custom role's conditions.

C.  

Implement a log sink to exclude sensitive fields before logs are sent to the centralized logging project. Create separate sinks for sensitive data.

D.  

Create a BigQuery authorized view on the exported log sink to filter out the sensitive fields based on user groups.

Discussion 0
Questions 28

You need to audit the network segmentation for your Google Cloud footprint. You currently operate Production and Non-Production infrastructure-as-a-service (IaaS) environments. All your VM instances are deployed without any service account customization.

After observing the traffic in your custom network, you notice that all instances can communicate freely – despite tag-based VPC firewall rules in place to segment traffic properly – with a priority of 1000. What are the most likely reasons for this behavior?

Options:

A.  

All VM instances are missing the respective network tags.

B.  

All VM instances are residing in the same network subnet.

C.  

All VM instances are configured with the same network route.

D.  

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 999.

E.  

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 1001.

Discussion 0
Questions 29

A security audit uncovered several inconsistencies in your project's Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.  

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.​

B.  

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.​

C.  

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.​

D.  

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.​

Discussion 0
Questions 30

Your financial services company needs to process customer personally identifiable information (PII) for analytics while adhering to strict privacy regulations. You must transform this data to protect individual privacy to ensure that the data retains its original format and consistency for analytical integrity. Your solution must avoid full irreversible deletion. What should you do?

Options:

A.  

Configure Sensitive Data Protection (SDP) to de-identify PII using format-preserving encryption (FPE).

B.  

Use Cloud Key Management Service (Cloud KMS) to encrypt the entire dataset with a customer-managed encryption key (CMEK).

C.  

Implement a custom BigQuery user-defined function (UDF) by using JavaScript to hash all sensitive fields before they are loaded into the analytical tables.

D.  

Set up VPC Service Controls around the BigQuery project. Implement row-level encryption.

Discussion 0
Questions 31

Your company uses Google Cloud and has publicly exposed network assets. You want to discover the assets and perform a security audit on these assets by using a software tool in the least amount of time.

What should you do?

Options:

A.  

Run a platform security scanner on all instances in the organization.

B.  

Notify Google about the pending audit and wait for confirmation before performing the scan.

C.  

Contact a Google approved security vendor to perform the audit.

D.  

Identify all external assets by using Cloud Asset Inventory and then run a network security scanner against them.

Discussion 0
Questions 32

You have an application where the frontend is deployed on a managed instance group in subnet A and the data layer is stored on a mysql Compute Engine virtual machine (VM) in subnet B on the same VPC. Subnet A and Subnet B hold several other Compute Engine VMs. You only want to allow thee application frontend to access the data in the application's mysql instance on port 3306.

What should you do?

Options:

A.  

Configure an ingress firewall rule that allows communication from the src IP range of subnet A to the tag "data-tag" that is applied to the mysql Compute Engine VM on port 3306.

B.  

Configure an ingress firewall rule that allows communication from the frontend's unique service account to the unique service account of the mysql Compute Engine VM on port 3306.

C.  

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an egress firewall rule that allows communication from Compute Engine VMs tagged with data-tag to destination Compute Engine VMs tagged fe-tag.

D.  

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an ingress firewall rule that allows communication from Compute Engine VMs tagged with fe-tag to destination Compute Engine VMs tagged with data-tag.

Discussion 0
Questions 33

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

Options:

A.  

External Key Manager

B.  

Customer-supplied encryption keys

C.  

Hardware Security Module

D.  

Confidential Computing and Istio

E.  

Client-side encryption

Discussion 0
Questions 34

Your organization is using Model Garden to maintain a collection of models in a single location and to deploy different types of models in a consistent way. You must ensure that your users can only access the approved models. What should you do?

Options:

A.  

Configure IAM permissions on individual Model Garden to restrict access to specific models.

B.  

Regularly audit user activity logs in Vertex AI to identify and revoke access to unapproved models.

C.  

Train custom models within your Vertex AI project, and restrict user access to these models.

D.  

Implement an organization policy that restricts the vertexai.allowedModels constraint.

Discussion 0
Questions 35

You plan to deploy your cloud infrastructure using a CI/CD cluster hosted on Compute Engine. You want to minimize the risk of its credentials being stolen by a third party. What should you do?

Options:

A.  

Create a dedicated Cloud Identity user account for the cluster. Use a strong self-hosted vault solution to store the user's temporary credentials.

B.  

Create a dedicated Cloud Identity user account for the cluster. Enable the constraints/iam.disableServiceAccountCreation organization policy at the project level.

C.  

Create a custom service account for the cluster Enable the constraints/iam.disableServiceAccountKeyCreation organization policy at the project level.

D.  

Create a custom service account for the cluster Enable the constraints/iam.allowServiceAccountCredentialLifetimeExtension organization policy at the project level.

Discussion 0
Questions 36

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate,

and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.

What should you do?

Options:

A.  

Use the Cloud Key Management Service to manage the data encryption key (DEK).

B.  

Use the Cloud Key Management Service to manage the key encryption key (KEK).

C.  

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.  

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Discussion 0
Questions 37

You are a member of the security team at an organization. Your team has a single GCP project with credit card payment processing systems alongside web applications and data processing systems. You want to reduce the scope of systems subject to PCI audit standards.

What should you do?

Options:

A.  

Use multi-factor authentication for admin access to the web application.

B.  

Use only applications certified compliant with PA-DSS.

C.  

Move the cardholder data environment into a separate GCP project.

D.  

Use VPN for all connections between your office and cloud environments.

Discussion 0
Questions 38

Options:

A.  

Implement a Cloud Function that scans the environment variables multiple times a day. and creates a finding in Security Command Center if secrets are discovered.

B.  

Implement regular peer reviews to assess the environment variables and identify secrets in your Cloud Functions. Raise a security incident if secrets are discovered.

C.  

Use Sensitive Data Protection to scan the environment variables multiple times per day. and create a finding in Security Command Center if secrets are discovered.

D.  

Integrate dynamic application security testing into the CI/CD pipeline that scans the application code for the Cloud Functions. Fail the build process if secrets are discovered.

Discussion 0
Questions 39

An office manager at your small startup company is responsible for matching payments to invoices and creating billing alerts. For compliance reasons, the office manager is only permitted to have the Identity and Access Management (IAM) permissions necessary for these tasks. Which two IAM roles should the office manager have? (Choose two.)

Options:

A.  

Organization Administrator

B.  

Project Creator

C.  

Billing Account Viewer

D.  

Billing Account Costs Manager

E.  

Billing Account User

Discussion 0
Questions 40

You work for a large organization that runs many custom training jobs on Vertex AI. A recent compliance audit identified a security concern. All jobs currently use the Vertex AI service agent. The audit mandates that each training job must be isolated, with access only to the required Cloud Storage buckets, following the principle of least privilege. You need to design a secure, scalable solution to enforce this requirement. What should you do?

Options:

A.  

Create a custom service account. Assign it the storage object user role at the project level. Configure all Vertex AI custom training jobs to run as this service account.

B.  

Continue to use the default Vertex AI service agent. Implement VPC Service Controls around the Vertex AI and Cloud Storage services.

C.  

Modify the IAM policy of each Cloud Storage bucket to grant the default Vertex AI service agent the storage Legacy Object Reader role.

D.  

Use a dedicated service account for each custom training job. Grant each account the storage Legacy Object Reader role for the necessary Cloud Storage buckets.

Discussion 0
Questions 41

Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports

What should you do?

Options:

A.  

Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates.

B.  

Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily.

C.  

Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods.

D.  

Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them.

Discussion 0
Questions 42

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.  

Query Data Access logs.

B.  

Query Admin Activity logs.

C.  

Query Access Transparency logs.

D.  

Query Stackdriver Monitoring Workspace.

Discussion 0
Questions 43

You are working with a client that is concerned about control of their encryption keys for sensitive data. The client does not want to store encryption keys at rest in the same cloud service provider (CSP) as the data that the keys are encrypting. Which Google Cloud encryption solutions should you recommend to this client? (Choose two.)

Options:

A.  

Customer-supplied encryption keys.

B.  

Google default encryption

C.  

Secret Manager

D.  

Cloud External Key Manager

E.  

Customer-managed encryption keys

Discussion 0
Questions 44

You are the security admin of your company. You have 3,000 objects in your Cloud Storage bucket. You do not want to manage access to each object individually. You also do not want the uploader of an object to always have full control of the object. However, you want to use Cloud Audit Logs to manage access to your bucket.

What should you do?

Options:

A.  

Set up an ACL with OWNER permission to a scope of allUsers.

B.  

Set up an ACL with READER permission to a scope of allUsers.

C.  

Set up a default bucket ACL and manage access for users using IAM.

D.  

Set up Uniform bucket-level access on the Cloud Storage bucket and manage access for users using IAM.

Discussion 0
Questions 45

Your organization has on-premises hosts that need to access Google Cloud APIs You must enforce private connectivity between these hosts minimize costs and optimize for operational efficiency

What should you do?

Options:

A.  

Route all on-premises traffic to Google Cloud through an IPsec VPN tunnel to a VPC with Private Google Access enabled.

B.  

Set up VPC peering between the hosts on-premises and the VPC through the internet.

C.  

Enforce a security policy that mandates all applications to encrypt data with a Cloud Key Management. Service (KMS) key before you send it over the network.

D.  

Route all on-premises traffic to Google Cloud through a dedicated or Partner interconnect to a VPC with Private Google Access enabled.

Discussion 0
Questions 46

Your organization must comply with the regulation to keep instance logging data within Europe. Your workloads will be hosted in the Netherlands in region europe-west4 in a new project. You must configure Cloud Logging to keep your data in the country.

What should you do?

Options:

A.  

Configure the organization policy constraint gcp.resourceLocations to europe-west4.

B.  

Set the logging storage region to eurcpe-west4 by using the gcloud CLI logging settings update.

C.  

Create a new tog bucket in europe-west4. and redirect the _Def auit bucKet to the new bucket.

D.  

Configure log sink to export all logs into a Cloud Storage bucket in europe-west4.

Discussion 0
Questions 47

Your organization has an operational image classification model running on a managed AI service on Google Cloud. You are in a configuration review with stakeholders and must describe the security responsibilities for the image classification model. What should you do?

Options:

A.  

Explain the development of custom network firewalls around the image classification service for deep intrusion detection and prevention. Describe vulnerability scanning tools for known vulnerabilities.

B.  

Explain Google's shared responsibility model. Focus the configuration review on Identity and Access Management (IAM) permissions, secure data upload/download procedures, and monitoring logs for any potential malicious activity.

C.  

Explain that using platform-as-a-service (PaaS) transfers security concerns to Google. Describe the need for strict API usage limits to protect against unexpected usage and billing spikes.

D.  

Explain the security aspects of the code that transforms user-uploaded images using Google's service. Define Cloud IAM for fine-grained access control within the development team.

Discussion 0
Questions 48

A company is running their webshop on Google Kubernetes Engine and wants to analyze customer transactions in BigQuery. You need to ensure that no credit card numbers are stored in BigQuery

What should you do?

Options:

A.  

Create a BigQuery view with regular expressions matching credit card numbers to query and delete affected rows.

B.  

Use the Cloud Data Loss Prevention API to redact related infoTypes before data is ingested into BigQuery.

C.  

Leverage Security Command Center to scan for the assets of type Credit Card Number in BigQuery.

D.  

Enable Cloud Identity-Aware Proxy to filter out credit card numbers before storing the logs in BigQuery.

Discussion 0
Questions 49

You manage a mission-critical workload for your organization, which is in a highly regulated industry The workload uses Compute Engine VMs to analyze and process the sensitive data after it is uploaded to Cloud Storage from the endpomt computers. Your compliance team has detected that this workload does not meet the data protection requirements for sensitive data. You need to meet these requirements;

• Manage the data encryption key (DEK) outside the Google Cloud boundary.

• Maintain full control of encryption keys through a third-party provider.

• Encrypt the sensitive data before uploading it to Cloud Storage

• Decrypt the sensitive data during processing in the Compute Engine VMs

• Encrypt the sensitive data in memory while in use in the Compute Engine VMs

What should you do?

Choose 2 answers

Options:

A.  

Create a VPC Service Controls service perimeter across your existing Compute Engine VMs and Cloud Storage buckets

B.  

Migrate the Compute Engine VMs to Confidential VMs to access the sensitive data.

C.  

Configure Cloud External Key Manager to encrypt the sensitive data before it is uploaded to Cloud Storage and decrypt the sensitive data after it is downloaded into your VMs

D.  

Create Confidential VMs to access the sensitive data.

E.  

Configure Customer Managed Encryption Keys to encrypt the sensitive data before it is uploaded to Cloud Storage, and decrypt the sensitive data after it is downloaded into your VMs.

Discussion 0
Questions 50

A customer’s company has multiple business units. Each business unit operates independently, and each has their own engineering group. Your team wants visibility into all projects created within the company and wants to organize their Google Cloud Platform (GCP) projects based on different business units. Each business unit also requires separate sets of IAM permissions.

Which strategy should you use to meet these needs?

Options:

A.  

Create an organization node, and assign folders for each business unit.

B.  

Establish standalone projects for each business unit, using gmail.com accounts.

C.  

Assign GCP resources in a project, with a label identifying which business unit owns the resource.

D.  

Assign GCP resources in a VPC for each business unit to separate network access.

Discussion 0
Questions 51

You work for a healthcare provider that is expanding into the cloud to store and process sensitive patient data. You must ensure the chosen Google Cloud configuration meets these strict regulatory requirements:​

Data must reside within specific geographic regions.​

Certain administrative actions on patient data require explicit approval from designated compliance officers.​

Access to patient data must be auditable.​

What should you do?

Options:

A.  

Select multiple standard Google Cloud regions for high availability. Implement Access Control Lists (ACLs) on individual storage objects containing patient data. Enable Cloud Audit Logs.​

B.  

Deploy an Assured Workloads environment in multiple regions for redundancy. Utilize custom IAM roles with granular permissions. Isolate network-level data by using VPC Service Controls.​

C.  

Deploy an Assured Workloads environment in an approved region. Configure Access Approval for sensitive operations on patient data. Enable both Cloud Audit Logs and Access Transparency.​

D.  

Select a standard Google Cloud region. Restrict access to patient data based on user location and job function by using Access Context Manager. Enable both Cloud Audit Logging and Access Transparency.​

Discussion 0
Questions 52

Your organization has a hybrid cloud environment with a data center connected to Google Cloud through a dedicated Cloud Interconnect connection. You need to configure private access from your on-premises hosts to Google APIs, specifically Cloud Storage and BigQuery, without exposing traffic to the public internet. What should you do?

Options:

A.  

Configure Shared VPC to extend your Google Cloud VPC network to your on-premises environment. Use Private Google Access to access Google APIs.

B.  

Use Private Google Access for on-premises hosts. Configure DNS resolution to point to the private.googleapis.com domain.

C.  

Configure Cloud NAT on your on-premises network. Configure DNS records in a private DNS zone to send requests to 199.36.153.8/30 to access Google APIs.

D.  

Establish VPC peering between your on-premises network and your Google Cloud VPC network. Configure Cloud Firewall rules to allow traffic to Google API IP ranges.

Discussion 0
Questions 53

You are implementing communications restrictions for specific services in your Google Cloud organization. Your data analytics team works in a dedicated folder You need to ensure that access to BigQuery is controlled for that folder and its projects. The data analytics team must be able to control the restrictions only at the folder level What should you do?

Options:

A.  

Enforce the Restrict Resource Service Usage organization policy constraint on the folder to restrict BigQuery access. Assign the data analytics team the Organization Policy Administrator role to allow the team to manage exclusions within the folder.

B.  

Create a scoped policy on the folder with a service perimeter to restrict BigQuery access. Assign the data analytics team the Access Context Manager Editor role on the scoped policy to allow the team to configure the scoped policy.

C.  

Define a hierarchical firewall policy on the folder to deny BigQuery access. Assign the data analytics team the Compute Organization Firewall Policy Admin role to allow the team to configure rules for the firewall policy.

D.  

Create an organization-level access policy with a service perimeter to restrict BigQuery access. Assign the data analytics team the Access Context Manager Editor role on the access policy to allow the team to configure the access policy.

Discussion 0
Questions 54

Your Google Cloud environment has one organization node, one folder named Apps." and several projects within that folder The organizational node enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the terramearth.com organization The "Apps" folder enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the flowlogistic.com organization. It also has the inheritFromParent: false property.

You attempt to grant access to a project in the Apps folder to the user testuser@terramearth.com.

What is the result of your action and why?

Options:

A.  

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy mustbe defined on the current project to deactivate the constraint temporarily.

B.  

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy is in place and only members from the flowlogistic.com organization are allowed.

C.  

The action succeeds because members from both organizations, terramearth. com or flowlogistic.com, are allowed on projects in the "Apps" folder

D.  

The action succeeds and the new member is successfully added to the project's Identity and Access Management (1AM) policy because all policies are inherited by underlying folders and projects.

Discussion 0
Questions 55

Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need to join user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.

This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?

Options:

A.  

Deterministic encryption

B.  

Secure, key-based hashes

C.  

Format-preserving encryption

D.  

Cryptographic hashing

Discussion 0
Questions 56

Your company's storage team manages all product images within a specific Google Cloud project. To maintain control, you must isolate access to Cloud Storage for this project, allowing the storage team to manage restrictions at the project level. They must be restricted to using corporate computers. What should you do?

Options:

A.  

Employ organization-level firewall rules to block all traffic to Cloud Storage. Create exceptions for specific service accounts used by the storage team within their project.

B.  

Implement VPC Service Controls by establishing an organization-wide service perimeter with all projects. Configure ingress and egress rules to restrict access to Cloud Storage based on IP address ranges.

C.  

Use Context-Aware Access. Create an access level that defines the required context. Apply it as an organization policy specifically at the project level, restricting access to Cloud Storage based on that context.

D.  

Use Identity and Access Management (IAM) roles at the project level within the storage team's project. Grant the storage team granular permissions on the project's Cloud Storage resources.

Discussion 0
Questions 57

You are the project owner for a regulated workload that runs in a project you own and manage as an Identity and Access Management (IAM) admin. For an upcoming audit, you need to provide access reviews evidence. Which tool should you use?

Options:

A.  

Policy Troubleshooter

B.  

Policy Analyzer

C.  

IAM Recommender

D.  

Policy Simulator

Discussion 0
Questions 58

Your organization previously stored files in Cloud Storage by using Google Managed Encryption Keys (GMEK). but has recently updated the internal policy to require Customer Managed Encryption Keys (CMEK). You need to re-encrypt the files quickly and efficiently with minimal cost.

What should you do?

Options:

A.  

Encrypt the files locally, and then use gsutil to upload the files to a new bucket.

B.  

Copy the files to a new bucket with CMEK enabled in a secondary region

C.  

Reupload the files to the same Cloud Storage bucket specifying a key file by using gsutil.

D.  

Change the encryption type on the bucket to CMEK, and rewrite the objects

Discussion 0
Questions 59

You want to prevent users from accidentally deleting a Shared VPC host project. Which organization-level policy constraint should you enable?

Options:

A.  

compute.restrictSharedVpcHostProjects

B.  

compute.restrictXpnProjectLienRemoval

C.  

compute.restrictSharedVpcSubnetworks

D.  

compute.sharedReservationsOwnerProjects

Discussion 0
Questions 60

You have numerous private virtual machines on Google Cloud. You occasionally need to manage the servers through Secure Socket Shell (SSH) from a remote location. You want to configure remote access to the servers in a manner that optimizes security and cost efficiency.

What should you do?

Options:

A.  

Create a site-to-site VPN from your corporate network to Google Cloud.

B.  

Configure server instances with public IP addresses Create a firewall rule to only allow traffic from your corporate IPs.

C.  

Create a firewall rule to allow access from the Identity-Aware Proxy (IAP) IP range Grant the role of an IAP- secured Tunnel User to the administrators.

D.  

Create a jump host instance with public IP Manage the instances by connecting through the jump host.

Discussion 0
Questions 61

Your company’s new CEO recently sold two of the company’s divisions. Your Director asks you to help migrate the Google Cloud projects associated with those divisions to a new organization node. Which preparation steps are necessary before this migration occurs? (Choose two.)

Options:

A.  

Remove all project-level custom Identity and Access Management (1AM) roles.

B.  

Disallow inheritance of organization policies.

C.  

Identify inherited Identity and Access Management (1AM) roles on projects to be migrated.

D.  

Create a new folder for all projects to be migrated.

E.  

Remove the specific migration projects from any VPC Service Controls perimeters and bridges.

Discussion 0
Questions 62

You need to implement an encryption at-rest strategy that reduces key management complexity for non-sensitive data and protects sensitive data while providing the flexibility of controlling the key residency and rotation schedule. FIPS 140-2 L1 compliance is required for all data types. What should you do?

Options:

A.  

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.  

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service

C.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Discussion 0
Questions 63

You are part of a security team that wants to ensure that a Cloud Storage bucket in Project A can only be readable from Project B. You also want to ensure that data in the Cloud Storage bucket cannot be accessed from or copied to Cloud Storage buckets outside the network, even if the user has the correct credentials.

What should you do?

Options:

A.  

Enable VPC Service Controls, create a perimeter with Project A and B, and include Cloud Storage service.

B.  

Enable Domain Restricted Sharing Organization Policy and Bucket Policy Only on the Cloud Storage bucket.

C.  

Enable Private Access in Project A and B networks with strict firewall rules to allow communication between the networks.

D.  

Enable VPC Peering between Project A and B networks with strict firewall rules to allow communication between the networks.

Discussion 0
Questions 64

You need to create a VPC that enables your security team to control network resources such as firewall rules. How should you configure the network to allow for separation of duties for network resources?

Options:

A.  

Set up multiple VPC networks, and set up multi-NIC virtual appliances to connect the networks.

B.  

Set up VPC Network Peering, and allow developers to peer their network with a Shared VPC.

C.  

Set up a VPC in a project. Assign the Compute Network Admin role to the security team, and assign the Compute Admin role to the developers.

D.  

Set up a Shared VPC where the security team manages the firewall rules, and share the network with developers via service projects.

Discussion 0
Questions 65

Your organization operates a hybrid cloud environment and has recently deployed a private Artifact Registry repository in Google Cloud. On-premises developers cannot resolve the Artifact Registry hostname and therefore cannot push or pull artifacts. You've verified the following:

Connectivity to Google Cloud is established by Cloud VPN or Cloud Interconnect.

No custom DNS configurations exist on-premises.

There is no route to the internet from the on-premises network.

You need to identify the cause and enable the developers to push and pull artifacts. What is likely causing the issue and what should you do to fix the issue?

Options:

A.  

Artifact Registry requires external HTTP/HTTPS access. Create a new firewall rule allowing ingress traffic on ports 80 and 443 from the developer's IP ranges.

B.  

Private Google Access is not enabled for the subnet hosting the Artifact Registry. Enable Private Google Access for the appropriate subnet.

C.  

On-premises DNS servers lack the necessary records to resolve private Google API domains. Create DNS records for restricted.googleapis.com or private.googleapis.com pointing to Google's published IP ranges.

D.  

Developers must be granted the artifactregistry.writer IAM role. Grant the relevant developer group this role.

Discussion 0
Questions 66

You need to set up a Cloud interconnect connection between your company's on-premises data center and VPC host network. You want to make sure that on-premises applications can only access Google APIs over the Cloud Interconnect and not through the public internet. You are required to only use APIs that are supported by VPC Service Controls to mitigate against exfiltration risk to non-supported APIs. How should you configure the network?

Options:

A.  

Enable Private Google Access on the regional subnets and global dynamic routing mode.

B.  

Set up a Private Service Connect endpoint IP address with the API bundle of "all-apis", which is advertised as a route over the Cloud interconnect connection.

C.  

Use private.googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the connection.

D.  

Use restricted googleapis.com to access Google APIs using a set of IP addresses only routable from within Google Cloud, which are advertised as routes over the Cloud Interconnect connection.

Discussion 0
Questions 67

You have been tasked with inspecting IP packet data for invalid or malicious content. What should you do?

Options:

A.  

Use Packet Mirroring to mirror traffic to and from particular VM instances. Perform inspection using security software that analyzes the mirrored traffic.

B.  

Enable VPC Flow Logs for all subnets in the VPC. Perform inspection on the Flow Logs data using Cloud Logging.

C.  

Configure the Fluentd agent on each VM Instance within the VP

C.  

Perform inspection on the log data using Cloud Logging.

D.  

Configure Google Cloud Armor access logs to perform inspection on the log data.

Discussion 0
Questions 68

You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project.

What should you do?

Options:

A.  

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation.

B.  

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.

C.  

In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User.

D.  

In Resource Manager, edit the organization permissions. Add the project ID as member with the role: Compute Image User.

Discussion 0
Questions 69

Your organization recently deployed a new application on Google Kubernetes Engine. You need to deploy a solution to protect the application. The solution has the following requirements:

Scans must run at least once per week

Must be able to detect cross-site scripting vulnerabilities

Must be able to authenticate using Google accounts

Which solution should you use?

Options:

A.  

Google Cloud Armor

B.  

Web Security Scanner

C.  

Security Health Analytics

D.  

Container Threat Detection

Discussion 0
Questions 70

A large e-retailer is moving to Google Cloud Platform with its ecommerce website. The company wants to ensure payment information is encrypted between the customer’s browser and GCP when the customers checkout online.

What should they do?

Options:

A.  

Configure an SSL Certificate on an L7 Load Balancer and require encryption.

B.  

Configure an SSL Certificate on a Network TCP Load Balancer and require encryption.

C.  

Configure the firewall to allow inbound traffic on port 443, and block all other inbound traffic.

D.  

Configure the firewall to allow outbound traffic on port 443, and block all other outbound traffic.

Discussion 0
Questions 71

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements—such as having a company-managed device, a specific location, and a valid user identity—can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

Options:

A.  

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.  

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

C.  

Use Cloud Audit Logs to monitor user access to the project resources.11 Use post-incident analysis to identify unauthorized access attempts.

D.  

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

Discussion 0
Questions 72

You need to follow Google-recommended practices to leverage envelope encryption and encrypt data at the application layer.

What should you do?

Options:

A.  

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the encrypted DEK.

B.  

Generate a data encryption key (DEK) locally to encrypt the data, and generate a new key encryption key (KEK) in Cloud KMS to encrypt the DEK. Store both the encrypted data and the KEK.

C.  

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the encrypted DEK.

D.  

Generate a new data encryption key (DEK) in Cloud KMS to encrypt the data, and generate a key encryption key (KEK) locally to encrypt the key. Store both the encrypted data and the KEK.

Discussion 0
Questions 73

Your organization is transitioning to Google Cloud You want to ensure that only trusted container images are deployed on Google Kubernetes Engine (GKE) clusters in a project. The containers must be deployed from a centrally managed. Container Registry and signed by a trusted authority.

What should you do?

Choose 2 answers

Options:

A.  

Configure the Binary Authorization policy with respective attestations for the project.

B.  

Create a custom organization policy constraint to enforce Binary Authorization for Google Kubernetes Engine (GKE).

C.  

Enable Container Threat Detection in the Security Command Center (SCC) for the project.

D.  

Configure the trusted image organization policy constraint for the project.

E.  

Enable Pod Security standards and set them to Restricted.

Discussion 0
Questions 74

Your team needs to prevent users from creating projects in the organization. Only the DevOps team should be allowed to create projects on behalf of the requester.

Which two tasks should your team perform to handle this request? (Choose two.)

Options:

A.  

Remove all users from the Project Creator role at the organizational level.

B.  

Create an Organization Policy constraint, and apply it at the organizational level.

C.  

Grant the Project Editor role at the organizational level to a designated group of users.

D.  

Add a designated group of users to the Project Creator role at the organizational level.

E.  

Grant the billing account creator role to the designated DevOps team.

Discussion 0
Questions 75

A customer deploys an application to App Engine and needs to check for Open Web Application Security Project (OWASP) vulnerabilities.

Which service should be used to accomplish this?

Options:

A.  

Cloud Armor

B.  

Google Cloud Audit Logs

C.  

Cloud Security Scanner

D.  

Forseti Security

Discussion 0
Questions 76

Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.

What should you do?

Options:

A.  

Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (laC).

B.  

Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing

C.  

Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.

D.  

Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.

Discussion 0
Questions 77

Your organization wants to be General Data Protection Regulation (GDPR) compliant You want to ensure that your DevOps teams can only create Google Cloud resources in the Europe regions.

What should you do?

Options:

A.  

Use the org policy constraint "Restrict Resource Service Usage'* on your Google Cloud organization node.

B.  

Use Identity and Access Management (1AM) custom roles to ensure that your DevOps team can only create resources in the Europe regions

C.  

Use the org policy constraint Google Cloud Platform - Resource Location Restriction" on your Google Cloudorganization node.

D.  

Use Identity-Aware Proxy (IAP) with Access Context Manager to restrict the location of Google Cloud resources.

Discussion 0
Questions 78

You are a security engineer at a finance company. Your organization plans to store data on Google Cloud, but your leadership team is worried about the security of their highly sensitive data Specifically, your

company is concerned about internal Google employees' ability to access your company's data on Google Cloud. What solution should you propose?

Options:

A.  

Use customer-managed encryption keys.

B.  

Use Google's Identity and Access Management (IAM) service to manage access controls on Google Cloud.

C.  

Enable Admin activity logs to monitor access to resources.

D.  

Enable Access Transparency logs with Access Approval requests for Google employees.

Discussion 0
Questions 79

You need to enforce a security policy in your Google Cloud organization that prevents users from exposing objects in their buckets externally. There are currently no buckets in your organization. Which solution should you implement proactively to achieve this goal with the least operational overhead?

Options:

A.  

Create an hourly cron job to run a Cloud Function that finds public buckets and makes them private.

B.  

Enable the constraints/storage.publicAccessPrevention constraint at the organization level.

C.  

Enable the constraints/storage.uniformBucketLevelAccess constraint at the organization level.

D.  

Create a VPC Service Controls perimeter that protects the storage.googleapis.com service in your projects that contains buckets. Add any new project that contains a bucket to the perimeter.

Discussion 0
Questions 80

Your organization operates in a highly regulated environment and has a stringent set of compliance requirements for protecting customer data. You must encrypt data while in use to meet regulations. What should you do?

Options:

A.  

Use customer-managed encryption keys (CMEK) and Cloud KSM to enable your organization to control their keys for data encryption in Cloud SQL

B.  

Enable the use of customer-supplied encryption keys (CSEK) keys in the Google Compute Engine VMs to give your organization maximum control over their VM disk encryption.

C.  

Establish a trusted execution environment with a Confidential VM.

D.  

Use a Shielded VM to ensure a secure boot with integrity monitoring for the application environment.

Discussion 0
Questions 81

Your organization is using Google Cloud to develop and host its applications. Following Google-recommended practices, the team has created dedicated projects for development and production. Your development team is located in Canada and Germany. The operations team works exclusively from Germany to adhere to local laws. You need to ensure that admin access to Google Cloud APIs is restricted to these countries and environments. What should you do?

Options:

A.  

Create dedicated firewall policies for each environment at the organization level, and then apply these policies to the projects. Create a rule to restrict access based on geolocations.

B.  

Group all development and production projects in separate folders. Activate the organization policy on the folders to restrict resource location according to the requirements.

C.  

Create dedicated VPC Service Controls perimeters for development and production projects. Configure distinct ingress policies to allow access from the respective countries.

D.  

Create dedicated IAM Groups for the Canadian and German developers. Grant access to the development and production projects according to the requirements.

Discussion 0
Questions 82

Your organization has 3 TB of information in BigQuery and Cloud SQL. You need to develop a cost-effective, scalable, and secure strategy to anonymize the personally identifiable information (PII) that exists today. What should you do?

Options:

A.  

Scan your BigQuery and Cloud SQL data using the Cloud DLP data profiling feature. Use the data profiling results to create a de-identification strategy with either Cloud Sensitive Data Protection's de-identification templates or custom configurations.

B.  

Create a new BigQuery dataset and Cloud SQL instance. Copy a small subset of the data to these new locations. Use Cloud Data Loss Prevention API to scan this subset for PII. Based on the results, create a custom anonymization script and apply the script to the entire 3 TB dataset in the original locations.

C.  

Export all 3TB of data from BigQuery and Cloud SQL to Cloud Storage. Use Cloud Sensitive Data Protection to anonymize the exported data. Re-import the anonymized data back into BigQuery and Cloud SQL.

D.  

Inspect a representative sample of the data in BigQuery and Cloud SQL to identify PII. Based on this analysis, develop a custom script to anonymize the identified PII.

Discussion 0
Questions 83

You work for a large organization where each business unit has thousands of users. You need to delegate management of access control permissions to each business unit. You have the following requirements:

Each business unit manages access controls for their own projects.

Each business unit manages access control permissions at scale.

Business units cannot access other business units' projects.

Users lose their access if they move to a different business unit or leave the company.

Users and access control permissions are managed by the on-premises directory service.

What should you do? (Choose two.)

Options:

A.  

Use VPC Service Controls to create perimeters around each business unit's project.

B.  

Organize projects in folders, and assign permissions to Google groups at the folder level.

C.  

Group business units based on Organization Units (OUs) and manage permissions based on OUs.

D.  

Create a project naming convention, and use Google's IAM Conditions to manage access based on the prefix of project names.

E.  

Use Google Cloud Directory Sync to synchronize users and group memberships in Cloud Identity.

Discussion 0
Questions 84

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.  

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.  

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.  

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.  

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Discussion 0
Questions 85

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and can not extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

Options:

A.  

Add a public IP address to the application's database. Create database users for each of the partner’s employees. Securely distribute the credentials for these users to the partner team.

B.  

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner's Cloud Identity accounts access to the database.

C.  

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

D.  

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner’s identity provider. Grant the workforce pool resources access to the database.

Discussion 0
Questions 86

Your company’s cloud security policy dictates that VM instances should not have an external IP address. You need to identify the Google Cloud service that will allow VM instances without external IP addresses to connect to the internet to update the VMs. Which service should you use?

Options:

A.  

Identity Aware-Proxy

B.  

Cloud NAT

C.  

TCP/UDP Load Balancing

D.  

Cloud DNS

Discussion 0
Questions 87

You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements:

The master key must be rotated at least once every 45 days.

The solution that stores the master key must be FIPS 140-2 Level 3 validated.

The master key must be stored in multiple regions within the US for redundancy.

Which solution meets these requirements?

Options:

A.  

Customer-managed encryption keys with Cloud Key Management Service

B.  

Customer-managed encryption keys with Cloud HSM

C.  

Customer-supplied encryption keys

D.  

Google-managed encryption keys

Discussion 0
Questions 88

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard

Which options should you recommend to meet the requirements?

Options:

A.  

Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.

B.  

Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.

C.  

Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.

D.  

Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.

Discussion 0
Questions 89

You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices?

Options:

A.  

Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All."

B.  

Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service.

C.  

Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, and then turn the policy back on.

D.  

Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on.

Discussion 0
Questions 90

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.  

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.

B.  

Set an Identity and Access Management (1AM) policy that includes a condition that restricts group membership to user principals that belong to your organization.

C.  

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.

D.  

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Have the alert trigger a Cloud Function instance that removes the external members from the group.

Discussion 0
Questions 91

Your company runs a website that will store PII on Google Cloud Platform. To comply with data privacy regulations, this data can only be stored for a specific amount of time and must be fully deleted after this specific period. Data that has not yet reached the time period should not be deleted. You want to automate the process of complying with this regulation.

What should you do?

Options:

A.  

Store the data in a single Persistent Disk, and delete the disk at expiration time.

B.  

Store the data in a single BigQuery table and set the appropriate table expiration time.

C.  

Store the data in a Cloud Storage bucket, and configure the bucket's Object Lifecycle Management feature.

D.  

Store the data in a single BigTable table and set an expiration time on the column families.

Discussion 0
Questions 92

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.  

Enable Access Transparency Logging.

B.  

Deploy resources only to regions permitted by data residency requirements

C.  

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.  

Deploy Assured Workloads.

Discussion 0
Questions 93

You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization. What should you do?

Options:

A.  

Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment.

B.  

Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts.

C.  

Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts.

D.  

Use organization policy constraints/iam.disableServiceAccountCreation boolean to disable the creation of new service accounts.

Discussion 0
Questions 94

A company is deploying their application on Google Cloud Platform. Company policy requires long-term data to be stored using a solution that can automatically replicate data over at least two geographic places.

Which Storage solution are they allowed to use?

Options:

A.  

Cloud Bigtable

B.  

Cloud BigQuery

C.  

Compute Engine SSD Disk

D.  

Compute Engine Persistent Disk

Discussion 0
Questions 95

You are setting up Cloud Identity for your company's Google Cloud organization. User accounts will be provisioned from Microsoft Entra ID through Directory Sync, and there will be single sign-on through Entra ID. You need to secure the super administrator accounts for the organization. Your solution must follow the principle of least privilege and implement strong authentication. What should you do?

Options:

A.  

Create dedicated accounts for super administrators. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

B.  

Create dedicated accounts for super administrators. Enforce Google 2-step verification for the super administrator accounts.

C.  

Create accounts that combine the organization administrator and the super administrator privileges. Ensure that 2-step verification is enforced for the super administrator accounts in Entra ID.

D.  

Create accounts that combine the organization administrators and the super administrator privileges. Enforce Google 2-step verification for the super administrator accounts.

Discussion 0