Weekend Special 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

Google Cloud Certified - Professional Cloud Security Engineer Question and Answers

Google Cloud Certified - Professional Cloud Security Engineer

Last Update Dec 14, 2024
Total Questions : 234

We are offering FREE Professional-Cloud-Security-Engineer Google exam questions. All you do is to just go and sign up. Give your details, prepare Professional-Cloud-Security-Engineer free exam questions and then go for complete pool of Google Cloud Certified - Professional Cloud Security Engineer test questions that will help you more.

Professional-Cloud-Security-Engineer pdf

Professional-Cloud-Security-Engineer PDF

$36.75  $104.99
Professional-Cloud-Security-Engineer Engine

Professional-Cloud-Security-Engineer Testing Engine

$43.75  $124.99
Professional-Cloud-Security-Engineer PDF + Engine

Professional-Cloud-Security-Engineer PDF + Testing Engine

$57.75  $164.99
Questions 1

You are on your company's development team. You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user's browser in a production environment.

How should you prevent and fix this vulnerability?

Options:

A.  

Use Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability.

B.  

Set up an HTTPS load balancer, and then use Cloud Armor for the production environment to prevent the potential XSS attack.

C.  

Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library.

D.  

Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping.

Discussion 0
Questions 2

You need to enforce a security policy in your Google Cloud organization that prevents users from exposing objects in their buckets externally. There are currently no buckets in your organization. Which solution should you implement proactively to achieve this goal with the least operational overhead?

Options:

A.  

Create an hourly cron job to run a Cloud Function that finds public buckets and makes them private.

B.  

Enable the constraints/storage.publicAccessPrevention constraint at the organization level.

C.  

Enable the constraints/storage.uniformBucketLevelAccess constraint at the organization level.

D.  

Create a VPC Service Controls perimeter that protects the storage.googleapis.com service in your projects that contains buckets. Add any new project that contains a bucket to the perimeter.

Discussion 0
Questions 3

Applications often require access to “secrets” - small pieces of sensitive data at build or run time. The administrator managing these secrets on GCP wants to keep a track of “who did what, where, and when?” within their GCP projects.

Which two log streams would provide the information that the administrator is looking for? (Choose two.)

Options:

A.  

Admin Activity logs

B.  

System Event logs

C.  

Data Access logs

D.  

VPC Flow logs

E.  

Agent logs

Discussion 0
Questions 4

Your organization acquired a new workload. The Web and Application (App) servers will be running on Compute Engine in a newly created custom VPC. You are responsible for configuring a secure network communication solution that meets the following requirements:

Only allows communication between the Web and App tiers.

Enforces consistent network security when autoscaling the Web and App tiers.

Prevents Compute Engine Instance Admins from altering network traffic.

What should you do?

Options:

A.  

1. Configure all running Web and App servers with respective network tags.

2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

B.  

1. Configure all running Web and App servers with respective service accounts.

2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

C.  

1. Re-deploy the Web and App servers with instance templates configured with respective network tags.

2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

D.  

1. Re-deploy the Web and App servers with instance templates configured with respective service accounts.

2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

Discussion 0
Questions 5

Your organization hosts a financial services application running on Compute Engine instances for a third-party company. The third-party company’s servers that will consume the application also run on Compute Engine in a separate Google Cloud organization. You need to configure a secure network connection between the Compute Engine instances. You have the following requirements:

    The network connection must be encrypted.

    The communication between servers must be over private IP addresses.

What should you do?

Options:

A.  

Configure a Cloud VPN connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

B.  

Configure a VPC peering connection between your organization's VPC network and the third party's that is controlled by VPC firewall rules.

C.  

Configure a VPC Service Controls perimeter around your Compute Engine instances, and provide access to the third party via an access level.

D.  

Configure an Apigee proxy that exposes your Compute Engine-hosted application as an API, and is encrypted with TLS which allows access only to the third party.

Discussion 0
Questions 6

A customer wants to make it convenient for their mobile workforce to access a CRM web interface that is hosted on Google Cloud Platform (GCP). The CRM can only be accessed by someone on the corporate network. The customer wants to make it available over the internet. Your team requires an authentication layer in front of the application that supports two-factor authentication

Which GCP product should the customer implement to meet these requirements?

Options:

A.  

Cloud Identity-Aware Proxy

B.  

Cloud Armor

C.  

Cloud Endpoints

D.  

Cloud VPN

Discussion 0
Questions 7

Last week, a company deployed a new App Engine application that writes logs to BigQuery. No other workloads are running in the project. You need to validate that all data written to BigQuery was done using the App Engine Default Service Account.

What should you do?

Options:

A.  

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Hide Matching Entries.

4.Make sure the resulting list is empty.

B.  

1. Use StackDriver Logging and filter on BigQuery Insert Jobs.

2.Click on the email address in line with the App Engine Default Service Account in the authentication field.

3.Click Show Matching Entries.

4.Make sure the resulting list is empty.

C.  

1. In BigQuery, select the related dataset.

2. Make sure the App Engine Default Service Account is the only account that can write to the dataset.

D.  

1. Go to the IAM section on the project.

2. Validate that the App Engine Default Service Account is the only account that has a role that can write to BigQuery.

Discussion 0
Questions 8

Your Google Cloud organization allows for administrative capabilities to be distributed to each team through provision of a Google Cloud project with Owner role (roles/ owner). The organization contains thousands of Google Cloud Projects Security Command Center Premium has surfaced multiple cpen_myscl_port findings. You are enforcing the guardrails and need to prevent these types of common misconfigurations.

What should you do?

Options:

A.  

Create a firewall rule for each virtual private cloud (VPC) to deny traffic from 0 0 0 0/0 with priority 0.

B.  

Create a hierarchical firewall policy configured at the organization to deny all connections from 0 0 0 0/0.

C.  

Create a Google Cloud Armor security policy to deny traffic from 0 0 0 0/0.

D.  

Create a hierarchical firewall policy configured at the organization to allow connections only from internal IP ranges

Discussion 0
Questions 9

You want to make sure that your organization’s Cloud Storage buckets cannot have data publicly available to the internet. You want to enforce this across all Cloud Storage buckets. What should you do?

Options:

A.  

Remove Owner roles from end users, and configure Cloud Data Loss Prevention.

B.  

Remove Owner roles from end users, and enforce domain restricted sharing in an organization policy.

C.  

Configure uniform bucket-level access, and enforce domain restricted sharing in an organization policy.

D.  

Remove *.setIamPolicy permissions from all roles, and enforce domain restricted sharing in an organization policy.

Discussion 0
Questions 10

You are routing all your internet facing traffic from Google Cloud through your on-premises internet connection. You want to accomplish this goal securely and with the highest bandwidth possible.

What should you do?

Options:

A.  

Create an HA VPN connection to Google Cloud Replace the default 0 0 0 0/0 route.

B.  

Create a routing VM in Compute Engine Configure the default route with the VM as the next hop.

C.  

Configure Cloud Interconnect with HA VPN Replace the default 0 0 0 0/0 route to an on-premises destination.

D.  

Configure Cloud Interconnect and route traffic through an on-premises firewall.

Discussion 0
Questions 11

Your company has been creating users manually in Cloud Identity to provide access to Google Cloud resources. Due to continued growth of the environment, you want to authorize the Google Cloud Directory Sync (GCDS) instance and integrate it with your on-premises LDAP server to onboard hundreds of users. You are required to:

Replicate user and group lifecycle changes from the on-premises LDAP server in Cloud Identity.

Disable any manually created users in Cloud Identity.

You have already configured the LDAP search attributes to include the users and security groups in scope for Google Cloud. What should you do next to complete this solution?

Options:

A.  

1. Configure the option to suspend domain users not found in LDAP.

2. Set up a recurring GCDS task.

B.  

1. Configure the option to delete domain users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

C.  

1. Configure the LDAP search attributes to exclude manually created Cloud Identity users not found in LDAP.

2. Set up a recurring GCDS task.

D.  

1. Configure the LDAP search attributes to exclude manually created Cloud identity users not found in LDAP.

2. Run GCDS after user and group lifecycle changes.

Discussion 0
Questions 12

You are backing up application logs to a shared Cloud Storage bucket that is accessible to both the administrator and analysts. Analysts should not have access to logs that contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible to the administrator. What should you do?

Options:

A.  

Upload the logs to both the shared bucket and the bucket with Pll that is only accessible to the administrator. Use the Cloud Data Loss Prevention API to create a job trigger. Configure the trigger to delete any files that contain Pll from the shared bucket.

B.  

On the shared bucket, configure Object Lifecycle Management to delete objects that contain Pll.

C.  

On the shared bucket, configure a Cloud Storage trigger that is only triggered when Pll is uploaded. Use Cloud Functions to capture the trigger and delete the files that contain Pll.

D.  

Use Pub/Sub and Cloud Functions to trigger a Cloud Data Loss Prevention scan every time a file is uploaded to the administrator's bucket. If the scan does not detect Pll, have the function move the objects into the shared Cloud Storage bucket.

Discussion 0
Questions 13

You are migrating an on-premises data warehouse to BigQuery Cloud SQL, and Cloud Storage. You need to configure security services in the data warehouse. Your company compliance policies mandate that the data warehouse must:

• Protect data at rest with full lifecycle management on cryptographic keys

• Implement a separate key management provider from data management

• Provide visibility into all encryption key requests

What services should be included in the data warehouse implementation?

Choose 2 answers

Options:

A.  

Customer-managed encryption keys

B.  

Customer-Supplied Encryption Keys

C.  

Key Access Justifications

D.  

Access Transparency and Approval

E.  

Cloud External Key Manager

Discussion 0
Questions 14

An administrative application is running on a virtual machine (VM) in a managed group at port 5601 inside a Virtual Private Cloud (VPC) instance without access to the internet currently. You want to expose the web interface at port 5601 to users and enforce authentication and authorization Google credentials

What should you do?

Options:

A.  

Modify the VPC routing with the default route point to the default internet gateway Modify the VPC Firewall rule to allow access from the internet 0.0.0.0/0 to port 5601 on the application instance.

B.  

Configure the bastion host with OS Login enabled and allow connection to port 5601 at VPC firewall Log in to the bastion host from the Google Cloud console by using SSH-in-browser and then to the web application

C.  

Configure an HTTP Load Balancing instance that points to the managed group with Identity-Aware Proxy (IAP) protection with Google credentials Modify the VPC firewall to allow access from IAP network range

D.  

Configure Secure Shell Access (SSH) bastion host in a public network, and allow only the bastion host to connect to the application on port 5601. Use a bastion host as a jump host to connect to the application

Discussion 0
Questions 15

Your company is storing sensitive data in Cloud Storage. You want a key generated on-premises to be used in the encryption process.

What should you do?

Options:

A.  

Use the Cloud Key Management Service to manage a data encryption key (DEK).

B.  

Use the Cloud Key Management Service to manage a key encryption key (KEK).

C.  

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.  

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Discussion 0
Questions 16

Your organization has on-premises hosts that need to access Google Cloud APIs You must enforce private connectivity between these hosts minimize costs and optimize for operational efficiency

What should you do?

Options:

A.  

Route all on-premises traffic to Google Cloud through an IPsec VPN tunnel to a VPC with Private Google Access enabled.

B.  

Set up VPC peering between the hosts on-premises and the VPC through the internet.

C.  

Enforce a security policy that mandates all applications to encrypt data with a Cloud Key Management. Service (KMS) key before you send it over the network.

D.  

Route all on-premises traffic to Google Cloud through a dedicated or Partner interconnect to a VPC with Private Google Access enabled.

Discussion 0
Questions 17

Your team sets up a Shared VPC Network where project co-vpc-prod is the host project. Your team has configured the firewall rules, subnets, and VPN gateway on the host project. They need to enable Engineering Group A to attach a Compute Engine instance to only the 10.1.1.0/24 subnet.

What should your team grant to Engineering Group A to meet this requirement?

Options:

A.  

Compute Network User Role at the host project level.

B.  

Compute Network User Role at the subnet level.

C.  

Compute Shared VPC Admin Role at the host project level.

D.  

Compute Shared VPC Admin Role at the service project level.

Discussion 0
Questions 18

A company’s application is deployed with a user-managed Service Account key. You want to use Google- recommended practices to rotate the key.

What should you do?

Options:

A.  

Open Cloud Shell and run gcloud iam service-accounts enable-auto-rotate --iam- account=IAM_ACCOUNT.

B.  

Open Cloud Shell and run gcloud iam service-accounts keys rotate --iam- account=IAM_ACCOUNT --key=NEW_KEY.

C.  

Create a new key, and use the new key in the application. Delete the old key from the Service Account.

D.  

Create a new key, and use the new key in the application. Store the old key on the system as a backup key.

Discussion 0
Questions 19

You are exporting application logs to Cloud Storage. You encounter an error message that the log sinks don't support uniform bucket-level access policies. How should you resolve this error?

Options:

A.  

Change the access control model for the bucket

B.  

Update your sink with the correct bucket destination.

C.  

Add the roles/logging.logWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

D.  

Add the roles/logging.bucketWriter Identity and Access Management (IAM) role to the bucket for the log sink identity.

Discussion 0
Questions 20

You need to use Cloud External Key Manager to create an encryption key to encrypt specific BigQuery data at rest in Google Cloud. Which steps should you do first?

Options:

A.  

1. Create or use an existing key with a unique uniform resource identifier (URI) in your Google Cloud project.

2. Grant your Google Cloud project access to a supported external key management partner system.

B.  

1. Create or use an existing key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).

2. In Cloud KMS, grant your Google Cloud project access to use the key.

C.  

1. Create or use an existing key with a unique uniform resource identifier (URI) in a supported external key management partner system.

2. In the external key management partner system, grant access for this key to use your Google Cloud project.

D.  

1. Create an external key with a unique uniform resource identifier (URI) in Cloud Key Management Service (Cloud KMS).

2. In Cloud KMS, grant your Google Cloud project access to use the key.

Discussion 0
Questions 21

You are a consultant for an organization that is considering migrating their data from its private cloud to Google Cloud. The organization’s compliance team is not familiar with Google Cloud and needs guidance on how compliance requirements will be met on Google Cloud. One specific compliance requirement is for customer data at rest to reside within specific geographic boundaries. Which option should you recommend for the organization to meet their data residency requirements on Google Cloud?

Options:

A.  

Organization Policy Service constraints

B.  

Shielded VM instances

C.  

Access control lists

D.  

Geolocation access controls

E.  

Google Cloud Armor

Discussion 0
Questions 22

You are deploying regulated workloads on Google Cloud. The regulation has data residency and data access requirements. It also requires that support is provided from the same geographical location as where the data resides.

What should you do?

Options:

A.  

Enable Access Transparency Logging.

B.  

Deploy resources only to regions permitted by data residency requirements

C.  

Use Data Access logging and Access Transparency logging to confirm that no users are accessing data from another region.

D.  

Deploy Assured Workloads.

Discussion 0
Questions 23

You are developing a new application that uses exclusively Compute Engine VMs Once a day. this application will execute five different batch jobs Each of the batch jobs requires a dedicated set of permissions on Google Cloud resources outside of your application. You need to design a secure access concept for the batch jobs that adheres to the least-privilege principle

What should you do?

Options:

A.  

1. Create a general service account **g-sa" to execute the batch jobs.

• 2 Grant the permissions required to execute the batch jobs to g-sa.

• 3. Execute the batch jobs with the permissions granted to g-sa

B.  

1. Create a general service account "g-sa" to orchestrate the batch jobs.

• 2. Create one service account per batch job Mb-sa-[1-5]," and grant only the permissions required to run the individual batch jobs to the service accounts.

• 3. Grant the Service Account Token Creator role to g-sa Use g-sa to obtain short-lived access tokens for b-sa-[1-5] and to execute the batch jobs with the permissions of b-sa-[1-5].

C.  

1. Create a workload identity pool and configure workload identity pool providers for each batch job

• 2 Assign the workload identity user role to each of the identities configured in the providers.

• 3. Create one service account per batch job Mb-sa-[1-5]". and grant only the permissions required to run the individual batch jobs to the service accounts

• 4 Generate credential configuration files for each of the providers

D.  

• 1. Create a general service account "g-sa" to orchestrate the batch jobs.

• 2 Create one service account per batch job 'b-sa-[1-5)\ Grant only the permissions required to run the individual batch jobs to the service accounts and generate service account keys for each of these service accounts

• 3. Store the service account keys in Secret Manager. Grant g-sa access to Secret Manager and run the batch jobs with the permis

Discussion 0
Questions 24

You are in charge of creating a new Google Cloud organization for your company. Which two actions should you take when creating the super administrator accounts? (Choose two.)

Options:

A.  

Create an access level in the Google Admin console to prevent super admin from logging in to Google Cloud.

B.  

Disable any Identity and Access Management (1AM) roles for super admin at the organization level in the Google Cloud Console.

C.  

Use a physical token to secure the super admin credentials with multi-factor authentication (MFA).

D.  

Use a private connection to create the super admin accounts to avoid sending your credentials over the Internet.

E.  

Provide non-privileged identities to the super admin users for their day-to-day activities.

Discussion 0
Questions 25

You are a security administrator at your company. Per Google-recommended best practices, you implemented the domain restricted sharing organization policy to allow only required domains to access your projects. An engineering team is now reporting that users at an external partner outside your organization domain cannot be granted access to the resources in a project. How should you make an exception for your partner's domain while following the stated best practices?

Options:

A.  

Turn off the domain restriction sharing organization policy. Set the policy value to "Allow All."

B.  

Turn off the domain restricted sharing organization policy. Provide the external partners with the required permissions using Google's Identity and Access Management (IAM) service.

C.  

Turn off the domain restricted sharing organization policy. Add each partner's Google Workspace customer ID to a Google group, add the Google group as an exception under the organization policy, and then turn the policy back on.

D.  

Turn off the domain restricted sharing organization policy. Set the policy value to "Custom." Add each external partner's Cloud Identity or Google Workspace customer ID as an exception under the organization policy, and then turn the policy back on.

Discussion 0
Questions 26

You are part of a security team investigating a compromised service account key. You need to audit which new resources were created by the service account.

What should you do?

Options:

A.  

Query Data Access logs.

B.  

Query Admin Activity logs.

C.  

Query Access Transparency logs.

D.  

Query Stackdriver Monitoring Workspace.

Discussion 0
Questions 27

You are migrating an application into the cloud The application will need to read data from a Cloud Storage bucket. Due to local regulatory requirements, you need to hold the key material used for encryption fully under your control and you require a valid rationale for accessing the key material.

What should you do?

Options:

A.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys. Configure an 1AM deny policy for unauthorized groups

B.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys backed by a Cloud Hardware Security Module (HSM). Enable data access logs.

C.  

Generate a key in your on-premises environment and store it in a Hardware Security Module (HSM) that is managed on-premises Use this key as an external key in the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and set the external key system to reject unauthorized accesses.

D.  

Generate a key in your on-premises environment to encrypt the data before you upload the data to the Cloud Storage bucket Upload the key to the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and have the external key system reject unauthorized accesses.

Discussion 0
Questions 28

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

Options:

A.  

Multifactor Authentication

B.  

A strict password policy

C.  

Captcha on login pages

D.  

Encrypted emails

Discussion 0
Questions 29

You are setting up a new Cloud Storage bucket in your environment that is encrypted with a customer managed encryption key (CMEK). The CMEK is stored in Cloud Key Management Service (KMS). in project "pr j -a", and the Cloud Storage bucket will use project "prj-b". The key is backed by a Cloud Hardware Security Module (HSM) and resides in the region europe-west3. Your storage bucket will be located in the region europe-west1. When you create the bucket, you cannot access the key. and you need to troubleshoot why.

What has caused the access issue?

Options:

A.  

A firewall rule prevents the key from being accessible.

B.  

Cloud HSM does not support Cloud Storage

C.  

The CMEK is in a different project than the Cloud Storage bucket

D.  

The CMEK is in a different region than the Cloud Storage bucket.

Discussion 0
Questions 30

A customer terminates an engineer and needs to make sure the engineer's Google account is automatically deprovisioned.

What should the customer do?

Options:

A.  

Use the Cloud SDK with their directory service to remove their IAM permissions in Cloud Identity.

B.  

Use the Cloud SDK with their directory service to provision and deprovision users from Cloud Identity.

C.  

Configure Cloud Directory Sync with their directory service to provision and deprovision users from Cloud Identity.

D.  

Configure Cloud Directory Sync with their directory service to remove their IAM permissions in Cloud Identity.

Discussion 0
Questions 31

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.  

Configure GCDS and use GCDS search rules lo sync these users.

B.  

Use the transfer tool to migrate unmanaged users.

C.  

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API to transfer their account.

D.  

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Discussion 0
Questions 32

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.  

Admin Activity

B.  

System Event

C.  

Access Transparency

D.  

Data Access

Discussion 0
Questions 33

You need to set up two network segments: one with an untrusted subnet and the other with a trusted subnet. You want to configure a virtual appliance such as a next-generation firewall (NGFW) to inspect all traffic between the two network segments. How should you design the network to inspect the traffic?

Options:

A.  

1. Set up one VPC with two subnets: one trusted and the other untrusted.

2. Configure a custom route for all traffic (0.0.0.0/0) pointed to the virtual appliance.

B.  

1. Set up one VPC with two subnets: one trusted and the other untrusted.

2. Configure a custom route for all RFC1918 subnets pointed to the virtual appliance.

C.  

1. Set up two VPC networks: one trusted and the other untrusted, and peer them together.

2. Configure a custom route on each network pointed to the virtual appliance.

D.  

1. Set up two VPC networks: one trusted and the other untrusted.

2. Configure a virtual appliance using multiple network interfaces, with each interface connected to one of the VPC networks.

Discussion 0
Questions 34

You are tasked with exporting and auditing security logs for login activity events for Google Cloud console and API calls that modify configurations to Google Cloud resources. Your export must meet the following requirements:

Export related logs for all projects in the Google Cloud organization.

Export logs in near real-time to an external SIEM.

What should you do? (Choose two.)

Options:

A.  

Create a Log Sink at the organization level with a Pub/Sub destination.

B.  

Create a Log Sink at the organization level with the includeChildren parameter, and set the destination to a Pub/Sub topic.

C.  

Enable Data Access audit logs at the organization level to apply to all projects.

D.  

Enable Google Workspace audit logs to be shared with Google Cloud in the Admin Console.

E.  

Ensure that the SIEM processes the AuthenticationInfo field in the audit log entry to gather identity information.

Discussion 0
Questions 35

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.  

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.  

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create a

job trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.  

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.  

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Discussion 0
Questions 36

Your organization has had a few recent DDoS attacks. You need to authenticate responses to domain name lookups. Which Google Cloud service should you use?

Options:

A.  

Cloud DNS with DNSSEC

B.  

Cloud NAT

C.  

HTTP(S) Load Balancing

D.  

Google Cloud Armor

Discussion 0
Questions 37

Your organization operates Virtual Machines (VMs) with only private IPs in the Virtual Private Cloud (VPC) with internet access through Cloud NAT Everyday, you must patch all VMs with critical OS updates and provide summary reports

What should you do?

Options:

A.  

Validate that the egress firewall rules allow any outgoing traffic Log in to each VM and execute OS specific update commands Configure the Cloud Scheduler job to update with critical patches daily for daily updates.

B.  

Ensure that VM Manager is installed and running on the VMs. In the OS patch management service. configure the patch jobs to update with critical patches daily.

C.  

Assign public IPs to VMs. Validate that the egress firewall rules allow any outgoing traffic Log in to each VM. and configure a daily cron job to enable for OS updates at night during low activity periods.

D.  

Copy the latest patches to the Cloud Storage bucket. Log in to each VM. download the patches from the bucket, and install them.

Discussion 0
Questions 38

You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.)

Options:

A.  

Secret Manager

B.  

Cloud Key Management Service

C.  

Cloud Data Loss Prevention with cryptographic hashing

D.  

Cloud Data Loss Prevention with automatic text redaction

E.  

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

Discussion 0
Questions 39

You need to centralize your team’s logs for production projects. You want your team to be able to search and analyze the logs using Logs Explorer. What should you do?

Options:

A.  

Enable Cloud Monitoring workspace, and add the production projects to be monitored.

B.  

Use Logs Explorer at the organization level and filter for production project logs.

C.  

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a Cloud Storage bucket.

D.  

Create an aggregate org sink at the parent folder of the production projects, and set the destination to a logs bucket.

Discussion 0
Questions 40

An organization’s typical network and security review consists of analyzing application transit routes, request handling, and firewall rules. They want to enable their developer teams to deploy new applications without the overhead of this full review.

How should you advise this organization?

Options:

A.  

Use Forseti with Firewall filters to catch any unwanted configurations in production.

B.  

Mandate use of infrastructure as code and provide static analysis in the CI/CD pipelines to enforce policies.

C.  

Route all VPC traffic through customer-managed routers to detect malicious patterns in production.

D.  

All production applications will run on-premises. Allow developers free rein in GCP as their dev and QA platforms.

Discussion 0
Questions 41

How should a customer reliably deliver Stackdriver logs from GCP to their on-premises SIEM system?

Options:

A.  

Send all logs to the SIEM system via an existing protocol such as syslog.

B.  

Configure every project to export all their logs to a common BigQuery DataSet, which will be queried by the SIEM system.

C.  

Configure Organizational Log Sinks to export logs to a Cloud Pub/Sub Topic, which will be sent to the SIEM via Dataflow.

D.  

Build a connector for the SIEM to query for all logs in real time from the GCP RESTful JSON APIs.

Discussion 0
Questions 42

You have numerous private virtual machines on Google Cloud. You occasionally need to manage the servers through Secure Socket Shell (SSH) from a remote location. You want to configure remote access to the servers in a manner that optimizes security and cost efficiency.

What should you do?

Options:

A.  

Create a site-to-site VPN from your corporate network to Google Cloud.

B.  

Configure server instances with public IP addresses Create a firewall rule to only allow traffic from your corporate IPs.

C.  

Create a firewall rule to allow access from the Identity-Aware Proxy (IAP) IP range Grant the role of an IAP- secured Tunnel User to the administrators.

D.  

Create a jump host instance with public IP Manage the instances by connecting through the jump host.

Discussion 0
Questions 43

You are deploying a web application hosted on Compute Engine. A business requirement mandates that application logs are preserved for 12 years and data is kept within European boundaries. You want to implement a storage solution that minimizes overhead and is cost-effective. What should you do?

Options:

A.  

Create a Cloud Storage bucket to store your logs in the EUROPE-WEST1 region. Modify your application code to ship logs directly to your bucket for increased efficiency.

B.  

Configure your Compute Engine instances to use the Google Cloud's operations suite Cloud Logging agent to send application logs to a custom log bucket in the EUROPE-WEST1 region with a custom retention of 12 years.

C.  

Use a Pub/Sub topic to forward your application logs to a Cloud Storage bucket in the EUROPE-WEST1 region.

D.  

Configure a custom retention policy of 12 years on your Google Cloud's operations suite log bucket in the EUROPE-WEST1 region.

Discussion 0
Questions 44

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

Options:

A.  

Create a policy that requires employees to not leave their sessions open for long durations.

B.  

Review and disable unnecessary Google Cloud APIs.

C.  

Require strong passwords and 2SV through a security token or Google authenticate.

D.  

Set the session length timeout for Google Cloud services to a shorter duration.

Discussion 0
Questions 45

You need to implement an encryption-at-rest strategy that protects sensitive data and reduces key management complexity for non-sensitive data. Your solution has the following requirements:

    Schedule key rotation for sensitive data.

    Control which region the encryption keys for sensitive data are stored in.

    Minimize the latency to access encryption keys for both sensitive and non-sensitive data.

What should you do?

Options:

A.  

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.  

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service.

C.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Discussion 0
Questions 46

Your team wants to make sure Compute Engine instances running in your production project do not have public IP addresses. The frontend application Compute Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.

How should your team meet these requirements?

Options:

A.  

Enable Private Access on the VPC network in the production project.

B.  

Remove the Editor role and grant the Compute Admin IAM role to the engineers.

C.  

Set up an organization policy to only permit public IPs for the front-end Compute Engine instances.

D.  

Set up a VPC network with two subnets: one with public IPs and one without public IPs.

Discussion 0
Questions 47

Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1.

What command should you execute?

Options:

A.  

• organization policy: constraints/gcp.restrictStorageNonCraekServices

• binding at: orgl

• policy type: deny

• policy value: storage.gcogleapis.com

B.  

• organization policy: constraints/gcp.restrictHonCmekServices

• binding at: orgl

• policy type: deny

• policy value: storage.googleapis.com

C.  

• organization policy:constraints/gcp.restrictStorageNonCraekServices

• binding at: orgl

• policy type: allow

• policy value: all supported services

D.  

• organization policy: constramts/gcp.restrictNonCmekServices

• binding at: orgl

• policy type: allow

• policy value: storage.googleapis.com

Discussion 0
Questions 48

A customer implements Cloud Identity-Aware Proxy for their ERP system hosted on Compute Engine. Their security team wants to add a security layer so that the ERP systems only accept traffic from Cloud Identity- Aware Proxy.

What should the customer do to meet these requirements?

Options:

A.  

Make sure that the ERP system can validate the JWT assertion in the HTTP requests.

B.  

Make sure that the ERP system can validate the identity headers in the HTTP requests.

C.  

Make sure that the ERP system can validate the x-forwarded-for headers in the HTTP requests.

D.  

Make sure that the ERP system can validate the user’s unique identifier headers in the HTTP requests.

Discussion 0
Questions 49

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

Options:

A.  

Google Cloud Armor's preconfigured rules in preview mode

B.  

Prepopulated VPC firewall rules in monitor mode

C.  

The inherent protections of Google Front End (GFE)

D.  

Cloud Load Balancing firewall rules

E.  

VPC Service Controls in dry run mode

Discussion 0
Questions 50

You plan to use a Google Cloud Armor policy to prevent common attacks such as cross-site scripting (XSS) and SQL injection (SQLi) from reaching your web application's backend. What are two requirements for using Google Cloud Armor security policies? (Choose two.)

Options:

A.  

The load balancer must be an external SSL proxy load balancer.

B.  

Google Cloud Armor Policy rules can only match on Layer 7 (L7) attributes.

C.  

The load balancer must use the Premium Network Service Tier.

D.  

The backend service's load balancing scheme must be EXTERNAL.

E.  

The load balancer must be an external HTTP(S) load balancer.

Discussion 0
Questions 51

Your company is using Cloud Dataproc for its Spark and Hadoop jobs. You want to be able to create, rotate,

and destroy symmetric encryption keys used for the persistent disks used by Cloud Dataproc. Keys can be stored in the cloud.

What should you do?

Options:

A.  

Use the Cloud Key Management Service to manage the data encryption key (DEK).

B.  

Use the Cloud Key Management Service to manage the key encryption key (KEK).

C.  

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.  

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Discussion 0
Questions 52

Which Google Cloud service should you use to enforce access control policies for applications and resources?

Options:

A.  

Identity-Aware Proxy

B.  

Cloud NAT

C.  

Google Cloud Armor

D.  

Shielded VMs

Discussion 0
Questions 53

Your company conducts clinical trials and needs to analyze the results of a recent study that are stored in BigQuery. The interval when the medicine was taken contains start and stop dates The interval data is critical to the analysis, but specific dates may identify a particular batch and introduce bias You need to obfuscate the start and end dates for each row and preserve the interval data.

What should you do?

Options:

A.  

Use bucketing to shift values to a predetermined date based on the initial value.

B.  

Extract the date using TimePartConfig from each date field and append a random month and year

C.  

Use date shifting with the context set to the unique ID of the test subject

D.  

Use the FFX mode of format preserving encryption (FPE) and maintain data consistency

Discussion 0
Questions 54

Your organization wants to be continuously evaluated against CIS Google Cloud Computing Foundations Benchmark v1 3 0 (CIS Google Cloud Foundation 1 3). Some of the controls are irrelevant to your organization and must be disregarded in evaluation. You need to create an automated system or process to ensure that only the relevant controls are evaluated.

What should you do?

Options:

A.  

Mark all security findings that are irrelevant with a tag and a value that indicates a security exception Select all marked findings and mute them on the console every time they appear Activate Security Command Center (SCC) Premium.

B.  

Activate Security Command Center (SCC) Premium Create a rule to mute the security findings in SCC so they are not evaluated.

C.  

Download all findings from Security Command Center (SCC) to a CSV file Mark the findings that are part of CIS Google Cloud Foundation 1 3 in the file Ignore the entries that are irrelevant and out of scope for the company.

D.  

Ask an external audit company to provide independent reports including needed CIS benchmarks. In the scope of the audit clarify that some of the controls are not needed and must be disregarded.

Discussion 0
Questions 55

Your team needs to make sure that their backend database can only be accessed by the frontend application and no other instances on the network.

How should your team design this network?

Options:

A.  

Create an ingress firewall rule to allow access only from the application to the database using firewall tags.

B.  

Create a different subnet for the frontend application and database to ensure network isolation.

C.  

Create two VPC networks, and connect the two networks using Cloud VPN gateways to ensure network isolation.

D.  

Create two VPC networks, and connect the two networks using VPC peering to ensure network isolation.

Discussion 0
Questions 56

Your organization uses BigQuery to process highly sensitive, structured datasets. Following the "need to know" principle, you need to create the Identity and Access Management (IAM) design to meet the needs of these users:

• Business user must access curated reports.

• Data engineer: must administrate the data lifecycle in the platform.

• Security operator: must review user activity on the data platform.

What should you do?

Options:

A.  

Configure data access log for BigQuery services, and grant Project Viewer role to security operators.

B.  

Generate a CSV data file based on the business user's needs, and send the data to their email addresses.

C.  

Create curated tables in a separate dataset and assign the role roles/bigquery.dataViewer.

D.  

Set row-based access control based on the "region" column, and filter the record from the United States for data engineers.

Discussion 0
Questions 57

An organization is migrating from their current on-premises productivity software systems to G Suite. Some network security controls were in place that were mandated by a regulatory body in their region for their previous on-premises system. The organization’s risk team wants to ensure that network security controls are maintained and effective in G Suite. A security architect supporting this migration has been asked to ensure that network security controls are in place as part of the new shared responsibility model between the organization and Google Cloud.

What solution would help meet the requirements?

Options:

A.  

Ensure that firewall rules are in place to meet the required controls.

B.  

Set up Cloud Armor to ensure that network security controls can be managed for G Suite.

C.  

Network security is a built-in solution and Google’s Cloud responsibility for SaaS products like G Suite.

D.  

Set up an array of Virtual Private Cloud (VPC) networks to control network security as mandated by the relevant regulation.

Discussion 0
Questions 58

Your company recently published a security policy to minimize the usage of service account keys. On-premises Windows-based applications are interacting with Google Cloud APIs. You need to implement Workload Identity Federation (WIF) with your identity provider on-premises.

What should you do?

Options:

A.  

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Configure a rule to let principals in the pool impersonate the Google Cloud service account.

B.  

Set up a workload identity pool with your corporate Active Directory Federation Service (ADFS) Let all principals in the pool impersonate the Google Cloud service account.

C.  

Set up a workload identity pool with an OpenID Connect (OIDC) service on the name machine Configure a rule to let principals in the pool impersonate the Google Cloud service account.

D.  

Set up a workload identity pool with an OpenID Connect (OIDC) service on the same machine Let all principals in the pool impersonate the Google Cloud service account.

Discussion 0
Questions 59

Your Google Cloud environment has one organization node, one folder named Apps." and several projects within that folder The organizational node enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the terramearth.com organization The "Apps" folder enforces the constraints/iam.allowedPolicyMemberDomains organization policy, which allows members from the flowlogistic.com organization. It also has the inheritFromParent: false property.

You attempt to grant access to a project in the Apps folder to the user testuser@terramearth.com.

What is the result of your action and why?

Options:

A.  

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy must

be defined on the current project to deactivate the constraint temporarily.

B.  

The action fails because a constraints/iam.allowedPolicyMemberDomains organization policy is in place and only members from the flowlogistic.com organization are allowed.

C.  

The action succeeds because members from both organizations, terramearth. com or flowlogistic.com, are allowed on projects in the "Apps" folder

D.  

The action succeeds and the new member is successfully added to the project's Identity and Access Management (1AM) policy because all policies are inherited by underlying folders and projects.

Discussion 0
Questions 60

Your team needs to obtain a unified log view of all development cloud projects in your SIEM. The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization.

Which logging export strategy should you use to meet the requirements?

Options:

A.  

1. Export logs to a Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project.

2.Subscribe SIEM to the topic.

B.  

1. Create a Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project.

2.Process Cloud Storage objects in SIEM.

C.  

1. Export logs in each dev project to a Cloud Pub/Sub topic in a dedicated SIEM project.

2.Subscribe SIEM to the topic.

D.  

1. Create a Cloud Storage sink with a publicly shared Cloud Storage bucket in each project.

2.Process Cloud Storage objects in SIEM.

Discussion 0
Questions 61

Your privacy team uses crypto-shredding (deleting encryption keys) as a strategy to delete personally identifiable information (PII). You need to implement this practice on Google Cloud while still utilizing the majority of the platform’s services and minimizing operational overhead. What should you do?

Options:

A.  

Use client-side encryption before sending data to Google Cloud, and delete encryption keys on-premises

B.  

Use Cloud External Key Manager to delete specific encryption keys.

C.  

Use customer-managed encryption keys to delete specific encryption keys.

D.  

Use Google default encryption to delete specific encryption keys.

Discussion 0
Questions 62

You need to audit the network segmentation for your Google Cloud footprint. You currently operate Production and Non-Production infrastructure-as-a-service (IaaS) environments. All your VM instances are deployed without any service account customization.

After observing the traffic in your custom network, you notice that all instances can communicate freely – despite tag-based VPC firewall rules in place to segment traffic properly – with a priority of 1000. What are the most likely reasons for this behavior?

Options:

A.  

All VM instances are missing the respective network tags.

B.  

All VM instances are residing in the same network subnet.

C.  

All VM instances are configured with the same network route.

D.  

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 999.

E.  

A VPC firewall rule is allowing traffic between source/targets based on the same service account with priority 1001.

Discussion 0
Questions 63

A manager wants to start retaining security event logs for 2 years while minimizing costs. You write a filter to select the appropriate log entries.

Where should you export the logs?

Options:

A.  

BigQuery datasets

B.  

Cloud Storage buckets

C.  

StackDriver logging

D.  

Cloud Pub/Sub topics

Discussion 0
Questions 64

A customer wants to run a batch processing system on VMs and store the output files in a Cloud Storage bucket. The networking and security teams have decided that no VMs may reach the public internet.

How should this be accomplished?

Options:

A.  

Create a firewall rule to block internet traffic from the VM.

B.  

Provision a NAT Gateway to access the Cloud Storage API endpoint.

C.  

Enable Private Google Access on the VP

C.  

D.  

Mount a Cloud Storage bucket as a local filesystem on every VM.

Discussion 0
Questions 65

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

Options:

A.  

Create a project with multiple VPC networks for each environment.

B.  

Create a folder for each development and production environment.

C.  

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.  

Create an Organizational Policy constraint for each folder environment.

E.  

Create projects for each environment, and grant IAM rights to each engineering user.

Discussion 0
Questions 66

Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need to join user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.

This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?

Options:

A.  

Deterministic encryption

B.  

Secure, key-based hashes

C.  

Format-preserving encryption

D.  

Cryptographic hashing

Discussion 0
Questions 67

A patch for a vulnerability has been released, and a DevOps team needs to update their running containers in Google Kubernetes Engine (GKE).

How should the DevOps team accomplish this?

Options:

A.  

Use Puppet or Chef to push out the patch to the running container.

B.  

Verify that auto upgrade is enabled; if so, Google will upgrade the nodes in a GKE cluster.

C.  

Update the application code or apply a patch, build a new image, and redeploy it.

D.  

Configure containers to automatically upgrade when the base image is available in Container Registry.

Discussion 0
Questions 68

An office manager at your small startup company is responsible for matching payments to invoices and creating billing alerts. For compliance reasons, the office manager is only permitted to have the Identity and Access Management (IAM) permissions necessary for these tasks. Which two IAM roles should the office manager have? (Choose two.)

Options:

A.  

Organization Administrator

B.  

Project Creator

C.  

Billing Account Viewer

D.  

Billing Account Costs Manager

E.  

Billing Account User

Discussion 0
Questions 69

A batch job running on Compute Engine needs temporary write access to a Cloud Storage bucket. You want the batch job to use the minimum permissions necessary to complete the task. What should you do?

Options:

A.  

Create a service account with full Cloud Storage administrator permissions. Assign the service account to the Compute Engine instance.

B.  

Grant the predefined storage.objectcreator role to the Compute Engine instances default service account.

C.  

Create a service account and embed a long-lived service account key file that has write permissions specified directly in the batch job

script.

D.  

Create a service account with the storage .objectcreator role. Use service account impersonation in the batch job's code.

Discussion 0
Questions 70

What are the steps to encrypt data using envelope encryption?

Options:

A.  

Generate a data encryption key (DEK) locally.

Use a key encryption key (KEK) to wrap the DEK. Encrypt data with the KEK.

Store the encrypted data and the wrapped KEK.

B.  

Generate a key encryption key (KEK) locally.

Use the KEK to generate a data encryption key (DEK). Encrypt data with the DEK.

Store the encrypted data and the wrapped DEK.

C.  

Generate a data encryption key (DEK) locally.

Encrypt data with the DEK.

Use a key encryption key (KEK) to wrap the DEK. Store the encrypted data and the wrapped DEK.

D.  

Generate a key encryption key (KEK) locally.

Generate a data encryption key (DEK) locally. Encrypt data with the KEK.

Store the encrypted data and the wrapped DEK.

Discussion 0