Big Black Friday Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

Google Cloud Certified - Professional Cloud Security Engineer Question and Answers

Google Cloud Certified - Professional Cloud Security Engineer

Last Update Nov 30, 2025
Total Questions : 297

We are offering FREE Professional-Cloud-Security-Engineer Google exam questions. All you do is to just go and sign up. Give your details, prepare Professional-Cloud-Security-Engineer free exam questions and then go for complete pool of Google Cloud Certified - Professional Cloud Security Engineer test questions that will help you more.

Professional-Cloud-Security-Engineer pdf

Professional-Cloud-Security-Engineer PDF

$36.75  $104.99
Professional-Cloud-Security-Engineer Engine

Professional-Cloud-Security-Engineer Testing Engine

$43.75  $124.99
Professional-Cloud-Security-Engineer PDF + Engine

Professional-Cloud-Security-Engineer PDF + Testing Engine

$57.75  $164.99
Questions 1

Your Security team believes that a former employee of your company gained unauthorized access to Google Cloud resources some time in the past 2 months by using a service account key. You need to confirm the unauthorized access and determine the user activity. What should you do?

Options:

A.  

Use Security Health Analytics to determine user activity.

B.  

Use the Cloud Monitoring console to filter audit logs by user.

C.  

Use the Cloud Data Loss Prevention API to query logs in Cloud Storage.

D.  

Use the Logs Explorer to search for user activity.

Discussion 0
Questions 2

Your organization is developing a sophisticated machine learning (ML) model to predict customer behavior for targeted marketing campaigns. The BigQuery dataset used for training includes sensitive personal information. You must design the security controls around the AI/ML pipeline. Data privacy must be maintained throughout the model's lifecycle and you must ensure that personal data is not used in the training process Additionally, you must restrict access to the dataset to an authorized subset of people only. What should you do?

Options:

A.  

Implement at-rest encryption by using customer-managed encryption keys (CMEK) for the pipeline. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

B.  

De-identify sensitive data before model training by using Cloud Data Loss Prevention (DLP) APIs, and implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

C.  

Implement Identity-Aware Proxy to enforce context-aware access to BigQuery and models based on user identity and device.

D.  

Deploy the model on Confidential VMs for enhanced protection of data and code while in use. Implement strict Identity and Access Management (IAM) policies to control access to BigQuery.

Discussion 0
Questions 3

You want to use the gcloud command-line tool to authenticate using a third-party single sign-on (SSO) SAML identity provider. Which options are necessary to ensure that authentication is supported by the third-party identity provider (IdP)? (Choose two.)

Options:

A.  

SSO SAML as a third-party IdP

B.  

Identity Platform

C.  

OpenID Connect

D.  

Identity-Aware Proxy

E.  

Cloud Identity

Discussion 0
Questions 4

You are a security administrator at your company and are responsible for managing access controls (identification, authentication, and authorization) on Google Cloud. Which Google-recommended best practices should you follow when configuring authentication and authorization? (Choose two.)

Options:

A.  

Use Google default encryption.

B.  

Manually add users to Google Cloud.

C.  

Provision users with basic roles using Google's Identity and Access Management (1AM) service.

D.  

Use SSO/SAML integration with Cloud Identity for user authentication and user lifecycle management.

E.  

Provide granular access with predefined roles.

Discussion 0
Questions 5

You are designing a new governance model for your organization's secrets that are stored in Secret Manager. Currently, secrets for Production and Non-Production applications are stored and accessed using service accounts. Your proposed solution must:

Provide granular access to secrets

Give you control over the rotation schedules for the encryption keys that wrap your secrets

Maintain environment separation

Provide ease of management

Which approach should you take?

Options:

A.  

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using project-level identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

B.  

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

C.  

1. Use separate Google Cloud projects to store Production and Non-Production secrets.2. Enforce access control to secrets using secret-level Identity and Access Management (IAM) bindings.3. Use Google-managed encryption keys to encrypt secrets.

D.  

1. Use a single Google Cloud project to store both Production and Non-Production secrets.2. Enforce access control to secrets using project-level Identity and Access Management (IAM) bindings.3. Use customer-managed encryption keys to encrypt secrets.

Discussion 0
Questions 6

A customer has 300 engineers. The company wants to grant different levels of access and efficiently manage IAM permissions between users in the development and production environment projects.

Which two steps should the company take to meet these requirements? (Choose two.)

Options:

A.  

Create a project with multiple VPC networks for each environment.

B.  

Create a folder for each development and production environment.

C.  

Create a Google Group for the Engineering team, and assign permissions at the folder level.

D.  

Create an Organizational Policy constraint for each folder environment.

E.  

Create projects for each environment, and grant IAM rights to each engineering user.

Discussion 0
Questions 7

You want to prevent users from accidentally deleting a Shared VPC host project. Which organization-level policy constraint should you enable?

Options:

A.  

compute.restrictSharedVpcHostProjects

B.  

compute.restrictXpnProjectLienRemoval

C.  

compute.restrictSharedVpcSubnetworks

D.  

compute.sharedReservationsOwnerProjects

Discussion 0
Questions 8

You are responsible for the operation of your company's application that runs on Google Cloud. The database for the application will be maintained by an external partner. You need to give the partner team access to the database. This access must be restricted solely to the database and cannot extend to any other resources within your company's network. Your solution should follow Google-recommended practices. What should you do?

Options:

A.  

Add a public IP address to the application's database. Create database users for each of the partner's employees. Securely distribute the credentials for these users to the partner team.

B.  

Create accounts for the partner team in your corporate identity provider. Synchronize these accounts with Google Cloud Identity. Grant the accounts access to the database.

C.  

Ask the partner team to set up Cloud Identity accounts within their own corporate environment and identity provider. Grant the partner’s Cloud Identity accounts access to the database.

D.  

Configure Workforce Identity Federation for the partner. Connect the identity pool provider to the partner's identity provider. Grant the workforce pool resources access to the database.

Discussion 0
Questions 9

A company is using Google Kubernetes Engine (GKE) with container images of a mission-critical application The company wants to scan the images for known security issues and securely share the report with the security team without exposing them outside Google Cloud.

What should you do?

Options:

A.  

1. Enable Container Threat Detection in the Security Command Center Premium tier.• 2. Upgrade all clusters that are not on a supported version of GKE to the latest possible GKE version.• 3. View and share the results from the Security Command Center

B.  

• 1. Use an open source tool in Cloud Build to scan the images.• 2. Upload reports to publicly accessible buckets in Cloud Storage by using gsutil• 3. Share the scan report link with your security department.

C.  

• 1. Enable vulnerability scanning in the Artifact Registry settings.• 2. Use Cloud Build to build the images• 3. Push the images to the Artifact Registry for automatic scanning.• 4. View the reports in the Artifact Registry.

D.  

• 1. Get a GitHub subscription.• 2. Build the images in Cloud Build and store them in GitHub for automatic scanning• 3. Download the report from GitHub and share with the Security Team

Discussion 0
Questions 10

Your organization recently activated the Security Command Center {SCO standard tier. There are a few Cloud Storage buckets that were accidentally made accessible to the public. You need to investigate the impact of the incident and remediate it.

What should you do?

Options:

A.  

• 1 Remove the Identity and Access Management (IAM) granting access to allusers from the buckets• 2 Apply the organization policy storage. unifromBucketLevelAccess to prevent regressions• 3 Query the data access logs to report on unauthorized access

B.  

• 1 Change bucket permissions to limit access• 2 Query the data access audit logs for any unauthorized access to the buckets• 3 After the misconfiguration is corrected mute the finding in the Security Command Center

C.  

• 1 Change permissions to limit access for authorized users• 2 Enforce a VPC Service Controls perimeter around all the production projects to immediately stop any unauthorized access• 3 Review the administrator activity audit logs to report on any unauthorized access

D.  

• 1 Change the bucket permissions to limit access• 2 Query the buckets usage logs to report on unauthorized access to the data• 3 Enforce the organization policy storage.publicAccessPrevention to avoid regressions

Discussion 0
Questions 11

You need to implement an encryption at-rest strategy that reduces key management complexity for non-sensitive data and protects sensitive data while providing the flexibility of controlling the key residency and rotation schedule. FIPS 140-2 L1 compliance is required for all data types. What should you do?

Options:

A.  

Encrypt non-sensitive data and sensitive data with Cloud External Key Manager.

B.  

Encrypt non-sensitive data and sensitive data with Cloud Key Management Service

C.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud External Key Manager.

D.  

Encrypt non-sensitive data with Google default encryption, and encrypt sensitive data with Cloud Key Management Service.

Discussion 0
Questions 12

You have been tasked with inspecting IP packet data for invalid or malicious content. What should you do?

Options:

A.  

Use Packet Mirroring to mirror traffic to and from particular VM instances. Perform inspection using security software that analyzes the mirrored traffic.

B.  

Enable VPC Flow Logs for all subnets in the VPC. Perform inspection on the Flow Logs data using Cloud Logging.

C.  

Configure the Fluentd agent on each VM Instance within the VP

C.  

Perform inspection on the log data using Cloud Logging.

D.  

Configure Google Cloud Armor access logs to perform inspection on the log data.

Discussion 0
Questions 13

A customer needs to launch a 3-tier internal web application on Google Cloud Platform (GCP). The customer’s internal compliance requirements dictate that end-user access may only be allowed if the traffic seems to originate from a specific known good CIDR. The customer accepts the risk that their application will only have SYN flood DDoS protection. They want to use GCP’s native SYN flood protection.

Which product should be used to meet these requirements?

Options:

A.  

Cloud Armor

B.  

VPC Firewall Rules

C.  

Cloud Identity and Access Management

D.  

Cloud CDN

Discussion 0
Questions 14

Your organization has Google Cloud applications that require access to external web services. You must monitor, control, and log access to these services. What should you do?

Options:

A.  

Configure VPC firewall rules to allow the services to access the IP addresses of required external web services.

B.  

Set up a Secure Web Proxy that allows access to the specific external web services. Configure applications to use the proxy for the web service requests.

C.  

Configure Google Cloud Armor to monitor and protect your applications by checking incoming traffic patterns for attack patterns.

D.  

Set up a Cloud NAT instance to allow egress traffic from your VPC.

Discussion 0
Questions 15

Your company is deploying a large number of containerized applications to GKE. The existing CI/CD pipeline uses Cloud Build to construct container images, transfers the images to Artifact Registry, and then deploys the images to GKE. You need to ensure that only images that have passed vulnerability scanning and meet specific corporate policies are allowed to be deployed. The process needs to be automated and integrated into the existing CI/CD pipeline. What should you do?

Options:

A.  

Implement a custom script in the Cloud Build pipeline that uses a third-party vulnerability scanning tool. Fail the build if vulnerabilities are found.

B.  

Configure GKE to use only images from a specific, trusted Artifact Registry repository. Manually inspect all images before pushing them to this repository.

C.  

Configure a policy in Binary Authorization to use Artifact Analysis vulnerability scanning to only allow images that pass the scan to deploy to your GKE clusters.

D.  

Enable Artifact Analysis vulnerability scanning and regularly scan images in Artifact Registry. Remove any images that do not meet the vulnerability requirements before deployment.

Discussion 0
Questions 16

You will create a new Service Account that should be able to list the Compute Engine instances in the project. You want to follow Google-recommended practices.

What should you do?

Options:

A.  

Create an Instance Template, and allow the Service Account Read Only access for the Compute Engine Access Scope.

B.  

Create a custom role with the permission compute.instances.list and grant the Service Account this role.

C.  

Give the Service Account the role of Compute Viewer, and use the new Service Account for all instances.

D.  

Give the Service Account the role of Project Viewer, and use the new Service Account for all instances.

Discussion 0
Questions 17

You are responsible for managing identities in your company's Google Cloud organization. Employees are frequently using your organization's corporate domain name to create unmanaged Google accounts. You want to implement a practical and efficient solution to prevent employees from completing this action in the future. What should you do?

Options:

A.  

Implement an automated process that scans all identities in your organization and disables any unmanaged accounts.

B.  

Create a Google Cloud identity for all users in your organization. Ensure that new users are added automatically.

C.  

Register a new domain for your Google Cloud resources. Move all existing identities and resources to this domain.

D.  

Switch your corporate email system to another domain to avoid using the same domain for Google Cloud identities and corporate emails.

Discussion 0
Questions 18

Your company is concerned about unauthorized parties gaming access to the Google Cloud environment by using a fake login page. You must implement a solution to protect against person-in-the-middle attacks.

Which security measure should you use?

Options:

A.  

Text message or phone call code

B.  

Security key

C.  

Google Authenticator application

D.  

Google prompt

Discussion 0
Questions 19

Your company has multiple teams needing access to specific datasets across various Google Cloud data services for different projects. You need to ensure that team members can only access the data relevant to their projects and prevent unauthorized access to sensitive information within BigQuery, Cloud Storage, and Cloud SQL. What should you do?

Options:

A.  

Grant project-level group permissions by using specific Cloud IAM roles. Use BigQuery authorized views. Cloud Storage uniform bucket-level access, and Cloud SQL database roles.

B.  

Configure an access level to control access to the Google Cloud console for users managing these data services. Require multi-factor authentication for all access attempts.

C.  

Use VPC Service Controls to create security perimeters around the projects for BigQuery. Cloud Storage, and Cloud SQL services. restricting access based on the network origin of the requests.

D.  

Enable project-level data access logs for BigQuery. Cloud Storage, and Cloud SQL. Configure log sinks to export these logs to Security Command Center to identify unauthorized access attempts.

Discussion 0
Questions 20

Your organization acquired a new workload. The Web and Application (App) servers will be running on Compute Engine in a newly created custom VPC. You are responsible for configuring a secure network communication solution that meets the following requirements:

Only allows communication between the Web and App tiers.

Enforces consistent network security when autoscaling the Web and App tiers.

Prevents Compute Engine Instance Admins from altering network traffic.

What should you do?

Options:

A.  

1. Configure all running Web and App servers with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

B.  

1. Configure all running Web and App servers with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

C.  

1. Re-deploy the Web and App servers with instance templates configured with respective network tags.2. Create an allow VPC firewall rule that specifies the target/source with respective network tags.

D.  

1. Re-deploy the Web and App servers with instance templates configured with respective service accounts.2. Create an allow VPC firewall rule that specifies the target/source with respective service accounts.

Discussion 0
Questions 21

Users are reporting an outage on your public-facing application that is hosted on Compute Engine. You suspect that a recent change to your firewall rules is responsible. You need to test whether your firewall rules are working properly. What should you do?

Options:

A.  

Enable Firewall Rules Logging on the latest rules that were changed. Use Logs Explorer to analyze whether the rules are working correctly.

B.  

Connect to a bastion host in your VPC. Use a network traffic analyzer to determine at which point your requests are being blocked.

C.  

In a pre-production environment, disable all firewall rules individually to determine which one is blocking user traffic.

D.  

Enable VPC Flow Logs in your VPC. Use Logs Explorer to analyze whether the rules are working correctly.

Discussion 0
Questions 22

Your organization uses Google Workspace Enterprise Edition tor authentication. You are concerned about employees leaving their laptops unattended for extended periods of time after authenticating into Google Cloud. You must prevent malicious people from using an employee's unattended laptop to modify their environment.

What should you do?

Options:

A.  

Create a policy that requires employees to not leave their sessions open for long durations.

B.  

Review and disable unnecessary Google Cloud APIs.

C.  

Require strong passwords and 2SV through a security token or Google authenticate.

D.  

Set the session length timeout for Google Cloud services to a shorter duration.

Discussion 0
Questions 23

Your organization develops software involved in many open source projects and is concerned about software supply chain threats You need to deliver provenance for the build to demonstrate the software is untampered.

What should you do?

Options:

A.  

• 1- Generate Supply Chain Levels for Software Artifacts (SLSA) level 3 assurance by using Cloud Build.• 2. View the build provenance in the Security insights side panel within the Google Cloud console.

B.  

• 1. Review the software process.• 2. Generate private and public key pairs and use Pretty Good Privacy (PGP) protocols to sign the output software artifacts together with a file containing the address of your enterprise and point of contact.• 3. Publish the PGP signed attestation to your public web page.

C.  

• 1, Publish the software code on GitHub as open source.• 2. Establish a bug bounty program, and encourage the open source community to review, report, and fix the vulnerabilities.

D.  

• 1. Hire an external auditor to review and provide provenance• 2. Define the scope and conditions.• 3. Get support from the Security department or representative.• 4. Publish the attestation to your public web page.

Discussion 0
Questions 24

You work at a company in a regulated industry and are responsible for ongoing security of the Cloud environment. You need to prevent and detect misconfigurations in a particular folder based on specific compliance policies. You need to adhere to industry-specific compliance policies and policies that are internal to your company. What should you do?

Options:

A.  

Enable Assured Workloads on the folder level, with the specific control bundle appropriate for your industry's regulations.

B.  

Use Workload Manager with custom Rego policies to continuously scan the environment for misconfigurations on the folder level.C. Create a Posture file by using custom and predefined SHA or organization policies. Enforce the posture on the folder level.

C.  

Create custom organization policies that follow specific business requirements. Enforce the policies on the folder level.

Discussion 0
Questions 25

A company has been running their application on Compute Engine. A bug in the application allowed a malicious user to repeatedly execute a script that results in the Compute Engine instance crashing. Although the bug has been fixed, you want to get notified in case this hack re-occurs.

What should you do?

Options:

A.  

Create an Alerting Policy in Stackdriver using a Process Health condition, checking that the number of executions of the script remains below the desired threshold. Enable notifications.

B.  

Create an Alerting Policy in Stackdriver using the CPU usage metric. Set the threshold to 80% to be notified when the CPU usage goes above this 80%.

C.  

Log every execution of the script to Stackdriver Logging. Create a User-defined metric in Stackdriver Logging on the logs, and create a Stackdriver Dashboard displaying the metric.

D.  

Log every execution of the script to Stackdriver Logging. Configure BigQuery as a log sink, and create a BigQuery scheduled query to count the number of executions in a specific timeframe.

Discussion 0
Questions 26

You discovered that sensitive personally identifiable information (PII) is being ingested to your Google Cloud environment in the daily ETL process from an on-premises environment to your BigQuery datasets. You need to redact this data to obfuscate the PII, but need to re-identify it for data analytics purposes. Which components should you use in your solution? (Choose two.)

Options:

A.  

Secret Manager

B.  

Cloud Key Management Service

C.  

Cloud Data Loss Prevention with cryptographic hashing

D.  

Cloud Data Loss Prevention with automatic text redaction

E.  

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

Discussion 0
Questions 27

You are onboarding new users into Cloud Identity and discover that some users have created consumer user accounts using the corporate domain name. How should you manage these consumer user accounts with Cloud Identity?

Options:

A.  

Use Google Cloud Directory Sync to convert the unmanaged user accounts.

B.  

Create a new managed user account for each consumer user account.

C.  

Use the transfer tool for unmanaged user accounts.

D.  

Configure single sign-on using a customer's third-party provider.

Discussion 0
Questions 28

You are consulting with a client that requires end-to-end encryption of application data (including data in transit, data in use, and data at rest) within Google Cloud. Which options should you utilize to accomplish this? (Choose two.)

Options:

A.  

External Key Manager

B.  

Customer-supplied encryption keys

C.  

Hardware Security Module

D.  

Confidential Computing and Istio

E.  

Client-side encryption

Discussion 0
Questions 29

Your application is deployed as a highly available cross-region solution behind a global external HTTP(S) load balancer. You notice significant spikes in traffic from multiple IP addresses but it is unknown whether the IPs are malicious. You are concerned about your application's availability. You want to limit traffic from these clients over a specified time interval.

What should you do?

Options:

A.  

Configure a rate_based_ban action by using Google Cloud Armor and set the ban_duration_sec parameter to the specified time interval.

B.  

Configure a deny action by using Google Cloud Armor to deny the clients that issued too many requests over the specified time interval.

C.  

Configure a throttle action by using Google Cloud Armor to limit the number of requests per client over a specified time interval.

D.  

Configure a firewall rule in your VPC to throttle traffic from the identified IP addresses.

Discussion 0
Questions 30

Your organization is deploying a serverless web application on Cloud Run that must be publicly accessible over HTTPS. To meet security requirements, you need to terminate TLS at the edge, apply threat mitigation, and prepare for geo-based access restrictions. What should you do?

Options:

A.  

Make the Cloud Run service public by enabling allUsers access. Configure Identity-Aware Proxy (IAP) for authentication and IP-based access control. Use custom SSL certificates for HTTPS.

B.  

Assign a custom domain to the Cloud Run service. Enable HTTPS. Configure IAM to allow allUsers to invoke the service. Use firewall rules and VPC Service Controls for geo-based restriction and traffic filtering.

C.  

Deploy an external HTTP(S) load balancer with a serverless NEG that points to the Cloud Run service. Use a Google-managed certificate for TLS termination. Configure a Cloud Armor policy with geo-based access control.

D.  

Create a Cloud DNS public zone for the Cloud Run URL. Bind a static IP to the service. Use VPC firewall rules to restrict incoming traffic based on IP ranges and threat signatures.

Discussion 0
Questions 31

You are working with protected health information (PHI) for an electronic health record system. The privacy officer is concerned that sensitive data is stored in the analytics system. You are tasked with anonymizing the sensitive data in a way that is not reversible. Also, the anonymized data should not preserve the character set and length. Which Google Cloud solution should you use?

Options:

A.  

Cloud Data Loss Prevention with deterministic encryption using AES-SIV

B.  

Cloud Data Loss Prevention with format-preserving encryption

C.  

Cloud Data Loss Prevention with cryptographic hashing

D.  

Cloud Data Loss Prevention with Cloud Key Management Service wrapped cryptographic keys

Discussion 0
Questions 32

You have stored company approved compute images in a single Google Cloud project that is used as an image repository. This project is protected with VPC Service Controls and exists in the perimeter along with other projects in your organization. This lets other projects deploy images from the image repository project. A team requires deploying a third-party disk image that is stored in an external Google Cloud organization. You need to grant read access to the disk image so that it can be deployed into the perimeter.

What should you do?

Options:

A.  

• 1 Update the perimeter• 2 Configure the egressTo field to set identity Type to any_identity.• 3 Configure the egressFrom field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.

B.  

* Allow the external project by using the organizational policyconstraints/compute.trustedlmageProjects.

C.  

• 1 Update the perimeter• 2 Configure the egressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute. googleapis. com.• 3 Configure the egressFrom field to set identity Type to any_idestity.

D.  

• 1 Update the perimeter• 2 Configure the ingressFrcm field to set identityType to an-y_identity.• 3 Configure the ingressTo field to include the external Google Cloud project number as an allowed resource and the serviceName to compute.googleapis -com.

Discussion 0
Questions 33

You are working with a client who plans to migrate their data to Google Cloud. You are responsible for recommending an encryption service to manage their encrypted keys. You have the following requirements:

    The master key must be rotated at least once every 45 days.

    The solution that stores the master key must be FIPS 140-2 Level 3 validated.

    The master key must be stored in multiple regions within the US for redundancy.

Which solution meets these requirements?

Options:

A.  

Customer-managed encryption keys with Cloud Key Management Service

B.  

Customer-managed encryption keys with Cloud HSM

C.  

Customer-supplied encryption keys

D.  

Google-managed encryption keys

Discussion 0
Questions 34

Your organization has established a highly sensitive project within a VPC Service Controls perimeter. You need to ensure that only users meeting specific contextual requirements such as having a company-managed device, a specific location, and a valid user identity can access resources within this perimeter. You want to evaluate the impact of this change without blocking legitimate access. What should you do?

Options:

A.  

Configure a VPC Service Controls perimeter in dry run mode, and enforce strict network segmentation using firewall rules. Use multi-factor authentication (MFA) for user verification.

B.  

Use Cloud Audit Logs to monitor user access to the project resources. Use post-incident analysis to identify unauthorized access attempts.

C.  

Establish a Context-Aware Access policy that specifies the required contextual attributes, and associate the policy with the VPC Service Controls perimeter in dry run mode.

D.  

Use the VPC Service Control Violation dashboard to identify the impact of details about access denials by service perimeters.

Discussion 0
Questions 35

Your organization s record data exists in Cloud Storage. You must retain all record data for at least seven years This policy must be permanent.

What should you do?

Options:

A.  

• 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Monitor the bucket by using log-based alerts to ensure that no modifications to the retention policy occurs

B.  

• 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Remove any Identity and Access Management (IAM) roles that contain the storage buckets update permission

C.  

• 1 Identify buckets with record data• 2 Enable the bucket policy only to ensure that data is retained• 3 Enable bucket lock

D.  

* 1 Identify buckets with record data• 2 Apply a retention policy and set it to retain for seven years• 3 Enable bucket lock

Discussion 0
Questions 36

Options:

A.  

Implement a Cloud Function that scans the environment variables multiple times a day. and creates a finding in Security Command Center if secrets are discovered.

B.  

Implement regular peer reviews to assess the environment variables and identify secrets in your Cloud Functions. Raise a security incident if secrets are discovered.

C.  

Use Sensitive Data Protection to scan the environment variables multiple times per day. and create a finding in Security Command Center if secrets are discovered.

D.  

Integrate dynamic application security testing into the CI/CD pipeline that scans the application code for the Cloud Functions. Fail the build process if secrets are discovered.

Discussion 0
Questions 37

Your organization is rolling out a new continuous integration and delivery (CI/CD) process to deploy infrastructure and applications in Google Cloud Many teams will use their own instances of the CI/CD workflow It will run on Google Kubernetes Engine (GKE) The CI/CD pipelines must be designed to securely access Google Cloud APIs

What should you do?

Options:

A.  

• 1 Create a dedicated service account for the CI/CD pipelines• 2 Run the deployment pipelines in a dedicated nodes pool in the GKE cluster• 3 Use the service account that you created as identity for the nodes in the pool to authenticate to the Google Cloud APIs

B.  

• 1 Create service accounts for each deployment pipeline• 2 Generate private keys for the service accounts• 3 Securely store the private keys as Kubernetes secrets accessible only by the pods that run the specific deploy pipeline

C.  

* 1 Create individual service accounts (or each deployment pipeline• 2 Add an identifier for the pipeline in the service account naming convention• 3 Ensure each pipeline runs on dedicated pods• 4 Use workload identity to map a deployment pipeline pod with a service account

D.  

• 1 Create two service accounts one for the infrastructure and one for the application deployment• 2 Use workload identities to let the pods run the two pipelines and authenticate with the service accounts• 3 Run the infrastructure and application pipelines in separate namespaces

Discussion 0
Questions 38

You have an application where the frontend is deployed on a managed instance group in subnet A and the data layer is stored on a mysql Compute Engine virtual machine (VM) in subnet B on the same VPC. Subnet A and Subnet B hold several other Compute Engine VMs. You only want to allow thee application frontend to access the data in the application's mysql instance on port 3306.

What should you do?

Options:

A.  

Configure an ingress firewall rule that allows communication from the src IP range of subnet A to the tag "data-tag" that is applied to the mysql Compute Engine VM on port 3306.

B.  

Configure an ingress firewall rule that allows communication from the frontend's unique service account to the unique service account of the mysql Compute Engine VM on port 3306.

C.  

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an egress firewall rule that allows communication from Compute Engine VMs tagged with data-tag to destination Compute Engine VMs tagged fe-tag.

D.  

Configure a network tag "fe-tag" to be applied to all instances in subnet A and a network tag "data-tag" to be applied to all instances in subnet B. Then configure an ingress firewall rule that allows communication from Compute Engine VMs tagged with fe-tag to destination Compute Engine VMs tagged with data-tag.

Discussion 0
Questions 39

You work for a financial organization in a highly regulated industry that is subject to active regulatory compliance. To meet compliance requirements, you need to continuously maintain a specific set of configurations, data residency, organizational policies, and personnel data access controls. What should you do?

Options:

A.  

Create an Assured Workloads folder for your required compliance program to apply defined controls and requirements.

B.  

Create a posture.yaml file with the required security compliance posture. Apply the posture with the gcloud sec postures create POSTURE_NAME --posture-from-file=posture.yaml command in Security Command Center Premium.

C.  

Apply an organizational policy constraint at the organization level to limit the location of new resource creation.

D.  

Go to the Compliance page in Security Command Center View the report for your status against the required compliance standard. Triage violations to maintain compliance on a regular basis.

Discussion 0
Questions 40

You are a Cloud Identity administrator for your organization. In your Google Cloud environment groups are used to manage user permissions. Each application team has a dedicated group Your team is responsible for creating these groups and the application teams can manage the team members on their own through the Google Cloud console. You must ensure that the application teams can only add users from within your organization to their groups.

What should you do?

Options:

A.  

Change the configuration of the relevant groups in the Google Workspace Admin console to prevent external users from being added to the group.

B.  

Set an Identity and Access Management (1AM) policy that includes a condition that restricts group membership to user principals that belong to your organization.

C.  

Define an Identity and Access Management (IAM) deny policy that denies the assignment of principals that are outside your organization to the groups in scope.

D.  

Export the Cloud Identity logs to BigQuery Configure an alert for external members added to groups Have the alert trigger a Cloud Function instance that removes the external members from the group.

Discussion 0
Questions 41

Your organization wants to be General Data Protection Regulation (GDPR) compliant You want to ensure that your DevOps teams can only create Google Cloud resources in the Europe regions.

What should you do?

Options:

A.  

Use the org policy constraint "Restrict Resource Service Usage'* on your Google Cloud organization node.

B.  

Use Identity and Access Management (1AM) custom roles to ensure that your DevOps team can only create resources in the Europe regions

C.  

Use the org policy constraint Google Cloud Platform - Resource Location Restriction" on your Google Cloudorganization node.

D.  

Use Identity-Aware Proxy (IAP) with Access Context Manager to restrict the location of Google Cloud resources.

Discussion 0
Questions 42

You have been tasked with implementing external web application protection against common web application attacks for a public application on Google Cloud. You want to validate these policy changes before they are enforced. What service should you use?

Options:

A.  

Google Cloud Armor's preconfigured rules in preview mode

B.  

Prepopulated VPC firewall rules in monitor mode

C.  

The inherent protections of Google Front End (GFE)

D.  

Cloud Load Balancing firewall rules

E.  

VPC Service Controls in dry run mode

Discussion 0
Questions 43

While migrating your organization’s infrastructure to GCP, a large number of users will need to access GCP Console. The Identity Management team already has a well-established way to manage your users and want to keep using your existing Active Directory or LDAP server along with the existing SSO password.

What should you do?

Options:

A.  

Manually synchronize the data in Google domain with your existing Active Directory or LDAP server.

B.  

Use Google Cloud Directory Sync to synchronize the data in Google domain with your existing Active Directory or LDAP server.

C.  

Users sign in directly to the GCP Console using the credentials from your on-premises Kerberos compliant identity provider.

D.  

Users sign in using OpenID (OIDC) compatible IdP, receive an authentication token, then use that token to log in to the GCP Console.

Discussion 0
Questions 44

Your team needs to obtain a unified log view of all development cloud projects in your SIEM. The development projects are under the NONPROD organization folder with the test and pre-production projects. The development projects share the ABC-BILLING billing account with the rest of the organization.

Which logging export strategy should you use to meet the requirements?

Options:

A.  

1. Export logs to a Cloud Pub/Sub topic with folders/NONPROD parent and includeChildren property set to True in a dedicated SIEM project.2.Subscribe SIEM to the topic.

B.  

1. Create a Cloud Storage sink with billingAccounts/ABC-BILLING parent and includeChildren property set to False in a dedicated SIEM project.2.Process Cloud Storage objects in SIEM.

C.  

1. Export logs in each dev project to a Cloud Pub/Sub topic in a dedicated SIEM project.2.Subscribe SIEM to the topic.

D.  

1. Create a Cloud Storage sink with a publicly shared Cloud Storage bucket in each project.2.Process Cloud Storage objects in SIEM.

Discussion 0
Questions 45

A company allows every employee to use Google Cloud Platform. Each department has a Google Group, with

all department members as group members. If a department member creates a new project, all members of that department should automatically have read-only access to all new project resources. Members of any other department should not have access to the project. You need to configure this behavior.

What should you do to meet these requirements?

Options:

A.  

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Viewer role to the Google Group related to that department.

B.  

Create a Folder per department under the Organization. For each department’s Folder, assign the Project Browser role to the Google Group related to that department.

C.  

Create a Project per department under the Organization. For each department’s Project, assign the Project Viewer role to the Google Group related to that department.

D.  

Create a Project per department under the Organization. For each department’s Project, assign the Project Browser role to the Google Group related to that department.

Discussion 0
Questions 46

Your organization wants to protect all workloads that run on Compute Engine VM to ensure that the instances weren't compromised by boot-level or kernel-level malware. Also, you need to ensure that data in use on the VM cannot be read by the underlying host system by using a hardware-based solution.

What should you do?

Options:

A.  

• 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring• 2 Create a Cloud Run function to check for the VM settings generate metrics and run the function regularly

B.  

• 1 Activate Virtual Machine Threat Detection in Security Command Center (SCO Premium• 2 Monitor the findings in SCC

C.  

* 1 Use Google Shielded VM including secure boot Virtual Trusted Platform Module (vTPM) and integrity monitoring• 2 Activate Confidential Computing• 3 Enforce these actions by using organization policies

D.  

• 1 Use secure hardened images from the Google Cloud Marketplace• 2 When deploying the images activate the Confidential Computing option• 3 Enforce the use of the correct images and Confidential Computing by using organization policies

Discussion 0
Questions 47

You work for an ecommerce company that stores sensitive customer data across multiple Google Cloud regions. The development team has built a new 3-tier application to process orders and must integrate the application into the production environment. You must design the network architecture to ensure strong security boundaries and isolation for the new application, facilitate secure remote maintenance by authorized third-party vendors, and follow the principle of least privilege. What should you do?

Options:

A.  

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Provide vendors with SSH keys and root access only to the instances within the VPC for maintenance purposes.

B.  

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors and grant the network admin role to the vendors. Deploy a VPN appliance and rely on the vendors' configurations to secure third-party access.

C.  

Create separate VPC networks for each tier. Use VPC peering between application tiers and other required VPCs. Enable Identity-Aware Proxy (IAP) for remote access to management resources, limiting access to authorized vendors.

D.  

Create a single VPC network and create different subnets for each tier. Create a new Google project specifically for the third-party vendors. Grant the vendors ownership of that project and the ability to modify the Shared VPC configuration.

Discussion 0
Questions 48

You want to update your existing VPC Service Controls perimeter with a new access level. You need to avoid breaking the existing perimeter with this change, and ensure the least disruptions to users while minimizing overhead. What should you do?

Options:

A.  

Create an exact replica of your existing perimeter. Add your new access level to the replica. Update the original perimeter after the access level has been vetted.

B.  

Update your perimeter with a new access level that never matches. Update the new access level to match your desired state one condition at a time to avoid being overly permissive.

C.  

Enable the dry run mode on your perimeter. Add your new access level to the perimeter configuration. Update the perimeter configuration after the access level has been vetted.

D.  

Enable the dry run mode on your perimeter. Add your new access level to the perimeter dry run configuration. Update the perimeter configuration after the access level has been vetted.

Discussion 0
Questions 49

Your customer has an on-premises Public Key Infrastructure (PKI) with a certificate authority (CA). You need to issue certificates for many HTTP load balancer frontends. The on-premises PKI should be minimally affected due to many manual processes, and the solution needs to scale.

What should you do?

Options:

A.  

Use Certificate Manager to issue Google managed public certificates and configure it at HTTP the load balancers in your infrastructure as code (laC).

B.  

Use Certificate Manager to import certificates issued from on-premises PKI and for the frontends. Leverage the gcloud tool for importing

C.  

Use a subordinate CA in the Google Certificate Authority Service from the on-premises PKI system to issue certificates for the load balancers.

D.  

Use the web applications with PKCS12 certificates issued from subordinate CA based on OpenSSL on-premises Use the gcloud tool for importing. Use the External TCP/UDP Network load balancer instead of an external HTTP Load Balancer.

Discussion 0
Questions 50

Your organization uses Google Workspace as the primary identity provider for Google Cloud Users in your organization initially created their passwords. You need to improve password security due to a recent security event. What should you do?

Options:

A.  

Audit user activity for suspicious logins by using the audit and investigation tool.

B.  

Conduct a security awareness training session, and set the password expiration settings to require more frequent updates.

C.  

Check the Enforce strong password box, and set the password expiration to occur more frequently.

D.  

Check the Enforce strong password box, and check Enforce password policy at the next sign-in.

Discussion 0
Questions 51

Your organization's application is being integrated with a partner application that requires read access to customer data to process customer orders. The customer data is stored in one of your Cloud Storage buckets. You have evaluated different options and determined that this activity requires the use of service account keys. You must advise the partner on how to minimize the risk of a compromised service account key causing a loss of data. What should you advise the partner to do?

Options:

A.  

Define a VPC Service Controls perimeter, and restrict the Cloud Storage API. Add an ingress rule to the perimeter to allow access to the Cloud Storage API for the service account from outside of the perimeter.​

B.  

Scan the Cloud Storage bucket with Sensitive Data Protection when new data is added, and automatically mask all customer data.​

C.  

Ensure that all data for the application that is accessed through the relevant service accounts is encrypted at rest by using customer-managed encryption keys (CMEK).​

D.  

Implement a secret management service. Configure the service to frequently rotate the service account key. Configure proper access control to the key, and restrict who can create service account keys.​

Discussion 0
Questions 52

Your organization processes sensitive health information. You want to ensure that data is encrypted while in use by the virtual machines (VMs). You must create a policy that is enforced across the entire organization.

What should you do?

Options:

A.  

Implement an organization policy that ensures that all VM resources created across your organization use customer-managed encryption keys (CMEK) protection.

B.  

Implement an organization policy that ensures all VM resources created across your organization are Confidential VM instances.

C.  

Implement an organization policy that ensures that all VM resources created across your organization use Cloud External Key Manager (EKM) protection.

D.  

No action is necessary because Google encrypts data while it is in use by default.

Discussion 0
Questions 53

You are migrating an application into the cloud The application will need to read data from a Cloud Storage bucket. Due to local regulatory requirements, you need to hold the key material used for encryption fully under your control and you require a valid rationale for accessing the key material.

What should you do?

Options:

A.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys. Configure an 1AM deny policy for unauthorized groups

B.  

Encrypt the data in the Cloud Storage bucket by using Customer Managed Encryption Keys backed by a Cloud Hardware Security Module (HSM). Enable data access logs.

C.  

Generate a key in your on-premises environment and store it in a Hardware Security Module (HSM) that is managed on-premises Use this key as an external key in the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and set the external key system to reject unauthorized accesses.

D.  

Generate a key in your on-premises environment to encrypt the data before you upload the data to the Cloud Storage bucket Upload the key to the Cloud Key Management Service (KMS). Activate Key Access Justifications (KAJ) and have the external key system reject unauthorized accesses.

Discussion 0
Questions 54

Your organization s customers must scan and upload the contract and their driver license into a web portal in Cloud Storage. You must remove all personally identifiable information (Pll) from files that are older than 12 months. Also you must archive the anonymized files for retention purposes.

What should you do?

Options:

A.  

Set a time to live (TTL) of 12 months for the files in the Cloud Storage bucket that removes PH and moves the files to the archive storage class.

B.  

Create a Cloud Data Loss Prevention (DLP) inspection job that de-identifies Pll in files created more than 12 months ago and archives them to another Cloud Storage bucket. Delete the original files.

C.  

Schedule a Cloud Key Management Service (KMS) rotation period of 12 months for the encryption keys of the Cloud Storage files containing Pll to de-identify them Delete the original keys.

D.  

Configure the Autoclass feature of the Cloud Storage bucket to de-identify Pll Archive the files that are older than 12 months Delete the original files.

Discussion 0
Questions 55

Your company is storing sensitive data in Cloud Storage. You want a key generated on-premises to be used in the encryption process.

What should you do?

Options:

A.  

Use the Cloud Key Management Service to manage a data encryption key (DEK).

B.  

Use the Cloud Key Management Service to manage a key encryption key (KEK).

C.  

Use customer-supplied encryption keys to manage the data encryption key (DEK).

D.  

Use customer-supplied encryption keys to manage the key encryption key (KEK).

Discussion 0
Questions 56

A database administrator notices malicious activities within their Cloud SQL instance. The database administrator wants to monitor the API calls that read the configuration or metadata of resources. Which logs should the database administrator review?

Options:

A.  

Admin Activity

B.  

System Event

C.  

Access Transparency

D.  

Data Access

Discussion 0
Questions 57

An organization's security and risk management teams are concerned about where their responsibility lies for certain production workloads they are running in Google Cloud Platform (GCP), and where Google's responsibility lies. They are mostly running workloads using Google Cloud's Platform-as-a-Service (PaaS) offerings, including App Engine primarily.

Which one of these areas in the technology stack would they need to focus on as their primary responsibility when using App Engine?

Options:

A.  

Configuring and monitoring VPC Flow Logs

B.  

Defending against XSS and SQLi attacks

C.  

Manage the latest updates and security patches for the Guest OS

D.  

Encrypting all stored data

Discussion 0
Questions 58

In order to meet PCI DSS requirements, a customer wants to ensure that all outbound traffic is authorized.

Which two cloud offerings meet this requirement without additional compensating controls? (Choose two.)

Options:

A.  

App Engine

B.  

Cloud Functions

C.  

Compute Engine

D.  

Google Kubernetes Engine

E.  

Cloud Storage

Discussion 0
Questions 59

An organization receives an increasing number of phishing emails.

Which method should be used to protect employee credentials in this situation?

Options:

A.  

Multifactor Authentication

B.  

A strict password policy

C.  

Captcha on login pages

D.  

Encrypted emails

Discussion 0
Questions 60

You are on your company's development team. You noticed that your web application hosted in staging on GKE dynamically includes user data in web pages without first properly validating the inputted data. This could allow an attacker to execute gibberish commands and display arbitrary content in a victim user's browser in a production environment.

How should you prevent and fix this vulnerability?

Options:

A.  

Use Cloud IAP based on IP address or end-user device attributes to prevent and fix the vulnerability.

B.  

Set up an HTTPS load balancer, and then use Cloud Armor for the production environment to prevent the potential XSS attack.

C.  

Use Web Security Scanner to validate the usage of an outdated library in the code, and then use a secured version of the included library.

D.  

Use Web Security Scanner in staging to simulate an XSS injection attack, and then use a templating system that supports contextual auto-escaping.

Discussion 0
Questions 61

You are the security admin of your company. Your development team creates multiple GCP projects under the "implementation" folder for several dev, staging, and production workloads. You want to prevent data exfiltration by malicious insiders or compromised code by setting up a security perimeter. However, you do not want to restrict communication between the projects.

What should you do?

Options:

A.  

Use a Shared VPC to enable communication between all projects, and use firewall rules to prevent data exfiltration.

B.  

Create access levels in Access Context Manager to prevent data exfiltration, and use a shared VPC for communication between projects.

C.  

Use an infrastructure-as-code software tool to set up a single service perimeter and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the associated perimeter.

D.  

Use an infrastructure-as-code software tool to set up three different service perimeters for dev, staging, and prod and to deploy a Cloud Function that monitors the "implementation" folder via Stackdriver and Cloud Pub/Sub. When the function notices that a new project is added to the folder, it executes Terraform to add the new project to the respective perimeter.

Discussion 0
Questions 62

A customer wants to move their sensitive workloads to a Compute Engine-based cluster using Managed Instance Groups (MIGs). The jobs are bursty and must be completed quickly. They have a requirement to be able to manage and rotate the encryption keys.

Which boot disk encryption solution should you use on the cluster to meet this customer’s requirements?

Options:

A.  

Customer-supplied encryption keys (CSEK)

B.  

Customer-managed encryption keys (CMEK) using Cloud Key Management Service (KMS)

C.  

Encryption by default

D.  

Pre-encrypting files before transferring to Google Cloud Platform (GCP) for analysis

Discussion 0
Questions 63

A company is backing up application logs to a Cloud Storage bucket shared with both analysts and the administrator. Analysts should only have access to logs that do not contain any personally identifiable information (PII). Log files containing PII should be stored in another bucket that is only accessible by the administrator.

What should you do?

Options:

A.  

Use Cloud Pub/Sub and Cloud Functions to trigger a Data Loss Prevention scan every time a file is uploaded to the shared bucket. If the scan detects PII, have the function move into a Cloud Storage bucket only accessible by the administrator.

B.  

Upload the logs to both the shared bucket and the bucket only accessible by the administrator. Create ajob trigger using the Cloud Data Loss Prevention API. Configure the trigger to delete any files from the shared bucket that contain PII.

C.  

On the bucket shared with both the analysts and the administrator, configure Object Lifecycle Management to delete objects that contain any PII.

D.  

On the bucket shared with both the analysts and the administrator, configure a Cloud Storage Trigger that is only triggered when PII data is uploaded. Use Cloud Functions to capture the trigger and delete such files.

Discussion 0
Questions 64

Your company is developing a new application for your organization. The application consists of two Cloud Run services, service A and service B. Service A provides a web-based user front-end. Service B provides back-end services that are called by service A. You need to set up identity and access management for the application. Your solution should follow the principle of least privilege. What should you do?

Options:

A.  

Create a new service account with the permissions to run service A and service B. Require authentication for service B. Permit only the new service account to call the backend.

B.  

Create two separate service accounts. Grant one service account the permissions to execute service A, and grant the other service account the permissions to execute service

B.  

Require authentication for service

B.  

Permit only the service account for service A to call the back-end.

C.  

Use the Compute Engine default service account to run service A and service B. Require authentication for service B. Permit only the default service account to call the backend.

D.  

Create three separate service accounts. Grant one service account the permissions to execute service A. Grant the second service account the permissions to run service B. Grant the third service account the permissions to communicate between both services A and B. Require authentication for service B. Call the back-end by authenticating with a service account key for the third service account.

Discussion 0
Questions 65

Your organization’s Google Cloud VMs are deployed via an instance template that configures them with a public IP address in order to host web services for external users. The VMs reside in a service project that is attached to a host (VPC) project containing one custom Shared VPC for the VMs. You have been asked to reduce the exposure of the VMs to the internet while continuing to service external users. You have already recreated the instance template without a public IP address configuration to launch the managed instance group (MIG). What should you do?

Options:

A.  

Deploy a Cloud NAT Gateway in the service project for the MIG.

B.  

Deploy a Cloud NAT Gateway in the host (VPC) project for the MIG.

C.  

Deploy an external HTTP(S) load balancer in the service project with the MIG as a backend.

D.  

Deploy an external HTTP(S) load balancer in the host (VPC) project with the MIG as a backend.

Discussion 0
Questions 66

Your company is moving to Google Cloud. You plan to sync your users first by using Google Cloud Directory Sync (GCDS). Some employees have already created Google Cloud accounts by using their company email addresses that were created outside of GCDS. You must create your users on Cloud Identity.

What should you do?

Options:

A.  

Configure GCDS and use GCDS search rules lo sync these users.

B.  

Use the transfer tool to migrate unmanaged users.

C.  

Write a custom script to identify existing Google Cloud users and call the Admin SDK Directory API to transfer their account.

D.  

Configure GCDS and use GCDS exclusion rules to ensure users are not suspended.

Discussion 0
Questions 67

An organization is evaluating the use of Google Cloud Platform (GCP) for certain IT workloads. A well- established directory service is used to manage user identities and lifecycle management. This directory service must continue for the organization to use as the “source of truth” directory for identities.

Which solution meets the organization's requirements?

Options:

A.  

Google Cloud Directory Sync (GCDS)

B.  

Cloud Identity

C.  

Security Assertion Markup Language (SAML)

D.  

Pub/Sub

Discussion 0
Questions 68

Your company hosts a critical web application on Google Cloud The application is experiencing an increasing number of sophisticated layer 7 attacks, including cross-site scripting (XSS) and SQL injection attempts. You need to protect the application from these attacks while minimizing the impact on legitimate traffic and ensuring high availability. What should you do?

Options:

A.  

Enable Google Cloud Armor's pre-configured WAF rules for OWASP Top 10 vulnerabilities at the backend service.

B.  

Implement a load balancer in front of the web application instances, and enable Adaptive Protection and throttling to mitigate the occurrence of these malicious requests.

C.  

Configure Cloud Next Generation Firewall to block known malicious IP addresses targeting /32 addresses.

D.  

Configure a Cloud Armor security policy with customized and pre-configured WAF rules for OWASP Top 10 vulnerabilities at the load balancer.

Discussion 0
Questions 69

Which Google Cloud service should you use to enforce access control policies for applications and resources?

Options:

A.  

Identity-Aware Proxy

B.  

Cloud NAT

C.  

Google Cloud Armor

D.  

Shielded VMs

Discussion 0
Questions 70

You are developing a new application that uses exclusively Compute Engine VMs Once a day. this application will execute five different batch jobs Each of the batch jobs requires a dedicated set of permissions on Google Cloud resources outside of your application. You need to design a secure access concept for the batch jobs that adheres to the least-privilege principle

What should you do?

Options:

A.  

1. Create a general service account **g-sa" to execute the batch jobs.• 2 Grant the permissions required to execute the batch jobs to g-sa.• 3. Execute the batch jobs with the permissions granted to g-sa

B.  

1. Create a general service account "g-sa" to orchestrate the batch jobs.• 2. Create one service account per batch job Mb-sa-[1-5]," and grant only the permissions required to run the individual batch jobs to the service accounts.• 3. Grant the Service Account Token Creator role to g-sa Use g-sa to obtain short-lived access tokens for b-sa-[1-5] and to execute the batch jobs with the permissions of b-sa-[1-5].

C.  

1. Create a workload identity pool and configure workload identity pool providers for each batch job• 2 Assign the workload identity user role to each of the identities configured in the providers.• 3. Create one service account per batch job Mb-sa-[1-5]". and grant only the permissions required to run the individual batch jobs to the service accounts• 4 Generate credential configuration files for each of the providers Use these files to ex

D.  

• 1. Create a general service account "g-sa" to orchestrate the batch jobs.• 2 Create one service account per batch job 'b-sa-[1-5)\ Grant only the permissions required to run the individual batch jobs to the service accounts and generate service account keys for each of these service accounts• 3. Store the service account keys in Secret Manager. Grant g-sa access to Secret Manager and run the batch jobs with the permissions of b-sa-[1-5].<

Discussion 0
Questions 71

You are implementing a new web application on Google Cloud that will be accessed from your on-premises network. To provide protection from threats like malware, you must implement transport layer security (TLS) interception for incoming traffic to your application. What should you do?​

Options:

A.  

Configure Secure Web Proxy. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

B.  

Configure an internal proxy load balancer. Offload the TLS traffic in the load balancer, inspect the traffic, and forward the traffic to the web application.​

C.  

Configure a hierarchical firewall policy. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

D.  

Configure a VPC firewall rule. Enable TLS interception by using Cloud Next Generation Firewall (NGFW) Enterprise.​

Discussion 0
Questions 72

Your security team wants to reduce the risk of user-managed keys being mismanaged and compromised. To achieve this, you need to prevent developers from creating user-managed service account keys for projects in their organization. How should you enforce this?

Options:

A.  

Configure Secret Manager to manage service account keys.

B.  

Enable an organization policy to disable service accounts from being created.

C.  

Enable an organization policy to prevent service account keys from being created.

D.  

Remove the iam.serviceAccounts.getAccessToken permission from users.

Discussion 0
Questions 73

Your company must follow industry specific regulations. Therefore, you need to enforce customer-managed encryption keys (CMEK) for all new Cloud Storage resources in the organization called org1.

What command should you execute?

Options:

A.  

• organization policy: constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: deny• policy value: storage.gcogleapis.com

B.  

• organization policy: constraints/gcp.restrictHonCmekServices• binding at: orgl• policy type: deny• policy value: storage.googleapis.com

C.  

• organization policy:constraints/gcp.restrictStorageNonCraekServices• binding at: orgl• policy type: allow• policy value: all supported services

D.  

• organization policy: constramts/gcp.restrictNonCmekServices• binding at: orgl• policy type: allow• policy value: storage.googleapis.com

Discussion 0
Questions 74

A security audit uncovered several inconsistencies in your project’s Identity and Access Management (IAM) configuration. Some service accounts have overly permissive roles, and a few external collaborators have more access than necessary. You need to gain detailed visibility into changes to IAM policies, user activity, service account behavior, and access to sensitive projects. What should you do?

Options:

A.  

Deploy the OS Config Management agent to your VMs. Use OS Config Management to create patch management jobs and monitor system modifications.

B.  

Enable the metrics explorer in Cloud Monitoring to follow the service account authentication events and build alerts linked on it.

C.  

Use Cloud Audit Logs. Create log export sinks to send these logs to a security information and event management (SIEM) solution for correlation with other event sources.

D.  

Configure Google Cloud Functions to be triggered by changes to IAM policies. Analyze changes by using the policy simulator, send alerts upon risky modifications, and store event details.

Discussion 0
Questions 75

Your organization deploys a large number of containerized applications on Google Kubernetes Engine (GKE). Node updates are currently applied manually. Audit findings show that a critical patch has not been installed due to a missed notification. You need to design a more reliable, cloud-first, and scalable process for node updates. What should you do?​

Options:

A.  

Migrate the cluster infrastructure to a self-managed Kubernetes environment for greater control over the patching process.​

B.  

Develop a custom script to continuously check for patch availability, download patches, and apply the patches across all components of the cluster.​

C.  

Schedule a daily reboot for all nodes to automatically upgrade.​

D.  

Configure node auto-upgrades for node pools in the maintenance windows.​

Discussion 0
Questions 76

Applications often require access to “secrets” - small pieces of sensitive data at build or run time. The administrator managing these secrets on GCP wants to keep a track of “who did what, where, and when?” within their GCP projects.

Which two log streams would provide the information that the administrator is looking for? (Choose two.)

Options:

A.  

Admin Activity logs

B.  

System Event logs

C.  

Data Access logs

D.  

VPC Flow logs

E.  

Agent logs

Discussion 0
Questions 77

Your team needs to make sure that a Compute Engine instance does not have access to the internet or to any Google APIs or services.

Which two settings must remain disabled to meet these requirements? (Choose two.)

Options:

A.  

Public IP

B.  

IP Forwarding

C.  

Private Google Access

D.  

Static routes

E.  

IAM Network User Role

Discussion 0
Questions 78

Your company wants to determine what products they can build to help customers improve their credit scores depending on their age range. To achieve this, you need to join user information in the company's banking app with customers' credit score data received from a third party. While using this raw data will allow you to complete this task, it exposes sensitive data, which could be propagated into new systems.

This risk needs to be addressed using de-identification and tokenization with Cloud Data Loss Prevention while maintaining the referential integrity across the database. Which cryptographic token format should you use to meet these requirements?

Options:

A.  

Deterministic encryption

B.  

Secure, key-based hashes

C.  

Format-preserving encryption

D.  

Cryptographic hashing

Discussion 0
Questions 79

You are a member of your company's security team. You have been asked to reduce your Linux bastion host external attack surface by removing all public IP addresses. Site Reliability Engineers (SREs) require access to the bastion host from public locations so they can access the internal VPC while off-site. How should you enable this access?

Options:

A.  

Implement Cloud VPN for the region where the bastion host lives.

B.  

Implement OS Login with 2-step verification for the bastion host.

C.  

Implement Identity-Aware Proxy TCP forwarding for the bastion host.

D.  

Implement Google Cloud Armor in front of the bastion host.

Discussion 0
Questions 80

You want to limit the images that can be used as the source for boot disks. These images will be stored in a dedicated project.

What should you do?

Options:

A.  

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted project as the whitelist in an allow operation.

B.  

Use the Organization Policy Service to create a compute.trustedimageProjects constraint on the organization level. List the trusted projects as the exceptions in a deny operation.

C.  

In Resource Manager, edit the project permissions for the trusted project. Add the organization as member with the role: Compute Image User.

D.  

In Resource Manager, edit the organization permissions. Add the project ID as member with the role: Compute Image User.

Discussion 0
Questions 81

A customer is running an analytics workload on Google Cloud Platform (GCP) where Compute Engine instances are accessing data stored on Cloud Storage. Your team wants to make sure that this workload will not be able to access, or be accessed from, the internet.

Which two strategies should your team use to meet these requirements? (Choose two.)

Options:

A.  

Configure Private Google Access on the Compute Engine subnet

B.  

Avoid assigning public IP addresses to the Compute Engine cluster.

C.  

Make sure that the Compute Engine cluster is running on a separate subnet.

D.  

Turn off IP forwarding on the Compute Engine instances in the cluster.

E.  

Configure a Cloud NAT gateway.

Discussion 0
Questions 82

Your company wants to deploy 2-step verification (2SV). The organizational unit (OU) structure of your company is divided into four departmental units: Human Resources. Finance. Engineering, and Marketing. You need to prevent many access issues from occurring at the same time. Your solution should minimize complexity in management and configuration. What should you do?

Options:

A.  

Create a single new OU to configure enforcement of 2SV to certain users but not others.

B.  

Create configuration groups, and enable a phased migration to control the number of individuals in which to enforce 2SV.

C.  

In the Admin console, for each OU, check the checkbox to Allow users to turn on 2-Step Verification and set Enforcement to Off.

D.  

In the Admin console, for each OU. uncheck the checkbox to Allow users to turn on 2-Step Verification and set Enforcement to On

Discussion 0
Questions 83

Your organization uses the top-tier folder to separate application environments (prod and dev). The developers need to see all application development audit logs but they are not permitted to review production logs. Your security team can review all logs in production and development environments. You must grant Identity and Access Management (1AM) roles at the right resource level tor the developers and security team while you ensure least privilege.

What should you do?

Options:

A.  

• 1 Grant logging, viewer rote to the security team at the organization resource level.• 2 Grant logging, viewer rote to the developer team at the folder resource level that contains all the dev projects.

B.  

• 1 Grant logging. viewer rote to the security team at the organization resource level.• 2 Grant logging. admin role to the developer team at the organization resource level.

C.  

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging. viewer rote to the developer team at the folder resource level that contains all the dev projects.

D.  

• 1 Grant logging.admin role to the security team at the organization resource level.• 2 Grant logging.admin role to the developer team at the organization resource level.

Discussion 0
Questions 84

Your organization operates in a highly regulated industry and uses multiple Google Cloud services. You need to identify potential risks to regulatory compliance. Which situation introduces the greatest risk?

Options:

A.  

Principals have broad IAM roles allowing the creation and management of Compute Engine VMs without a pre-defined hardening process.

B.  

Sensitive data is stored in a Cloud Storage bucket with the uniform bucket-level access setting enabled.

C.  

The security team mandates the use of customer-managed encryption keys (CMEK) for all data classified as sensitive.

D.  

The audit team needs access to Cloud Audit Logs related to managed services like BigQuery.

Discussion 0
Questions 85

Your team wants to make sure Compute Engine instances running in your production project do not have public IP addresses. The frontend application Compute Engine instances will require public IPs. The product engineers have the Editor role to modify resources. Your team wants to enforce this requirement.

How should your team meet these requirements?

Options:

A.  

Enable Private Access on the VPC network in the production project.

B.  

Remove the Editor role and grant the Compute Admin IAM role to the engineers.

C.  

Set up an organization policy to only permit public IPs for the front-end Compute Engine instances.

D.  

Set up a VPC network with two subnets: one with public IPs and one without public IPs.

Discussion 0
Questions 86

In an effort for your company messaging app to comply with FIPS 140-2, a decision was made to use GCP compute and network services. The messaging app architecture includes a Managed Instance Group (MIG) that controls a cluster of Compute Engine instances. The instances use Local SSDs for data caching and UDP for instance-to-instance communications. The app development team is willing to make any changes necessary to comply with the standard

Which options should you recommend to meet the requirements?

Options:

A.  

Encrypt all cache storage and VM-to-VM communication using the BoringCrypto module.

B.  

Set Disk Encryption on the Instance Template used by the MIG to customer-managed key and use BoringSSL for all data transit between instances.

C.  

Change the app instance-to-instance communications from UDP to TCP and enable BoringSSL on clients' TLS connections.

D.  

Set Disk Encryption on the Instance Template used by the MIG to Google-managed Key and use BoringSSL library on all instance-to-instance communications.

Discussion 0
Questions 87

You are a Security Administrator at your organization. You need to restrict service account creation capability within production environments. You want to accomplish this centrally across the organization. What should you do?

Options:

A.  

Use Identity and Access Management (IAM) to restrict access of all users and service accounts that have access to the production environment.

B.  

Use organization policy constraints/iam.disableServiceAccountKeyCreation boolean to disable the creation of new service accounts.

C.  

Use organization policy constraints/iam.disableServiceAccountKeyUpload boolean to disable the creation of new service accounts.

D.  

Use organization policy constraints/iam.disableServiceAccountCreation boolean to disable the creation of new service accounts.

Discussion 0
Questions 88

Your company’s chief information security officer (CISO) is requiring business data to be stored in specific locations due to regulatory requirements that affect the company’s global expansion plans. After working on a plan to implement this requirement, you determine the following:

    The services in scope are included in the Google Cloud data residency requirements.

    The business data remains within specific locations under the same organization.

    The folder structure can contain multiple data residency locations.

    The projects are aligned to specific locations.

You plan to use the Resource Location Restriction organization policy constraint with very granular control. At which level in the hierarchy should you set the constraint?

Options:

A.  

Organization

B.  

Resource

C.  

Project

D.  

Folder

Discussion 0
Questions 89

Which Identity-Aware Proxy role should you grant to an Identity and Access Management (IAM) user to access HTTPS resources?

Options:

A.  

Security Reviewer

B.  

lAP-Secured Tunnel User

C.  

lAP-Secured Web App User

D.  

Service Broker Operator

Discussion 0