Pre-Summer Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

AWS Certified Solutions Architect - Associate (SAA-C03) Question and Answers

AWS Certified Solutions Architect - Associate (SAA-C03)

Last Update Apr 15, 2026
Total Questions : 879

We are offering FREE SAA-C03 Amazon Web Services exam questions. All you do is to just go and sign up. Give your details, prepare SAA-C03 free exam questions and then go for complete pool of AWS Certified Solutions Architect - Associate (SAA-C03) test questions that will help you more.

SAA-C03 pdf

SAA-C03 PDF

$36.75  $104.99
SAA-C03 Engine

SAA-C03 Testing Engine

$43.75  $124.99
SAA-C03 PDF + Engine

SAA-C03 PDF + Testing Engine

$57.75  $164.99
Questions 1

A solutions architect needs to host a high performance computing (HPC) workload in the AWS Cloud. The workload will run on hundreds of Amazon EC2 instances and will require parallel access to a shared file system to enable distributed processing of large datasets. Datasets will be accessed across multiple instances simultaneously. The workload requires access latency within 1 ms. After processing has completed, engineers will need access to the dataset for manual postprocessing.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Elastic File System (Amazon EFS) as a shared fie system. Access the dataset from Amazon EFS.

B.  

Mount an Amazon S3 bucket to serve as the shared file system. Perform postprocessing directly from the S3 bucket.

C.  

Use Amazon FSx for Lustre as a shared file system. Link the file system to an Amazon S3 bucket for postprocessing.

D.  

Configure AWS Resource Access Manager to share an Amazon S3 bucket so that it can be mounted to all instances for processing and postprocessing.

Discussion 0
Questions 2

A company has developed an API using Amazon API Gateway REST API and AWS Lambda. How can latency be reduced for users worldwide?

Options:

A.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding to compress data in transit.

B.  

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding to compress data in transit.

C.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for Lambda functions.

D.  

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for Lambda functions.

Discussion 0
Questions 3

A company stores customer data in a multitenant Amazon S3 bucket. Each customer ' s data is stored in a prefix that is unique to the customer. The company needs to migrate data for specific customers to a new. dedicated S3 bucket that is in the same AWS Region as the source bucket. The company must preserve object metadata such as creation date and version IDs.

After the migration is finished, the company must delete the source data for the migrated customers from the original multitenant S3 bucket.

Which combination of solutions will meet these requirements with the LEAST overhead? (Select THREE.)

Options:

A.  

Create a new S3 bucket as a destination bucket. Enable versioning on the new bucket.

B.  

Use S3 batch operations to copy objects from the specified prefixes to the destination bucket.

C.  

Use the S3 CopyObject API, and create a script to copy data to the destination S3 bucket.

D.  

Configure S3 Same-Region Replication (SRR) to replicate existing data from the specified prefixes in the source bucket to the destination bucket.

E.  

Configure AWS DataSync to migrate data from the specified prefixes in the source bucket to the destination bucket.

F.  

Use an S3 Lifecycle policy to delete objects from the source bucket after the data is migrated to the destination bucket.

Discussion 0
Questions 4

A company stores a file in an S3 bucket containing IP allow/deny lists. The file must be accessible via an HTTP endpoint. Firewalls outside AWS must read the file. The company wants to restrict access to only the firewall IP addresses.

The S3 Block Public Access feature is enabled on the account.

Which solution meets these requirements?

Options:

A.  

Host the bucket as a static website and restrict access by IP.

B.  

Create a bucket policy that explicitly allows access only from the firewall IP addresses.

C.  

Create a CloudFront distribution with the S3 bucket as the origin. Use an origin access control (OAC) that allows access only from the firewall IP addresses.

D.  

Create a Lambda function to validate IP addresses and return the lists.

Discussion 0
Questions 5

A company has hired an external vendor to work in the company’s AWS account. The vendor uses an automated tool that the vendor hosts in its own AWS account. The vendor does not have IAM access to the company ' s AWS account. A solutions architect needs to grant access to the vendor.

Which solution will meet these requirements MOST securely?

Options:

A.  

Create an IAM role in the company ' s account to delegate access to the vendor ' s IAM role. Attach the appropriate IAM policies to the new IAM role to grant the permissions that the vendor requires.

B.  

Create an IAM user in the company ' s account with a password. Attach the appropriate IAM policies to the IAM user.

C.  

Create an IAM group in the company ' s account. Add the IAM user for the vendor ' s automated tool from the vendor account to the IAM group. Attach policies to the group.

D.  

Create a new identity provider (IdP) of provider type AWS account. Supply the vendor ' s AWS account ID and username. Attach policies to the IdP.

Discussion 0
Questions 6

A media company hosts a mobile app backend in the AWS Cloud. The company is releasing a new feature to allow users to upload short videos and apply special effects by using the mobile app. The company uses AWS Amplify to store the videos that customers upload in an Amazon S3 bucket.

The videos must be processed immediately. Users must receive a notification when processing is finished.

Which solution will meet these requirements?

Options:

A.  

Use Amazon EventBridge Scheduler to schedule an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

B.  

Use Amazon EventBridge Scheduler to schedule AWS Fargate to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

C.  

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use Amazon Simple Notification Service (Amazon SNS) to send push notifications to customers when processing is finished.

D.  

Use an S3 trigger to invoke an AWS Lambda function to process the videos. Save the processed videos to the S3 bucket. Use AWS Amplify to send push notifications to customers when processing is finished.

Discussion 0
Questions 7

A solutions architect is designing a web application that will run on Amazon EC2 instances behind an Application Load Balancer (ALB). The company strictly requires that the application be resilient against malicious internet activity and attacks, and protect against new common vulnerabilities and exposures.

What should the solutions architect recommend?

Options:

A.  

Leverage Amazon CloudFront with the ALB endpoint as the origin.

B.  

Deploy an appropriate managed rule for AWS WAF and associate it with the AL

B.  

C.  

Subscribe to AWS Shield Advanced and ensure common vulnerabilities and exposures are blocked.

D.  

Configure network ACLs and security groups to allow only ports 80 and 443 to access the EC2 instances.

Discussion 0
Questions 8

A company needs to design a resilient web application to process customer orders. The web application must automatically handle increases in web traffic and application usage without affecting the customer experience or losing customer orders.

Which solution will meet these requirements?

Options:

A.  

Use a NAT gateway to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive, process, and store processed customer orders. Use an AWS Lambda function to capture and store unprocessed orders.

B.  

Use a Network Load Balancer (NLB) to manage web traffic. Use an Application Load Balancer to receive customer orders from the NL

B.  

Use Amazon Redshift with a Multi-AZ deployment to store unprocessed and processed customer orders.

C.  

Use a Gateway Load Balancer (GWLB) to manage web traffic. Use Amazon Elastic Container Service (Amazon ECS) to receive and process customer orders. Use the GWLB to capture and store unprocessed orders. Use Amazon DynamoDB to store processed customer orders.

D.  

Use an Application Load Balancer to manage web traffic. Use Amazon EC2 Auto Scaling groups to receive and process customer orders. Use Amazon Simple Queue Service (Amazon SQS) to store unprocessed orders. Use Amazon RDS with a Multi-AZ deployment to store processed customer orders.

Discussion 0
Questions 9

A company hosts its main public web application in one AWS Region across multiple Availability Zones. The application uses an Amazon EC2 Auto Scaling group and an Application Load Balancer (ALB).

A web development team needs a cost-optimized compute solution to improve the company ' s ability to serve dynamic content globally to millions of customers.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon CloudFront distribution. Configure the existing ALB as the origin.

B.  

Use Amazon Route 53 to serve traffic to the ALB and EC2 instances based on the geographic location of each customer.

C.  

Create an Amazon S3 bucket with public read access enabled. Migrate the web application to the S3 bucket. Configure the S3 bucket for website hosting.

D.  

Use AWS Direct Connect to directly serve content from the web application to the location of each customer.

Discussion 0
Questions 10

A company has an application that serves clients that are deployed in more than 20.000 retail storefront locations around the world. The application consists of backend web services that are exposed over HTTPS on port 443 The application is hosted on Amazon EC2 Instances behind an Application Load Balancer (ALB). The retail locations communicate with the web application over the public internet. The company allows each retail location to register the IP address that the retail location has been allocated by its local ISP.

The company ' s security team recommends to increase the security of the application endpoint by restricting access to only the IP addresses registered by the retail locations.

What should a solutions architect do to meet these requirements?

Options:

A.  

Associate an AWS WAF web ACL with the ALB Use IP rule sets on the ALB to filter traffic Update the IP addresses in the rule to Include the registered IP addresses

B.  

Deploy AWS Firewall Manager to manage the AL

B.  

Configure firewall rules to restrict traffic to the ALB Modify the firewall rules to include the registered IP addresses.

C.  

Store the IP addresses in an Amazon DynamoDB table. Configure an AWS Lambda authorization function on the ALB to validate that incoming requests are from the registered IP addresses.

D.  

Configure the network ACL on the subnet that contains the public interface of the ALB Update the ingress rules on the network ACL with entries for each of the registered IP addresses.

Discussion 0
Questions 11

An ecommerce company runs Its application on AWS. The application uses an Amazon Aurora PostgreSQL cluster in Multi-AZ mode for the underlying database. During a recent promotionalcampaign, the application experienced heavy read load and write load. Users experienced timeout issues when they attempted to access the application.

A solutions architect needs to make the application architecture more scalable and highly available.

Which solution will meet these requirements with the LEAST downtime?

Options:

A.  

Create an Amazon EventBndge rule that has the Aurora cluster as a source. Create an AWS Lambda function to log the state change events of the Aurora cluster. Add the Lambda function as a target for the EventBndge rule Add additional reader nodes to fail over to.

B.  

Modify the Aurora cluster and activate the zero-downtime restart (ZDR) feature. Use Database Activity Streams on the cluster to track the cluster status.

C.  

Add additional reader instances to the Aurora cluster Create an Amazon RDS Proxy target group for the Aurora cluster.

D.  

Create an Amazon ElastiCache for Redis cache. Replicate data from the Aurora cluster to Redis by using AWS Database Migration Service (AWS DMS) with a write-around approach.

Discussion 0
Questions 12

A company deployed an application in two AWS Regions. If the application fails in one Region, traffic must fail over to the second Region. The failover must avoid stale DNS client caches, and the company requires one endpoint for both Regions.

Which solution meets these requirements?

Options:

A.  

Use a CloudFront distribution with multiple origins.

B.  

Use Route 53 weighted routing with equal weights.

C.  

Use AWS Global Accelerator and assign static anycast IPs to the application.

D.  

Use Route 53 IP-based routing to switch Regions.

Discussion 0
Questions 13

A financial company is migrating banking applications to AWS accounts managed through AWS Organizations. The applications store sensitive customer data on Amazon EBS volumes, and the company takes regular snapshots for backups.

The company must implement controls across all accounts to prevent sharing EBS snapshots publicly, with the least operational overhead.

Which solution will meet these requirements?

Options:

A.  

Enable AWS Config rules for each OU to monitor EBS snapshot permissions.

B.  

Enable block public access for EBS snapshots at the organization level.

C.  

Create an IAM policy in the root account that prevents users from modifying snapshot permissions.

D.  

Use AWS CloudTrail to track snapshot permission changes.

Discussion 0
Questions 14

A company runs a mobile game app on AWS. The app stores data for every user session. The data updates frequently during a gaming session. The app stores up to 256 KB for each session. Sessions can last up to 48 hours.

The company wants to automate the deletion of expired session data. The company must be able to restore all session data automatically if necessary.

Which solution will meet these requirements?

Options:

A.  

Use an Amazon DynamoDB table to store the session data. Enable point-in-time recovery (PITR) and TTL for the table. Select the corresponding attribute for TTL in the session data.

B.  

Use an Amazon MemoryDB table to store the session data. Enable point-in-time recovery (PITR) and TTL for the table. Select the corresponding attribute for TTL in the session data.

C.  

Store session data in an Amazon S3 bucket. Use the S3 Standard storage class. Enable S3 Versioning for the bucket. Create an S3 Lifecycle configuration to expire objects after 48 hours.

D.  

Store session data in an Amazon S3 bucket. Use the S3 Intelligent-Tiering storage class. Enable S3 Versioning for the bucket. Create an S3 Lifecycle configuration to expire objects after 48 hours.

Discussion 0
Questions 15

A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions granted to IAM users to determine which users have more permissions than required.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Use Network Access Analyzer to review all access permissions in the company ' s AWS accounts.

B.  

Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.

C.  

Use AWS Identity and Access Management (IAM) Access Analyzer to review all the company ' s resources and accounts.

D.  

Use Amazon Inspector to find vulnerabilities in existing IAM policies.

Discussion 0
Questions 16

An international company needs to share data from an Amazon S3 bucket to employees who are located around the world. The company needs a secure solution to provide employees with access to the S3 bucket. The employees are already enrolled in AWS IAM Identity Center.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create a help desk application to generate an Amazon S3 presigned URL for each employee. Configure the presigned URLs to have short expirations. Instruct employees to contact the company help desk to receive a presigned URL to access the S3 bucket.

B.  

Create a group for Amazon S3 access in IAM Identity Center. Add the employees who require access to the S3 bucket to the group. Create an IAM policy to allow Amazon S3 access from the group. Instruct employees to use the AWS access portal to access the AWS Management Console and navigate to the S3 bucket.

C.  

Create an Amazon S3 File Gateway. Create one share for data uploads and a second share for data downloads. Set up an SFTP service on an Amazon EC2 instance. Mount the shares to the EC2 instance. Instruct employees to use the SFTP server.

D.  

Configure AWS Transfer Family SFTP endpoints. Select the custom identity provider option. Use AWS Secrets Manager to manage the user credentials. Instruct employees to use Transfer Family SFTP.

Discussion 0
Questions 17

A company hosts a database that runs on an Amazon RDS instance deployed to multiple Availability Zones. A periodic script negatively affects a critical application by querying the database. How can application performance be improved with minimal costs?

Options:

A.  

Add functionality to the script to identify the instance with the fewest active connections and query that instance.

B.  

Create a read replica of the database. Configure the script to query only the read replica.

C.  

Instruct the development team to manually export new entries at the end of the day.

D.  

Use Amazon ElastiCache to cache the common queries the script runs.

Discussion 0
Questions 18

A solutions architect is creating a new Amazon CloudFront distribution for an application. Some of the information submitted by users is sensitive. The application uses HTTPS but needs another layer of security. The sensitive information should be protected throughout the entire application stack, and access to the information should be restricted to certain applications.

Which action should the solutions architect take?

Options:

A.  

Configure a CloudFront signed URL.

B.  

Configure a CloudFront signed cookie.

C.  

Configure a CloudFront field-level encryption profile.

D.  

Configure CloudFront and set the Origin Protocol Policy setting to HTTPS Only for the Viewer Protocol Policy.

Discussion 0
Questions 19

A company is setting up a development environment on AWS for a team of developers. The team needs to access multiple Amazon S3 buckets to store project data. The team also needs to use Amazon EC2 to run development instances.

The company needs to ensure that the developers have access only to specific Amazon S3 buckets and EC2 instances. Access permissions must be assigned according to each developer ' s role on the team. The company wants to minimize the use of permanent credentials and to ensure access is securely managed according to the principle of least privilege.

Which solution will meet these requirements?

Options:

A.  

Create IAM roles that have administrative-level permissions for Amazon S3 and Amazon EC2. Require developers to sign in by using Amazon Cognito to access Amazon S3 and Amazon EC2.

B.  

Create IAM roles that have fine-grained permissions for Amazon S3 and Amazon EC2. Configure AWS IAM Identity Center to manage credentials for the developers.

C.  

Create IAM users that have programmatic access to Amazon S3 and Amazon EC2. Generate individual access keys for each developer to access Amazon S3 and Amazon EC2.

D.  

Create a VPC endpoint for Amazon S3. Require developers to access Amazon EC2 instances and Amazon S3 buckets through a bastion host.

Discussion 0
Questions 20

A company runs a container application on a Kubernetes cluster in the company ' s data center. The application uses Advanced Message Queuing Protocol (AMQP) to communicate with a message queue. The data center cannot scale fast enough to meet the company ' s expanding business needs. The company wants to migrate the workloads to AWS.

Which solution will meet these requirements with the LEAST overhead?

Options:

A.  

Migrate the container application to Amazon ECS. Use Amazon SQS to retrieve the messages.

B.  

Migrate the container application to Amazon EKS. Use Amazon MQ to retrieve the messages.

C.  

Use highly available Amazon EC2 instances to run the application. Use Amazon MQ to retrieve the messages.

D.  

Use AWS Lambda functions to run the application. Use Amazon SQS to retrieve the messages.

Discussion 0
Questions 21

A company is building a new web application that serves static and dynamic content from an API. Users will access the application from around the world. The company wants to minimize latency in the most cost-effective way.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy the static content to an Amazon S3 bucket. Use an Amazon API Gateway HTTP API to serve the dynamic content. Create an Amazon CloudFront distribution that uses the S3 bucket and the HTTP API as origins. Enable caching for static content.

B.  

Deploy the static content to an Amazon S3 bucket. Provide the bucket website endpoint to users. Use an Amazon API Gateway HTTP API with caching enabled to serve the dynamic content.

C.  

Deploy the static content to an Amazon S3 bucket. Use two Amazon EC2 instances as web servers. Deploy an Application Load Balancer to distribute traffic. Create an Amazon CloudFront distribution in front of the S3 bucket to cache static content.

D.  

Deploy the static content to an Amazon S3 bucket. Provide the bucket website endpoint to users. Create an Amazon CloudFront distribution in front of the S3 bucket to cache static content.

Discussion 0
Questions 22

A company has deployed a multi-tier web application to support a website. The architecture includes an Application Load Balancer (ALB) in public subnets, two Amazon Elastic Container Service (Amazon ECS) tasks in the public subnets, and a PostgreSQL cluster that runs on Amazon EC2 instances in private subnets.

The EC2 instances that host the PostgreSQL database run shell scripts that need to access an external API to retrieve product information. A solutions architect must design a solution to allow the EC2 instances to securely communicate with the external API without increasing operational overhead.

Which solution will meet these requirements?

Options:

A.  

Assign public IP addresses to the EC2 instances in the private subnets. Configure security groups to allow outbound internet access.

B.  

Configure a NAT gateway in the public subnets. Update the route table for the private subnets to route traffic to the NAT gateway.

C.  

Configure a VPC peering connection between the private subnets and a public subnet that has access to the external API.

D.  

Deploy an interface VPC endpoint to securely connect to the external API.

Discussion 0
Questions 23

Question:

A company wants to migrate an application to AWS. The application runs on Docker containers behind an Application Load Balancer (ALB). The application stores data in a PostgreSQL database. The cloud-based solution must use AWS WAF to inspect all application traffic. The application experiences most traffic on weekdays. There is significantly less traffic on weekends. Which solution will meet these requirements in the MOST cost-effective way?

Options:

Options:

A.  

Use a Network Load Balancer (NLB). Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the NLB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon RDS for PostgreSQL as the database.

B.  

Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the AL

B.  

Run the application on Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon RDS for PostgreSQL as the database.

C.  

Create a web access control list (web ACL) in AWS WAF that includes the necessary rules. Attach the web ACL to the ALB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon Aurora Serverless as the database.

D.  

Use a Network Load Balancer (NLB). Create a web access control list (web ACL) in AWS WAF that has the necessary rules. Attach the web ACL to the NLB. Run the application on Amazon Elastic Container Service (Amazon ECS). Use Amazon Aurora Serverless as the database.

Discussion 0
Questions 24

A company is building a serverless web application with multiple interdependent workflows that millions of users worldwide will access. The application needs to handle bursts of traffic.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with a Standard Workflow.

B.  

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with an Express Workflow.

C.  

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions with an Express Workflow.

D.  

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions and multiple AWS Lambda functions with reserved concurrency.

Discussion 0
Questions 25

A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (PII). The company recently discovered that S3 buckets have some objects that contain PII. The company needs to automatically detect PII in S3 buckets and to notify the company ' s security team. Which solution will meet these requirements?

Options:

A.  

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData event type from Macie findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.

B.  

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.

C.  

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S3Object/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.

D.  

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.

Discussion 0
Questions 26

Question:

A company runs an online order management system on AWS. The company stores order and inventory data for the previous 5 years in an Amazon Aurora MySQL database. The company deletes inventory data after 5 years.

The company wants to optimize costs to archive data.

Options:

Options:

A.  

Create an AWS Glue crawler to export data to Amazon S3. Create an AWS Lambda function to compress the data.

B.  

Use the SELECT INTO OUTFILE S3 query on the Aurora database to export the data to Amazon S3. Configure S3 Lifecycle rules on the S3 bucket.

C.  

Create an AWS Glue DataBrew Job to migrate data from Aurora to Amazon S3. Configure S3 Lifecycle rules on the S3 bucket.

D.  

Use the AWS Schema Conversion Tool (AWS SCT) to replicate data from Aurora to Amazon S3. Use the S3 Standard-Infrequent Access (S3 Standard-IA) storage class.

Discussion 0
Questions 27

A solutions architect is building a static website hosted on Amazon S3. The website uses an Amazon Aurora PostgreSQL database accessed through an AWS Lambda function. The production website uses a Lambda alias that points to a specific version of the Lambda function.

Database credentials must rotate every 2 weeks. Previously deployed Lambda versions must always use the most recent credentials.

Which solution will meet these requirements?

Options:

A.  

Store credentials in AWS Secrets Manager. Turn on rotation. Write code in the Lambda function to retrieve credentials from Secrets Manager.

B.  

Include the credentials in the Lambda function code and update the function regularly.

C.  

Use Lambda environment variables and update them when new credentials are available.

D.  

Store credentials in AWS Systems Manager Parameter Store. Turn on rotation. Write code to retrieve credentials from Parameter Store.

Discussion 0
Questions 28

A company operates multiple VPCs in a single AWS account. Account users need temporary access to Amazon S3 buckets. The S3 buckets are private and have no public endpoints.

The solution must follow the principle of least privilege for access to each environment and must avoid distributing permanent access keys.

Which solution will meet these requirements?

Options:

A.  

Create a gateway VPC endpoint for Amazon S3 in each VPC. Attach an endpoint policy that allows only environment-scoped IAM roles to access the S3 buckets.

B.  

Configure the S3 buckets to use SSE-S3. Create bucket policies that allow access only from the VPC CIDR blocks.

C.  

Define separate S3 access points for each environment. Allow users to assume a role associated with the access points. Use the default Amazon S3 endpoints.

D.  

Route S3 traffic through a NAT gateway. Configure bucket policies that allow traffic only from the NAT gateway’s public IP addresses.

Discussion 0
Questions 29

A company has AWS Lambda functions that use environment variables. The company does not want its developers to see environment variables in plaintext.

Which solution will meet these requirements?

Options:

A.  

Deploy code to Amazon EC2 instances instead of using Lambda functions.

B.  

Configure SSL encryption on the Lambda functions to use AWS CloudHSM to store and encrypt the environment variables.

C.  

Create a certificate in AWS Certificate Manager (ACM). Configure the Lambda functions to use the certificate to encrypt the environment variables.

D.  

Create an AWS Key Management Service (AWS KMS) key. Enable encryption helpers on the Lambda functions to use the KMS key to store and encrypt the environment variables.

Discussion 0
Questions 30

A company needs to migrate a MySQL database from an on-premises data center to AWS within 2 weeks. The database is 180 TB in size. The company cannot partition the database.

The company wants to minimize downtime during the migration. The company ' s internet connection speed is 100 Mbps.

Which solution will meet these requirements?

Options:

A.  

Order an AWS Snowball Edge Storage Optimized device. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS for MySQL and replicate ongoing changes. Send the Snowball Edge device back to AWS to finish the migration. Continue to replicate ongoing changes.

B.  

Establish an AWS Site-to-Site VPN connection between the data center and AWS. Use AWS Database Migration Service (AWS DMS) and the AWS Schema Conversion Tool (AWS SCT) to migrate the database to Amazon RDS tor MySQL and replicate ongoing changes.

C.  

Establish a 10 Gbps dedicated AWS Direct Connect connection between the data center and AWS. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

D.  

Use the company ' s existing internet connection. Use AWS DataSync to replicate the database to Amazon S3. Create a script to import the data from Amazon S3 to a new Amazon RDS for MySQL database instance.

Discussion 0
Questions 31

A company runs its applications on both Amazon EKS clusters and on-premises Kubernetes clusters. The company wants to view all clusters and workloads from a central location.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon CloudWatch Container Insights to collect and group the cluster information.

B.  

Use Amazon EKS Connector to register and connect all Kubernetes clusters.

C.  

Use AWS Systems Manager to collect and view the cluster information.

D.  

Use Amazon EKS Anywhere as the primary cluster to view the other clusters with native Kubernetes commands.

Discussion 0
Questions 32

A company needs to use its on-premises LDAP directory service to authenticate its users to the AWS Management Console. The directory service is not compatible with Security Assertion Markup Language (SAML).

Which solution meets these requirements?

Options:

A.  

Enable AWS IAM Identity Center between AWS and the on-premises LDAP.

B.  

Create an IAM policy that uses AWS credentials, and integrate the policy into LDAP.

C.  

Set up a process that rotates the IAM credentials whenever LDAP credentials are updated.

D.  

Develop an on-premises custom identity broker application or process that uses AWS STS to get short-lived credentials.

Discussion 0
Questions 33

A company wants to visualize its AWS spend and resource usage. The company wants to use an AWS managed service to provide visual dashboards.

Which solution will meet these requirements?

Options:

A.  

Configure an export in AWS Data Exports. Use Amazon QuickSight to create a cost and usage dashboard. View the data in QuickSight.

B.  

Configure one custom budget in AWS Budgets for costs. Configure a second custom budget for usage. Schedule daily AWS Budgets reports by using the two budgets as sources.

C.  

Configure AWS Cost Explorer to use user-defined cost allocation tags with hourly granularity to generate detailed data.

D.  

Configure an export in AWS Data Exports. Use the standard export option. View the data in Amazon Athena.

Discussion 0
Questions 34

Question:

An ecommerce company hosts an API that handles sales requests. The company hosts the API frontend on Amazon EC2 instances that run behind an Application Load Balancer (ALB). The company hosts the API backend on EC2 instances that perform the transactions. The backend tiers are loosely coupled by an Amazon Simple Queue Service (Amazon SQS) queue.

The company anticipates a significant increase in request volume during a new product launch event. The company wants to ensure that the API can handle increased loads successfully.

Options:

Options:

A.  

Double the number of frontend and backend EC2 instances to handle the increased traffic during the product launch event. Create a dead-letter queue to retain unprocessed sales requests when the demand exceeds the system capacity.

B.  

Place the frontend EC2 instances into an Auto Scaling group. Create an Auto Scaling policy to launch new instances to handle the incoming network traffic.

C.  

Place the frontend EC2 instances into an Auto Scaling group. Add an Amazon ElastiCache cluster in front of the ALB to reduce the amount of traffic the API needs to handle.

D.  

Place the frontend and backend EC2 instances into separate Auto Scaling groups. Create a policy for the frontend Auto Scaling group to launch instances based on incoming network traffic. Create a policy for the backend Auto Scaling group to launch instances based on the SQS queue backlog.

Discussion 0
Questions 35

A company wants to use automatic machine learning (ML) to create and visualize forecasts of complex scenarios and trends.

Which solution will meet these requirements with the LEAST management overhead?

Options:

A.  

Use an AWS Glue ML job to transform the data and create forecasts. Use Amazon QuickSight to visualize the data.

B.  

Use Amazon QuickSight to visualize the data. Use ML-powered forecasting in QuickSight to create forecasts.

C.  

Use a prebuilt ML AMI from the AWS Marketplace to create forecasts. Use Amazon QuickSight to visualize the data.

D.  

Use Amazon SageMaker AI inference pipelines to create and update forecasts. Use Amazon QuickSight to visualize the combined data.

Discussion 0
Questions 36

A solutions architect needs to build a log storage solution for a client. The client has an application that produces user activity logs that track user API calls to the application. The application typically produces 50 GB of logs each day. The client needs a storage solution that makes the logs available for occasional querying and analytics.

Options:

A.  

Store user activity logs in an Amazon S3 bucket. Use Amazon Athena to perform queries and analytics.

B.  

Store user activity logs in an Amazon OpenSearch Service cluster. Use OpenSearch Dashboards to perform queries and analytics.

C.  

Store user activity logs in an Amazon RDS instance. Use an Open Database Connectivity (ODBC) connector to perform queries and analytics.

D.  

Store user activity logs in an Amazon CloudWatch Logs log group. Use CloudWatch Logs Insights to perform queries and analytics.

Discussion 0
Questions 37

A company runs a web application on Amazon EC2 instances. The application also uses an Amazon DynamoDB table. The application generates sporadic HTTP 500 errors. The DynamoDB table is operating in on-demand mode, and other applications use the table without any issues.

A solutions architect wants to resolve the HTTP 500 errors without disrupting the web application.

Which solution will meet these requirements?

Options:

A.  

Configure DynamoDB to support larger write requests for increased throughput.

B.  

Enable DynamoDB Streams to monitor changes in the table.

C.  

Configure the application to use exponential backoff and retries to query the table.

D.  

Configure the application to use strongly consistent reads.

Discussion 0
Questions 38

A company is redesigning its data intake process. In the existing process, the company receives data transfers and uploads the data to an Amazon S3 bucket every night. The company uses AWS Glue crawlers and jobs to prepare the data for a machine learning (ML) workflow.

The company needs a low-code solution to run multiple AWS Glue jobs in sequence and provide a visual workflow.

Which solution will meet these requirements?

Options:

A.  

Use an Amazon EC2 instance to run a cron job and a script to check for the S3 files and call the AWS Glue jobs. Create an Amazon CloudWatch dashboard to visualize the workflow.

B.  

Use Amazon EventBridge to call an AWS Step Functions workflow for the AWS Glue jobs. Use Step Functions to create a visual workflow.

C.  

Use S3 Event Notifications to invoke a series of AWS Lambda functions and AWS Glue jobs in sequence. Use Amazon QuickSight to create a visual workflow.

D.  

Create an Amazon Elastic Container Service (Amazon ECS) task that contains a Python script that manages the AWS Glue jobs and creates a visual workflow. Use Amazon EventBridge Scheduler to start the ECS task.

Discussion 0
Questions 39

A company wants to protect AWS-hosted resources, including Application Load Balancers and CloudFront distributions. They need near real-time visibility into attacks and a dedicated AWS response team for DDoS events.

Which AWS service meets these requirements?

Options:

A.  

AWS WAF

B.  

AWS Shield Standard

C.  

Amazon Macie

D.  

AWS Shield Advanced

Discussion 0
Questions 40

A company is creating a low-latency payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. Users must access the application from a single entry point.

The bank wants to use Amazon Elastic Container Service (Amazon ECS) tasks to deploy the application. The company wants to enable AWSVPC network mode.

Which solution will meet these requirements MOST securely?

Options:

A.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

B.  

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

C.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

D.  

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

Discussion 0
Questions 41

A company recently migrated its application to a VPC on AWS. An AWS Site-to-Site VPN connection connects the company ' s on-premises network to the VPC. The application retrieves customer data from another system that resides on premises. The application uses an on-premises DNS server to resolve domain records. After the migration, the application is not able to connect to the customer data because of name resolution errors.

Which solution will give the application the ability to resolve the internal domain names?

Options:

A.  

Launch EC2 instances in the VPC. On the EC2 instances, deploy a custom DNS forwarder that forwards all DNS requests to the on-premises DNS server. Create an Amazon Route 53 private hosted zone that uses the EC2 instances for name servers.

B.  

Create an Amazon Route 53 Resolver outbound endpoint. Configure the outbound endpoint to forward DNS queries against the on-premises domain to the on-premises DNS server.

C.  

Set up two AWS Direct Connect connections between the AWS environment and the on-premises network. Set up a link aggregation group (LAG) that includes the two connections. Change the VPC resolver address to point to the on-premises DNS server.

D.  

Create an Amazon Route 53 public hosted zone for the on-premises domain. Configure the network ACLs to forward DNS requests against the on-premises domain to the Route 53 public hosted zone.

Discussion 0
Questions 42

A company is using Amazon DocumentDB global clusters to support an ecommerce application. The application serves customers across multiple AWS Regions. To ensure business continuity, the company needs a solution to minimize downtime during maintenance windows or other disruptions.

Which solution will meet these requirements?

Options:

A.  

Regularly create manual snapshots of the DocumentDB instance in the primary Region.

B.  

Perform a managed failover to a secondary Region when needed.

C.  

Perform a failover to a replica DocumentDB instance within the primary Region.

D.  

Configure increased replication lag to manage cross-Region replication.

Discussion 0
Questions 43

How can trade data from DynamoDB be ingested into an S3 data lake for near real-time analysis?

Options:

A.  

Use DynamoDB Streams to invoke a Lambda function that writes to S3.

B.  

Use DynamoDB Streams to invoke a Lambda function that writes to Data Firehose, which writes to S3.

C.  

Enable Kinesis Data Streams on DynamoDB. Configure it to invoke a Lambda function that writes to S3.

D.  

Enable Kinesis Data Streams on DynamoDB. Use Data Firehose to write to S3.

Discussion 0
Questions 44

A company is developing a content sharing platform that currently handles 500 GB of user-generated media files. The company expects the amount of content to grow significantly in the future. The company needs a storage solution that can automatically scale, provide high durability, and allow direct user uploads from web browsers.

Options:

A.  

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume with Multi-Attach enabled.

B.  

Store the data in an Amazon Elastic File System (Amazon EFS) Standard file system.

C.  

Store the data in an Amazon S3 Standard bucket.

D.  

Store the data in an Amazon S3 Express One Zone bucket.

Discussion 0
Questions 45

A company wants to run a hybrid workload for data processing. The data needs to be accessed by on-premises applications for local data processing using an NFS protocol, and must also be accessible from the AWS Cloud for further analytics and batch processing.

Which solution will meet these requirements?

Options:

A.  

Use an AWS Storage Gateway file gateway to provide file storage to AWS, then perform analytics on this data in the AWS Cloud.

B.  

Use an AWS Storage Gateway tape gateway to copy the backup of the local data to AWS, then perform analytics on this data in the AWS Cloud.

C.  

Use an AWS Storage Gateway volume gateway in a stored volume configuration to regularly take snapshots of the local data, then copy the data to AWS.

D.  

Use an AWS Storage Gateway volume gateway in a cached volume configuration to back up all the local storage in the AWS Cloud, then perform analytics on this data in the cloud.

Discussion 0
Questions 46

A company wants to isolate its workloads by creating an AWS account for each workload. The company needs a solution that centrally manages networking components for the workloads. The solution also must create accounts with automatic security controls (guardrails).

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use AWS Control Tower to deploy accounts. Create a networking account that has a VPC with private subnets and public subnets. Use AWS Resource Access Manager (AWS RAM) to share the subnets with the workload accounts.

B.  

Use AWS Organizations to deploy accounts. Create a networking account that has a VPC with private subnets and public subnets. Use AWS Resource Access Manager (AWS RAM) to share the subnets with the workload accounts.

C.  

Use AWS Control Tower to deploy accounts. Deploy a VPC in each workload account. Configure each VPC to route through an inspection VPC by using a transit gateway attachment.

D.  

Use AWS Organizations to deploy accounts. Deploy a VPC in each workload account. Configure each VPC to route through an inspection VPC by using a transit gateway attachment.

Discussion 0
Questions 47

A company runs an application on EC2 instances that need access to RDS credentials stored in AWS Secrets Manager.

Which solution meets this requirement?

Options:

A.  

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the role access to the secret.

B.  

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the user access to the secret.

C.  

Create a resource-based policy for the secret. Use EC2 Instance Connect to access the secret.

D.  

Create an identity-based policy for the secret. Grant direct access to the EC2 instances.

Discussion 0
Questions 48

A company has resources across multiple AWS Regions and accounts. A new solutions architect needs to build a map of the workloads and their relationships but has no documentation from the previous employee.

Which solution will provide these details with the least operational effort?

Options:

A.  

Use AWS Systems Manager Inventory to generate a map from the detailed report.

B.  

Use AWS Step Functions to collect workload details and build diagrams manually.

C.  

Use Workload Discovery on AWS to generate architecture diagrams.

D.  

Use AWS X-Ray to view workload details and manually draw diagrams.

Discussion 0
Questions 49

A company has applications that run on Amazon EC2 instances in a VPC One of the applications needs to call the Amazon S3 API to store and read objects. According to the company ' s security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

Options:

A.  

Configure an S3 gateway endpoint.

B.  

Create an S3 bucket in a private subnet.

C.  

Create an S3 bucket in the same AWS Region as the EC2 instances.

D.  

Configure a NAT gateway in the same subnet as the EC2 instances

Discussion 0
Questions 50

A company processes large amounts of data by using Amazon EC2 instances in an Auto Scaling group. The data processing jobs run for up to 48 hours each week. The data processing jobs can handle interruptions. However, the company wants to minimize the interruptions.

The company wants to use the latest generation of Amazon EC2 instances each year.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.  

Purchase Convertible Reserved Instances (RIs) on an All Upfront basis for a 3-year term for the instance types currently in use.

B.  

Purchase Standard Reserved Instances (RIs) on an All Upfront basis for a 1-year term for the instance types in use.

C.  

Purchase Spot Instances with a price-capacity-optimized allocation strategy. Override instance types in the Auto Scaling group.

D.  

Purchase Spot Instances with a capacity-optimized allocation strategy. Override instance types in the Auto Scaling group.

Discussion 0
Questions 51

A company is using a loosely coupled serverless architecture on AWS. The architecture consists of multiple web applications and APIs distributed across multiple teams. The company uses AWS Control Tower to provision AWS accounts. The company ' s development teams use AWS CloudFormation.

The company wants to improve trace monitoring and gain insight into how individual services in application stacks are performing.

Which solution will meet these requirements?

Options:

A.  

Enable AWS CloudTrail across all accounts by using AWS Control Tower.

B.  

Enable AWS X-Ray across all accounts by using AWS Control Tower.

C.  

Enable Amazon CloudWatch in the CloudFormation templates.

D.  

Enable AWS X-Ray in the CloudFormation templates.

Discussion 0
Questions 52

How can a law firm make files publicly readable while preventing modifications or deletions until a specific future date?

Options:

A.  

Upload files to an Amazon S3 bucket configured for static website hosting. Grant read-only IAM permissions to any AWS principals.

B.  

Create an S3 bucket. Enable S3 Versioning. Use S3 Object Lock with a retention period. Create a CloudFront distribution. Use a bucket policy to restrict access.

C.  

Create an S3 bucket. Enable S3 Versioning. Configure an event trigger with AWS Lambda to restore modified objects from a private S3 bucket.

D.  

Upload files to an S3 bucket for static website hosting. Use S3 Object Lock with a retention period. Grant read-only IAM permissions.

Discussion 0
Questions 53

A company has an application that runs only on Amazon EC2 Spot Instances. The instances run in an Amazon EC2 Auto Scaling group with scheduled scaling actions. However, the capacity does not always increase at the scheduled times, and instances terminate many times a day. A solutions architect must ensure that the instances launch on time and have fewer interruptions.

Which action will meet these requirements?

Options:

A.  

Specify the capacity-optimized allocation strategy for Spot Instances. Add more instance types to the Auto Scaling group.

B.  

Specify the capacity-optimized allocation strategy for Spot Instances. Increase the size of the instances in the Auto Scaling group.

C.  

Specify the lowest-price allocation strategy for Spot Instances. Add more instance types to the Auto Scaling group.

D.  

Specify the lowest-price allocation strategy for Spot Instances. Increase the size of the instances in the Auto Scaling group.

Discussion 0
Questions 54

A company is designing an advertisement distribution application to run on AWS. The company wants to deploy the application as a container to Amazon Elastic Container Service (Amazon ECS).

Advertisements must be displayed to users around the world with low latency. The company needs to optimize data transfer costs.

Which solution will meet these requirements?

Options:

A.  

Deploy the application in a single AWS Region. Use an Application Load Balancer (ALB) to distribute traffic. Create an Amazon CloudFront distribution, and set the ALB as the origin.

B.  

Deploy the application in multiple AWS Regions. Create an Application Load Balancer (ALB) in each Region. Use Amazon Route 53 with a latency-based weighted routing policy to distribute traffic to the ALBs.

C.  

Deploy the application in multiple AWS Regions. Create an Application Load Balancer (ALB) in each Region. Create a transit gateway in each Region. Route traffic between the ALBs and Amazon ECS through the transit gateways.

D.  

Deploy the application in a single AWS Region. Use an Application Load Balancer (ALB) to distribute traffic. Create an accelerator in AWS Global Accelerator. Associate the accelerator with the ALB.

Discussion 0
Questions 55

A company has a large amount of data in an Amazon DynamoDB table. A large batch of data is appended to the table once each day. The company wants a solution that will make all the existing and future data in DynamoDB available for analytics on a long-term basis. Which solution meets these requirements with the LEAST operational overhead?

Options:

A.  

Configure DynamoDB incremental exports to Amazon S3.

B.  

Configure Amazon DynamoDB Streams to write records to Amazon S3.

C.  

Configure Amazon EMR to copy DynamoDB data to Amazon S3.

D.  

Configure Amazon EMR to copy DynamoDB data to Hadoop Distributed File System (HDFS).

Discussion 0
Questions 56

A company decides to use AWS Key Management Service (AWS KMS) for data encryption operations. The company must create a KMS key and automate the rotation of the key. The company also needs the ability to deactivate the key and schedule the key for deletion.

Which solution will meet these requirements?

Options:

A.  

Create an asymmetric customer managed KMS key. Enable automatic key rotation.

B.  

Create a symmetric customer managed KMS key. Disable the envelope encryption option.

C.  

Create a symmetric customer managed KMS key. Enable automatic key rotation.

D.  

Create an asymmetric customer managed KMS key. Disable the envelope encryption option.

Discussion 0
Questions 57

A company is developing a new online gaming application. The application will run on Amazon EC2 instances in multiple AWS Regions and will have a high number of globally distributed users. A solutions architect must design the application to optimize network latency for the users.

Which actions should the solutions architect take to meet these requirements? (Select TWO.)

Options:

A.  

Configure AWS Global Accelerator. Create Regional endpoint groups in each Region where an EC2 fleet is hosted.

B.  

Create a content delivery network (CDN) by using Amazon CloudFront. Enable caching for static and dynamic content, and specify a high expiration period.

C.  

Integrate AWS Client VPN into the application. Instruct users to select which Region is closest to them after they launch the application. Establish a VPN connection to that Region.

D.  

Create an Amazon Route 53 weighted routing policy. Configure the routing policy to give the highest weight to the EC2 instances in the Region that has the largest number of users.

E.  

Configure an Amazon API Gateway endpoint in each Region where an EC2 fleet is hosted. Instruct users to select which Region is closest to them after they launch the application. Use the API Gateway endpoint that is closest to them.

Discussion 0
Questions 58

A finance company is migrating its trading platform to AWS. The trading platform processes a high volume of market data and processes stock trades. The company needs to establish a consistent, low-latency network connection from its on-premises data center to AWS.

The company will host resources in a VPC. The solution must not use the public internet.

Which solution will meet these requirements?

Options:

A.  

Use AWS Client VPN to connect the on-premises data center to AWS.

B.  

Use AWS Direct Connect to set up a connection from the on-premises data center to AWS

C.  

Use AWS PrivateLink to set up a connection from the on-premises data center to AWS.

D.  

Use AWS Site-to-Site VPN to connect the on-premises data center to AWS.

Discussion 0
Questions 59

A law firm needs to make hundreds of files readable for the general public. The law firm must prevent members of the public from modifying or deleting the files before a specified future date. Which solution will meet these requirements MOST securely?

Options:

A.  

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Grant read-only IAM permissions to any AWS principals that access the S3 bucket until the specified date.

B.  

Create a new Amazon S3 bucket. Enable S3 Versioning. Use S3 Object Lock and set a retention period based on the specified date. Create an Amazon CloudFront distribution to serve content from the bucket. Use an S3 bucket policy to restrict access to the CloudFront origin access control (OAC).

C.  

Create a new Amazon S3 bucket. Enable S3 Versioning. Configure an event trigger to run an AWS Lambda function if a user modifies or deletes an object. Configure the Lambda function to replace the modified or deleted objects with the original versions of the objects from a private S3 bucket.

D.  

Upload the files to an Amazon S3 bucket that is configured for static website hosting. Select the folder that contains the files. Use S3 Object Lock with a retention period based on the specified date. Grant read-only IAM permissions to any AWS principals that access the S3 bucket.

Discussion 0
Questions 60

A company wants to create a payment processing application. The application must run when a payment record arrives in an existing Amazon S3 bucket. The application must process each payment record exactly once. The company wants to use an AWS Lambda function to process the payments.

Which solution will meet these requirements?

Options:

A.  

Configure the existing S3 bucket to send object creation events to Amazon EventBridge. Configure EventBridge to route events to an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Configure the Lambda function to run when a new event arrives in the SQS queue.

B.  

Configure the existing S3 bucket to send object creation events to an Amazon Simple Notification Service (Amazon SNS) topic. Configure the Lambda function to run when a new event arrives in the SNS topic.

C.  

Configure the existing S3 bucket to send object creation events to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the Lambda function to run when a new event arrives in the SQS queue.

D.  

Configure the existing S3 bucket to send object creation events directly to the Lambda function. Configure the Lambda function to handle object creation events and to process the payments.

Discussion 0
Questions 61

A company has 15 employees. The company stores employee start dates in an Amazon DynamoDB table. The company wants to send an email message to each employee on the day of the employee ' s work anniversary.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.  

Create a script that scans the DynamoDB table and uses Amazon Simple Notification Service (Amazon SNS) to send email messages to employees when necessary. Use a cron job to run this script every day on an Amazon EC2 instance.

B.  

Create a script that scans the DynamoDB table and uses Amazon Simple Queue Service {Amazon SQS) to send email messages to employees when necessary. Use a cron job to run this script every day on an Amazon EC2 instance.

C.  

Create an AWS Lambda function that scans the DynamoDB table and uses Amazon Simple Notification Service (Amazon SNS) to send email messages to employees when necessary. Schedule this Lambda function to run every day.

D.  

Create an AWS Lambda function that scans the DynamoDB table and uses Amazon Simple Queue Service (Amazon SQS) to send email messages to employees when necessary Schedule this Lambda function to run every day.

Discussion 0
Questions 62

An ecommerce company runs several internal applications in multiple AWS accounts. The company uses AWS Organizations to manage its AWS accounts.

A security appliance in the company ' s networking account must inspect interactions between applications across AWS accounts.

Which solution will meet these requirements?

Options:

A.  

Deploy a Network Load Balancer (NLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the NLB by using an interface VPC endpoint in the application accounts

B.  

Deploy an Application Load Balancer (ALB) in the application accounts to send traffic directly to the security appliance.

C.  

Deploy a Gateway Load Balancer (GWLB) in the networking account to send traffic to the security appliance. Configure the application accounts to send traffic to the GWLB by using an interface GWLB endpoint in the application accounts

D.  

Deploy an interface VPC endpoint in the application accounts to send traffic directly to the security appliance.

Discussion 0
Questions 63

A company runs an application on premises. The application stores files that the application servers process in a shared storage system. The company uses Linux file system permissions to control access to the files.

The company plans to migrate the application servers to Amazon EC2 instances across multiple Availability Zones. The company does not want to change the application code.

Which solution will meet these requirements?

Options:

A.  

Migrate the files to an Amazon S3 bucket. Use the S3 Intelligent-Tiering storage class. Mount the S3 bucket to the EC2 instances.

B.  

Migrate the files to a set of Amazon EC2 instance store volumes. Mount the instance store volumes to the EC2 instances.

C.  

Migrate the files to a set of Amazon EBS volumes. Mount the EBS volumes to the EC2 instances.

D.  

Migrate the files to an Amazon EFS file system. Mount the EFS file system to the EC2 instances.

Discussion 0
Questions 64

A media company hosts a web application on AWS for uploading videos. Only authenticated users should upload within a specified time frame after authentication.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure the application to generate IAM temporary security credentials for authenticated users.

B.  

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.  

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.  

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Discussion 0
Questions 65

A solutions architect is creating a website that will be hosted from an Amazon S3 bucket. The website must support secure browser connections (HTTPS).

Which combination of actions must the solutions architect take to meet this requirement? (Select TWO.)

Options:

A.  

Create an Elastic Load Balancing (ELB) load balancer. Configure the load balancer to direct traffic to the S3 bucket.

B.  

Create an Amazon CloudFront distribution. Set the S3 bucket as an origin.

C.  

Configure the Elastic Load Balancing (ELB) load balancer with an SSL/TLS certificate.

D.  

Configure the Amazon CloudFront distribution with an SSL/TLS certificate.

E.  

Configure the S3 bucket with an SSL/TLS certificate.

Discussion 0
Questions 66

A company runs its application on Oracle Database Enterprise Edition The company needs to migrate the application and the database to AWS. The company can use the Bring Your Own License (BYOL) model while migrating to AWS The application uses third-party database features that require privileged access.

A solutions architect must design a solution for the database migration.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Migrate the database to Amazon RDS for Oracle by using native tools. Replace the third-party features with AWS Lambda.

B.  

Migrate the database to Amazon RDS Custom for Oracle by using native tools Customize the new database settings to support the third-party features.

C.  

Migrate the database to Amazon DynamoDB by using AWS Database Migration Service {AWS DMS). Customize the new database settings to support the third-party features.

D.  

Migrate the database to Amazon RDS for PostgreSQL by using AWS Database Migration Service (AWS DMS). Rewrite the application code to remove the dependency on third-party features.

Discussion 0
Questions 67

A company hosts an application on AWS that uses an Amazon S3 bucket and an Amazon Aurora database. The company wants to implement a multi-Region disaster recovery (DR) strategy that minimizes potential data loss.

Which solution will meet these requirements?

Options:

A.  

Create an Aurora read replica in a second Availability Zone within the same AWS Region. Enable S3 Versioning for the bucket.

B.  

Create an Aurora read replica in a second AWS Region. Configure AWS Backup to create continuous backups of the S3 bucket to a second bucket in a second Availability Zone.

C.  

Enable Aurora native database backups across multiple AWS Regions. Use S3 cross-account backups within the company ' s local Region.

D.  

Migrate the database to an Aurora global database. Create a second S3 bucket in a second Region. Configure Cross-Region Replication.

Discussion 0
Questions 68

A healthcare provider is planning to store patient data on AWS as PDF files. To comply with regulations, the company must encrypt the data and store the files in multiple locations. The data must be available for immediate access from any environment.

Options:

A.  

Store the files in an Amazon S3 bucket. Use the Standard storage class. Enable server-side encryption with Amazon S3 managed keys (SSE-S3) on the bucket. Configure cross-Region replication on the bucket.

B.  

Store the files in an Amazon Elastic File System (Amazon EFS) volume. Use an AWS KMS managed key to encrypt the EFS volume. Use AWS DataSync to replicate the EFS volume to a second AWS Region.

C.  

Store the files in an Amazon Elastic Block Store (Amazon EBS) volume. Configure AWS Backup to back up the volume on a regular schedule. Use an AWS KMS key to encrypt the backups.

D.  

Store the files in an Amazon S3 bucket. Use the S3 Glacier Flexible Retrieval storage class. Ensure that all PDF files are encrypted by using client-side encryption before the files are uploaded. Configure cross-Region replication on the bucket.

Discussion 0
Questions 69

A media company hosts a web application on AWS. The application gives users the ability to upload and view videos. The application stores the videos in an Amazon S3 bucket. The company wants to ensure that only authenticated users can upload videos. Authenticated users must have the ability to upload videos only within a specified time frame after authentication. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure the application to generate IAM temporary security credentials for authenticated users.

B.  

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.  

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.  

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Discussion 0
Questions 70

A company has an on-premises MySQL database that handles transactional data. The company is migrating the database to the AWS Cloud. The migrated database must maintain compatibility with the company ' s applications that use the database. The migrated database also must scale automatically during periods of increased demand.

Which migration solution will meet these requirements?

Options:

A.  

Use native MySQL tools to migrate the database to Amazon RDS for MySQL. Configure elastic storage scaling.

B.  

Migrate the database to Amazon Redshift by using the mysqldump utility. Turn on Auto Scaling for the Amazon Redshift cluster.

C.  

Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon Aurora. Turn on Aurora Auto Scaling.

D.  

Use AWS Database Migration Service (AWS DMS) to migrate the database to Amazon DynamoDB. Configure an Auto Scaling policy.

Discussion 0
Questions 71

A company runs an application on Amazon EC2 instances behind an Application Load Balancer (ALB). The company uses Amazon Route 53 to route traffic to the ALB. The ALB is a resource in an AWS Shield Advanced protection group.

The company is preparing for a blue/green deployment in which traffic will shift to a new ALB. The company wants to protect against DDoS attacks during the deployment.

Which solution will meet this requirement?

Options:

A.  

Add the new ALB to the Shield Advanced protection group. Select Sum as the aggregation type for the volume of traffic for the whole group.

B.  

Add the new ALB to the Shield Advanced protection group. Select Mean as the aggregation type for the volume of traffic for the whole group.

C.  

Create a new Shield Advanced protection group. Add the new ALB to the new protection group. Select Sum as the aggregation type for the volume of traffic.

D.  

Set up an Amazon CloudFront distribution. Add the CloudFront distribution and the new ALB to the Shield Advanced protection group. Select Max as the aggregation type for the volume of traffic for the whole group.

Discussion 0
Questions 72

An ecommerce company runs a PostgreSQL database on an Amazon EC2 instance. The database stores data in Amazon Elastic Block Store (Amazon EBS) volumes. The daily peak input/output transactions per second (IOPS) do not exceed 15,000 IOPS. The company wants to migrate the database to Amazon RDS for PostgreSQL and to provision disk IOPS performance that is independent of disk storage capacity.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Configure General Purpose SSD (gp2) EBS volumes. Provision a 5 TiB volume.

B.  

Configure Provisioned IOPS SSD (io1) EBS volumes. Provision 15,000 IOPS.

C.  

Configure General Purpose SSD (gp3) EBS volumes. Provision 15,000 IOPS.

D.  

Configure magnetic EBS volumes to achieve maximum IOPS.

Discussion 0
Questions 73

A company ' s solutions architect is building a static website to be deployed in Amazon S3 for a production environment. The website integrates with an Amazon Aurora PostgreSQL database by using an AWS Lambda function. The website that is deployed to production will use a Lambda alias that points to a specific version of the Lambda function.

The company must rotate the database credentials every 2 weeks. Lambda functions that the company deployed previously must be able to use the most recent credentials.

Which solution will meet these requirements?

Options:

A.  

Store the database credentials in AWS Secrets Manager. Turn on rotation. Write code in the Lambda function to retrieve the credentials from Secrets Manager.

B.  

Include the database credentials as part of the Lambda function code. Update the credentials periodically and deploy the new Lambda function.

C.  

Use Lambda environment variables. Update the environment variables when new credentials are available.

D.  

Store the database credentials in AWS Systems Manager Parameter Store. Turn on rotation. Write code in the Lambda function to retrieve the credentials from Systems Manager Parameter Store.

Discussion 0
Questions 74

A company wants to migrate a visual search application from an on-premises environment to AWS. The application uses NFS storage to cache images. The image cache is currently a few terabytes in size. The company needs to migrate to a cost-effective cloud alternative.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.  

Use an Amazon ElastiCache (Memcached) cluster as the image cache. Set the cache TTL according to the required image lifetime in the cache.

B.  

Use compute-optimized Amazon EC2 instances with instance store volumes as the image cache. Recycle EC2 instances for cache invalidation.

C.  

Use an Amazon EFS One Zone file system as the image cache. Configure the application to use the EFS mount target.

D.  

Use Amazon S3 Express One Zone to store the images. Store the S3 object URLs in an Amazon DynamoDB table. Use DynamoDB TTL to invalidate image cache entries.

Discussion 0
Questions 75

A company ' s SAP application has a backend SQL Server database in an on-premises environment. The company wants to migrate its on-premises application and database server to AWS. The company needs an instance type that meets the high demands of its SAP database. On-premises performance data shows that both the SAP application and the database have high memory utilization.

Which solution will meet these requirements?

Options:

A.  

Use the compute optimized Instance family for the application Use the memory optimized instance family for the database.

B.  

Use the storage optimized instance family for both the application and the database

C.  

Use the memory optimized instance family for both the application and the database

D.  

Use the high performance computing (HPC) optimized instance family for the application. Use the memory optimized instance family for the database.

Discussion 0
Questions 76

A company is using an Amazon Elastic Kubernetes Service (Amazon EKS) cluster. The company must ensure that Kubernetes service accounts in the EKS cluster have secure and granular access to specific AWS resources by using IAM roles for service accounts (IRSA).

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.  

Create an IAM policy that defines the required permissions. Attach the policy directly to the IAM role of the EKS nodes.

B.  

Implement network policies within the EKS cluster to prevent Kubernetes service accounts from accessing specific AWS services.

C.  

Modify the EKS cluster ' s IAM role to include permissions for each Kubernetes service account. Ensure a one-to-one mapping between IAM roles and Kubernetes roles.

D.  

Define an IAM role that includes the necessary permissions. Annotate the Kubernetes service accounts with the Amazon Resource Name (ARN) of the IAM role.

E.  

Set up a trust relationship between the IAM roles for the service accounts and an OpenID Connect (OIDC) identity provider.

Discussion 0
Questions 77

A company needs to run its external website on Amazon EC2 instances and on-premises virtualized servers. The AWS environment has a 1 GB AWS Direct Connect connection to the data center. The application has IP addresses that will not change. The on-premises and AWS servers are able to restart themselves while maintaining the same IP address if a failure occurs. Some website users have to add their vendors to an allow list, so the solution must have a fixed IP address. The company needs a solution with the lowest operational overhead to handle this split traffic.

What should a solutions architect do to meet these requirements?

Options:

A.  

Deploy an Amazon Route 53 Resolver with rules pointing to the on-premises and AWS IP addresses.

B.  

Deploy a Network Load Balancer on AWS. Create target groups for the on-premises and AWS IP addresses.

C.  

Deploy an Application Load Balancer on AWS. Register the on-premises and AWS IP addresses with the target group.

D.  

Deploy Amazon API Gateway to direct traffic to the on-premises and AWS IP addresses based on the header of the request.

Discussion 0
Questions 78

A company has a multi-tier web application. The application ' s internal service components are deployed on Amazon EC2 instances. The internal service components need to access third-party software as a service (SaaS) APIs that are hosted on AWS.

The company needs to provide secure and private connectivity from the application ' s internal services to the third-party SaaS application. The company needs to ensure that there is minimal public internet exposure.

Which solution will meet these requirements?

Options:

A.  

Implement an AWS Site-to-Site VPN to establish a secure connection with the third-party SaaS provider.

B.  

Deploy AWS Transit Gateway to manage and route traffic between the application ' s VPC and the third-party SaaS provider.

C.  

Configure AWS PrivateLink to allow only outbound traffic from the VPC without enabling the third-party SaaS provider to establish a return path to the network.

D.  

Use AWS PrivateLink to create a private connection between the application ' s VPC and the third-party SaaS provider.

Discussion 0
Questions 79

A software company needs to upgrade a critical web application. The application is hosted in a public subnet. The EC2 instance runs a MySQL database. The application ' s DNS records are published in an Amazon Route 53 zone.

A solutions architect must reconfigure the application to be scalable and highly available. The solutions architect must also reduce MySQL read latency.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.  

Launch a second EC2 instance in a second AWS Region. Use a Route 53 failover routing policy to redirect the traffic to the second EC2 instance.

B.  

Create and configure an Auto Scaling group to launch private EC2 instances in multiple Availability Zones. Add the instances to a target group behind a new Application Load Balancer.

C.  

Migrate the database to an Amazon Aurora MySQL cluster. Create the primary DB instance and reader DB instance in separate Availability Zones.

D.  

Create and configure an Auto Scaling group to launch private EC2 instances in multiple AWS Regions. Add the instances to a target group behind a new Application Load Balancer.

E.  

Migrate the database to an Amazon Aurora MySQL cluster with cross-Region read replicas.

Discussion 0
Questions 80

A company has an ecommerce application that users access through multiple mobile apps and web applications. The company needs a solution that will receive requests from the mobile apps and web applications through an API.

Request traffic volume varies significantly throughout each day. Traffic spikes during sales events. The solution must be loosely coupled and ensure that no requests are lost.

Options:

A.  

Create an Application Load Balancer (ALB). Create an AWS Elastic Beanstalk endpoint to process the requests. Add the Elastic Beanstalk endpoint to the target group of the ALB.

B.  

Set up an Amazon API Gateway REST API with an integration to an Amazon Simple Queue Service (Amazon SQS) queue. Configure a dead-letter queue. Create an AWS Lambda function to poll the queue to process the requests.

C.  

Create an Application Load Balancer (ALB). Create an AWS Lambda function to process the requests. Add the Lambda function as a target of the ALB.

D.  

Set up an Amazon API Gateway HTTP API with an integration to an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to process the requests. Subscribe the function to the SNS topic to process the requests.

Discussion 0
Questions 81

A company needs a solution to ingest streaming sensor data from 100,000 devices, transform the data in near real time, and load the data into Amazon S3 for analysis. The solution must be fully managed, scalable, and maintain sub-second ingestion latency.

Options:

A.  

Use Amazon Kinesis Data Streams to ingest the data. Use Amazon Managed Service for Apache Flink to process the data in near real time. Use an Amazon Data Firehose stream to send processed data to Amazon S3.

B.  

Use Amazon Simple Queue Service (Amazon SQS) standard queues to collect the sensor data. Invoke AWS Lambda functions to transform and process SQS messages in batches. Configure the Lambda functions to use an AWS SDK to write transformed data to Amazon S3.

C.  

Deploy a fleet of Amazon EC2 instances that run Apache Kafka to ingest the data. Run Apache Spark on Amazon EMR clusters to process the data. Configure Spark to write processed data directly to Amazon S3.

D.  

Implement Amazon EventBridge to capture all sensor data. Use AWS Batch to run containerized transformation jobs on a schedule. Configure AWS Batch jobs to process data in chunks. Save results to Amazon S3.

Discussion 0
Questions 82

A company wants to use AWS Direct Connect to connect on-premises networks to AWS. The company runs many VPCs in a single Region and plans to scale to hundreds of VPCs.

Which service will simplify and scale the network architecture?

Options:

A.  

VPC endpoints

B.  

AWS Transit Gateway

C.  

Amazon Route 53

D.  

AWS Secrets Manager

Discussion 0
Questions 83

A company runs an application on an Amazon ECS cluster that uses AWS Fargate On-Demand capacity. The application cannot tolerate any sudden interruptions. The company wants to optimize costs for the application and ensure that the application remains operational.

Which solution will meet these requirements?

Options:

A.  

Create an On-Demand Capacity Reservation.

B.  

Purchase Convertible Reserved Instances.

C.  

Use Fargate Spot capacity instead of On-Demand capacity with a rolling update deployment type.

D.  

Purchase a Compute Savings Plan.

Discussion 0
Questions 84

An ecommerce company is redesigning a web application to run on the AWS Cloud. The application needs to store static website content and must use a Microsoft SQL Server database to store customer data. The company needs to deploy the application in a resilient way across multiple Availability Zones.

Which solution will meet these requirements?

Options:

A.  

Use an Amazon S3 bucket to store static content. Deploy an Amazon RDS Custom for SQL Server DB instance for the database.

B.  

Use an Amazon S3 bucket to store static content. Create an Amazon RDS for SQL Server Multi-AZ deployment for the database.

C.  

Create an Amazon Elastic Block Store (Amazon EBS) Multi-Attach volume to store static content. Deploy an Amazon RDS for SQL Server DB instance for the database.

D.  

Create an Amazon Elastic Block Store (Amazon EBS) Multi-Attach volume to store static content. Deploy SQL Server on two Amazon EC2 instances in separate Availability Zones.

Discussion 0
Questions 85

A company runs its critical storage application in the AWS Cloud. The application uses Amazon S3 in two AWS Regions. The company wants the application to send remote user data to the nearest S3 bucket with no public network congestion. The company also wants the application to fail over with the least amount of management of Amazon S3.

Which solution will meet these requirements?

Options:

A.  

Implement an active-active design between the two Regions. Configure the application to use the regional S3 endpoints closest to the user.

B.  

Use an active-passive configuration with S3 Multi-Region Access Points. Create a global endpoint for each of the Regions.

C.  

Send user data to the regional S3 endpoints closest to the user. Configure an S3 cross-account replication rule to keep the S3 buckets synchronized.

D.  

Set up Amazon S3 to use Multi-Region Access Points in an active-active configuration with a single global endpoint. Configure S3 Cross-Region Replication.

Discussion 0
Questions 86

A media streaming company is redesigning its infrastructure to accommodate increasing demand for video content that users consume daily. The company needs to process terabyte-sized videos to block some content in the videos. Video processing can take up to 20 minutes.

The company needs a solution that is cost-effective, highly available, and scalable.

Which solution will meet these requirements?

Options:

A.  

Use AWS Lambda functions to process the videos. Store video metadata in Amazon DynamoDB. Store video content in Amazon S3 Intelligent-Tiering.

B.  

Use Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type to implement microservices to process videos. Store video metadata in Amazon Aurora. Store video content in Amazon S3 Intelligent-Tiering.

C.  

Use Amazon EMR to process the videos with Apache Spark. Store video content in Amazon FSx for Lustre. Use Amazon Kinesis Data Streams to ingest videos in real time.

D.  

Deploy a containerized video processing application on Amazon Elastic Kubernetes Service (Amazon EKS) with the Amazon EC2 launch type. Store video metadata in Amazon RDS in a single Availability Zone. Store video content in Amazon S3 Glacier Deep Archive.

Discussion 0
Questions 87

A company is developing software that uses a PostgreSQL database schema. The company needs to configure development environments and test environments for its developers.

Each developer at the company uses their own development environment, which includes a PostgreSQL database. On average, each development environment is used for an 8-hour workday. The test environments will be used for load testing that can take up to 2 hours each day.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Configure development environments and test environments with their own Amazon Aurora Serverless v2 PostgreSQL database.

B.  

For each development environment, configure an Amazon RDS for PostgreSQL Single-AZ DB instance. For the test environment, configure a single Amazon RDS for PostgreSQL Multi-AZ DB instance.

C.  

Configure development environments and test environments with their own Amazon Aurora PostgreSQL DB cluster.

D.  

Configure an Amazon Aurora global database. Allow developers to connect to the database with their own credentials.

Discussion 0
Questions 88

A company runs an ecommerce platform with a monolithic architecture on Amazon EC2 instances. The platform runs web and API services. The company wants to decouple the architecture and enhance scalability. The company also wants the ability to track orders and reprocess any failed orders.

Which solution will meet these requirements?

Options:

A.  

Send orders to an Amazon Simple Queue Service (Amazon SQS) queue. Configure AWS Lambda functions to consume the queue and process orders. Implement an SQS dead-letter queue.

B.  

Send orders to an Amazon Simple Queue Service (Amazon SQS) queue. Configure Amazon Elastic Container Service (Amazon ECS) tasks to consume the queue. Implement SQS visibility timeout.

C.  

Use Amazon Kinesis Data Streams to queue orders. Use AWS Lambda functions to consume the data stream. Configure Amazon S3 to track and reprocess failed orders.

D.  

Send orders to an Amazon Simple Queue Service (Amazon SQS) queue. Configure AWS Lambda functions to consume the queue and process orders. Configure the Lambda functions to use SQS long polling.

Discussion 0
Questions 89

A company has an employee web portal. Employees log in to the portal to view payroll details. The company is developing a new system to give employees the ability to upload scanned documents for reimbursement. The company runs a program to extract text-based data from the documents and attach the extracted information to each employee ' s reimbursement IDs for processing.

The employee web portal requires 100% uptime. The document extract program runs infrequently throughout the day on an on-demand basis. The company wants to build a scalable and cost-effective new system that will require minimal changes to the existing web portal. The company does not want to make any code changes.

Which solution will meet these requirements with the LEAST implementation effort?

Options:

A.  

Run Amazon EC2 On-Demand Instances in an Auto Scaling group for the web portal. Use an AWS Lambda function to run the document extract program. Invoke the Lambda function when an employee uploads a new reimbursement document.

B.  

Run Amazon EC2 Spot Instances in an Auto Scaling group for the web portal. Run the document extract program on EC2 Spot Instances Start document extract program instances when an employee uploads a new reimbursement document.

C.  

Purchase a Savings Plan to run the web portal and the document extract program. Run the web portal and the document extract program in an Auto Scaling group.

D.  

Create an Amazon S3 bucket to host the web portal. Use Amazon API Gateway and an AWS Lambda function for the existing functionalities. Use the Lambda function to run the document extract program. Invoke the Lambda function when the API that is associated with a new document upload is called.

Discussion 0
Questions 90

A company is creating a mobile financial app that gives users the ability to sign up and store personal information. The app uses an Amazon DynamoDB table to store user details and preferences.

The app generates a credit score report by using the data that is stored in DynamoDB. The app sends credit score reports to users once every month.

The company needs to provide users with an option to remove their data and preferences. The app must delete customer data within one month of receiving a request to delete the data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an AWS Lambda function to delete user information. Create an Amazon EventBridge rule that runs when a specified TTL expires. Configure the EventBridge rule to invoke the Lambda function.

B.  

Create a DynamoDB stream. Create an AWS Lambda function to delete user information. When a specified TTL expires, write user information to the DynamoDB stream from the DynamoDB table. Configure the DynamoDB stream to invoke the Lambda function to delete user information.

C.  

Enable TTL in DynamoDB. Set the expiration date as an attribute. Create an AWS Lambda function to set the TTL based on the expiration date value. Invoke the Lambda function when a user requests to delete personal data.

D.  

Enable TTL in DynamoDB. Create an AWS Lambda function to delete user information. Configure AWS Config to detect the DynamoDB state change when TTL expires and to invoke the Lambda function.

Discussion 0
Questions 91

A company has an application that runs on Amazon EC2 instances and uses an Amazon Aurora database. The EC2 instances connect to the Aurora database by using user names and passwords that the company stores locally in a file.

The company changes the user names and passwords every month. The company wants to minimize the operational overhead of credential management.

Which solution will meet these requirements?

Options:

A.  

Store the credentials as a secret within AWS Secrets Manager. Assign IAM permissions to the secret. Reconfigure the application to call the secret. Enable rotation on the secret and configure rotation to occur on a monthly schedule.

B.  

Use AWS Systems Manager Parameter Store to create a new parameter for the credentials. Use IAM policies to restrict access to the parameter. Reconfigure the application to access the parameter.

C.  

Create an Amazon S3 bucket to store objects. Use an AWS Key Management Service (AWS KMS) key to encrypt the objects. Migrate the credentials file to the S3 bucket. Update the application to retrieve the credentials file from the S3 bucket.

D.  

Create an encrypted Amazon Elastic Block Store (Amazon EBS) volume for each EC2 instance. Attach the encrypted EBS volumes to the EC2 instances. Migrate the credentials file to the new EBS volumes.

Discussion 0
Questions 92

A company uses Amazon RDS for PostgreSQL databases for its data tier. The company must implement password rotation for the databases.

Which solution meets this requirement with the LEAST operational overhead?

Options:

A.  

Store the password in AWS Secrets Manager. Enable automatic rotation on the secret.

B.  

Store the password in AWS Systems Manager Parameter Store. Enable automatic rotation on the parameter.

C.  

Store the password in AWS Systems Manager Parameter Store. Write an AWS Lambda function that rotates the password.

D.  

Store the password in AWS Key Management Service (AWS KMS). Enable automatic rotation on the AWS KMS key.

Discussion 0
Questions 93

A company has stored millions of objects across multiple prefixes in an Amazon S3 bucket by using the Amazon S3 Glacier Deep Archive storage class. The company needs to delete all data older than 3 years except for a subset of data that must be retained. The company has identified the data that must be retained and wants to implement a serverless solution.

Which solution will meet these requirements?

Options:

A.  

Use S3 Inventory to list all objects. Use the AWS CLI to create a script that runs on an Amazon EC2 instance that deletes objects from the inventory list.

B.  

Use AWS Batch to delete objects older than 3 years except for the data that must be retained

C.  

Provision an AWS Glue crawler to query objects older than 3 years. Save the manifest file of old objects. Create a script to delete objects in the manifest.

D.  

Enable S3 Inventory. Create an AWS Lambda function to filter and delete objects. Invoke the Lambda function with S3 Batch Operations to delete objects by using the inventory reports.

Discussion 0
Questions 94

A data science team requires storage for nightly log processing. The size and number of logs is unknown and the logs will persist for 24 hours only.

What is the MOST cost-effective solution?

Options:

A.  

Amazon S3 Glacier Deep Archive

B.  

Amazon S3 Standard

C.  

Amazon S3 Intelligent-Tiering

D.  

Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

Discussion 0
Questions 95

A company has a large fleet of vehicles that are equipped with internet connectivity to send telemetry to the company. The company receives over 1 million data points every 5 minutes from the vehicles. The company uses the data in machine learning (ML) applications to predict vehicle maintenance needs and to preorder parts. The company produces visual reports based on the captured data. The company wants to migrate the telemetry ingestion, processing, and visualization workloads to AWS. Which solution will meet these requirements?

Options:

A.  

Use Amazon Timestream for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon QuickSight to visualize the data.

B.  

Use Amazon DynamoDB to store the data points. Use DynamoDB Connector to ingest data from DynamoDB into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.

C.  

Use Amazon Neptune to store the data points. Use Amazon Kinesis Data Streams to ingest data from Neptune into an AWS Lambda function for processing. Use Amazon QuickSight to visualize the data.

D.  

Use Amazon Timestream to for LiveAnalytics to store the data points. Grant Amazon SageMaker permission to access the data for processing. Use Amazon Athena to visualize the data.

Discussion 0
Questions 96

Question:

A company wants to deploy an internal web application on AWS. The web application must be accessible only from the company ' s office. The company needs to download security patches for the web application from the internet. The company has created a VPC and has configured an AWS Site-to-Site VPN connection to the company ' s office. A solutions architect must design a secure architecture for the web application. Which solution will meet these requirements?

Options:

Options:

A.  

Deploy the web application on Amazon EC2 instances in public subnets behind a public Application Load Balancer (ALB). Attach an internet gateway to the VPC. Set the inbound source of the ALB ' s security group to 0.0.0.0/0.

B.  

Deploy the web application on Amazon EC2 instances in private subnets behind an internal Application Load Balancer (ALB). Deploy NAT gateways in public subnets. Attach an internet gateway to the VPC. Set the inbound source of the ALB ' s security group to the company ' s office network CIDR block.

C.  

Deploy the web application on Amazon EC2 instances in public subnets behind an internal Application Load Balancer (ALB). Deploy NAT gateways in private subnets. Attach an internet gateway to the VP

C.  

Set the outbound destination of the ALB ' s security group to the company ' s office network CIDR block.

D.  

Deploy the web application on Amazon EC2 instances in private subnets behind a public Application Load Balancer (ALB). Attach an internet gateway to the VPC. Set the outbound destination of the ALB ' s security group to 0.0.0.0/0.

Discussion 0
Questions 97

How can a company detect and notify security teams about PII in S3 buckets?

Options:

A.  

Use Amazon Macie. Create an EventBridge rule for SensitiveData findings and send an SNS notification.

B.  

Use Amazon GuardDuty. Create an EventBridge rule for CRITICAL findings and send an SNS notification.

C.  

Use Amazon Macie. Create an EventBridge rule for SensitiveData:S3Object/Personal findings and send an SQS notification.

D.  

Use Amazon GuardDuty. Create an EventBridge rule for CRITICAL findings and send an SQS notification.

Discussion 0
Questions 98

A company is creating a payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. The application must allow users to access the application from a single entry point while maintaining the lowest possible attack surface.

The company wants to use Amazon ECS tasks to deploy the application. The company wants to enable awsvpc network mode.

Which solution will meet these requirements?

Options:

A.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer (NLB) and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

B.  

Create a VPC that has an egress-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer (ALB) and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

C.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer (ALB) in the public subnets. Deploy the ECS tasks in the public subnets.

D.  

Create a VPC that has an egress-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer (NLB) in the public subnets. Deploy the ECS tasks in the public subnets.

Discussion 0
Questions 99

A company is designing a solution to capture customer activity on the company ' s web applications. The company wants to analyze the activity data to make predictions.

Customer activity on the web applications is unpredictable and can increase suddenly. The company requires a solution that integrates with other web applications. The solution must include an authorization step.

Which solution will meet these requirements?

Options:

A.  

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Configure the applications to pass an authorization header to the GWLB.

B.  

Deploy an Amazon API Gateway endpoint in front of an Amazon Kinesis data stream. Store the data in an Amazon S3 bucket. Use an AWS Lambda function to handle authorization.

C.  

Deploy an Amazon API Gateway endpoint in front of an Amazon Data Firehose delivery stream. Store the data in an Amazon S3 bucket. Use an API Gateway Lambda authorizer to handle authorization.

D.  

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Use an AWS Lambda function to handle authorization.

Discussion 0
Questions 100

A company has an application that uses an Amazon DynamoDB table for storage. A solutions architect discovers that many requests to the table are not returning the latest data. The company’s users have not reported any other issues with database performance. Latency is in an acceptable range.

Which design change should the solutions architect recommend?

Options:

A.  

Add read replicas to the table.

B.  

Use a global secondary index (GSI).

C.  

Request strongly consistent reads for the table.

D.  

Request eventually consistent reads for the table.

Discussion 0
Questions 101

A company is moving data from an on-premises data center to the AWS Cloud. The company must store all its data in an Amazon S3 bucket. To comply with regulations, the company must also ensure that the data will be protected against overwriting indefinitely.

Which solution will ensure that the data in the S3 bucket cannot be overwritten?

Options:

A.  

Enable versioning for the S3 bucket. Use server-side encryption with Amazon S3 managed keys (SSE-S3) to protect the data.

B.  

Disable versioning for the S3 bucket. Configure S3 Object Lock for the S3 bucket with a retention period of 1 year.

C.  

Enable versioning for the S3 bucket. Configure S3 Object Lock for the S3 bucket with a legal hold.

D.  

Configure S3 Storage Lens for the S3 bucket. Use server-side encryption with customer-provided keys (SSE-C) to protect the data.

Discussion 0
Questions 102

A company hosts an application on AWS that gives users the ability to download photos. The company stores all photos in an Amazon S3 bucket that is located in the us-east-1 Region. The company wants to provide the photo download application to global customers with low latency.

Which solution will meet these requirements?

Options:

A.  

Find the public IP addresses that Amazon S3 uses in us-east-1. Configure an Amazon Route 53 latency-based routing policy that routes to all the public IP addresses.

B.  

Configure an Amazon CloudFront distribution in front of the S3 bucket. Use the distribution endpoint to access the photos that are in the S3 bucket.

C.  

Configure an Amazon Route 53 geoproximity routing policy to route the traffic to the S3 bucket that is closest to each customer ' s location.

D.  

Create a new S3 bucket in the us-west-1 Region. Configure an S3 Cross-Region Replication rule to copy the photos to the new S3 bucket.

Discussion 0
Questions 103

A company deploys its applications on Amazon Elastic Kubernetes Service (Amazon EKS) behind an Application Load Balancer in an AWS Region. The application needs to store data in a PostgreSQL database engine. The company wants the data in the database to be highly available. The company also needs increased capacity for read workloads.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.  

Create an Amazon DynamoDB database table configured with global tables.

B.  

Create an Amazon RDS database with Multi-AZ deployments

C.  

Create an Amazon RDS database with Multi-AZ DB cluster deployment.

D.  

Create an Amazon RDS database configured with cross-Region read replicas.

Discussion 0
Questions 104

A company is running a highly sensitive application on Amazon EC2 backed by an Amazon RDS database Compliance regulations mandate that all personally identifiable information (Pll) be encrypted at rest.

Which solution should a solutions architect recommend to meet this requirement with the LEAST amount of changes to the infrastructure?

Options:

A.  

Deploy AWS Certificate Manager to generate certificates Use the certificates to encrypt the database volume

B.  

Deploy AWS CloudHSM. generate encryption keys, and use the keys to encrypt database volumes.

C.  

Configure SSL encryption using AWS Key Management Service {AWS KMS) keys to encrypt database volumes.

D.  

Configure Amazon Elastic Block Store (Amazon EBS) encryption and Amazon RDS encryption with AWS Key Management Service (AWS KMS) keys to encrypt instance and database volumes.

Discussion 0
Questions 105

A company has migrated several applications to AWS in the past 3 months. The company wants to know the breakdown of costs for each of these applications. The company wants to receive a regular report that Includes this Information.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Use AWS Budgets to download data for the past 3 months into a csv file. Look up the desired information.

B.  

Load AWS Cost and Usage Reports into an Amazon RDS DB instance. Run SQL queries to gel the desired information.

C.  

Tag all the AWS resources with a key for cost and a value of the application ' s name. Activate cost allocation tags Use Cost Explorer to get the desired information.

D.  

Tag all the AWS resources with a key for cost and a value of the application ' s name. Use the AWS Billing and Cost Management console to download bills for the past 3 months. Look up the desired information.

Discussion 0
Questions 106

A company is building a stock trading application in the AWS Cloud. The company requires a highly available solution that provides low-latency access to block storage across multiple Availability Zones.

Options:

A.  

Use an Amazon S3 bucket and an S3 File Gateway as shared storage for the application.

B.  

Create an Amazon EC2 instance in each Availability Zone. Attach a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS) volume to each EC2 instance. Create a Bash script to sync data between volumes.

C.  

Use an Amazon FSx for NetApp ONTAP Multi-AZ file system to access data by using the iSCSI protocol.

D.  

Create an Amazon EC2 instance in each Availability Zone. Attach a Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volume to each EC2 instance. Create a Python script to sync data between volumes.

Discussion 0
Questions 107

A gaming company is building an application that uses a database to store user data. The company wants the database to have an active-active configuration that allows data writes to a secondary AWS Region. The database must achieve a sub-second recovery point objective (RPO).

Options:

Options:

A.  

Deploy an Amazon ElastiCache (Redis OSS) cluster. Configure a global data store for disaster recovery. Configure the ElastiCache cluster to cache data from an Amazon RDS database that is deployed in the primary Region.

B.  

Deploy an Amazon DynamoDB table in the primary Region and the secondary Region. Configure Amazon DynamoDB Streams to invoke an AWS Lambda function to write changes from the table in the primary Region to the table in the secondary Region.

C.  

Deploy an Amazon Aurora MySQL database in the primary Region. Configure a global database for the secondary Region.

D.  

Deploy an Amazon DynamoDB table in the primary Region. Configure global tables for the secondary Region.

Discussion 0
Questions 108

A company uses AWS Lake Formation to govern its S3 data lake. It wants to visualize data in QuickSight by joining S3 data with Aurora MySQL operational data. The marketing team must see only specific columns.

Which solution provides column-level authorization with the least operational overhead?

Options:

A.  

Use EMR to ingest database data into SPICE with only required columns.

B.  

Use AWS Glue Studio to ingest database data into S3 and use IAM policies for column control.

C.  

Use AWS Glue Elastic Views to create materialized S3 views with column restrictions.

D.  

Use a Lake Formation blueprint to ingest database data to S3. Use Lake Formation for column-level access control. Use Athena as the QuickSight data source.

Discussion 0
Questions 109

A company hosts an application on AWS that stores files that users need to access. The application uses two Amazon EC2 instances. One instance is in Availability Zone A, and the second instance is in Availability Zone B. Both instances use Amazon Elastic Block Store (Amazon EBS) volumes. Users must be able to access the files at any time without delay. Users report that the two instances occasionally contain different versions of the same file. Users occasionally receive HTTP 404 errors when they try to download files. The company must address the customer issues. The company cannot make changes to the application code. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.  

Run the robocopy command on one of the EC2 instances on a schedule to copy files from the Availability Zone A instance to the Availability Zone B instance.

B.  

Configure the application to store the files on both EBS volumes each time a user writes or updates a file.

C.  

Mount an Amazon Elastic File System (Amazon EFS) file system to the EC2 instances. Copy the files from the EBS volumes to the EFS file system. Configure the application to store files in the EFS file system.

D.  

Create an EC2 instance profile that allows the instance in Availability Zone A to access the S3 bucket. Re-associate the instance profile to the instance in Availability Zone B when needed.

Discussion 0
Questions 110

A company is migrating mobile banking applications to run on Amazon EC2 instances in a VPC. Backend service applications run in an on-premises data center. The data center has an AWS Direct Connect connection into AWS. The applications that run in the VPC need to resolve DNS requests to an on-premises Active Directory domain that runs in the data center.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Provision a set of EC2 instances across two Availability Zones in the VPC as caching DNS servers to resolve DNS queries from the application servers within the VPC.

B.  

Provision an Amazon Route 53 private hosted zone. Configure NS records that point to on-premises DNS servers.

C.  

Create DNS endpoints by using Amazon Route 53 Resolver. Add conditional forwarding rules to resolve DNS namespaces between the on-premises data center and the VP

C.  

D.  

Provision a new Active Directory domain controller in the VPC with a bidirectional trust between this new domain and the on-premises Active Directory domain.

Discussion 0
Questions 111

A company is using microservices to build an ecommerce application on AWS. The company wants to preserve customer transaction information after customers submit orders. The company wants to store transaction data in an Amazon Aurora database. The company expects sales volumes to vary throughout each year.

Options:

A.  

Use an Amazon API Gateway REST API to invoke an AWS Lambda function to send transaction data to the Aurora database. Send transaction data to an Amazon Simple Queue Service (Amazon SQS) queue that has a dead-letter queue. Use a second Lambda function to read from the SQS queue and to update the Aurora database.

B.  

Use an Amazon API Gateway HTTP API to send transaction data to an Application Load Balancer (ALB). Use the ALB to send the transaction data to Amazon Elastic Container Service (Amazon ECS) on Amazon EC2. Use ECS tasks to store the data in Aurora database.

C.  

Use an Application Load Balancer (ALB) to route transaction data to Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon EKS to send the data to the Aurora database.

D.  

Use Amazon Data Firehose to send transaction data to Amazon S3. Use AWS Database Migration Service (AWS DMS) to migrate the data from Amazon S3 to the Aurora database.

Discussion 0
Questions 112

A company wants to run big data workloads on Amazon EMR. The workloads need to process terabytes of data in memory.

A solutions architect needs to identify the appropriate EMR cluster instance configuration for the workloads.

Which solution will meet these requirements?

Options:

A.  

Use a storage optimized instance for the primary node. Use compute optimized instances for core nodes and task nodes.

B.  

Use a memory optimized instance for the primary node. Use storage optimized instances for core nodes and task nodes.

C.  

Use a general purpose instance for the primary node. Use memory optimized instances for core nodes and task nodes.

D.  

Use general purpose instances for the primary, core, and task nodes.

Discussion 0
Questions 113

A digital image processing company wants to migrate its on-premises monolithic application to the AWS Cloud. The company processes thousands of images and generates large files as part of the processing workflow.

The company needs a solution to manage the growing number of image processing jobs. The solution must also reduce the manual tasks in the image processing workflow. The company does not want to manage the underlying infrastructure of the solution.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 Spot Instances to process the images. Configure Amazon Simple Queue Service (Amazon SQS) to orchestrate the workflow. Store the processed files in Amazon Elastic File System (Amazon EFS)

B.  

Use AWS Batch jobs to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon S3 bucket.

C.  

Use AWS Lambda functions and Amazon EC2 Spot Instances lo process the images. Store the processed files in Amazon FSx.

D.  

Deploy a group of Amazon EC2 instances to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon Elastic Block Store (Amazon EBS) volume.

Discussion 0
Questions 114

A media company has an ecommerce website to sell music. Each music file is stored as an MP3 file. Premium users of the website purchase music files and download the files. The company wants to store music files on AWS. The company wants to provide access only to the premium users. The company wants to use the same URL for all premium users.

Which solution will meet these requirements?

Options:

A.  

Store the MP3 files on a set of Amazon EC2 instances that have Amazon Elastic Block Store (Amazon EBS) volumes attached. Manage access to the files by creating an IAM user and an IAM policy for each premium user.

B.  

Store all the MP3 files in an Amazon S3 bucket. Create a presigned URL for each MP3 file. Share the presigned URLs with the premium users.

C.  

Store all the MP3 files in an Amazon S3 bucket. Create an Amazon CloudFront distribution that uses the S3 bucket as the origin. Generate CloudFront signed cookies for the music files. Share the signed cookies with the premium users.

D.  

Store all the MP3 files in an Amazon S3 bucket. Create an Amazon CloudFront distribution that uses the S3 bucket as the origin. Use a CloudFront signed URL for each music file. Share the signed URLs with the premium users.

Discussion 0
Questions 115

A company is enhancing the security of its AWS environment, where the company stores a significant amount of sensitive customer data. The company needs a solution that automatically identifies and classifies sensitive data that is stored in multiple Amazon S3 buckets. The solution must automatically respond to data breaches and alert the company ' s security team through email immediately when noncompliant data is found.

Which solution will meet these requirements?

Options:

A.  

Use Amazon GuardDuty. Configure an AWS Lambda function to route alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe the security team to the SNS topic.

B.  

Use Amazon GuardDuty. Configure an AWS Lambda function to route alerts to an Amazon Simple Queue Service (Amazon SQS) queue. Configure a second Lambda function to periodically poll the SQS queue and to send emails to the security team by using Amazon Simple Email Service (Amazon SES).

C.  

Use Amazon Macie. Integrate Amazon EventBridge with Macie, and configure EventBridge to send alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe the security team to the SNS topic.

D.  

Use Amazon Macie. Integrate Amazon EventBridge with Macie, and configure EventBridge to route alerts to an Amazon Simple Queue Service (Amazon SQS) queue. Configure an AWS Lambda function to periodically poll the SQS queue and to send alerts to the security team by using Amazon Simple Email Service (Amazon SES).

Discussion 0
Questions 116

A company has a web application that uses several web servers that run on Amazon EC2 instances. The instances use a shared Amazon RDS for MySQL database.

The company requires a secure method to store database credentials. The credentials must be automatically rotated every 30 days without affecting application availability.

Which solution will meet these requirements?

Options:

A.  

Store database credentials in AWS Secrets Manager. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to access Secrets Manager.

B.  

Store database credentials in AWS Systems Manager OpsCenter. Grant the necessary IAM permissions to allow the web servers to access OpsCenter.

C.  

Store database credentials in an Amazon S3 bucket. Create an AWS Lambda function to automatically rotate the credentials. Use Amazon EventBridge to run the Lambda function on a schedule. Grant the necessary IAM permissions to allow the web servers to retrieve credentials from the S3 bucket.

D.  

Store the credentials in a local file on each of the web servers. Use an AWS KMS key to encrypt the credentials. Create a cron job on each server to rotate the credentials every 30 days.

Discussion 0
Questions 117

A company is creating a new application that will store a large amount of data. The data will be analyzed hourly and will be modified by several Amazon EC2 Linux instances that are deployed across multiple Availability Zones. The needed amount of storage space will continue to grow for the next 6 months.

Which storage solution should a solutions architect recommend to meet these requirements?

Options:

A.  

Store the data in Amazon S3 Glacier. Update the S3 Glacier vault policy to allow access to the application instances.

B.  

Store the data in an Amazon Elastic Block Store (Amazon EBS) volume. Mount the EBS volume on the application instances.

C.  

Store the data in an Amazon Elastic File System (Amazon EFS) file system. Mount the file system on the application instances.

D.  

Store the data in an Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS volume shared between the application instances.

Discussion 0
Questions 118

A company is moving a legacy data processing application to the AWS Cloud. The application needs to run on Amazon EC2 instances behind an Application Load Balancer (ALB).

The application must handle incoming traffic spikes and continue to work in the event of an application fault in one Availability Zone. The company requires that a Web Application Firewall (WAF) must be attached to the ALB.

Which solution will meet these requirements?

Options:

A.  

Deploy the application to EC2 instances in an Auto Scaling group that is in a single Availability Zone. Use an ALB to distribute traffic. Use AWS WAF.

B.  

Deploy the application to EC2 instances in an Auto Scaling group across multiple Availability Zones. Use an ALB to distribute traffic. Use AWS WAF.

C.  

Deploy the application to EC2 instances in Auto Scaling groups across multiple AWS Regions. Use Route 53 latency routing. Attach AWS WAF to Route 53.

D.  

Deploy the application to EC2 instances in an Auto Scaling group across multiple Availability Zones. Use a Network Load Balancer (NLB). Use AWS WAF.

Discussion 0
Questions 119

A company uses Amazon S3 to store customer data that contains personally identifiable information (PII) attributes. The company needs to make the customer information available to company resources through an AWS Glue Catalog. The company needs to have fine-grained access control for the data so that only specific IAM roles can access the PII data.

Options:

A.  

Create one IAM policy that grants access to PII. Create a second IAM policy that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

B.  

Create one IAM role that grants access to PII. Create a second IAM role that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

C.  

Use AWS Lake Formation to provide the specified IAM roles access to the PII data.

D.  

Use AWS Glue to create one view for PII data. Create a second view for non-PII data. Provide the specified IAM roles access to the PII view.

Discussion 0
Questions 120

A company is implementing a new policy to enhance the security of its AWS environment. The policy requires all administrative actions that users perform on the AWS Management Console to be secured by multi-factor authentication (MFA).

Which solution will allow the company to enforce this policy in the MOST operationally efficient way?

Options:

A.  

Enable MFA on the root account. Ensure that all administrators use the root account to perform administrative actions.

B.  

Create an IAM policy that requires MFA to be enabled for the IAM roles that administrators assume to perform administrative actions.

C.  

Configure an Amazon CloudWatch alarm that sends an email notification when an administrator performs an administrative action without MFA.

D.  

Use AWS Config to periodically audit IAM users and to automatically attach an IAM policy that requires MFA when AWS Config detects administrative actions.

Discussion 0
Questions 121

The DNS provider that hosts a company ' s domain name records is experiencing outages that cause service disruption for a website running on AWS. The company needs to migrate to a more resilient managed DNS service and wants the service to run on AWS.

What should a solutions architect do to rapidly migrate the DNS hosting service?

Options:

A.  

Create an Amazon Route 53 public hosted zone for the domain name. Import the zone file containing the domain records hosted by the previous provider.

B.  

Create an Amazon Route 53 private hosted zone for the domain name. Import the zone file containing the domain records hosted by the previous provider.

C.  

Create a Simple AD directory in AWS. Enable zone transfer between the DNS provider and AWS Directory Service for Microsoft Active Directory for the domain records.

D.  

Create an Amazon Route 53 Resolver inbound endpoint in the VPC. Specify the IP addresses that the provider ' s DNS will forward DNS queries to. Configure the provider ' s DNS to forward DNS queries for the domain to the IP addresses that are specified in the inbound endpoint.

Discussion 0
Questions 122

A company has developed a non-production application that is composed of multiple microservices for each of the company ' s business units. A single development team maintains all the microservices.

The current architecture uses a static web frontend and a Java-based backend that contains the application logic. The architecture also uses a MySQL database that the company hosts on an Amazon EC2 instance.

The company needs to ensure that the application is secure and available globally.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon CloudFront and AWS Amplify to host the static web frontend. Refactor the microservices to use AWS Lambda functions that the microservices access by using Amazon API Gateway. Migrate the MySQL database to an Amazon EC2 Reserved Instance.

B.  

Use Amazon CloudFront and Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that the microservices access by using Amazon API Gateway. Migrate the MySQL database to Amazon RDS for MySQL.

C.  

Use Amazon CloudFront and Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that are in a target group behind a Network Load Balancer. Migrate the MySQL database to Amazon RDS for MySQL.

D.  

Use Amazon S3 to host the static web frontend. Refactor the microservices to use AWS Lambda functions that are in a target group behind an Application Load Balancer. Migrate the MySQL database to an Amazon EC2 Reserved Instance.

Discussion 0
Questions 123

A company is developing a social media application that must scale to meet demand spikes and handle ordered processes.

Which AWS services meet these requirements?

Options:

A.  

ECS with Fargate, RDS, and SQS for decoupling.

B.  

ECS with Fargate, RDS, and SNS for decoupling.

C.  

DynamoDB, Lambda, DynamoDB Streams, and Step Functions.

D.  

Elastic Beanstalk, RDS, and SNS for decoupling.

Discussion 0
Questions 124

A company is building a serverless application to process large video files that users upload. The application performs multiple tasks to process each video file. Processing can take up to 30 minutes for the largest files.

The company needs a scalable architecture to support the processing application.

Which solution will meet these requirements?

Options:

A.  

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure a schedule in Amazon EventBridge Scheduler to invoke an AWS Lambda function periodically to check for new files. Configure the Lambda function to perform all the processing tasks.

B.  

Store the uploaded video files in Amazon Elastic File System (Amazon EFS). Configure an Amazon EFS event notification to start an AWS Step Functions workflow that uses AWS Fargate tasks to perform the processing tasks.

C.  

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to send an event to Amazon EventBridge when a user uploads a new video file. Configure an AWS Step Functions workflow as a target for an EventBridge rule. Use the workflow to manage AWS Fargate tasks to perform the processing tasks.

D.  

Store the uploaded video files in Amazon S3. Configure an Amazon S3 event notification to invoke an AWS Lambda function when a user uploads a new video file. Configure the Lambda function to perform all the processing tasks.

Discussion 0
Questions 125

A solutions architect is designing the architecture for a company website that is composed of static content. The company ' s target customers are located in the United States and Europe.

Which architecture should the solutions architect recommend to MINIMIZE cost?

Options:

A.  

Store the website files on Amazon S3 in the us-east-2 Region. Use an Amazon CloudFront distribution with the price class configured to limit the edge locations in use.

B.  

Store the website files on Amazon S3 in the us-east-2 Region. Use an Amazon CloudFront distribution with the price class configured to maximize the use of edge locations.

C.  

Store the website files on Amazon S3 in the us-east-2 Region and the eu-west-1 Region. Use an Amazon CloudFront geolocation routing policy to route requests to the closest Region to the user.

D.  

Store the website files on Amazon S3 in the us-east-2 Region and the eu-west-1 Region. Use an Amazon CloudFront distribution with an Amazon Route 53 latency routing policy to route requests to the closest Region to the user.

Discussion 0
Questions 126

A company hosts its application on several Amazon EC2 instances inside a VPC. The company creates a dedicated Amazon S3 bucket for each customer to store their relevant information in Amazon S3.

The company wants to ensure that the application running on EC2 instances can securely access only the S3 buckets that belong to the company ' s AWS account.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy to provide access to only the specific buckets that the application needs.

B.  

Create a NAT gateway in a public subnet with a security group that allows access to only Amazon S3 Update the route tables to use the NAT Gateway.

C.  

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy with a Deny action and the following condition key:

D.  

Create a NAT Gateway in a public subnet Update route tables to use the NAT Gateway Assign bucket policies for all buckets with a Deny action and the following condition key:

Discussion 0
Questions 127

A company hosts a web application in a VPC on AWS. A public Application Load Balancer (ALB) forwards connections from the internet to an Auto Scaling group of Amazon EC2 instances. The Auto Scaling group runs in private subnets across four Availability Zones.

The company stores data in an Amazon S3 bucket in the same Region. The EC2 instances use NAT gateways in each Availability Zone for outbound internet connectivity.

The company wants to optimize costs for its AWS architecture.

Which solution will meet this requirement?

Options:

A.  

Reconfigure the Auto Scaling group and the ALB to use two Availability Zones instead of four. Do not change the desired count or scaling metrics for the Auto Scaling group to maintain application availability.

B.  

Create a new, smaller VPC that still has sufficient IP address availability to run the application. Redeploy the application stack in the new VPC. Delete the existing VPC and its resources.

C.  

Deploy an S3 gateway endpoint to the VP

C.  

Configure the EC2 instances to access the S3 bucket through the S3 gateway endpoint.

D.  

Deploy an S3 interface endpoint to the VPC. Configure the EC2 instances to access the S3 bucket through the S3 interface endpoint.

Discussion 0
Questions 128

A company stores data for multiple business units in a single Amazon S3 bucket that is in the company ' s payer AWS account. To maintain data isolation, the business units store data in separate prefixes in the S3 bucket by using an S3 bucket policy.

The company plans to add a large number of dynamic prefixes. The company does not want to rely on a single S3 bucket policy to manage data access at scale. The company wants to develop a secure access management solution in addition to the bucket policy to enforce prefix-level data isolation.

Options:

A.  

Configure the S3 bucket policy to deny s3:GetObject permissions for all users. Configure the bucket policy to allow s3:* access to individual business units.

B.  

Enable default encryption on the S3 bucket by using server-side encryption with Amazon S3 managed keys (SSE-S3).

C.  

Configure resource-based permissions on the S3 bucket by creating an S3 access point for each business unit.

D.  

Use pre-signed URLs to provide access to the S3 bucket.

Discussion 0
Questions 129

A manufacturing company runs an order processing application in its VPC. The company wants to securely send messages from the application to an external Salesforce system that uses Open Authorization (OAuth).

A solutions architect needs to integrate the company ' s order processing application with the external Salesforce system.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon Simple Notification Service (Amazon SNS) topic in a fanout configuration that pushes data to an HTTPS endpoint. Configure the order processing application to publish messages to the SNS topic.

B.  

Create an Amazon Simple Notification Service (Amazon SNS) topic in a fanout configuration that pushes data to an Amazon Data Firehose delivery stream that has a HTTP destination. Configure the order processing application to publish messages to the SNS topic.

C.  

Create an Amazon EventBridge rule and configure an Amazon EventBridge API destination partner Configure the order processing application to publish messages to Amazon EventBridge.

D.  

Create an Amazon Managed Streaming for Apache Kafka (Amazon MSK) topic that has an outbound MSK Connect connector. Configure the order processing application to publish messages to the MSK topic.

Discussion 0
Questions 130

A company is storing data that will not be frequently accessed in the AWS Cloud. If the company needs to access the data, the data must be retrieved within 12 hours. The company wants a solution that is cost-effective for storage costs per gigabyte.

Which Amazon S3 storage class will meet these requirements?

Options:

A.  

S3 Standard

B.  

S3 Glacier Flexible Retrieval

C.  

S3 One Zone-Infrequent Access (S3 One Zone-IA)

D.  

S3 Standard-Infrequent Access (S3 Standard-IA)

Discussion 0
Questions 131

A solutions architect needs to implement a solution that can handle up to 5,000 messages per second. The solution must publish messages as events to multiple consumers. The messages are up to 500 KB in size. The message consumers need to have the ability to use multiple programming languages to consume the messages with minimal latency. The solution must retain published messages for more than 3 months. The solution must enforce strict ordering of the messages.

Options:

A.  

Publish messages to an Amazon Kinesis Data Streams data stream. Enable enhanced fan-out. Ensure that consumers ingest the data stream by using dedicated throughput.

B.  

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to subscribe to the topic.

C.  

Publish messages to Amazon EventBridge. Allow each consumer to create rules to deliver messages to the consumer ' s own target.

D.  

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use Amazon Data Firehose to subscribe to the topic.

Discussion 0
Questions 132

A company uses Amazon Route 53 as its DNS provider. The company hosts a website both on premises and in the AWS Cloud. The company ' s on-premises data center is near the us-west-1 Region. The company hosts the website on AWS in the eu-central-1 Region.

The company wants to optimize load times for the website as much as possible.

Which solution will meet these requirements?

Options:

A.  

Create a DNS record with a failover routing policy that routes all primary traffic to eu-central-1. Configure the routing policy to use the on-premises data center as the secondary location.

B.  

Create a DNS record with an IP-based routing policy. Configure specific IP ranges to return the value for the eu-central-1 website. Configure all other IP ranges to return the value for the on-premises website.

C.  

Create a DNS record with a latency-based routing policy. Configure one latency record for the eu-central-1 website and one latency record for the on-premises data center. Associate the record for the on-premises data center with the us-west-1 Region.

D.  

Create a DNS record with a weighted routing policy. Split the traffic evenly between eu-central-1 and the on-premises data center.

Discussion 0
Questions 133

A company is developing a highly available natural language processing (NLP) application. The application handles large volumes of concurrent requests. The application performs NLP tasks such as entity recognition, sentiment analysis, and key phrase extraction on text data.

The company needs to store data that the application processes in a highly available and scalable database.

Options:

Options:

A.  

Create an Amazon API Gateway REST API endpoint to handle incoming requests. Configure the REST API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Comprehend to perform NLP tasks on the text data. Store the processed data in Amazon DynamoDB.

B.  

Create an Amazon API Gateway HTTP API endpoint to handle incoming requests. Configure the HTTP API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Translate to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

C.  

Create an Amazon SQS queue to buffer incoming requests. Deploy the NLP application on Amazon EC2 instances in an Auto Scaling group. Use Amazon Comprehend to perform NLP tasks. Store the processed data in an Amazon RDS database.

D.  

Create an Amazon API Gateway WebSocket API endpoint to handle incoming requests. Configure the WebSocket API to invoke an AWS Lambda function for each request. Configure the Lambda function to call Amazon Textract to perform NLP tasks on the text data. Store the processed data in Amazon ElastiCache.

Discussion 0
Questions 134

A company has an on-premises SFTP file transfer solution. The company is migrating to the AWS Cloud to scale the file transfer solution and to optimize costs by using Amazon S3. The company ' s employees will use their credentials for the on-premises Microsoft Active Directory (AD) to access the new solution The company wants to keep the current authentication and file access mechanisms.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure an S3 File Gateway. Create SMB file shares on the file gateway that use the existing Active Directory to authenticate

B.  

Configure an Auto Scaling group with Amazon EC2 instances to run an SFTP solution Configure the group to scale up at 60% CPU utilization.

C.  

Create an AWS Transfer Family server with SFTP endpoints Choose the AWS Directory Service option as the identity provider Use AD Connector to connect the on-premises Active Directory.

D.  

Create an AWS Transfer Family SFTP endpoint. Configure the endpoint to use the AWS Directory Service option as the identity provider to connect to the existing Active Directory.

Discussion 0
Questions 135

A company needs an automated solution to detect cryptocurrency mining activity on Amazon EC2 instances. The solution must automatically isolate any identified EC2 instances for forensic analysis.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon EventBridge rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

B.  

Create an AWS Security Hub custom action that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the custom action to invoke an AWS Lambda function to isolate the identified EC2 instances.

C.  

Create an Amazon Inspector rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

D.  

Create an AWS Config custom rule that runs when AWS Config detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

Discussion 0
Questions 136

A company runs a three-tier web application in a VPC on AWS. The company deployed an Application Load Balancer (ALB) in a public subnet. The web tier and application tier Amazon EC2 instances are deployed in a private subnet. The company uses a self-managed MySQL database that runs on EC2 instances in an isolated private subnet for the database tier.

The company wants a mechanism that will give a DevOps team the ability to use SSH to access all the servers. The company also wants to have a centrally managed log of all connections made to the servers.

Which combination of solutions will meet these requirements with the MOST operational efficiency? (Select TWO.)

Options:

A.  

Create a bastion host in the public subnet. Configure security groups in the public, private, and isolated subnets to allow SSH access.

B.  

Create an interface VPC endpoint for AWS Systems Manager Session Manager. Attach the endpoint to the VPC.

C.  

Create an IAM policy that grants access to AWS Systems Manager Session Manager. Attach the IAM policy to the EC2 instances.

D.  

Create a gateway VPC endpoint for AWS Systems Manager Session Manager. Attach the endpoint to the VPC.

E.  

Attach an AmazonSSMManagedInstanceCore AWS managed IAM policy to all the EC2 instance roles.

Discussion 0
Questions 137

A gaming company is building an application with Voice over IP capabilities. The application will serve traffic to users across the world. The application needs to be highly available with automated failover across AWS Regions. The company wants to minimize the latency of users without relying on IP address caching on user devices.

What should a solutions architect do to meet these requirements?

Options:

A.  

Use AWS Global Accelerator with health checks.

B.  

Use Amazon Route 53 with a geolocation routing policy.

C.  

Create an Amazon CloudFront distribution that includes multiple origins.

D.  

Create an Application Load Balancer that uses path-based routing.

Discussion 0
Questions 138

A consulting company provides professional services to customers worldwide. The company provides solutions and tools for customers to expedite gathering and analyzing data on AWS. The company needs to centrally manage and deploy a common set of solutions and tools for customers to use for self-service purposes.

Which solution will meet these requirements?

Options:

A.  

Create AWS Cloud Formation templates for the customers.

B.  

Create AWS Service Catalog products for the customers.

C.  

Create AWS Systems Manager templates for the customers.

D.  

Create AWS Config items for the customers.

Discussion 0
Questions 139

A company has developed an API using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static and dynamic content to users worldwide. The company wants to decrease the latency of transferring content for API requests.

Options:

Options:

A.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

B.  

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

C.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

D.  

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

Discussion 0
Questions 140

A company is designing a web application with an internet-facing Application Load Balancer (ALB).

The company needs the ALB to receive HTTPS web traffic from the public internet. The ALB must send only HTTPS traffic to the web application servers hosted on the Amazon EC2 instances on port 443. The ALB must perform a health check of the web application servers over HTTPS on port 8443.

Which combination of configurations of the security group that is associated with the ALB will meet these requirements? (Select THREE.)

Options:

A.  

Allow HTTPS inbound traffic from 0.0.0.0/0 for port 443.

B.  

Allow all outbound traffic to 0.0.0.0/0 for port 443.

C.  

Allow HTTPS outbound traffic to the web application instances for port 443.

D.  

Allow HTTPS inbound traffic from the web application instances for port 443.

E.  

Allow HTTPS outbound traffic to the web application instances for the health check on port 8443.

F.  

Allow HTTPS inbound traffic from the web application instances for the health check on port 8443.

Discussion 0
Questions 141

A company is building a serverless application to process orders from an e-commerce site. The application needs to handle bursts of traffic during peak usage hours and to maintain high availability. The orders must be processed asynchronously in the order the application receives them.

Options:

A.  

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use an AWS Lambda function to process the orders.

B.  

Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to receive orders. Use an AWS Lambda function to process the orders.

C.  

Use an Amazon Simple Queue Service (Amazon SQS) standard queue to receive orders. Use AWS Batch jobs to process the orders.

D.  

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use AWS Batch jobs to process the orders.

Discussion 0
Questions 142

A solutions architect needs to design a system to process incoming work items immediately. Processing can take up to 30 minutes and involves calling external APIs, executing multiple states, and storing intermediate states.

The solution must scale with variable workloads and minimize operational overhead.

Which combination of steps meets these requirements? (Select TWO.)

Options:

A.  

Invoke an AWS Lambda function for each incoming work item. Configure each function to handle the work item completely. Store states in DynamoDB.

B.  

Invoke an AWS Step Functions workflow to process incoming work items. Use Lambda functions for business logic. Store work item states in DynamoD

B.  

C.  

Set up an API Gateway REST API to receive work items. Configure the API to invoke a Lambda function for each work item.

D.  

Deploy two EC2 Reserved Instances behind an ALB and send requests to an SQS queue.

E.  

Set up an API Gateway REST API to receive work items. Send the work items to an SQS queue.

Discussion 0
Questions 143

A company hosts an application in a private subnet. The company has already integrated the application with Amazon Cognito. The company uses an Amazon Cognito user pool to authenticate users.

The company needs to modify the application so the application can securely store user documents in an Amazon S3 bucket.

Which combination of steps will securely integrate Amazon S3 with the application? (Select TWO.)

Options:

A.  

Create an Amazon Cognito identity pool to generate secure Amazon S3 access tokens for users when they successfully log in.

B.  

Use the existing Amazon Cognito user pool to generate Amazon S3 access tokens for users when they successfully log in.

C.  

Create an Amazon S3 VPC endpoint in the same VPC where the company hosts the application.

D.  

Create a NAT gateway in the VPC where the company hosts the application. Assign a policy to the S3 bucket to deny any request that is not initiated from Amazon Cognito.

E.  

Attach a policy to the S3 bucket that allows access only from the users ' IP addresses.

Discussion 0
Questions 144

A solutions architect is designing the architecture for a two-tier web application. The web application consists of an internet-facing Application Load Balancer (ALB) that forwards traffic to an Auto Scaling group of Amazon EC2 instances.

The EC2 instances must be able to access an Amazon RDS database. The company does not want to rely solely on security groups or network ACLs. Only the minimum resources that are necessary should be routable from the internet.

Which network design meets these requirements?

Options:

A.  

Place the ALB, EC2 instances, and RDS database in private subnets.

B.  

Place the ALB in public subnets. Place the EC2 instances and RDS database in private subnets.

C.  

Place the ALB and EC2 instances in public subnets. Place the RDS database in private subnets.

D.  

Place the ALB outside the VPC. Place the EC2 instances and RDS database in private subnets.

Discussion 0
Questions 145

An ecommerce company experiences a surge in mobile application traffic every Monday at 8 AM during the company ' s weekly sales events. The application ' s backend uses an Amazon API Gateway HTTP API and AWS Lambda functions to process user requests. During peak sales periods, users report encountering TooManyRequestsException errors from the Lambda functions. The errors result in a degraded user experience. A solutions architect needs to design a scalable and resilient solution that minimizes the errors and ensures that the application ' s overall functionality remains unaffected.

Options:

A.  

Create an Amazon Simple Queue Service (Amazon SQS) queue. Send user requests to the SQS queue. Configure the Lambda function with provisioned concurrency. Set the SQS queue as the event source trigger.

B.  

Use AWS Step Functions to orchestrate and process user requests. Configure Step Functions to invoke the Lambda functions and to manage the request flow.

C.  

Create an Amazon Simple Notification Service (Amazon SNS) topic. Send user requests to the SNS topic. Configure the Lambda functions with provisioned concurrency. Subscribe the functions to the SNS topic.

D.  

Create an Amazon Simple Queue Service (Amazon SQS) queue. Send user requests to the SQS queue. Configure the Lambda functions with reserved concurrency. Set the SQS queue as the event source trigger for the functions.

Discussion 0
Questions 146

A company has a single AWS account that contains resources belonging to several teams. The company needs to identify the costs associated with each team. The company wants to use a tag named CostCenter to identify resources that belong to each team.

Options:

A.  

Tag all resources that belong to each team with the user-defined CostCenter tag.

B.  

Create a tag for each team, and set the value to CostCenter.

C.  

Activate the CostCenter tag to track cost allocation.

D.  

Configure AWS Billing and Cost Management to send monthly invoices to the company through email messages.

E.  

Set up consolidated billing in the existing AWS account.

Discussion 0
Questions 147

A developer is creating an ecommerce workflow in an AWS Step Functions state machine that includes an HTTP Task state. The task passes shipping information and order details to an endpoint.

The developer needs to test the workflow to confirm that the HTTP headers and body are correct and that the responses meet expectations.

Which solution will meet these requirements?

Options:

A.  

Use the TestState API to invoke only the HTTP Task. Set the inspection level to TRACE.

B.  

Use the TestState API to invoke the state machine. Set the inspection level to DEBUG.

C.  

Use the data flow simulator to invoke only the HTTP Task. View the request and response data.

D.  

Change the log level of the state machine to ALL. Run the state machine.

Discussion 0
Questions 148

A company has an application that uses a MySQL database that runs on an Amazon EC2 instance. The instance currently runs in a single Availability Zone. The company requires a fault-tolerant database solution that provides a recovery time objective (RTO) and a recovery point objective (RPO) of 2 minutes or less. Which solution will meet these requirements?

Options:

A.  

Migrate the MySQL database to Amazon RDS. Create a read replica in a second Availability Zone. Create a script that detects availability interruptions and promotes the read replica when needed.

B.  

Migrate the MySQL database to Amazon RDS for MySQL. Configure the new RDS for MySQL database to use a Multi-AZ deployment.

C.  

Create a second MySQL database in a second Availability Zone. Use native MySQL commands to sync the two databases every 2 minutes. Create a script that detects availability interruptions and promotes the second MySQL database when needed.

D.  

Create a copy of the EC2 instance that runs the MySQL database. Deploy the copy in a second Availability Zone. Create a Network Load Balancer. Add both instances as targets.

Discussion 0
Questions 149

A company is implementing a shared storage solution for a media application that the company hosts on AWS. The company needs the ability to use SMB clients to access stored data.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Create an AWS Storage Gateway Volume Gateway. Create a file share that uses the required client protocol. Connect the application server to the file share.

B.  

Create an AWS Storage Gateway Tape Gateway. Configure tapes to use Amazon S3. Connect the application server to the Tape Gateway.

C.  

Create an Amazon EC2 Windows instance. Install and configure a Windows file share role on the instance. Connect the application server to the file share.

D.  

Create an Amazon FSx for Windows File Server file system. Connect the application server to the file system.

Discussion 0
Questions 150

A company wants to enhance its ecommerce order-processing application that is deployed on AWS. The application must process each order exactly once without affecting the customer experience during unpredictable traffic surges.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon Simple Queue Service (Amazon SQS) FIFO queue. Put all the orders in the SQS queue. Configure an AWS Lambda function as the target to process the orders.

B.  

Create an Amazon Simple Notification Service (Amazon SNS) standard topic. Publish all the orders to the SNS standard topic. Configure the application as a notification target.

C.  

Create a flow by using Amazon AppFlow. Send the orders to the flow. Configure an AWS Lambda function as the target to process the orders.

D.  

Configure AWS X-Ray in the application to track the order requests. Configure the application to process the orders by pulling the orders from Amazon CloudWatch.

Discussion 0
Questions 151

A company wants to migrate applications from its on-premises servers to AWS. As a first step, the company is modifying and migrating a non-critical application to a single Amazon EC2 instance. The application will store information in an Amazon S3 bucket. The company needs to follow security best practices when deploying the application on AWS.

Which approach should the company take to allow the application to interact with Amazon S3?

Options:

A.  

Create an IAM role that has administrative access to AWS. Attach the role to the EC2 instance.

B.  

Create an IAM user. Attach the AdministratorAccess policy. Copy the generated access key and secret key. Within the application code, use the access key and secret key along with the AWS SDK to communicate with Amazon S3.

C.  

Create an IAM role that has the necessary access to Amazon S3. Attach the role to the EC2 instance.

D.  

Create an IAM user. Attach a policy that provides the necessary access to Amazon S3. Copy the generated access key and secret key. Within the application code, use the access key and secret key along with the AWS SDK to communicate with Amazon S3.

Discussion 0
Questions 152

A company sets up an organization in AWS Organizations that contains 10AWS accounts. A solutions architect must design a solution to provide access to the accounts for several thousand employees. The company has an existing identity provider (IdP). The company wants to use the existing IdP for authentication to AWS.

Which solution will meet these requirements?

Options:

A.  

Create IAM users for the employees in the required AWS accounts. Connect IAM users to the existing IdP. Configure federated authentication for the IAM users.

B.  

Set up AWS account root users with user email addresses and passwords that are synchronized from the existing IdP.

C.  

Configure AWS IAM Identity Center Connect IAM Identity Center to the existing IdP Provision users and groups from the existing IdP

D.  

Use AWS Resource Access Manager (AWS RAM) to share access to the AWS accounts with the users in the existing IdP.

Discussion 0
Questions 153

A company runs a production database on Amazon RDS for MySQL. The company wants to upgrade the database version for security compliance reasons. Because the database contains critical data, the company wants a quick solution to upgrade and test functionality without losing any data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an RDS manual snapshot. Upgrade to the new version of Amazon RDS for MySQL.

B.  

Use native backup and restore. Restore the data to the upgraded new version of Amazon RDS for MySQL.

C.  

Use AWS Database Migration Service (AWS DMS) to replicate the data to the upgraded new version of Amazon RDS for MySQL.

D.  

Use Amazon RDS Blue/Green Deployments to deploy and test production changes.

Discussion 0
Questions 154

A company runs a Microsoft Windows SMB file share on-premises to support an application. The company wants to migrate the application to AWS. The company wants to share storage across multiple Amazon EC2 instances.

Which solutions will meet these requirements with the LEAST operational overhead? (Select TWO.)

Options:

A.  

Create an Amazon Elastic File System (Amazon EFS) file system with elastic throughput.

B.  

Create an Amazon FSx for NetApp ONTAP file system.

C.  

Use Amazon Elastic Block Store (Amazon EBS) to create a self-managed Windows file share on the instances.

D.  

Create an Amazon FSx for Windows File Server file system.

E.  

Create an Amazon FSx for OpenZFS file system.

Discussion 0
Questions 155

A company operates a data lake in Amazon S3. The company wants to query and filter data directly in S3 without downloading objects.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Athena to query and filter the objects in Amazon S3.

B.  

Use Amazon EMR to process and filter the objects.

C.  

Use Amazon API Gateway to retrieve filtered results.

D.  

Use Amazon ElastiCache to cache the objects.

Discussion 0
Questions 156

A company wants to protect resources that the company hosts on AWS, including Application Load Balancers and Amazon CloudFront distributions.

The company wants an AWS service that can provide near real-time visibility into attacks on the company ' s resources. The service must also have a dedicated AWS team to assist with DDoS attacks.

Which AWS service will meet these requirements?

Options:

A.  

AWS WAF

B.  

AWS Shield Standard

C.  

Amazon Macie

D.  

AWS Shield Advanced

Discussion 0
Questions 157

A company runs an application on Amazon EC2 instances. The instances need to access an Amazon RDS database by using specific credentials. The company uses AWS Secrets Manager to contain the credentials the EC2 instances must use. Which solution will meet this requirement?

Options:

A.  

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the new IAM role access to the secret that contains the database credentials.

B.  

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the new IAM user access to the secret that contains the database credentials.

C.  

Create a resource-based policy for the secret that contains the database credentials. Use EC2 Instance Connect to access the secret.

D.  

Create an identity-based policy for the secret that contains the database credentials. Grant direct access to the EC2 instances.

Discussion 0
Questions 158

A company is implementing a new application on AWS. The company will run the application on multiple Amazon EC2 instances across multiple Availability Zones within multiple AWS Regions. The application will be available through the internet. Users will access the application from around the world.

The company wants to ensure that each user who accesses the application is sent to the EC2 instances that are closest to the user ' s location.

Which solution will meet these requirements?

Options:

A.  

Implement an Amazon Route 53 geolocation routing policy. Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.

B.  

Implement an Amazon Route 53 geoproximity routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.

C.  

Implement an Amazon Route 53 multivalue answer routing policy Use an internet-facing Application Load Balancer to distribute the traffic across all Availability Zones within the same Region.

D.  

Implement an Amazon Route 53 weighted routing policy. Use an internet-facing Network Load Balancer to distribute the traffic across all Availability Zones within the same Region.

Discussion 0
Questions 159

A company hosts a web application on Amazon EC2 instances that are part of an Auto Scaling group behind an Application Load Balancer (ALB). The application experiences spikes in requests that come through the ALB throughout each day. The traffic spikes last between 15 and 20 minutes.

The company needs a solution that uses a standard or custom metric to scale the EC2 instances based on the number of requests that come from the ALB.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Configure an Amazon CloudWatch alarm to monitor the ALB RequestCount metric. Configure a simple scaling policy to scale the EC2 instances in response to the metric.

B.  

Configure a predictive scaling policy based on the ALB RequestCount metric to scale the EC2 instances.

C.  

Configure an Amazon CloudWatch alarm to monitor the ALB UnhealthyHostCount metric. Configure a target tracking policy to scale the EC2 instances in response to the metric.

D.  

Create an Amazon CloudWatch alarm to monitor a user-defined metric for GET requests. Configure a target tracking policy threshold to scale the EC2 instances.

Discussion 0
Questions 160

An insurance company is creating an application to record personal user data. The data includes users’ names, ages, and health data. The company wants to run the application in a private subnet on AWS.

Because of data security requirements, the company must have access to the operating system of the compute resources that run the application tier. The company must use a low-latency NoSQL database to store the data.

Which solution will meet these requirements?

Options:

A.  

Use Amazon EC2 instances for the application tier. Use an Amazon DynamoDB table for the database tier. Create a VPC endpoint for DynamoDB. Assign the instances an instance profile that has permission to access DynamoDB.

B.  

Use AWS Lambda functions for the application tier. Use an Amazon DynamoDB table for the database tier. Assign a Lambda function an appropriate IAM role to access the table.

C.  

Use AWS Fargate for the application tier. Create an Amazon Aurora PostgreSQL instance inside a private subnet for the database tier.

D.  

Use Amazon EC2 instances for the application tier. Use an Amazon S3 bucket to store the data in JSON format. Configure the application to use Amazon Athena to read and write the data to and from the S3 bucket.

Discussion 0
Questions 161

A company is building an application on AWS that connects to an Amazon RDS database. The company wants to manage the application configuration and to securely store and retrieve credentials for the database and other services.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Use AWS AppConfig to store and manage the application configuration. Use AWS Secrets Manager to store and retrieve the credentials.

B.  

Use AWS Lambda to store and manage the application configuration. Use AWS Systems Manager Parameter Store to store and retrieve the credentials.

C.  

Use an encrypted application configuration file Store the file in Amazon S3 for the application configuration. Create another S3 file to store and retrieve the credentials.

D.  

Use AWS AppConfig to store and manage the application configuration. Use Amazon RDS to store and retrieve the credentials.

Discussion 0
Questions 162

A company ' s reporting system delivers hundreds of .csv files to an Amazon S3 bucket each day. The company must convert these files to Apache Parquet format and must store the files in a transformed data bucket.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.  

Create an Amazon EMR cluster with Apache Spark installed. Write a Spark application to transform the data. Use EMR File System (EMRFS) to write files to the transformed data bucket.

B.  

Create an AWS Glue crawler to discover the data. Create an AWS Glue extract, transform, and load (ETL) job to transform the data. Specify the transformed data bucket in the output step.

C.  

Use AWS Batch to create a job definition with Bash syntax to transform the data and output the data to the transformed data bucket. Use the job definition to submit a job. Specify an array job as the job type.

D.  

Create an AWS Lambda function to transform the data and output the data to the transformed data bucket. Configure an event notification for the S3 bucket. Specify the Lambda function as the destination for the event notification.

Discussion 0
Questions 163

A company wants to share data that is collected from self-driving cars with the automobile community. The data will be made available from within an Amazon S3 bucket. The company wants to minimize its cost of making this data available to other AWS accounts.

What should a solutions architect do to accomplish this goal?

Options:

A.  

Create an S3 VPC endpoint for the bucket.

B.  

Configure the S3 bucket to be a Requester Pays bucket.

C.  

Create an Amazon CloudFront distribution in front of the S3 bucket.

D.  

Require that the files be accessible only with the use of the BitTorrent protocol.

Discussion 0
Questions 164

A company wants to create an Amazon EMR cluster that multiple teams will use. The company wants to ensure that each team ' s big data workloads can access only the AWS services that each team needs to interact with. The company does not want the workloads to have access to Instance Metadata Service Version 2 (IMDSv2) on the cluster ' s underlying EC2 instances.

Which solution will meet these requirements?

Options:

A.  

Configure interface VPC endpoints for each AWS service that the teams need. Use the required interface VPC endpoints to submit the big data workloads.

B.  

Create EMR runtime roles. Configure the cluster to use the runtime roles. Use the runtime roles to submit the big data workloads.

C.  

Create an EC2 IAM instance profile that has the required permissions for each team. Use the instance profile to submit the big data workloads.

D.  

Create an EMR security configuration that has the EnableApplicationScoped IAM Role option set to false. Use the security configuration to submit the big data workloads.

Discussion 0
Questions 165

A company has a relational database workload that runs on Amazon Aurora MySQL. According to new compliance standards, the company must rotate all database credentials every 30 days. The company needs a solution that maximizes security and minimizes development effort.

Which solution will meet these requirements?

Options:

A.  

Store the database credentials in AWS Secrets Manager. Configure automatic credential rotation for every 30 days.

B.  

Store the database credentials in AWS Systems Manager Parameter Store. Create an AWS Lambda function to rotate the credentials every 30 days.

C.  

Store the database credentials in an environment file or in a configuration file. Modify the credentials every 30 days.

D.  

Store the database credentials in an environment file or in a configuration file. Create an AWS Lambda function to rotate the credentials every 30 days.

Discussion 0
Questions 166

A company receives data transfers from a small number of external clients that use SFTP software on an Amazon EC2 instance. The clients use an SFTP client to upload data. The clients use SSH keys for authentication. Every hour, an automated script transfers new uploads to an Amazon S3 bucket for processing.

The company wants to move the transfer process to an AWS managed service and to reduce the time required to start data processing. The company wants to retain the existing user management and SSH key generation process. The solution must not require clients to make significant changes to their existing processes.

Which solution will meet these requirements?

Options:

A.  

Reconfigure the script that runs on the EC2 instance to run every 15 minutes. Create an S3 Event Notifications rule for all new object creation events. Set an Amazon Simple Notification Service (Amazon SNS) topic as the destination.

B.  

Create an AWS Transfer Family SFTP server that uses the existing S3 bucket as a target. Use service-managed users to enable authentication.

C.  

Require clients to add the AWS DataSync agent into their local environments. Create an IAM user for each client that has permission to upload data to the target S3 bucket.

D.  

Create an AWS Transfer Family SFTP connector that has permission to access the target S3 bucket for each client. Store credentials in AWS Systems Manager. Create an IAM role to allow the SFTP connector to securely use the credentials.

Discussion 0
Questions 167

A company hosts its multi-tier, public web application in the AWS Cloud. The web application runs on Amazon EC2 instances, and its database runs on Amazon RDS. The company is anticipating a large increase in sales during an upcoming holiday weekend. A solutions architect needs to build asolution to analyze the performance of the web application with a granularity of no more than 2 minutes.

What should the solutions architect do to meet this requirement?

Options:

A.  

Send Amazon CloudWatch logs to Amazon Redshift. Use Amazon QuickSight to perform further analysis.

B.  

Enable detailed monitoring on all EC2 instances. Use Amazon CloudWatch metrics to perform further analysis.

C.  

Create an AWS Lambda function to fetch EC2 logs from Amazon CloudWatch Logs. Use Amazon CloudWatch metrics to perform further analysis.

D.  

Send EC2 logs to Amazon S3. Use Amazon Redshift to fetch togs from the S3 bucket to process raw data tor further analysis with Amazon QuickSight.

Discussion 0
Questions 168

A company wants to migrate its on-premises Oracle database to Amazon Aurora. The company wants to use a secure and encrypted network to transfer the data. Which combination of steps will meet these requirements? (Select TWO.)

Options:

A.  

Use AWS Application Migration Service to migrate the data.

B.  

Use AWS Schema Conversion Tool (AWS SCT) and AWS Database Migration Service (AWS DMS) to migrate the data.

C.  

Use AWS Direct Connect SiteLink to transfer data from the on-premises environment to AWS.

D.  

Use AWS Site-to-Site VPN to establish a connection to transfer the data from the on-premises environment to AWS.

E.  

Use AWS App2Container to migrate the data.

Discussion 0
Questions 169

A company is storing data in Amazon S3 buckets. The company needs to retain any objects that contain personally identifiable information (PII) that might need to be reviewed.

A solutions architect must develop an automated solution to identify objects that contain PII and apply the necessary controls to prevent deletion before review.

Which combination of steps should the solutions architect take to meet these requirements? (Select THREE.)

Options:

A.  

Create a job in Amazon Macie to scan the S3 buckets for the relevant sensitive data identifiers.

B.  

Move the identified objects to the S3 Glacier Deep Archive storage class.

C.  

Create an AWS Lambda function that performs an S3 Object Lock legal hold operation on the identified objects.

D.  

Create an AWS Lambda function that applies an S3 Object Lock retention period to the identified objects in governance mode.

E.  

Create an Amazon EventBridge rule that invokes the AWS Lambda function when Amazon Macie detects sensitive data.

F.  

Configure multi-factor authentication (MFA) delete on the S3 buckets.

Discussion 0
Questions 170

A company is hosting multiple websites for several lines of business under its registered parent domain. Users accessing these websites will be routed to appropriate backend Amazon EC2instances based on the subdomain. The websites host static webpages, images, and server-side scripts like PHP and JavaScript.

Some of the websites experience peak access during the first two hours of business with constant usage throughout the rest of the day. A solutions architect needs to design a solution that will automatically adjust capacity to these traffic patterns while keeping costs low.

Which combination of AWS services or features will meet these requirements? (Select TWO.)

Options:

A.  

AWS Batch

B.  

Network Load Balancer

C.  

Application Load Balancer

D.  

Amazon EC2 Auto Scaling

E.  

Amazon S3 website hosting

Discussion 0
Questions 171

A company has a static website that is hosted on Amazon CloudFront in front of Amazon S3. The static website uses a database backend. The company notices that the website does not reflect updates that have been made in the website ' s Git repository. The company checks the continuous integration and continuous delivery (CI/CD) pipeline between the Git repository and Amazon S3. The company verifies that the webhooks are configured properly and that the CI/CD pipeline Is sending messages that indicate successful deployments.

A solutions architect needs to implement a solution that displays the updates on the website.

Which solution will meet these requirements?

Options:

A.  

Add an Application Load Balancer.

B.  

Add Amazon ElastiCache for Redis or Memcached to the database layer of the web application.

C.  

Invalidate the CloudFront cache.

D.  

Use AWS Certificate Manager (ACM) to validate the website ' s SSL certificate.

Discussion 0
Questions 172

A company hosts an application on AWS. The application gives users the ability to upload photos and store the photos in an Amazon S3 bucket. The company wants to use Amazon CloudFront and a custom domain name to upload the photo files to the S3 bucket in the eu-west-1 Region.

Which solution will meet these requirements? (Select TWO.)

Options:

A.  

Use AWS Certificate Manager (ACM) to create a public certificate in the us-east-1 Region. Use the certificate in CloudFront

B.  

Use AWS Certificate Manager (ACM) to create a public certificate in eu-west-1. Use the certificate in CloudFront.

C.  

Configure Amazon S3 to allow uploads from CloudFront. Configure S3 Transfer Acceleration.

D.  

Configure Amazon S3 to allow uploads from CloudFront origin access control (OAC).

E.  

Configure Amazon S3 to allow uploads from CloudFront. Configure an Amazon S3 website endpoint.

Discussion 0
Questions 173

A company runs production workloads in its AWS account. Multiple teams create and maintain the workloads.

The company needs to be able to detect changes in resource configurations. The company needs to capture changes as configuration items without changing or modifying the existing resources.

Which solution will meet these requirements?

Options:

A.  

Use AWS Config. Start the configuration recorder for AWS resources to detect changes in resource configurations.

B.  

Use AWS CloudFormation. Initiate drift detection to capture changes in resource configurations.

C.  

Use Amazon Detective to detect, analyze, and investigate changes in resource configurations.

D.  

Use AWS Audit Manager to capture management events and global service events for resource configurations.

Discussion 0
Questions 174

A security audit reveals that Amazon EC2 instances are not being patched regularly. A solutions architect needs to provide a solution that will run regular security scans across a large fleet of EC2 instances. The solution should also patch the EC2 instances on a regular schedule and provide a report of each instance ' s patch status.

Which solution will meet these requirements?

Options:

A.  

Set up Amazon Macie to scan the EC2 instances for software vulnerabilities. Set up a cron job on each EC2 instance to patch the instance on a regular schedule.

B.  

Turn on Amazon GuardDuty in the account. Configure GuardDuty to scan the EC2 instances for software vulnerabilities. Set up AWS Systems Manager Session Manager to patch the EC2 instances on a regular schedule.

C.  

Set up Amazon Detective to scan the EC2 instances for software vulnerabilities. Set up an Amazon EventBridge scheduled rule to patch the EC2 instances on a regular schedule.

D.  

Turn on Amazon Inspector in the account. Configure Amazon Inspector to scan the EC2 instances for software vulnerabilities. Set up AWS Systems Manager Patch Manager to patch the EC2 instances on a regular schedule.

Discussion 0
Questions 175

A company uses Amazon FSx for NetApp ONTAP in its primary AWS Region for CIFS and NFS file shares. Applications that run on Amazon EC2 instances access the file shares. The company needs a storage disaster recovery (DR) solution in a secondary Region. The data that is replicated in the secondary Region needs to be accessed by using the same protocols as the primary Region.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an AWS Lambda function to copy the data to an Amazon S3 bucket. Replicate the S3 bucket to the secondary Region.

B.  

Create a backup of the FSx for ONTAP volumes by using AWS Backup. Copy the volumes to the secondary Region. Create a new FSx for ONTAP instance from the backup.

C.  

Create an FSx for ONTAP instance in the secondary Region. Use NetApp SnapMirror to replicate data from the primary Region to the secondary Region.

D.  

Create an Amazon EFS volume. Migrate the current data to the volume. Replicate the volume to the secondary Region.

Discussion 0
Questions 176

A company has an application that uses an Amazon DynamoDB table for storage. A solutions architect discovers that many requests to the table are not returning the latest data.

Users have not reported latency or performance issues.

Which design change should the solutions architect recommend?

Options:

A.  

Add read replicas to the table.

B.  

Use a global secondary index (GSI).

C.  

Request strongly consistent reads for the table.

D.  

Request eventually consistent reads for the table.

Discussion 0
Questions 177

How can DynamoDB data be made available for long-term analytics with minimal operational overhead?

Options:

A.  

Configure DynamoDB incremental exports to S3.

B.  

Configure DynamoDB Streams to write records to S3.

C.  

Configure EMR to copy DynamoDB data to S3.

D.  

Configure EMR to copy DynamoDB data to HDFS.

Discussion 0
Questions 178

A company needs to ensure that an IAM group that contains database administrators can perform operations only within Amazon RDS. The company must ensure that the members of the IAM group cannot access any other AWS services.

Options:

A.  

Create an IAM policy that includes a statement that has the Effect " Allow " and the Action " rds: " . Attach the IAM policy to the IAM group.

B.  

Create an IAM policy that includes two statements. Configure the first statement to have the Effect " Allow " and the Action " rds: " . Configure the second statement to have the Effect " Deny " and the Action " " . Attach the IAM policy to the IAM group.

C.  

Create an IAM policy that includes a statement that has the Effect " Deny " and the NotAction " rds: " . Attach the IAM policy to the IAM group.

D.  

Create an IAM policy with a statement that includes the Effect " Allow " and the Action " rds: " . Include a permissions boundary that has the Effect " Allow " and the Action " rds: " . Attach the IAM policy to the IAM group.

Discussion 0
Questions 179

A solutions architect needs to secure an Amazon API Gateway REST API. Users need to be able to log in to the API by using common external social identity providers (IdPs). The social IdPs must use standard authentication protocols such as SAML or OpenID Connect (OIDC). The solutions architect needs to protect the API against attempts to exploit application vulnerabilities.

Which combination of steps will meet these security requirements? (Select TWO.)

Options:

A.  

Create an AWS WAF web ACL that is associated with the REST API. Add the appropriate managed rules to the ACL.

B.  

Subscribe to AWS Shield Advanced. Enable DDoS protection. Associate Shield Advanced with the REST API.

C.  

Create an Amazon Cognito user pool with a federation to the social IdPs. Integrate the user pool with the REST API.

D.  

Create an API key in API Gateway. Associate the API key with the REST API.

E.  

Create an IP address filter in AWS WAF that allows only the social IdPs. Associate the filter with the web ACL and the API.

Discussion 0
Questions 180

A company is building a data analysis platform on AWS by using AWS Lake Formation. The platform will ingest data from different sources such as Amazon S3 and Amazon RDS. The company needs a secure solution to prevent access to portions of the data that contain sensitive information.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an IAM role that includes permissions to access Lake Formation tables.

B.  

Create data filters to implement row-level security and cell-level security.

C.  

Create an AWS Lambda function that removes sensitive information before Lake Formation ingests the data.

D.  

Create an AWS Lambda function that periodically queries and removes sensitive information from Lake Formation tables.

Discussion 0
Questions 181

A social media company wants to store its database of user profiles, relationships, and interactions in the AWS Cloud. The company needs an application to monitor any changes in the database. The application needs to analyze the relationships between the data entities and to provide recommendations to users.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon Neptune to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

B.  

Use Amazon Neptune to store the information. Use Neptune Streams to process changes in the database.

C.  

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

D.  

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Neptune Streams to process changes in the database.

Discussion 0
Questions 182

A global company runs its workloads on AWS The company ' s application uses Amazon S3 buckets across AWS Regions for sensitive data storage and analysis. The company stores millions of objects in multiple S3 buckets daily. The company wants to identify all S3 buckets that are not versioning-enabled.

Which solution will meet these requirements?

Options:

A.  

Set up an AWS CloudTrail event that has a rule to identify all S3 buckets that are not versioning-enabled across Regions

B.  

Use Amazon S3 Storage Lens to identify all S3 buckets that are not versioning-enabled across Regions.

C.  

Enable IAM Access Analyzer for S3 to identify all S3 buckets that are not versioning-enabled across Regions

D.  

Create an S3 Multi-Region Access Point to identify all S3 buckets that are not versioning-enabled across Regions

Discussion 0
Questions 183

A company discovers that an Amazon DynamoDB Accelerator (DAX) cluster for the company ' s web application workload is not encrypting data at rest. The company needs to resolve thesecurity issue.

Which solution will meet this requirement?

Options:

A.  

Stop the existing DAX cluster. Enable encryption at rest for the existing DAX cluster, and start the cluster again.

B.  

Delete the existing DAX cluster. Recreate the DAX cluster, and configure the new cluster to encrypt the data at rest.

C.  

Update the configuration of the existing DAX cluster to encrypt the data at rest.

D.  

Integrate the existing DAX cluster with AWS Security Hub to automatically enable encryption at rest.

Discussion 0
Questions 184

A company is building a data processing application that uses AWS Lambda functions. The Lambda functions need to communicate with an Amazon RDS DB instance deployed inside a VPC in the same AWS account.

Which solution meets these requirements in the most secure way?

Options:

A.  

Configure the DB instance for public access. Allow Lambda public address space.

B.  

Deploy Lambda inside the VPC. Attach a network ACL allowing outbound access to the VPC CIDR. Update the DB security group to allow traffic from 0.0.0.0/0.

C.  

Deploy Lambda inside the VP

C.  

Attach a security group to the Lambda functions. Allow outbound access only to the VPC CIDR. Update the DB instance security group to allow traffic from the Lambda security group.

D.  

Peer the Lambda default VPC with the DB VPC and avoid security groups.

Discussion 0
Questions 185

A company wants to release a new device that will collect data to track overnight sleep on an intelligent mattress. Sensors will send data that will be uploaded to an Amazon S3 bucket. Each mattress generates about 2 MB of data each night.

An application must process the data and summarize the data for each user. The application must make the results available as soon as possible. Every invocation of the application will require about 1 GB of memory and will finish running within 30 seconds.

Which solution will run the application MOST cost-effectively?

Options:

A.  

AWS Lambda with a Python script

B.  

AWS Glue with a Scala job

C.  

Amazon EMR with an Apache Spark script

D.  

AWS Glue with a PySpark job

Discussion 0
Questions 186

A company is building a new furniture inventory application. The company has deployed the application on a fleet of Amazon EC2 instances across multiple Availability Zones. The EC2 instances run behind an Application Load Balancer (ALB) in their VPC.

A solutions architect has observed that incoming traffic seems to favor one EC2 instance, resulting in latency for some requests.

What should the solutions architect do to resolve this issue?

Options:

A.  

Disable session affinity (sticky sessions) on the ALB.

B.  

Replace the ALB with a Network Load Balancer.

C.  

Increase the number of EC2 instances in each Availability Zone.

D.  

Adjust the frequency of the health checks on the ALB ' s target group.

Discussion 0
Questions 187

A company is developing a monolithic Microsoft Windows based application that will run on Amazon EC2 instances. The application will run long data-processing jobs that must not be in-terrupted. The company has modeled expected usage growth for the next 3 years. The company wants to optimize costs for the EC2 instances during the 3-year growth period.

Options:

A.  

Purchase a Compute Savings Plan with a 3-year commitment. Adjust the hourly commit-ment based on the plan recommendations.

B.  

Purchase an EC2 Instance Savings Plan with a 3-year commitment. Adjust the hourly com-mitment based on the plan recommendations.

C.  

Purchase a Compute Savings Plan with a 1-year commitment. Renew the purchase and adjust the capacity each year as necessary.

D.  

Deploy the application on EC2 Spot Instances. Use an Auto Scaling group with a minimum size of 1 to ensure that the application is always running.

Discussion 0
Questions 188

A company uses an Amazon CloudFront distribution to serve content pages for its website. The company needs to ensure that clients use a TLS certificate when accessing the company ' s website. The company wants to automate the creation and renewal of the TLS certificates.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.  

Use a CloudFront security policy to create a certificate.

B.  

Use a CloudFront origin access control (OAC) to create a certificate.

C.  

Use AWS Certificate Manager (ACM) to create a certificate. Use DNS validation for the domain.

D.  

Use AWS Certificate Manager (ACM) to create a certificate. Use email validation for the domain.

Discussion 0
Questions 189

A company is migrating a legacy application from an on-premises data center to AWS. The application relies on hundreds of cron Jobs that run between 1 and 20 minutes on different recurring schedules throughout the day.

The company wants a solution to schedule and run the cron jobs on AWS with minimal refactoring. The solution must support running the cron jobs in response to an event in the future.

Which solution will meet these requirements?

Options:

A.  

Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule. Run the cron job tasks as AWS Lambda functions.

B.  

Create a container image for the cron jobs. Use AWS Batch on Amazon Elastic Container Service (Amazon ECS) with a scheduling policy to run the cron jobs.

C.  

Create a container image for the cron jobs. Use Amazon EventBridge Scheduler to create a recurring schedule Run the cron job tasks on AWS Fargate.

D.  

Create a container image for the cron jobs. Create a workflow in AWS Step Functions that uses a Wait state to run the cron jobs at a specified time. Use the RunTask action to run the cron job tasks on AWS Fargate.

Discussion 0
Questions 190

A solutions architect needs to design a solution for a high performance computing (HPC) workload. The solution must include multiple Amazon EC2 instances. Each EC2 instance requires 10 Gbps of bandwidth individually for single-flow traffic. The EC2 instances require an aggregate throughput of 100 Gbps of bandwidth across all EC2 instances. Communication between the EC2 instances must have low latency.

Which solution will meet these requirements?

Options:

A.  

Place the EC2 instances in a single subnet of a VPC. Configure a cluster placement group. Ensure that the latest Elastic Fabric Adapter (EFA) drivers are installed on the EC2 instances with a supported operating system.

B.  

Place the EC2 instances in multiple subnets in a single VPC. Configure a spread placement group. Ensure that the EC2 instances support Elastic Network Adapters (ENAs) and that the drivers are updated on each instance operating system.

C.  

Place the EC2 instances in multiple VPCs. Use AWS Transit Gateway to route traffic between the VPCs. Ensure that the latest Elastic Fabric Adapter (EFA) drivers are installed on the EC2 instances with a supported operating system.

D.  

Place the EC2 instances in multiple subnets across multiple Availability Zones. Configure a cluster placement group. Ensure that the EC2 instances support Elastic Network Adapters (ENAs) and that the drivers are updated on each instance operating system.

Discussion 0
Questions 191

A company uses an Amazon EC2 instance to handle requests for a public web application. The application routes traffic to multiple application pages by using URL paths.

The company begins to experience large surges of traffic at unpredictable times. The traffic surges cause the web application to experience issues and to occasionally become unavailable.

The company needs to make the web application more scalable to handle sudden increases in traffic.

Which solution will meet this requirement?

Options:

A.  

Create an Amazon Machine Image (AMI) of the web application instance. Use the AMI to create an Auto Scaling group of EC2 instances that has a minimum capacity of two. Create an Application Load Balancer. Set the Auto Scaling group as the target group.

B.  

Create a Docker image of the application. Use Amazon Elastic Container Service (Amazon ECS) to create an Auto Scaling ECS cluster. Enable managed scaling. Create a Network Load Balancer. Set the ECS cluster as the target group.

C.  

Create an Amazon Machine Image (AMI) of the web application instance. Use the AMI to create two more web application instances in separate Availability Zones. Update the website DNS record to refer to all three instances.

D.  

Create an Application Load Balancer (ALB). Set the web application instance as the target. Create an Amazon CloudWatch alarm based on ALB traffic metrics. Configure the alert to activate when traffic spikes.

Discussion 0
Questions 192

A company needs a solution to back up and protect critical AWS resources. The company needs to regularly take backups of several Amazon EC2 instances and Amazon RDS for PostgreSQL databases. To ensure high resiliency, the company must have the ability to validate and restore backups.

Which solution meets the requirement with LEAST operational overhead?

Options:

A.  

Use AWS Backup to create a backup schedule for the resources. Use AWS Backup to create a restoration testing plan for the required resources.

B.  

Take snapshots of the EC2 instances and RDS DB instances. Create AWS Batch jobs to validate and restore the snapshots.

C.  

Create a custom AWS Lambda function to take snapshots of the EC2 instances and RDS DB instances. Create a second Lambda function to restore the snapshots periodically to validate the backups.

D.  

Take snapshots of the EC2 instances and RDS DB instances. Create an AWS Lambda function to restore the snapshots periodically to validate the backups.

Discussion 0
Questions 193

A company hosts its applications in multiple private and public subnets in a VPC. The applications in the private subnets need to access an API. The API is available on the internet and is hosted in the company ' s on-premises data center. A solutions architect needs to establish connectivity for applications in the private subnets.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Create a transit gateway to connect the VPC to the on-premises network. Use the transit gateway to route API calls from the private subnets to the on-premises data center.

B.  

Create a NAT gateway in the public subnet of the VPC. Use the NAT gateway to allow the private subnets to access the API over the internet.

C.  

Establish an AWS PrivateLink connection to connect the VPC to the on-premises network. Use PrivateLink to make API calls from the private subnets to the on-premises data center.

D.  

Implement an AWS Site-to-Site VPN connection between the VPC and the on-premises data center. Use the VPN connection to make API calls from the private subnets to the on-premises data center.

Discussion 0
Questions 194

A company runs multiple applications in multiple AWS accounts within the same organization in AWS Organizations. A content management system (CMS) runs on Amazon EC2 instances in a VPC. The CMS needs to access shared files from an Amazon Elastic File System (Amazon EFS) file system that is deployed in a separate AWS account. The EFS account is in a separate VPC.

Which solution will meet this requirement?

Options:

A.  

Mount the EFS file system on the EC2 instances by using the EFS Elastic IP address.

B.  

Enable VPC sharing between the two accounts. Use the EFS mount helper to mount the file system on the EC2 instances. Redeploy the EFS file system in a shared subnet.

C.  

Configure AWS Systems Manager Run Command to mount the EFS file system on the EC2 instances.

D.  

Install the amazon-efs-utils package on the EC2 instances. Add the mount target in the efs-config file. Mount the EFS file system by using the EFS access point.

Discussion 0
Questions 195

A company uses Amazon EC2 instances to host a website. The website uses an Amazon S3 bucket to store media files. The company wants to automate infrastructure creation across multiple Regions and securely grant EC2 access to S3 using IAM.

Which solution will meet these requirements MOST securely?

Options:

A.  

Store IAM access keys in UserData.

B.  

Store access keys in S3 and reference them in CloudFormation.

C.  

Use an IAM role and instance profile in CloudFormation.

D.  

Retrieve access keys dynamically and store them on EC2.

Discussion 0
Questions 196

An e-commerce company has an application that uses Amazon DynamoDB tables configured with provisioned capacity. Order data is stored in a table named Orders. The Orders table has a primary key of order-ID and a sort key of product-ID. The company configured an AWS Lambda function to receive DynamoDB streams from the Orders table and update a table named Inventory. The company has noticed that during peak sales periods, updates to the Inventory table take longer than the company can tolerate. Which solutions will resolve the slow table updates? (Select TWO.)

Options:

A.  

Add a global secondary index to the Orders table. Include the product-ID attribute.

B.  

Set the batch size attribute of the DynamoDB streams to be based on the size of items in the Orders table.

C.  

Increase the DynamoDB table provisioned capacity by 1,000 write capacity units (WCUs).

D.  

Increase the DynamoDB table provisioned capacity by 1,000 read capacity units (RCUs).

E.  

Increase the timeout of the Lambda function to 15 minutes.

Discussion 0
Questions 197

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload currently consists of a web application and a backend Microsoft SQL database for storage.

The company expects a high volume of customers during a promotional event. The new infrastructure in the AWS Cloud must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Migrate the web application to two Amazon EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.  

Migrate the web application to an Amazon EC2 instance that runs in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate AWS Regions with database replication.

C.  

Migrate the web application to Amazon EC2 instances that run in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with Multi-AZ deployment.

D.  

Migrate the web application to three Amazon EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Discussion 0
Questions 198

A company runs a latency-sensitive gaming service in the AWS Cloud. The gaming service runs on a fleet of Amazon EC2 instances behind an Application Load Balancer (ALB). An Amazon DynamoDB table stores the gaming data. All the infrastructure is in a single AWS Region. The main user base is in that same Region.

A solutions architect needs to update the architecture to support a global expansion of the gaming service. The gaming service must operate with the least possible latency.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon CloudFront distribution in front of the ALB.

B.  

Deploy an Amazon API Gateway regional API endpoint. Integrate the API endpoint with the AL

B.  

C.  

Create an accelerator in AWS Global Accelerator. Add a listener. Configure the endpoint to point to the ALB.

D.  

Deploy the ALB and the fleet of EC2 instances to another Region. Use Amazon Route 53 with geolocation routing.

Discussion 0
Questions 199

A company is migrating applications from an on-premises Microsoft Active Directory that the company manages to AWS. The company deploys the applications in multiple AWS accounts. The company uses AWS Organizations to manage the accounts centrally.

The company ' s security team needs a single sign-on solution across all the company ' s AWS accounts. The company must continue to manage users and groups that are in the on-premises Active Directory

Which solution will meet these requirements?

Options:

A.  

Create an Enterprise Edition Active Directory in AWS Directory Service for Microsoft Active Directory. Configure the Active Directory to be the identity source for AWS IAM Identity Center

B.  

Enable AWS IAM Identity Center. Configure a two-way forest trust relationship to connect the company ' s self-managed Active Directory with IAM Identity Center by using AWS Directory Service for Microsoft Active Directory.

C.  

Use AWS Directory Service and create a two-way trust relationship with the company ' s self-managed Active Directory.

D.  

Deploy an identity provider (IdP) on Amazon EC2. Link the IdP as an identity source within AWS IAM Identity Center.

Discussion 0
Questions 200

A developer is creating a serverless application that performs video encoding. The encoding process runs as background jobs and takes several minutes to encode each video. The process must not send an immediate result to users.

The developer is using Amazon API Gateway to manage an API for the application. The developer needs to run test invocations and request validations. The developer must distribute API keys to control access to the API.

Which solution will meet these requirements?

Options:

A.  

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the Event invocation type to call the Lambda function.

B.  

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the Event invocation type to call the Lambda function.

C.  

Create an HTTP API. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the HTTP API. Use the RequestResponse invocation type to call the Lambda function.

D.  

Create a REST API with the default endpoint type. Create an AWS Lambda function to handle the encoding jobs. Integrate the function with the REST API. Use the RequestResponse invocation type to call the Lambda function.

Discussion 0
Questions 201

A company runs a Node.js function on a server in its on-premises data center. The data center stores data in a PostgreSQL database. The company stores the credentials in a connection string in an environment variable on the server. The company wants to migrate its application to AWS and to replace the Node.js application server with AWS Lambda. The company also wants to migrate to Amazon RDS for PostgreSQL and to ensure that the database credentials are securely managed.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Store the database credentials as a parameter in AWS Systems Manager Parameter Store. Configure Parameter Store to automatically rotate the secrets every 30 days. Update the Lambda function to retrieve the credentials from the parameter.

B.  

Store the database credentials as a secret in AWS Secrets Manager. Configure Secrets Manager to automatically rotate the credentials every 30 days Update the Lambda function to retrieve the credentials from the secret.

C.  

Store the database credentials as an encrypted Lambda environment variable. Write a custom Lambda function to rotate the credentials. Schedule the Lambda function to run every 30 days.

D.  

Store the database credentials as a key in AWS Key Management Service (AWS KMS). Configure automatic rotation for the key. Update the Lambda function to retrieve the credentials from the KMS key.

Discussion 0
Questions 202

A company is migrating some workloads to AWS. However, many workloads will remain on premises. The on-premises workloads require secure and reliable connectivity to AWS with consistent, low-latency performance.

The company has deployed the AWS workloads across multiple AWS accounts and multiple VPCs. The company plans to scale to hundreds of VPCs within the next year.

The company must establish connectivity between each of the VPCs and from the on-premises environment to each VPC.

Which solution will meet these requirements?

Options:

A.  

Use an AWS Direct Connect connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

B.  

Use multiple AWS Site-to-Site VPN connections to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs.

C.  

Use an AWS Direct Connect connection with a Direct Connect gateway to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs. Associate the transit gateway with the Direct Connect gateway.

D.  

Use an AWS Site-to-Site VPN connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

Discussion 0
Questions 203

A company is developing an ecommerce application that will consist of a load-balanced front end, a container-based application, and a relational database. A solutions architect needs to create a highly available solution that operates with as little manual intervention as possible.

Which solutions meet these requirements? (Select TWO.)

Options:

A.  

Create an Amazon RDS DB instance in Multi-AZ mode.

B.  

Create an Amazon RDS DB instance and one or more replicas in another Availability Zone.

C.  

Create an Amazon EC2 instance-based Docker cluster to handle the dynamic application load.

D.  

Create an Amazon Elastic Container Service (Amazon ECS) cluster with a Fargate launch type to handle the dynamic application load.

E.  

Create an Amazon Elastic Container Service (Amazon ECS) cluster with an Amazon EC2 launch type to handle the dynamic application load.

Discussion 0
Questions 204

A company hosts an application that processes highly sensitive customer transactions on AWS. The application uses Amazon RDS as its database. The company manages its own encryption keys to secure the data in Amazon RDS.

The company needs to update the customer-managed encryption keys at least once each year.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Set up automatic key rotation in AWS Key Management Service (AWS KMS) for the encryption keys.

B.  

Configure AWS Key Management Service (AWS KMS) to alert the company to rotate the encryption keys annually.

C.  

Schedule an AWS Lambda function to rotate the encryption keys annually.

D.  

Create an AWS CloudFormation stack to run an AWS Lambda function that deploys new encryption keys once each year.

Discussion 0
Questions 205

A company is building an application in the AWS Cloud. The application is hosted on Amazon EC2 instances behind an Application Load Balancer (ALB). The company uses Amazon Route 53 for the DNS.

The company needs a managed solution with proactive engagement to detect against DDoS attacks.

Which solution will meet these requirements?

Options:

A.  

Enable AWS Config. Configure an AWS Config managed rule that detects DDoS attacks.

B.  

Enable AWS WAF on the ALB Create an AWS WAF web ACL with rules to detect and prevent DDoS attacks. Associate the web ACL with the AL

B.  

C.  

Store the ALB access logs in an Amazon S3 bucket. Configure Amazon GuardDuty to detect and take automated preventative actions for DDoS attacks.

D.  

Subscribe to AWS Shield Advanced. Configure hosted zones in Route 53 Add ALB resources as protected resources.

Discussion 0
Questions 206

A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The company wants to migrate the workloads to AWS.

The migration is time sensitive. The company wants to use a lift-and-shift strategy for non-critical workloads.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.  

Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.

B.  

Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.

C.  

Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.

D.  

Stop all operations on the VMs Launch a cutover instance.

E.  

Use AWS App2Container (A2C) to collect data about the VMs.

F.  

Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Discussion 0
Questions 207

A company is building an application composed of multiple microservices that communicate over HTTP. The company must deploy the application across multiple AWS Regions to meet disaster recovery requirements. The application must maintain high availability and automatic fault recovery.

Which solution will meet these requirements?

Options:

A.  

Deploy all microservices on a single large EC2 instance in one Region to simplify communication.

B.  

Use AWS Fargate to run each microservice in separate containers. Deploy across multiple Availability Zones in one Region behind an Application Load Balancer.

C.  

Use Amazon Route 53 with latency-based routing. Deploy microservices on Amazon EC2 instances in multiple Regions behind Application Load Balancers.

D.  

Implement each microservice using AWS Lambda. Expose the microservices using an Amazon API Gateway REST API.

Discussion 0
Questions 208

A company ' s application receives requests from customers in JSON format. The company uses Amazon Simple Queue Service (Amazon SQS) to handle the requests.

After the application ' s most recent update, the company ' s customers reported that requests were being duplicated. A solutions architect discovers that the application is consuming messages from the SQS queue more than once.

What is the root cause of the issue?

Options:

A.  

The visibility timeout is longer than the time it takes the application to process messages from the queue.

B.  

The duplicated messages in the SQS queue contain unescaped Unicode characters.

C.  

The message size exceeds the maximum of 256 KiB for each SQS message.

D.  

The visibility timeout is shorter than the time it takes the application to process messages from the queue.

Discussion 0
Questions 209

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload consists of a web application and a backend Microsoft SQL Server database. The company expects a high volume of customers during a promotional event. The new AWS infrastructure must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Migrate the web application to two EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.  

Migrate the web application to an EC2 instance in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate Regions with database replication.

C.  

Migrate the web application to EC2 instances in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with a Multi-AZ deployment.

D.  

Migrate the web application to three EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Discussion 0
Questions 210

A solutions architect is building an Amazon S3 data lake for a company. The company uses Amazon Kinesis Data Firehose to ingest customer personally identifiable information (PII) and transactional data in near real-time to an S3 bucket. The company needs to mask all PII data before storing thedata in the data lake.

Which solution will meet these requirements?

Options:

A.  

Create an AWS Lambda function to detect and mask PII. Invoke the function from Kinesis Data Firehose.

B.  

Use Amazon Macie to scan the S3 bucket. Configure Macie to detect and mask PII.

C.  

Enable server-side encryption (SSE) on the S3 bucket.

D.  

Create an AWS Lambda function that integrates with AWS CloudHSM. Configure the function to detect and mask PII.

Discussion 0
Questions 211

A company has several on-premises Internet Small Computer Systems Interface (iSCSI) network storage servers The company wants to reduce the number of these servers by moving to the AWS Cloud. A solutions architect must provide low-latency access to frequently used data and reduce the dependency on on-premises servers with a minimal number of infrastructure changes.

Which solution will meet these requirements?

Options:

A.  

Deploy an Amazon S3 File Gateway

B.  

Deploy Amazon Elastic Block Store (Amazon EBS) storage with backups to Amazon S3

C.  

Deploy an AWS Storage Gateway volume gateway that is configured with stored volumes

D.  

Deploy an AWS Storage Gateway volume gateway that is configured with cached volumes.

Discussion 0
Questions 212

An online food delivery company wants to optimize its storage costs. The company has been collecting operational data for the last 10 years in a data lake that was built on Amazon S3 by using a Standard storage class. The company does not keep data that is older than 7 years. A solutions architect frequently uses data from the past 6 months for reporting and runs queries on data from the last 2 years about once a month. Data that is more than 2 years old is rarely accessed and is only used for audit purposes.

Which combination of solutions will optimize the company ' s storage costs? (Select TWO.)

Options:

A.  

Create an S3 Lifecycle configuration rule to transition data that is older than 6 months to the S3 Standard-Infrequent Access (S3 Standard-IA) storage class. Create another S3 Lifecycle configuration rule to transition data that is older than 2 years to the S3 Glacier Deep Archive storage class.

B.  

Create an S3 Lifecycle configuration rule to transition data that is older than 6 months to the S3 One Zone-Infrequent Access (S3 One Zone-IA) storage class. Create another S3 Lifecycle configuration rule to transition data that is older than 2 years to the S3 Glacier Flexible Retrieval storage class.

C.  

Use the S3 Intelligent-Tiering storage class to store data instead of the S3 Standard storage class.

D.  

Create an S3 Lifecycle expiration rule to delete data that is older than 7 years.

E.  

Create an S3 Lifecycle configuration rule to transition data that is older than 7 years to the S3 Glacier Deep Archive storage class.

Discussion 0
Questions 213

A solutions architect needs to save a particular automated database snapshot from an Amazon RDS for Microsoft SQL Server DB instance for longer than the maximum number of days. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.  

Create a manual copy of the snapshot.

B.  

Export the contents of the snapshot to an Amazon S3 bucket.

C.  

Change the retention period of the snapshot to 45 days.

D.  

Create a native SQL Server backup. Save the backup to an Amazon S3 bucket.

Discussion 0
Questions 214

A company has built an application that uses an Amazon Simple Queue Service (Amazon SQS) standard queue and an AWS Lambda function. The Lambda function writes messages to the SQS queue. The company needs a solution to ensure that the consumer of the SQS queue never receives duplicate messages.

Which solution will meet this requirement with the FEWEST changes to the current architecture?

Options:

A.  

Modify the SQS queue to enable long polling for the queue.

B.  

Delete the existing SQS queue. Recreate the queue as a FIFO queue. Enable content-based deduplication for the queue.

C.  

Modify the SQS queue to enable content-based deduplication for the queue.

D.  

Delete the SQS queue. Create an Amazon MQ message broker. Configure the broker to deduplicate messages.

Discussion 0
Questions 215

A company runs business applications on AWS. The company uses 50 AWS accounts, thousands of VPCs, and three AWS Regions across the United States and Europe. The company has an existing AWS Direct Connect connection that connects an on-premises data center to a single Region.

A solutions architect needs to establish network connectivity between the on-premises data center and the remaining two Regions. The solutions architect must also establish connectivity between the VPCs. On-premises users and applications must be able to connect to applications that run in the VPCs. The solutions architect creates a transit gateway in each Region and configures the transit gateways as inter-Region peers.

What should the solutions architect do next to meet these requirements?

Options:

A.  

Create a private virtual interface (VIF) with a gateway type of virtual private gateway. Configure the private VIF to use a virtual private gateway that is associated with one of the VPCs.

B.  

Create a private virtual interface (VIF) to a new Direct Connect gateway. Associate the new Direct Connect gateway with a virtual private gateway in each VPC.

C.  

Create a transit virtual interface (VIF) with a gateway association to a new Direct Connect gateway. Associate each transit gateway with the new Direct Connect gateway.

D.  

Create an AWS Site-to-Site VPN connection that uses a public virtual interface (VIF) for the Direct Connect connection. Attach the Site-to-Site VPN connection to the transit gateways.

Discussion 0
Questions 216

A company uses AWS to host a public website. The load on the webservers recently increased.

The company wants to learn more about the traffic flow and traffic sources. The company also wants to increase the overall security of the website.

Which solution will meet these requirements?

Options:

A.  

Deploy AWS WAF and set up logging. Use Amazon Data Firehose to deliver the log files to an Amazon S3 bucket for analysis.

B.  

Deploy Amazon API Gateway and set up logging. Use Amazon Kinesis Data Streams to deliver the log files to an Amazon S3 bucket for analysis.

C.  

Deploy a Network Load Balancer and set up logging. Use Amazon Data Firehose to deliver the log files to an Amazon S3 bucket for analysis.

D.  

Deploy an Application Load Balancer and set up logging. Use Amazon Kinesis Data Streams to deliver the log files to an Amazon S3 bucket for analysis.

Discussion 0
Questions 217

A company needs to migrate its customer transactions database from on-premises to AWS. The database resides on an Oracle DB instance that runs on a Linux server. According to a new security requirement, the company must rotate the database password each year.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Convert the database to Amazon DynamoDB by using AWS Schema Conversion Tool (AWS SCT). Store the password in AWS Systems Manager Parameter Store. Create an Amazon CloudWatch alarm to invoke an AWS Lambda function for yearly password rotation.

B.  

Migrate the database to Amazon RDS for Oracle. Store the password in AWS Secrets Manager. Turn on automatic rotation. Configure a yearly rotation schedule.

C.  

Migrate the database to an Amazon EC2 instance. Use AWS Systems Manager Parameter Store to keep and rotate the connection string by using an AWS Lambda function on a yearly schedule.

D.  

Migrate the database to Amazon Neptune by using AWS Schema Conversion Tool (AWS SCT). Create an Amazon CloudWatch alarm to invoke an AWS Lambda function for yearly password rotation.

Discussion 0
Questions 218

An ecommerce company is planning to migrate an on-premises Microsoft SQL Server database to the AWS Cloud. The company needs to migrate the database to SQL Server Always On availability groups. The cloud-based solution must be highly available.

Options:

Options:

A.  

Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Attach one Amazon Elastic Block Store (Amazon EBS) volume to the EC2 instances.

B.  

Migrate the database to Amazon RDS for SQL Server. Configure a Multi-AZ deployment and read replicas.

C.  

Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon FSx for Windows File Server as the storage tier.

D.  

Deploy three Amazon EC2 instances with SQL Server across three Availability Zones. Use Amazon S3 as the storage tier.

Discussion 0
Questions 219

A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB). The instances run Amazon Linux in an Auto Scaling group. Each instance stores product manuals on Amazon EBS volumes.

New instances often start with outdated data and may take up to 30 minutes to download updates. The company needs a solution ensuring all instances always have up-to-date product manuals, can scale rapidly, and does not require application code changes.

Which solution will meet these requirements?

Options:

A.  

Store the product manuals on instance store volumes attached to each EC2 instance.

B.  

Store the product manuals in an Amazon S3 bucket. Configure EC2 instances to download updates from the bucket.

C.  

Store the product manuals in an Amazon EFS file system. Mount the EFS volume on the EC2 instances.

D.  

Store the product manuals in an S3 bucket using S3 Standard-IA. Configure EC2 instances to download updates from S3.

Discussion 0
Questions 220

A mining company is using Amazon S3 as its data lake. The company wants to analyze the data collected by the sensors in its mines. A data pipeline is being built to capture data from the sensors, ingest the data into an S3 bucket, and convert the data to Apache Parquet format. The data pipeline must be processed in near-real time. The data will be used for on-demand queries with Amazon Athena.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Data Firehose to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

B.  

Use Amazon Kinesis Data Streams to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

C.  

Use AWS DataSync to invoke an AWS Lambda function that converts the data to Parquet format and stores the data in Amazon S3.

D.  

Use Amazon Simple Queue Service (Amazon SQS) to stream data directly to an AWS Glue job that converts the data to Parquet format and stores the data in Amazon S3.

Discussion 0
Questions 221

A company is running a two-tier web-based application in an on-premises data center. The application layer consists of a single server running a stateful application. The application connects to a PostgreSQL database running on a separate server. The user base is expected to grow significantly, so the company is migrating the application and database to AWS. The solution will use Amazon Aurora PostgreSQL, Amazon EC2 Auto Scaling, and Elastic Load Balancing.

Which solution will provide a consistent user experience that will allow the application and database tiers to scale?

Options:

A.  

Enable Aurora Auto Scaling for Aurora Replicas. Use a Network Load Balancer with the least outstanding requests routing algorithm and sticky sessions enabled.

B.  

Enable Aurora Auto Scaling for Aurora writers. Use an Application Load Balancer with the round robin routing algorithm and sticky sessions enabled.

C.  

Enable Aurora Auto Scaling for Aurora Replicas. Use an Application Load Balancer with the round robin routing algorithm and sticky sessions enabled.

D.  

Enable Aurora Auto Scaling for Aurora writers. Use a Network Load Balancer with the least outstanding requests routing algorithm and sticky sessions enabled.

Discussion 0
Questions 222

A company is designing a new internal web application in the AWS Cloud. The new application must securely retrieve and store multiple employee usernames and passwords from an AWS managed service. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve usernames and passwords from Parameter Store.

B.  

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

C.  

Store the employee credentials in AWS Systems Manager Parameter Store. Use AWS Cloud Formation and AWS Batch with the BatchGetSecretValue API to retrieve the usernames and passwords from Parameter Store.

D.  

Store the employee credentials in AWS Secrets Manager. Use AWS Cloud Formation and the BatchGetSecretValue API to retrieve the usernames and passwords from Secrets Manager.

Discussion 0
Questions 223

A company wants to store a large amount of data as objects for analytics and long-term archiving. Resources from outside AWS need to access the data. The external resources need to access the data with unpredictable frequency. However, the external resource must have immediate access when necessary.

The company needs a cost-optimized solution that provides high durability and data security.

Which solution will meet these requirements?

Options:

A.  

Store the data in Amazon S3 Standard. Apply S3 Lifecycle policies to transition older data to S3 Glacier Deep Archive.

B.  

Store the data in Amazon S3 Intelligent-Tiering.

C.  

Store the data in Amazon S3 Glacier Flexible Retrieval. Use expedited retrieval to provide immediate access when necessary.

D.  

Store the data in Amazon Elastic File System (Amazon EFS) Infrequent Access (IA). Use lifecycle policies to archive older files.

Discussion 0
Questions 224

A transaction-processing company has weekly batch jobs that run on Amazon EC2 instances in an Auto Scaling group. Transaction volume varies, but CPU utilization is always at least 60% during the batch runs. Capacity must be provisioned 30 minutes before the jobs begin.

Engineers currently scale the Auto Scaling group manually. The company needs an automated solution but cannot allocate time to analyze scaling trends.

Which solution will meet these requirements with the least operational overhead?

Options:

A.  

Create a dynamic scaling policy based on CPU utilization at 60%.

B.  

Create a scheduled scaling policy. Set desired, minimum, and maximum capacity. Set recurrence weekly. Set the start time to 30 minutes before the jobs run.

C.  

Create a predictive scaling policy that forecasts CPU usage and pre-launches instances 30 minutes before the jobs run.

D.  

Create an EventBridge rule that invokes a Lambda function when CPU reaches 60%. The Lambda function increases the Auto Scaling group size by 20%.

Discussion 0
Questions 225

Question:

A genomics research company is designing a scalable architecture for a loosely coupled workload. Tasks in the workload are independent and can be processed in parallel. The architecture needs to minimize management overhead and provide automatic scaling based on demand.

Options:

Options:

A.  

Use a cluster of Amazon EC2 instances. Use AWS Systems Manager to manage the workload.

B.  

Implement a serverless architecture that uses AWS Lambda functions.

C.  

Use AWS ParallelCluster to deploy a dedicated high-performance cluster.

D.  

Implement vertical scaling for each workload task.

Discussion 0
Questions 226

An events company runs a web application on Amazon EKS that uses an Amazon DynamoDB table. The table has 1,000 RCUs and 500 WCUs provisioned. The application uses eventually consistent reads.

Traffic is usually low but occasionally spikes. During spikes, DynamoDB throttles requests, causing user-facing errors.

What should a solutions architect do to reduce these errors?

Options:

A.  

Change the DynamoDB table to on-demand capacity mode.

B.  

Create a DynamoDB read replica.

C.  

Purchase DynamoDB reserved capacity.

D.  

Use strongly consistent reads.

Discussion 0
Questions 227

A company has a single AWS account. The company runs workloads on Amazon EC2 instances in multiple VPCs in one AWS Region. The company also runs workloads in an on-premises data center that connects to the company ' s AWS account by using AWS Direct Connect.

The company needs all EC2 instances in the VPCs to resolve DNS queries for the internal.example.com domain to the authoritative DNS server that is located in the on-premises data center. The solution must use private communication between the VPCs and the on-premises network. All route tables, network ACLs, and security groups are configured correctly between AWS and the on-premises data center.

Which combination of actions will meet these requirements? (Select THREE.)

Options:

A.  

Create an Amazon Route 53 inbound endpoint in all the workload VPCs.

B.  

Create an Amazon Route 53 outbound endpoint in one of the workload VPCs.

C.  

Create an Amazon Route 53 Resolver rule with the Forward type configured to forward queries for internal.example.com to the on-premises DNS server.

D.  

Create an Amazon Route 53 Resolver rule with the System type configured to forward queries for internal.example.com to the on-premises DNS server.

E.  

Associate the Amazon Route 53 Resolver rule with all the workload VPCs.

F.  

Associate the Amazon Route 53 Resolver rule with the workload VPC with the new Route 53 endpoint.

Discussion 0