Weekend Sale 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

ExamsBrite Dumps

AWS Certified Solutions Architect - Associate (SAA-C03) Question and Answers

AWS Certified Solutions Architect - Associate (SAA-C03)

Last Update Sep 20, 2025
Total Questions : 467

We are offering FREE SAA-C03 Amazon Web Services exam questions. All you do is to just go and sign up. Give your details, prepare SAA-C03 free exam questions and then go for complete pool of AWS Certified Solutions Architect - Associate (SAA-C03) test questions that will help you more.

SAA-C03 pdf

SAA-C03 PDF

$36.75  $104.99
SAA-C03 Engine

SAA-C03 Testing Engine

$43.75  $124.99
SAA-C03 PDF + Engine

SAA-C03 PDF + Testing Engine

$57.75  $164.99
Questions 1

A company wants to store a large amount of data as objects for analytics and long-term archiving. Resources from outside AWS need to access the data. The external resources need to access the data with unpredictable frequency. However, the external resource must have immediate access when necessary.

The company needs a cost-optimized solution that provides high durability and data security.

Which solution will meet these requirements?

Options:

A.  

Store the data in Amazon S3 Standard. Apply S3 Lifecycle policies to transition older data to S3 Glacier Deep Archive.

B.  

Store the data in Amazon S3 Intelligent-Tiering.

C.  

Store the data in Amazon S3 Glacier Flexible Retrieval. Use expedited retrieval to provide immediate access when necessary.

D.  

Store the data in Amazon Elastic File System (Amazon EFS) Infrequent Access (IA). Use lifecycle policies to archive older files.

Discussion 0
Questions 2

A company needs a solution to back up and protect critical AWS resources. The company needs to regularly take backups of several Amazon EC2 instances and Amazon RDS for PostgreSQL databases. To ensure high resiliency, the company must have the ability to validate and restore backups.

Which solution meets the requirement with LEAST operational overhead?

Options:

A.  

Use AWS Backup to create a backup schedule for the resources. Use AWS Backup to create a restoration testing plan for the required resources.

B.  

Take snapshots of the EC2 instances and RDS DB instances. Create AWS Batch jobs to validate and restore the snapshots.

C.  

Create a custom AWS Lambda function to take snapshots of the EC2 instances and RDS DB instances. Create a second Lambda function to restore the snapshots periodically to validate the backups.

D.  

Take snapshots of the EC2 instances and RDS DB instances. Create an AWS Lambda function to restore the snapshots periodically to validate the backups.

Discussion 0
Questions 3

A company is storing data in Amazon S3 buckets. The company needs to retain any objects that contain personally identifiable information (PII) that might need to be reviewed.

A solutions architect must develop an automated solution to identify objects that contain PII and apply the necessary controls to prevent deletion before review.

Which combination of steps should the solutions architect take to meet these requirements? (Select THREE.)

Options:

A.  

Create a job in Amazon Macie to scan the S3 buckets for the relevant sensitive data identifiers.

B.  

Move the identified objects to the S3 Glacier Deep Archive storage class.

C.  

Create an AWS Lambda function that performs an S3 Object Lock legal hold operation on the identified objects.

D.  

Create an AWS Lambda function that applies an S3 Object Lock retention period to the identified objects in governance mode.

E.  

Create an Amazon EventBridge rule that invokes the AWS Lambda function when Amazon Macie detects sensitive data.

F.  

Configure multi-factor authentication (MFA) delete on the S3 buckets.

Discussion 0
Questions 4

A company has an on-premises application that uses SFTP to collect financial data from multiple vendors. The company is migrating to the AWS Cloud. The company has created an application that uses Amazon S3 APIs to upload files from vendors.

Some vendors run their systems on legacy applications that do not support S3 APIs. The vendors want to continue to use SFTP-based applications to upload data. The company wants to use managed services for the needs of the vendors that use legacy applications.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an AWS Database Migration Service (AWS DMS) instance to replicate data from the storage of the vendors that use legacy applications to Amazon S3. Provide the vendors with the credentials to access the AWS DMS instance.

B.  

Create an AWS Transfer Family endpoint for vendors that use legacy applications.

C.  

Configure an Amazon EC2 instance to run an SFTP server. Instruct the vendors that use legacy applications to use the SFTP server to upload data.

D.  

Configure an Amazon S3 File Gateway for vendors that use legacy applications to upload files to an SMB file share.

Discussion 0
Questions 5

A solutions architect is designing a multi-Region disaster recovery (DR) strategy for a company. The company runs an application on Amazon EC2 instances in Auto Scaling groups that are behind an Application Load Balancer (ALB). The company hosts the application in the company's primary and secondary AWS Regions.

The application must respond to DNS queries from the secondary Region if the primary Region fails. Only one Region must serve traffic at a time.

Which solution will meet these requirements?

Options:

A.  

Create an outbound endpoint in Amazon Route 53 Resolver. Create forwarding rules that determine how queries will be forwarded to DNS resolvers on the network. Associate the rules with VPCs in each Region.

B.  

Create primary and secondary DNS records in Amazon Route 53. Configure health checks and a failover routing policy.

C.  

Create a traffic policy in Amazon Route 53. Use a geolocation routing policy and a value type of ELB Application Load Balancer.

D.  

Create an Amazon Route 53 profile. Associate DNS resources to the profile. Associate the profile with VPCs in each Region.

Discussion 0
Questions 6

A company is designing a solution to capture customer activity on the company's web applications. The company wants to analyze the activity data to make predictions.

Customer activity on the web applications is unpredictable and can increase suddenly. The company requires a solution that integrates with other web applications. The solution must include an authorization step.

Which solution will meet these requirements?

Options:

A.  

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Configure the applications to pass an authorization header to the GWLB.

B.  

Deploy an Amazon API Gateway endpoint in front of an Amazon Kinesis data stream. Store the data in an Amazon S3 bucket. Use an AWS Lambda function to handle authorization.

C.  

Deploy an Amazon API Gateway endpoint in front of an Amazon Data Firehose delivery stream. Store the data in an Amazon S3 bucket. Use an API Gateway Lambda authorizer to handle authorization.

D.  

Deploy a Gateway Load Balancer (GWLB) in front of an Amazon Elastic Container Service (Amazon ECS) container instance. Store the data in an Amazon Elastic File System (Amazon EFS) file system. Use an AWS Lambda function to handle authorization.

Discussion 0
Questions 7

A company runs production workloads in its AWS account. Multiple teams create and maintain the workloads.

The company needs to be able to detect changes in resource configurations. The company needs to capture changes as configuration items without changing or modifying the existing resources.

Which solution will meet these requirements?

Options:

A.  

Use AWS Config. Start the configuration recorder for AWS resources to detect changes in resource configurations.

B.  

Use AWS CloudFormation. Initiate drift detection to capture changes in resource configurations.

C.  

Use Amazon Detective to detect, analyze, and investigate changes in resource configurations.

D.  

Use AWS Audit Manager to capture management events and global service events for resource configurations.

Discussion 0
Questions 8

A company needs a solution to prevent photos with unwanted content from being uploaded to the company’s web application. The solution must not involve training a machine learning (ML) model.

Which solution will meet these requirements?

Options:

A.  

Create and deploy a model by using Amazon SageMaker Autopilot. Create a real-time endpoint that the web application invokes when new photos are uploaded.

B.  

Create an AWS Lambda function that uses Amazon Rekognition to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

C.  

Create an Amazon CloudFront function that uses Amazon Comprehend to detect unwanted content. Associate the function with the web application.

D.  

Create an AWS Lambda function that uses Amazon Rekognition Video to detect unwanted content. Create a Lambda function URL that the web application invokes when new photos are uploaded.

Discussion 0
Questions 9

An online gaming company hosts its platform on Amazon EC2 instances behind Network Load Balancers (NLBs) across multiple AWS Regions. The NLBs can route requests to targets overthe internet. The company wants to improve the customer playing experience by reducing end-to-end load time for its global customer base.

Which solution will meet these requirements?

Options:

A.  

Create Application Load Balancers (ALBs) in each Region to replace the existing NLBs. Register the existing EC2 instances as targets for the ALBs in each Region.

B.  

Configure Amazon Route 53 to route equally weighted traffic to the NLBs in each Region.

C.  

Create additional NLBs and EC2 instances in other Regions where the company has large customer bases.

D.  

Create a standard accelerator in AWS Global Accelerator. Configure the existing NLBs as target endpoints.

Discussion 0
Questions 10

A company runs an application on Microsoft SQL Server databases in an on-premises data center. The company wants to migrate to AWS and optimize costs for its infrastructure on AWS.

Which solution will meet these requirements?

Options:

A.  

Migrate the databases to Amazon EC2 instances that use SQL Server Amazon Machine Images (AMIs) provided by AWS.

B.  

Migrate to Amazon Aurora PostgreSQL by using Babelfish for Aurora PostgreSQL.

C.  

Migrate the databases to a PostgreSQL database that runs on Amazon EC2 instances.

D.  

Migrate the databases to Amazon RDS for Microsoft SQL Server.

Discussion 0
Questions 11

A company is designing a new multi-tier web application that consists of the following components:

• Web and application servers that run on Amazon EC2 instances as part of Auto Scaling groups

• An Amazon RDS DB instance for data storage

A solutions architect needs to limit access to the application servers so that only the web servers can access them. Which solution will meet these requirements?

Options:

A.  

Deploy AWS PrivateLink in front of the application servers. Configure the network ACL to allow only the web servers to access the application servers.

B.  

Deploy a VPC endpoint in front of the application servers Configure the security group to allow only the web servers to access the application servers

C.  

Deploy a Network Load Balancer with a target group that contains the application servers' Auto Scaling group Configure the network ACL to allow only the web servers to access the application servers.

D.  

Deploy an Application Load Balancer with a target group that contains the application servers' Auto Scaling group. Configure the security group to allow only the web servers to access the application servers.

Discussion 0
Questions 12

A company receives data transfers from a small number of external clients that use SFTP software on an Amazon EC2 instance. The clients use an SFTP client to upload data. The clients use SSH keys for authentication. Every hour, an automated script transfers new uploads to an Amazon S3 bucket for processing.

The company wants to move the transfer process to an AWS managed service and to reduce the time required to start data processing. The company wants to retain the existing user management and SSH key generation process. The solution must not require clients to make significant changes to their existing processes.

Which solution will meet these requirements?

Options:

A.  

Reconfigure the script that runs on the EC2 instance to run every 15 minutes. Create an S3 Event Notifications rule for all new object creation events. Set an Amazon Simple Notification Service (Amazon SNS) topic as the destination.

B.  

Create an AWS Transfer Family SFTP server that uses the existing S3 bucket as a target. Use service-managed users to enable authentication.

C.  

Require clients to add the AWS DataSync agent into their local environments. Create an IAM user for each client that has permission to upload data to the target S3 bucket.

D.  

Create an AWS Transfer Family SFTP connector that has permission to access the target S3 bucket for each client. Store credentials in AWS Systems Manager. Create an IAM role to allow the SFTP connector to securely use the credentials.

Discussion 0
Questions 13

An ecommerce company runs a multi-tier application on AWS. The frontend and backend tiers both run on Amazon EC2 instances. The database tier runs on an Amazon RDS for MySQL DB instance. The backend tier communicates with the RDS DB instance.

The application makes frequent calls to return identical datasets from the database. The frequent calls on the database cause performance slowdowns. A solutions architect must improve the performance of the application backend.

Which solution will meet this requirement?

Options:

A.  

Configure an Amazon Simple Notification Service (Amazon SNS) topic between the EC2 instances and the RDS DB instance.

B.  

Configure an Amazon ElastiCache (Redis OSS) cache. Configure the backend EC2 instances to read from the cache.

C.  

Configure an Amazon DynamoDB Accelerator (DAX) cluster. Configure the backend EC2 instances to read from the cluster.

D.  

Configure Amazon Data Firehose to stream the calls to the database.

Discussion 0
Questions 14

A company is developing a public web application that needs to access multiple AWS services. The application will have hundreds of users who must log in to the application first before using the services.

The company needs to implement a secure and scalable method to grant the web application temporary access to the AWS resources.

Which solution will meet these requirements?

Options:

A.  

Create an IAM role for each AWS service that the application needs to access. Assign the roles directly to the instances that the web application runs on.

B.  

Create an IAM role that has the access permissions the web application requires. Configure the web application to use AWS Security Token Service (AWS STS) to assume the IAM role. Use STS tokens to access the required AWS services.

C.  

Use AWS IAM Identity Center to create a user pool that includes the application users. Assign access credentials to the web application users. Use the credentials to access the required AWS services.

D.  

Create an IAM user that has programmatic access keys for the AWS services. Store the access keys in AWS Systems Manager Parameter Store. Retrieve the access keys from Parameter Store. Use the keys in the web application.

Discussion 0
Questions 15

A company needs to run a critical data processing workload that uses a Python script every night. The workload takes 1 hour to finish.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the AWS Fargate launch type. Use the Fargate Spot capacity provider. Schedule the job to run once every night.

B.  

Deploy an Amazon Elastic Container Service (Amazon ECS) cluster with the Amazon EC2 launch type. Schedule the job to run once every night.

C.  

Create an AWS Lambda function that uses the existing Python code. Configure Amazon EventBridge to invoke the function once every night.

D.  

Create an Amazon EC2 On-Demand Instance that runs Amazon Linux. Migrate the Python script to the instance. Use a cron job to schedule the script. Create an AWS Lambda function to start and stop the instance once every night.

Discussion 0
Questions 16

A company is migrating a new application from an on-premises data center to a new VPC in the AWS Cloud. The company has multiple AWS accounts and VPCs that share many subnets and applications. The company wants to have fine-grained access control for the new application.The company wants to ensure that all network resources across accounts and VPCs that are granted permission to access the new application can access the application.

Which solution will meet these requirements?

Options:

A.  

Set up a VPC peering connection for each VPC that needs access to the new application VPC. Update route tables in each VPC to enable connectivity.

B.  

Deploy a transit gateway in the account that hosts the new application. Share the transit gateway with each account that needs to connect to the application. Update route tables in the VPC that hosts the new application and in the transit gateway to enable connectivity.

C.  

Use an AWS PrivateLink endpoint service to make the new application accessible to other VPCs. Control access to the application by using an endpoint policy.

D.  

Use an Application Load Balancer (ALB) to expose the new application to the internet. Configure authentication and authorization processes to ensure that only specified VPCs can access the application.

Discussion 0
Questions 17

A company wants to restrict access to the content of its web application. The company needs to protect the content by using authorization techniques that are available on AWS. The company also wants to implement a serverless architecture for authorization and authentication that has low login latency.

The solution must integrate with the web application and serve web content globally. The application currently has a small user base, but the company expects the application's user base to increase

Which solution will meet these requirements?

Options:

A.  

Configure Amazon Cognito for authentication. Implement Lambda@Edge for authorization. Configure Amazon CloudFront to serve the web application globally

B.  

Configure AWS Directory Service for Microsoft Active Directory for authentication. Implement AWS Lambda for authorization. Use an Application Load Balancer to serve the web application globally.

C.  

Configure Amazon Cognito for authentication. Implement AWS Lambda for authorization Use Amazon S3 Transfer Acceleration to serve the web application globally.

D.  

Configure AWS Directory Service for Microsoft Active Directory for authentication. Implement Lambda@Edge for authorization. Use AWS Elastic Beanstalk to serve the web application globally.

Discussion 0
Questions 18

A company hosts a multi-tier inventory reporting application on AWS. The company needs a cost-effective solution to generate inventory reports on demand. Admin users need to have the ability to generate new reports. Reports take approximately 5-10 minutes to finish. The application must send reports to the email address of the admin user who generates each report.

Options:

Options:

A.  

Use Amazon Elastic Container Service (Amazon ECS) to host the report generation code. Use an Amazon API Gateway HTTP API to invoke the code. Use Amazon Simple Email Service (Amazon SES) to send the reports to admin users.

B.  

Use Amazon EventBridge to invoke a scheduled AWS Lambda function to generate the reports. Use Amazon Simple Notification Service (Amazon SNS) to send the reports to admin users.

C.  

Use Amazon Elastic Kubernetes Service (Amazon EKS) to host the report generation code. Use an Amazon API Gateway REST API to invoke the code. Use Amazon Simple Notification Service (Amazon SNS) to send the reports to admin users.

D.  

Create an AWS Lambda function to generate the reports. Use a function URL to invoke the function. Use Amazon Simple Email Service (Amazon SES) to send the reports to admin users.

Discussion 0
Questions 19

A company stores petabytes of historical medical information on premises. The company has a process to manage encryption of the data to comply with regulations. The company needs a cloud-based solution for data backup, recovery, and archiving. The company must retain control over the encryption key material. Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.  

Create an AWS Key Management Service (AWS KMS) key without key material. Import the company's key material into the KMS key.

B.  

Create an AWS Key Management Service (AWS KMS) encryption key that contains key material generated by AWS KMS.

C.  

Store the data in Amazon S3 Standard-Infrequent Access (S3 Standard-IA) storage. Use S3 Bucket Keys with AWS Key Management Service (AWS KMS) keys.

D.  

Store the data in an Amazon S3 Glacier storage class. Use server-side encryption with customer-provided keys (SSE-C).

E.  

Store the data in AWS Snowball devices. Use server-side encryption with AWS KMS keys (SSE-KMS).

Discussion 0
Questions 20

A company runs its databases on Amazon RDS for PostgreSQL. The company wants a secure solution to manage the master user password by rotating the password every 30 days. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon EventBridge to schedule a custom AWS Lambda function to rotate the password every 30 days.

B.  

Use the modlfy-db-instance command in the AWS CLI to change the password.

C.  

Integrate AWS Secrets Manager with Amazon RDS for PostgreSQL to automate password rotation.

D.  

Integrate AWS Systems Manager Parameter Store with Amazon RDS for PostgreSQL to automate password rotation.

Discussion 0
Questions 21

A company stores a large volume of critical data in Amazon RDS for PostgreSQL tables. The company is developing several new features for an upcoming product launch. Some of the new features require many table alterations.

The company needs a solution to test the altered tables for several days. After testing, the solution must make the new features available to customers in production.

Which solution will meet these requirements with the HIGHEST availability?

Options:

A.  

Create a new instance of the database in RDS for PostgreSQL to test the new features. When the testing is finished, take a backup of the test database, and restore the test database to the production database.

B.  

Create new database tables in the production database to test the new features. When the testing is finished, copy the data from the older tables to the new tables. Delete the older tables, and rename the new tables accordingly.

C.  

Create an Amazon RDS read replica to deploy a new instance of the database. Make updates to the database tables in the replica instance. When the testing is finished, promote the replica instance to become the new production instance.

D.  

Use an Amazon RDS blue/green deployment to deploy a new test instance of the database. Make database table updates in the test instance. When the testing is finished, promote the test instance to become the new production instance.

Discussion 0
Questions 22

A company is launching a new application that will be hosted on Amazon EC2 instances. A solutions architect needs to design a solution that does not allow public IPv4 access that originates from the internet. However, the solution must allow the EC2 instances to make outbound IPv4 internet requests.

Options:

A.  

Deploy a NAT gateway in public subnets in both Availability Zones. Create and configure one route table for each private subnet.

B.  

Deploy an internet gateway in public subnets in both Availability Zones. Create and configure a shared route table for the private subnets.

C.  

Deploy a NAT gateway in public subnets in both Availability Zones. Create and configure a shared route table for the private subnets.

D.  

Deploy an egress-only internet gateway in public subnets in both Availability Zones. Create and configure one route table for each private subnet.

Discussion 0
Questions 23

A company runs an AWS Lambda function in private subnets in a VPC. The subnets have a default route to the internet through an Amazon EC2 NAT instance. The Lambda function processes input data and saves its output as an object to Amazon S3.

Intermittently, the Lambda function times out while trying to upload the object because of saturated traffic on the NAT instance's network The company wants to access Amazon S3 without traversing the internet.

Which solution will meet these requirements?

Options:

A.  

Replace the EC2 NAT instance with an AWS managed NAT gateway.

B.  

Increase the size of the EC2 NAT instance in the VPC to a network optimized instance type

C.  

Provision a gateway endpoint for Amazon S3 in the VP

C.  

Update the route tables of the subnets accordingly.

D.  

Provision a transit gateway. Place transit gateway attachments in the private subnets where the Lambda function is running.

Discussion 0
Questions 24

A company hosts an Amazon EC2 instance in a private subnet in a new VPC. The VPC also has a public subnet that has the default route set to an internet gateway. The private subnet does not have outbound internet access.

The EC2 instance needs to have the ability to download monthly security updates from an outside vendor. However, the company must block any connections that are initiated from the internet.

Which solution will meet these requirements?

Options:

A.  

Configure the private subnet route table to use the internet gateway as the default route.

B.  

Create a NAT gateway in the public subnet. Configure the private subnet route table to use the NAT gateway as the default route.

C.  

Create a NAT instance in the private subnet. Configure the private subnet route table to use the NAT instance as the default route.

D.  

Create a NAT instance in the private subnet. Configure the private subnet route table to use the internet gateway as the default route.

Discussion 0
Questions 25

A company is migrating an online marketplace application from a mainframe system to an Auto Scaling group of Amazon EC2 instances. The EC2 instances access an Amazon Aurora cluster. The application requires a scalable, persistent caching solution to store the results of in-progress transactions and SQL queries.

Options:

A.  

Use an Amazon ElastiCache (Redis OSS) cluster to serve transaction and query results.

B.  

Use an Amazon CloudFront distribution with an Amazon S3 bucket as the origin to cache the transactions. Add an Amazon EC2 instance store volume to the EC2 instances for query result caching.

C.  

Use an Amazon ElastiCache (Memcached) cluster to serve transaction and query results.

D.  

Use an Amazon ElastiCache (Redis OSS) cluster to cache the transactions. Add an Amazon EC2 instance store volume to the EC2 instances for query result caching.

Discussion 0
Questions 26

An ecommerce company wants a disaster recovery solution for its Amazon RDS DB instances that run Microsoft SQL Server Enterprise Edition. The company's current recovery point objective (RPO) and recovery time objective (RTO) are 24 hours.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Create a cross-Region read replica and promote the read replica to the primary instance

B.  

Use AWS Database Migration Service (AWS DMS) to create RDS cross-Region replication.

C.  

Use cross-Region replication every 24 hours to copy native backups to an Amazon S3 bucket

D.  

Copy automatic snapshots to another Region every 24 hours.

Discussion 0
Questions 27

A global company runs its workloads on AWS The company's application uses Amazon S3 buckets across AWS Regions for sensitive data storage and analysis. The company stores millions of objects in multiple S3 buckets daily. The company wants to identify all S3 buckets that are not versioning-enabled.

Which solution will meet these requirements?

Options:

A.  

Set up an AWS CloudTrail event that has a rule to identify all S3 buckets that are not versioning-enabled across Regions

B.  

Use Amazon S3 Storage Lens to identify all S3 buckets that are not versioning-enabled across Regions.

C.  

Enable IAM Access Analyzer for S3 to identify all S3 buckets that are not versioning-enabled across Regions

D.  

Create an S3 Multi-Region Access Point to identify all S3 buckets that are not versioning-enabled across Regions

Discussion 0
Questions 28

A company is migrating some workloads to AWS. However, many workloads will remain on premises. The on-premises workloads require secure and reliable connectivity to AWS with consistent, low-latency performance.

The company has deployed the AWS workloads across multiple AWS accounts and multiple VPCs. The company plans to scale to hundreds of VPCs within the next year.

The company must establish connectivity between each of the VPCs and from the on-premises environment to each VPC.

Which solution will meet these requirements?

Options:

A.  

Use an AWS Direct Connect connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

B.  

Use multiple AWS Site-to-Site VPN connections to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs.

C.  

Use an AWS Direct Connect connection with a Direct Connect gateway to connect the on-premises environment to AWS. Create a transit gateway to establish connectivity between VPCs. Associate the transit gateway with the Direct Connect gateway.

D.  

Use an AWS Site-to-Site VPN connection to connect the on-premises environment to AWS. Configure VPC peering to establish connectivity between VPCs.

Discussion 0
Questions 29

A company wants to implement a data lake in the AWS Cloud. The company must ensure that only specific teams have access to sensitive data in the data lake. The company must have row-level access control for the data lake.

Options:

Options:

A.  

Use Amazon RDS to store the data. Use IAM roles and permissions for data governance and access control.

B.  

Use Amazon Redshift to store the data. Use IAM roles and permissions for data governance and access control.

C.  

Use Amazon S3 to store the data. Use AWS Lake Formation for data governance and access control.

D.  

Use AWS Glue Catalog to store the data. Use AWS Glue DataBrew for data governance and access control.

Discussion 0
Questions 30

A company has developed an API by using an Amazon API Gateway REST API and AWS Lambda functions. The API serves static content and dynamic content to users worldwide. The company wants to decrease the latency of transferring the content for API requests. Which solution will meet these requirements?

Options:

A.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

B.  

Deploy the REST API as a Regional API endpoint. Enable caching. Enable content encoding in the API definition to compress the application data in transit.

C.  

Deploy the REST API as an edge-optimized API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

D.  

Deploy the REST API as a Regional API endpoint. Enable caching. Configure reserved concurrency for the Lambda functions.

Discussion 0
Questions 31

A company has AWS Lambda functions that use environment variables. The company does not want its developers to see environment variables in plaintext.

Which solution will meet these requirements?

Options:

A.  

Deploy code to Amazon EC2 instances instead of using Lambda functions.

B.  

Configure SSL encryption on the Lambda functions to use AWS CloudHSM to store and encrypt the environment variables.

C.  

Create a certificate in AWS Certificate Manager (ACM). Configure the Lambda functions to use the certificate to encrypt the environment variables.

D.  

Create an AWS Key Management Service (AWS KMS) key. Enable encryption helpers on the Lambda functions to use the KMS key to store and encrypt the environment variables.

Discussion 0
Questions 32

A company uses Amazon S3 to store customer data that contains personally identifiable information (PII) attributes. The company needs to make the customer information available to company resources through an AWS Glue Catalog. The company needs to have fine-grained access control for the data so that only specific IAM roles can access the PII data.

Options:

A.  

Create one IAM policy that grants access to PII. Create a second IAM policy that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

B.  

Create one IAM role that grants access to PII. Create a second IAM role that grants access to non-PII data. Assign the PII policy to the specified IAM roles.

C.  

Use AWS Lake Formation to provide the specified IAM roles access to the PII data.

D.  

Use AWS Glue to create one view for PII data. Create a second view for non-PII data. Provide the specified IAM roles access to the PII view.

Discussion 0
Questions 33

A gaming company hosts a browser-based application on AWS. The users of the application consume a large number of videos and images that are stored in Amazon S3. This content is the same for all users.

The application has increased in popularity, and millions of users worldwide are accessing these media files. The company wants to provide the files to the users while reducing the load on the origin.

Which solution meets these requirements MOST cost-effectively?

Options:

A.  

Deploy an AWS Global Accelerator accelerator in front of the web servers.

B.  

Deploy an Amazon CloudFront web distribution in front of the S3 bucket.

C.  

Deploy an Amazon ElastiCache (Redis OSS) instance in front of the web servers.

D.  

Deploy an Amazon ElastiCache (Memcached) instance in front of the web servers.

Discussion 0
Questions 34

A company is deploying a new gaming application on Amazon EC2 instances. The gaming application needs to have access to shared storage.

The company requires a high-performance solution to give the application the ability to use an existing custom protocol to access shared storage. The solution must ensure low latency and must be operationally efficient.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon FSx File Gateway. Create a file share that uses the existing custom protocol. Connect the EC2 instances that host the application to the file share.

B.  

Create an Amazon EC2 Windows instance. Install and configure a Windows file share role on the instance. Connect the EC2 instances that host the application to the file share.

C.  

Create an Amazon Elastic File System (Amazon EFS) file system. Configure the file system to support Lustre. Connect the EC2 instances that host the application to the file system.

D.  

Create an Amazon FSx for Lustre file system. Connect the EC2 instances that host the application to the file system.

Discussion 0
Questions 35

A financial services company has a two-tier consumer banking application. The frontend serves static web content. The backend consists of APIs. The company needs to migrate the frontendcomponent to AWS. The backend of the application will remain on-premises. The company must protect the application from common web vulnerabilities and attacks.

Options:

A.  

Migrate the frontend to Amazon EC2 instances. Deploy an Application Load Balancer (ALB) in front of the instances. Use the instances to invoke the on-premises APIs. Associate AWS WAF rules with the instances.

B.  

Deploy the frontend as an Amazon CloudFront distribution that has multiple origins. Configure one origin to be an Amazon S3 bucket that serves the static web content. Configure a second origin to route traffic to the on-premises APIs based on the URL pattern. Associate AWS WAF rules with the distribution.

C.  

Migrate the frontend to Amazon EC2 instances. Deploy a Network Load Balancer (NLB) in front of the instances. Use the instances to invoke the on-premises APIs. Create an AWS Network Firewall instance. Route all traffic through the Network Firewall instance.

D.  

Deploy the frontend as a static website based on an Amazon S3 bucket. Use an Amazon API Gateway REST API and a set of Amazon EC2 instances to invoke the on-premises APIs. AssociateAWS WAF rules with the REST API and the S3 bucket.

Discussion 0
Questions 36

A company has set up hybrid connectivity between an on-premises data center and AWS by using AWS Site-to-Site VPN. The company is migrating a workload to AWS.

The company sets up a VPC that has two public subnets and two private subnets. The company wants to monitor the total packet loss and round-trip-time (RTT) between the data center and AWS.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon CloudWatch Network Monitor to set up Internet Control Message Protocol (ICMP) probe monitoring from each subnet to the on-premises destination.

B.  

Create an Amazon EC2 instance in each subnet. Create a scheduled job to send Internet Control Message Protocol (ICMP) packets to the on-premises destination.

C.  

Create an AWS Lambda function in each subnet. Write a script to perform Internet Control Message Protocol (ICMP) connectivity checks.

D.  

Create an AWS Batch job in each subnet. Write a script to perform Internet Control Message Protocol (ICMP) connectivity checks.

Discussion 0
Questions 37

A company has a three-tier web application that processes orders from customers. The web tier consists of Amazon EC2 instances behind an Application Load Balancer. The processing tier consists of EC2 instances. The company decoupled the web tier and processing tier by using Amazon Simple Queue Service (Amazon SQS). The storage layer uses Amazon DynamoDB.

At peak times some users report order processing delays and halts. The company has noticed that during these delays, the EC2 instances are running at 100% CPU usage, and the SQS queue fills up. The peak times are variable and unpredictable.

The company needs to improve the performance of the application

Which solution will meet these requirements?

Options:

A.  

Use scheduled scaling for Amazon EC2 Auto Scaling to scale out the processing tier instances for the duration of peak usage times. Use the CPU Utilization metric to determine when to scale.

B.  

Use Amazon ElastiCache for Redis in front of the DynamoDB backend tier. Use target utilization as a metric to determine when to scale.

C.  

Add an Amazon CloudFront distribution to cache the responses for the web tier. Use HTTP latency as a metric to determine when to scale.

D.  

Use an Amazon EC2 Auto Scaling target tracking policy to scale out the processing tier instances. Use the ApproximateNumberOfMessages attribute to determine when to scale.

Discussion 0
Questions 38

An ecommerce company is migrating its on-premises workload to the AWS Cloud. The workload currently consists of a web application and a backend Microsoft SQL database for storage.

The company expects a high volume of customers during a promotional event. The new infrastructure in the AWS Cloud must be highly available and scalable.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Migrate the web application to two Amazon EC2 instances across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS for Microsoft SQL Server with read replicas in both Availability Zones.

B.  

Migrate the web application to an Amazon EC2 instance that runs in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to two EC2 instances across separate AWS Regions with database replication.

C.  

Migrate the web application to Amazon EC2 instances that run in an Auto Scaling group across two Availability Zones behind an Application Load Balancer. Migrate the database to Amazon RDS with Multi-AZ deployment.

D.  

Migrate the web application to three Amazon EC2 instances across three Availability Zones behind an Application Load Balancer. Migrate the database to three EC2 instances across three Availability Zones.

Discussion 0
Questions 39

A company needs a secure connection between its on-premises environment and AWS. This connection does not need high bandwidth and will handle a small amount of traffic. The connection should be set up quickly.

What is the MOST cost-effective method to establish this type of connection?

Options:

A.  

Implement a client VPN

B.  

Implement AWS Direct Connect.

C.  

Implement a bastion host on Amazon EC2.

D.  

Implement an AWS Site-to-Site VPN connection.

Discussion 0
Questions 40

A company is migrating applications from an on-premises Microsoft Active Directory that the company manages to AWS. The company deploys the applications in multiple AWS accounts. The company uses AWS Organizations to manage the accounts centrally.

The company's security team needs a single sign-on solution across all the company's AWS accounts. The company must continue to manage users and groups that are in the on-premises Active Directory

Which solution will meet these requirements?

Options:

A.  

Create an Enterprise Edition Active Directory in AWS Directory Service for Microsoft Active Directory. Configure the Active Directory to be the identity source for AWS IAM Identity Center

B.  

Enable AWS IAM Identity Center. Configure a two-way forest trust relationship to connect the company's self-managed Active Directory with IAM Identity Center by using AWS Directory Service for Microsoft Active Directory.

C.  

Use AWS Directory Service and create a two-way trust relationship with the company's self-managed Active Directory.

D.  

Deploy an identity provider (IdP) on Amazon EC2. Link the IdP as an identity source within AWS IAM Identity Center.

Discussion 0
Questions 41

A company is enhancing the security of its AWS environment, where the company stores a significant amount of sensitive customer data. The company needs a solution that automatically identifies and classifies sensitive data that is stored in multiple Amazon S3 buckets. The solution must automatically respond to data breaches and alert the company's security team through email immediately when noncompliant data is found.

Which solution will meet these requirements?

Options:

A.  

Use Amazon GuardDuty. Configure an AWS Lambda function to route alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe the security team to the SNS topic.

B.  

Use Amazon GuardDuty. Configure an AWS Lambda function to route alerts to an Amazon Simple Queue Service (Amazon SQS) queue. Configure a second Lambda function to periodically poll the SQS queue and to send emails to the security team by using Amazon Simple Email Service (Amazon SES).

C.  

Use Amazon Macie. Integrate Amazon EventBridge with Macie, and configure EventBridge to send alerts to an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe the security team to the SNS topic.

D.  

Use Amazon Macie. Integrate Amazon EventBridge with Macie, and configure EventBridge to route alerts to an Amazon Simple Queue Service (Amazon SQS) queue. Configure an AWS Lambda function to periodically poll the SQS queue and to send alerts to the security team by using Amazon Simple Email Service (Amazon SES).

Discussion 0
Questions 42

A developer needs to export the contents of several Amazon DynamoDB tables into Amazon S3 buckets to comply with company data regulations. The developer uses the AWS CLI to runcommands to export from each table to the proper S3 bucket. The developer sets up AWS credentials correctly and grants resources appropriate permissions. However, the exports of some tables fail.

What should the developer do to resolve this issue?

Options:

A.  

Ensure that point-in-time recovery is enabled on the DynamoDB tables.

B.  

Ensure that the target S3 bucket is in the same AWS Region as the DynamoDB table.

C.  

Ensure that DynamoDB streaming is enabled for the tables.

D.  

Ensure that DynamoDB Accelerator (DAX) is enabled.

Discussion 0
Questions 43

An online education platform experiences lag and buffering during peak usage hours, when thousands of students access video lessons concurrently. A solutions architect needs to improve the performance of the education platform.

The platform needs to handle unpredictable traffic surges without losing responsiveness. The platform must provide smooth video playback performance at all times. The platform must create multiple copies of each video lesson and store the copies in various bitrates to serve users who have different internet speeds. The smallest video size is 7 GB.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Use Amazon ElastiCache to cache videos in all the required bitrates. Use AWS Lambda functions to process the videos and to convert the videos to the required bitrates.

B.  

Create an Auto Scaling group that includes Amazon EC2 instances that are sized to meet peak loads. Use the Auto Scaling group to serve videos. Use the Auto Scaling group to convert the videos to the required bitrates.

C.  

Store a copy of every video in every required bitrate in an Amazon S3 bucket. Use a single Amazon EC2 instance to serve the videos.

D.  

Use Amazon Kinesis Video Streams to store and serve the videos. Use AWS Lambda functions to process the videos and to convert the videos to the required bitrates.

Discussion 0
Questions 44

A company has an ecommerce application that users access through multiple mobile apps and web applications. The company needs a solution that will receive requests from the mobile apps and web applications through an API.

Request traffic volume varies significantly throughout each day. Traffic spikes during sales events. The solution must be loosely coupled and ensure that no requests are lost.

Options:

A.  

Create an Application Load Balancer (ALB). Create an AWS Elastic Beanstalk endpoint to process the requests. Add the Elastic Beanstalk endpoint to the target group of the ALB.

B.  

Set up an Amazon API Gateway REST API with an integration to an Amazon Simple Queue Service (Amazon SQS) queue. Configure a dead-letter queue. Create an AWS Lambda function to poll the queue to process the requests.

C.  

Create an Application Load Balancer (ALB). Create an AWS Lambda function to process the requests. Add the Lambda function as a target of the ALB.

D.  

Set up an Amazon API Gateway HTTP API with an integration to an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to process the requests. Subscribe the function to the SNS topic to process the requests.

Discussion 0
Questions 45

A company runs an application on EC2 instances that need access to RDS credentials stored in AWS Secrets Manager.

Which solution meets this requirement?

Options:

A.  

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the role access to the secret.

B.  

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the user access to the secret.

C.  

Create a resource-based policy for the secret. Use EC2 Instance Connect to access the secret.

D.  

Create an identity-based policy for the secret. Grant direct access to the EC2 instances.

Discussion 0
Questions 46

An e-commerce company has an application that uses Amazon DynamoDB tables configured with provisioned capacity. Order data is stored in a table named Orders. The Orders table has a primary key of order-ID and a sort key of product-ID. The company configured an AWS Lambda function to receive DynamoDB streams from the Orders table and update a table named Inventory. The company has noticed that during peak sales periods, updates to the Inventory table take longer than the company can tolerate. Which solutions will resolve the slow table updates? (Select TWO.)

Options:

A.  

Add a global secondary index to the Orders table. Include the product-ID attribute.

B.  

Set the batch size attribute of the DynamoDB streams to be based on the size of items in the Orders table.

C.  

Increase the DynamoDB table provisioned capacity by 1,000 write capacity units (WCUs).

D.  

Increase the DynamoDB table provisioned capacity by 1,000 read capacity units (RCUs).

E.  

Increase the timeout of the Lambda function to 15 minutes.

Discussion 0
Questions 47

A company needs to ingest and analyze telemetry data from vehicles at scale for machine learning and reporting.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon QuickSight to visualize the data.

B.  

Use Amazon DynamoDB to store data points. Use DynamoDB Connector to ingest data into Amazon EMR for processing. Use Amazon QuickSight to visualize the data.

C.  

Use Amazon Neptune to store data points. Use Amazon Kinesis Data Streams to ingest data into a Lambda function for processing. Use Amazon QuickSight to visualize the data.

D.  

Use Amazon Timestream for LiveAnalytics to store data points. Grant Amazon SageMaker permission to access the data. Use Amazon Athena to visualize the data.

Discussion 0
Questions 48

Question:

A company uses Apache Hadoop and Spark on-prem. The infrastructure is complex and not scalable. They want to reduce operational complexity but keep data processing on-premises.

Options:

Options:

A.  

Use Site-to-Site VPN to access on-prem HDFS. Use Amazon EMR to process the data.

B.  

Use AWS DataSync to connect to on-prem HDFS. Use Amazon EMR to process the data.

C.  

Migrate to Amazon EMR on AWS Outposts.

D.  

Use AWS Snowball to migrate data to S3. Use EMR to process.

Discussion 0
Questions 49

A manufacturing company runs an order processing application in its VPC. The company wants to securely send messages from the application to an external Salesforce system that uses Open Authorization (OAuth).

A solutions architect needs to integrate the company's order processing application with the external Salesforce system.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon Simple Notification Service (Amazon SNS) topic in a fanout configuration that pushes data to an HTTPS endpoint. Configure the order processing application to publish messages to the SNS topic.

B.  

Create an Amazon Simple Notification Service (Amazon SNS) topic in a fanout configuration that pushes data to an Amazon Data Firehose delivery stream that has a HTTP destination. Configure the order processing application to publish messages to the SNS topic.

C.  

Create an Amazon EventBridge rule and configure an Amazon EventBridge API destination partner Configure the order processing application to publish messages to Amazon EventBridge.

D.  

Create an Amazon Managed Streaming for Apache Kafka (Amazon MSK) topic that has an outbound MSK Connect connector. Configure the order processing application to publish messages to the MSK topic.

Discussion 0
Questions 50

A company stores 5 PB of archived data on physical tapes. The company needs to preserve the data for another 10 years. The data center that stores the tapes has a 10 Gbps Direct Connect connection to an AWS Region. The company wants to migrate the data to AWS within the next 6 months.

Options:

A.  

Read the data from the tapes on premises. Use local storage to stage the data. Use AWS DataSync to migrate the data to Amazon S3 Glacier Flexible Retrieval storage.

B.  

Use an on-premises backup application to read the data from the tapes. Use the backup application to write directly to Amazon S3 Glacier Deep Archive storage.

C.  

Order multiple AWS Snowball Edge devices. Copy the physical tapes to virtual tapes on the Snowball Edge devices. Ship the Snowball Edge devices to AWS. Create an S3 Lifecycle policy to move the tapes to Amazon S3 Glacier Instant Retrieval storage.

D.  

Configure an on-premises AWS Storage Gateway Tape Gateway. Create virtual tapes in the AWS Cloud. Use backup software to copy the physical tapes to the virtual tapes. Move the virtual tapes to Amazon S3 Glacier Deep Archive storage.

Discussion 0
Questions 51

A company is preparing to store confidential data in Amazon S3. For compliance reasons, the data must be encrypted at rest. Encryption key usage must be logged for auditing purposes. Keys must be rotated every year.

Which solution meets these requirements and is the MOST operationally efficient?

Options:

A.  

Server-side encryption with customer-provided keys (SSE-C)

B.  

Server-side encryption with Amazon S3 managed keys (SSE-S3)

C.  

Server-side encryption with AWS KMS keys (SSE-KMS) with manual rotation

D.  

Server-side encryption with AWS KMS keys (SSE-KMS) with automatic rotation

Discussion 0
Questions 52

A company hosts its application on several Amazon EC2 instances inside a VPC. The company creates a dedicated Amazon S3 bucket for each customer to store their relevant information in Amazon S3.

The company wants to ensure that the application running on EC2 instances can securely access only the S3 buckets that belong to the company's AWS account.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy to provide access to only the specific buckets that the application needs.

B.  

Create a NAT gateway in a public subnet with a security group that allows access to only Amazon S3 Update the route tables to use the NAT Gateway.

C.  

Create a gateway endpoint for Amazon S3 that is attached to the VPC Update the IAM instance profile policy with a Deny action and the following condition key:

D.  

Create a NAT Gateway in a public subnet Update route tables to use the NAT Gateway Assign bucket policies for all buckets with a Deny action and the following condition key:

Discussion 0
Questions 53

A company wants to share data that is collected from self-driving cars with the automobile community. The data will be made available from within an Amazon S3 bucket. The company wants to minimize its cost of making this data available to other AWS accounts.

What should a solutions architect do to accomplish this goal?

Options:

A.  

Create an S3 VPC endpoint for the bucket.

B.  

Configure the S3 bucket to be a Requester Pays bucket.

C.  

Create an Amazon CloudFront distribution in front of the S3 bucket.

D.  

Require that the files be accessible only with the use of the BitTorrent protocol.

Discussion 0
Questions 54

A company has customers located across the world. The company wants to use automation to secure its systems and network infrastructure The company's security team must be able to track and audit all incremental changes to the infrastructure.

Which solution will meet these requirements?

Options:

A.  

Use AWS Organizations to set up the infrastructure. Use AWS Config to track changes

B.  

Use AWS Cloud Formation to set up the infrastructure. Use AWS Config to track changes.

C.  

Use AWS Organizations to set up the infrastructure. Use AWS Service Catalog to track changes.

D.  

Use AWS Cloud Formation to set up the infrastructure. Use AWS Service Catalog to track changes.

Discussion 0
Questions 55

A company is planning to migrate customer records to an Amazon S3 bucket. The company needs to ensure that customer records are protected against unauthorized access and are encrypted in transit and at rest. The company must monitor all access to the S3 bucket.

Options:

A.  

Use AWS Key Management Service (AWS KMS) to encrypt customer records at rest. Create an S3 bucket policy that includes the aws:SecureTransport condition. Use an IAM policy to control access to the records. Use AWS CloudTrail to monitor access to the records.

B.  

Use AWS Nitro Enclaves to encrypt customer records at rest. Use AWS Key Management Service (AWS KMS) to encrypt the records in transit. Use an IAM policy to control access to the records. Use AWS CloudTrail and AWS Security Hub to monitor access to the records.

C.  

Use AWS Key Management Service (AWS KMS) to encrypt customer records at rest. Create an Amazon Cognito user pool to control access to the records. Use AWS CloudTrail to monitor access to the records. Use Amazon GuardDuty to detect threats.

D.  

Use server-side encryption with Amazon S3 managed keys (SSE-S3) with default settings to encrypt the records at rest. Access the records by using an Amazon CloudFront distribution that uses the S3 bucket as the origin. Use IAM roles to control access to the records. Use Amazon CloudWatch to monitor access to the records.

Discussion 0
Questions 56

A company wants to isolate its workloads by creating an AWS account for each workload. The company needs a solution that centrally manages networking components for the workloads. The solution also must create accounts with automatic security controls (guardrails).

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use AWS Control Tower to deploy accounts. Create a networking account that has a VPC with private subnets and public subnets. Use AWS Resource Access Manager (AWS RAM) to share the subnets with the workload accounts.

B.  

Use AWS Organizations to deploy accounts. Create a networking account that has a VPC with private subnets and public subnets. Use AWS Resource Access Manager (AWS RAM) to share the subnets with the workload accounts.

C.  

Use AWS Control Tower to deploy accounts. Deploy a VPC in each workload account. Configure each VPC to route through an inspection VPC by using a transit gateway attachment.

D.  

Use AWS Organizations to deploy accounts. Deploy a VPC in each workload account. Configure each VPC to route through an inspection VPC by using a transit gateway attachment.

Discussion 0
Questions 57

Question:

A company is building an ecommerce application that uses a relational database to store customer data and order history. The company also needs a solution to store 100 GB of product images. The company expects the traffic flow for the application to be predictable. Which solution will meet these requirements MOST cost-effectively?

Options:

Options:

A.  

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon S3 bucket.

B.  

Use Amazon DynamoDB for the database. Store the product images in an Amazon S3 bucket.

C.  

Use Amazon RDS for MySQL for the database. Store the product images in an Amazon Aurora MySQL database.

D.  

Create three Amazon EC2 instances. Install MongoDB software on the instances to use as the database. Store the product images in an Amazon RDS for MySQL database with a Multi-AZ deployment.

Discussion 0
Questions 58

A financial services company plans to launch a new application on AWS to handle sensitive financial transactions. The company will deploy the application on Amazon EC2 instances. The company will use Amazon RDS for MySQL as the database. The company's security policies mandate that data must be encrypted at rest and in transit.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.

B.  

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys. Configure IPsec tunnels for encryption in transit

C.  

Implement third-party application-level data encryption before storing data in Amazon RDS for MySQL. Configure AWS Certificate Manager (ACM) SSL/TLS certificates for encryption in transit.

D.  

Configure encryption at rest for Amazon RDS for MySQL by using AWS KMS managed keys Configure a VPN connection to enable private connectivity to encrypt data in transit.

Discussion 0
Questions 59

A company generates approximately 20 GB of data multiple times each day. The company uses AWS DataSync to copy all data from on-premises storage to Amazon S3 every 6 hours for further processing. The analytics team wants to modify the copy process to copy only data relevant to the analytics team and ignore the rest of the data. The team wants to copy data as soon as possible and receive a notification when the copy process is finished. Which combination of steps will meet these requirements MOST cost-effectively? (Select THREE.)

Options:

A.  

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create a custom script to upload the manifest file to an S3 bucket.

B.  

Modify the data generation process on-premises to create a manifest file at the end of the copy process with the names of the objects to be copied to Amazon S3. Create an AWS Lambda function to load the manifest file data into an Amazon DynamoDB table.

C.  

Create an AWS Lambda function that Amazon EventBridge invokes when the manifest file is loaded into Amazon DynamoDB. Configure the Lambda function to copy the data from on-premises storage to the S3 bucket that uses the manifest file.

D.  

Create an AWS Lambda function that an S3 Event Notification invokes when the manifest file is uploaded. Configure the Lambda function to invoke the DataSync task by calling the StartTaskExecution API action with a manifest.

E.  

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an Amazon EventBridge rule to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

F.  

Create an Amazon Simple Notification Service (Amazon SNS) topic. Create an AWS Lambda function to send an email notification to the SNS topic when the DataSync task execution status changes to SUCCESS or to ERROR.

Discussion 0
Questions 60

A company recently migrated a data warehouse to AWS. The company has an AWS Direct Connect connection to AWS. Company users query the data warehouse by using a visualization tool. The average size of the queries that the data warehouse returns is 50 MB. The average visualization that the visualization tool produces is 500 KB in size. The result sets that the data warehouse returns are not cached.

The company wants to optimize costs for data transfers between the data warehouse and the company.

Which solution will meet this requirement?

Options:

A.  

Host the visualization tool on premises. Connect to the data warehouse directly through the internet.

B.  

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the internet.

C.  

Host the visualization tool on premises. Connect to the data warehouse through the Direct Connect connection.

D.  

Host the visualization tool in the same AWS Region as the data warehouse. Access the visualization tool through the Direct Connect connection.

Discussion 0
Questions 61

A company has an application that uses an Amazon RDS for PostgreSQL database. The company is developing an application feature that will store sensitive information for an individual in the database.

During a security review of the environment, the company discovers that the RDS DB instance is not encrypting data at rest. The company needs a solution that will provide encryption at rest for all the existing data and for any new data that is entered for an individual.

Which combination of steps should the company take to meet these requirements? (Select TWO.)

Options:

A.  

Create a snapshot of the DB instance. Enable encryption on the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

B.  

Create a snapshot of the DB instance. Create an encrypted copy of the snapshot. Use the encrypted snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

C.  

Modify the configuration of the DB instance by enabling encryption. Create a snapshot of the DB instance. Use the snapshot to create a new DB instance. Adjust the application configuration to use the new DB instance.

D.  

Use AWS Key Management Service (AWS KMS) to create a new default AWS managed aws/rds key. Select this key as the encryption key for operations with Amazon RDS.

E.  

Use AWS Key Management Service (AWS KMS) to create a new customer managed key. Select this key as the encryption key for operations with Amazon RDS.

Discussion 0
Questions 62

A company deploys its applications on Amazon Elastic Kubernetes Service (Amazon EKS) behind an Application Load Balancer in an AWS Region. The application needs to store data in a PostgreSQL database engine. The company wants the data in the database to be highly available. The company also needs increased capacity for read workloads.

Which solution will meet these requirements with the MOST operational efficiency?

Options:

A.  

Create an Amazon DynamoDB database table configured with global tables.

B.  

Create an Amazon RDS database with Multi-AZ deployments

C.  

Create an Amazon RDS database with Multi-AZ DB cluster deployment.

D.  

Create an Amazon RDS database configured with cross-Region read replicas.

Discussion 0
Questions 63

A company wants to improve the availability and performance of its hybrid application. The application consists of a stateful TCP-based workload hosted on Amazon EC2 instances in different AWS Regions and a stateless UDP-based workload hosted on premises.

Which combination of actions should a solutions architect take to improve availability and performance? (Select TWO.)

Options:

A.  

Create an accelerator using AWS Global Accelerator. Add the load balancers as endpoints.

B.  

Create an Amazon CloudFront distribution with an origin that uses Amazon Route 53 latency-based routing to route requests to the load balancers.

C.  

Configure two Application Load Balancers in each Region. The first will route to the EC2 endpoints. and the second will route lo the on-premises endpoints.

D.  

Configure a Network Load Balancer in each Region to address the EC2 endpoints. Configure a Network Load Balancer in each Region that routes to the on-premises endpoints.

E.  

Configure a Network Load Balancer in each Region to address the EC2 endpoints. Configure an Application Load Balancer in each Region that routes to the on-premises endpoints.

Discussion 0
Questions 64

An ecommerce company runs an application that uses an Amazon DynamoDB table in a single AWS Region. The company wants to deploy the application to a second Region. The company needs to support multi-active replication with low latency reads and writes to the existing DynamoDB table in both Regions.

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.  

Create a DynamoDB global secondary index (GSI) for the existing table. Create a new table in the second Region. Convert the existing DynamoDB table to a global table. Specify the new table as the secondary table.

B.  

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create a new application that uses the DynamoDB Streams Kinesis Adapter and the Amazon Kinesis Client Library (KCL). Configure the new application to read data from the DynamoDB table in the first Region and to write the data to the new table in the second Region.

C.  

Convert the existing DynamoDB table to a global table. Choose the appropriate second Region to achieve active-active write capabilities in both Regions.

D.  

Enable Amazon DynamoDB Streams for the existing table. Create a new table in the second Region. Create an AWS Lambda function in the first Region that reads data from the table in the first Region and writes the data to the new table in the second Region. Set a DynamoDB stream as the input trigger for the Lambda function.

Discussion 0
Questions 65

A company stores sensitive customer data in an Amazon DynamoDB table. The company frequently updates the data. The company wants to use the data to personalize offers for customers.

The company's analytics team has its own AWS account. The analytics team runs an application on Amazon EC2 instances that needs to process data from the DynamoDB tables. The company needs to follow security best practices to create a process to regularly share data from DynamoDB to the analytics team.

Which solution will meet these requirements?

Options:

A.  

Export the required data from the DynamoDB table to an Amazon S3 bucket as multiple JSON files. Provide the analytics team with the necessary IAM permissions to access the S3 bucket.

B.  

Allow public access to the DynamoDB table. Create an IAM user that has permission to access DynamoD

B.  

Share the IAM user with the analytics team.

C.  

Allow public access to the DynamoDB table. Create an IAM user that has read-only permission for DynamoDB. Share the IAM user with the analytics team.

D.  

Create a cross-account IAM role. Create an IAM policy that allows the AWS account ID of the analytics team to access the DynamoDB table. Attach the IAM policy to the IAM role. Establish a trust relationship between accounts.

Discussion 0
Questions 66

A company is building a serverless web application with multiple interdependent workflows that millions of users worldwide will access. The application needs to handle bursts of traffic.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with a Standard Workflow.

B.  

Deploy an Amazon API Gateway HTTP API with a usage plan and throttle settings. Use AWS Step Functions with an Express Workflow.

C.  

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions with an Express Workflow.

D.  

Deploy an Amazon API Gateway HTTP API without a usage plan. Use AWS Step Functions and multiple AWS Lambda functions with reserved concurrency.

Discussion 0
Questions 67

A financial service company has a two-tier consumer banking application. The frontend serves static web content. The backend consists of APIs. The company needs to migrate the frontendcomponent to AWS. The backend of the application will remain on premises. The company must protect the application from common web vulnerabilities and attacks.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Migrate the frontend to Amazon EC2 instances. Deploy an Application Load Balancer (ALB) in front of the instances. Use the instances to invoke the on-premises APIs. Associate AWS WAF rules with the instances.

B.  

Deploy the frontend as an Amazon CloudFront distribution that has multiple origins. Configure one origin to be an Amazon S3 bucket that serves the static web content. Configure a second origin to route traffic to the on-premises APIs based on the URL pattern. Associate AWS WAF rules with the distribution.

C.  

Migrate the frontend to Amazon EC2 instances. Deploy a Network Load Balancer (NLB) in front of the instances. Use the instances to invoke the on-premises APIs. Create an AWS Network Firewall instance. Route all traffic through the Network Firewall instance.

D.  

Deploy the frontend as a static website based on an Amazon S3 bucket. Use an Amazon API Gateway REST API and a set of Amazon EC2 instances to invoke the on-premises APIs. Associate AWS WAF rules with the REST API and the S3 bucket.

Discussion 0
Questions 68

A company has an application that runs on an Amazon Elastic Kubernetes Service (Amazon EKS) cluster on Amazon EC2 instances. The application has a U1 that uses Amazon DynamoDB and data services that use Amazon S3 as part of the application deployment.

The company must ensure that the EKS Pods for the U1 can access only Amazon DynamoDB and that the EKS Pods for the data services can access only Amazon S3. The company uses AWS Identity and Access Management |IAM).

Which solution meets these requirements?

Options:

A.  

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach both IAM policies to the EC2 instance profile. Use role-based access control (RBAC) to control access to Amazon S3 or DynamoDB (or the respective EKS Pods.

B.  

Create separate IAM policies (or Amazon S3 and DynamoDB access with the required permissions. Attach the Amazon S3 IAM policy directly to the EKS Pods (or the data services and the DynamoDB policy to the EKS Pods for the U1.

C.  

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Attach the Amazon S3 Full Access policy to the data services account and the AmazonDynamoDBFullAccess policy to the U1 service account.

D.  

Create separate Kubernetes service accounts for the U1 and data services to assume an IAM role. Use IAM Role for Service Accounts (IRSA) to provide access to the EKS Pods for the U1 to Amazon S3 and the EKS Pods for the data services to DynamoDB.

Discussion 0
Questions 69

A media company hosts a web application on AWS for uploading videos. Only authenticated users should upload within a specified time frame after authentication.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure the application to generate IAM temporary security credentials for authenticated users.

B.  

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.  

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.  

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Discussion 0
Questions 70

A company runs multiple workloads on virtual machines (VMs) in an on-premises data center. The company is expanding rapidly. The on-premises data center is not able to scale fast enough to meet business needs. The company wants to migrate the workloads to AWS.

The migration is time sensitive. The company wants to use a lift-and-shift strategy for non-critical workloads.

Which combination of steps will meet these requirements? (Select THREE.)

Options:

A.  

Use the AWS Schema Conversion Tool (AWS SCT) to collect data about the VMs.

B.  

Use AWS Application Migration Service. Install the AWS Replication Agent on the VMs.

C.  

Complete the initial replication of the VMs. Launch test instances to perform acceptance tests on the VMs.

D.  

Stop all operations on the VMs Launch a cutover instance.

E.  

Use AWS App2Container (A2C) to collect data about the VMs.

F.  

Use AWS Database Migration Service (AWS DMS) to migrate the VMs.

Discussion 0
Questions 71

A company runs an application on Amazon EC2 instances. The instances need to access an Amazon RDS database by using specific credentials. The company uses AWS Secrets Manager to contain the credentials the EC2 instances must use. Which solution will meet this requirement?

Options:

A.  

Create an IAM role, and attach the role to each EC2 instance profile. Use an identity-based policy to grant the new IAM role access to the secret that contains the database credentials.

B.  

Create an IAM user, and attach the user to each EC2 instance profile. Use a resource-based policy to grant the new IAM user access to the secret that contains the database credentials.

C.  

Create a resource-based policy for the secret that contains the database credentials. Use EC2 Instance Connect to access the secret.

D.  

Create an identity-based policy for the secret that contains the database credentials. Grant direct access to the EC2 instances.

Discussion 0
Questions 72

A company is migrating a large amount of data from on-premises storage to AWS. Windows, Mac, and Linux based Amazon EC2 instances in the same AWS Region will access the data by using SMB and NFS storage protocols. The company will access a portion of the data routinely. The company will access the remaining data infrequently.

The company needs to design a solution to host the data.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create an Amazon Elastic File System (Amazon EFS) volume that uses EFS Intelligent-Tiering. Use AWS DataSync to migrate the data to the EFS volume.

B.  

Create an Amazon FSx for ONTAP instance. Create an FSx for ONTAP file system with a root volume that uses the auto tiering policy. Migrate the data to the FSx for ONTAP volume.

C.  

Create an Amazon S3 bucket that uses S3 Intelligent-Tiering. Migrate the data to the S3 bucket by using an AWS Storage Gateway Amazon S3 File Gateway.

D.  

Create an Amazon FSx for OpenZFS file system. Migrate the data to the new volume.

Discussion 0
Questions 73

A company hosts a website on Amazon EC2 instances behind an Application Load Balancer (ALB). The website serves static content. Website traffic is increasing. The company wants to minimize the website hosting costs.

Which solution will meet these requirements?

Options:

A.  

Move the website to an Amazon S3 bucket. Configure an Amazon CloudFront distribution for the S3 bucket.

B.  

Move the website to an Amazon S3 bucket. Configure an Amazon ElastiCache cluster for the S3 bucket.

C.  

Move the website to AWS Amplify. Configure an ALB to resolve to the Amplify website.

D.  

Move the website to AWS Amplify. Configure EC2 instances to cache the website.

Discussion 0
Questions 74

A company wants to release a new device that will collect data to track overnight sleep on an intelligent mattress. Sensors will send data that will be uploaded to an Amazon S3 bucket. Each mattress generates about 2 MB of data each night.

An application must process the data and summarize the data for each user. The application must make the results available as soon as possible. Every invocation of the application will require about 1 GB of memory and will finish running within 30 seconds.

Which solution will run the application MOST cost-effectively?

Options:

A.  

AWS Lambda with a Python script

B.  

AWS Glue with a Scala job

C.  

Amazon EMR with an Apache Spark script

D.  

AWS Glue with a PySpark job

Discussion 0
Questions 75

A company is setting up a development environment on AWS for a team of developers. The team needs to access multiple Amazon S3 buckets to store project data. The team also needs to use Amazon EC2 to run development instances.

The company needs to ensure that the developers have access only to specific Amazon S3 buckets and EC2 instances. Access permissions must be assigned according to each developer's role on the team. The company wants to minimize the use of permanent credentials and to ensure access is securely managed according to the principle of least privilege.

Which solution will meet these requirements?

Options:

A.  

Create IAM roles that have administrative-level permissions for Amazon S3 and Amazon EC2. Require developers to sign in by using Amazon Cognito to access Amazon S3 and Amazon EC2.

B.  

Create IAM roles that have fine-grained permissions for Amazon S3 and Amazon EC2. Configure AWS IAM Identity Center to manage credentials for the developers.

C.  

Create IAM users that have programmatic access to Amazon S3 and Amazon EC2. Generate individual access keys for each developer to access Amazon S3 and Amazon EC2.

D.  

Create a VPC endpoint for Amazon S3. Require developers to access Amazon EC2 instances and Amazon S3 buckets through a bastion host.

Discussion 0
Questions 76

A company is migrating a data processing application to AWS. The application processes several short-lived batch jobs that cannot be disrupted. The process generates data after each batch job finishes running. The company accesses the data for 30 days following data generation. After 30 days, the company stores the data for 2 years.

The company wants to optimize costs for the application and data storage. Which solution will meet these requirements?

Options:

A.  

Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Instant Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.

B.  

Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Glacier Instant Retrieval. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.

C.  

Use Amazon EC2 Spot Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Flexible Retrieval after 30 days. Configure a bucket policy to delete the data after 2 years.

D.  

Use Amazon EC2 On-Demand Instances to run the application. Store the data in Amazon S3 Standard. Move the data to S3 Glacier Deep Archive after 30 days. Configure an S3 Lifecycle configuration to delete the data after 2 years.

Discussion 0
Questions 77

A company needs an automated solution to detect cryptocurrency mining activity on Amazon EC2 instances. The solution must automatically isolate any identified EC2 instances for forensic analysis.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon EventBridge rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

B.  

Create an AWS Security Hub custom action that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the custom action to invoke an AWS Lambda function to isolate the identified EC2 instances.

C.  

Create an Amazon Inspector rule that runs when Amazon GuardDuty detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

D.  

Create an AWS Config custom rule that runs when AWS Config detects cryptocurrency mining activity. Configure the rule to invoke an AWS Lambda function to isolate the identified EC2 instances.

Discussion 0
Questions 78

A company runs a web application in a single AWS Region. A solutions architect wants to ensure that the web application can continue to operate if the application becomes unavailable in the Region.

Which solution will meet this requirement?

Options:

A.  

Deploy the application in multiple Regions. Use Amazon Route 53 DNS health checks to route traffic to a healthy Region.

B.  

Deploy the application in multiple Availability Zones within a single Region. Use Amazon Route 53 DNS health checks to route traffic to healthy application resources.

C.  

Deploy the application in multiple Regions. Use an Amazon Route 53 simple routing record to route traffic to a healthy Region.

D.  

Deploy the application in multiple Availability Zones within a single Region. Use an Amazon Route 53 latency record in each Availability Zone to route traffic to a healthy Availability Zone.

Discussion 0
Questions 79

A company is designing an IPv6 application that is hosted on Amazon EC2 instances in a private subnet within a VPC. The application will store user-uploaded content in Amazon S3 buckets. The application will save each S3 object's URL link and metadata in Amazon DynamoDB.

The company must not use public internet connections to transmit user-uploaded content or metadata.

Which solution will meet these requirements?

Options:

A.  

Implement a gateway VPC endpoint for Amazon S3 and an interface VPC endpoint for Amazon DynamoDB.

B.  

Implement interface VPC endpoints for both Amazon S3 and Amazon DynamoD

B.  

C.  

Implement gateway VPC endpoints for both Amazon S3 and Amazon DynamoDB.

D.  

Implement a gateway VPC endpoint for Amazon DynamoDB and an interface VPC endpoint for Amazon S3.

Discussion 0
Questions 80

A company runs an application as a task in an Amazon Elastic Container Service (Amazon ECS) cluster. The application must have read and write access to a specific group of Amazon S3 buckets. The S3 buckets are in the same AWS Region and AWS account as the ECS cluster. The company needs to grant the application access to the S3 buckets according to the principle of least privilege.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.  

Add a tag to each bucket. Create an IAM policy that includes a StringEquals condition that matches the tags and values of the buckets.

B.  

Create an IAM policy that lists the full Amazon Resource Name (ARN) for each S3 bucket.

C.  

Attach the IAM policy to the instance role of the ECS task.

D.  

Create an IAM policy that includes a wildcard Amazon Resource Name (ARN) that matches all combinations of the S3 bucket names.

E.  

Attach the IAM policy to the task role of the ECS task.

Discussion 0
Questions 81

A company needs to store confidential files on AWS. The company accesses the files every week. The company must encrypt the files by using envelope encryption, and the encryption keys must be rotated automatically. The company must have an audit trail to monitor encryption key usage.

Which combination of solutions will meet these requirements? (Select TWO.)

Options:

A.  

Store the confidential files in Amazon S3.

B.  

Store the confidential files in Amazon S3 Glacier Deep Archive.

C.  

Use server-side encryption with customer-provided keys (SSE-C).

D.  

Use server-side encryption with Amazon S3 managed keys (SSE-S3).

E.  

Use server-side encryption with AWS KMS managed keys (SSE-KMS).

Discussion 0
Questions 82

A company has an application that runs on Amazon EC2 instances within a private subnet in a VPC. The instances access data in an Amazon S3 bucket in the same AWS Region. The VPC contains a NAT gateway in a public subnet to access the S3 bucket. The company wants to reduce costs by replacing the NAT gateway without compromising security or redundancy.

Which solution meets these requirements?

Options:

A.  

Replace the NAT gateway with a NAT instance.

B.  

Replace the NAT gateway with an internet gateway.

C.  

Replace the NAT gateway with a gateway VPC endpoint.

D.  

Replace the NAT gateway with an AWS Direct Connect connection.

Discussion 0
Questions 83

A company hosts an application on AWS that stores files that users need to access. The application uses two Amazon EC2 instances. One instance is in Availability Zone A, and the second instance is in Availability Zone B. Both instances use Amazon Elastic Block Store (Amazon EBS) volumes. Users must be able to access the files at any time without delay. Users report that the two instances occasionally contain different versions of the same file. Users occasionally receive HTTP 404 errors when they try to download files. The company must address the customer issues. The company cannot make changes to the application code. Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.  

Run the robocopy command on one of the EC2 instances on a schedule to copy files from the Availability Zone A instance to the Availability Zone B instance.

B.  

Configure the application to store the files on both EBS volumes each time a user writes or updates a file.

C.  

Mount an Amazon Elastic File System (Amazon EFS) file system to the EC2 instances. Copy the files from the EBS volumes to the EFS file system. Configure the application to store files in the EFS file system.

D.  

Create an EC2 instance profile that allows the instance in Availability Zone A to access the S3 bucket. Re-associate the instance profile to the instance in Availability Zone B when needed.

Discussion 0
Questions 84

A company stores data in Amazon S3. According to regulations, the data must not contain personally identifiable information (PII). The company recently discovered that S3 buckets have some objects that contain PII. The company needs to automatically detect PII in S3 buckets and to notify the company's security team. Which solution will meet these requirements?

Options:

A.  

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData event type from Macie findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.

B.  

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Notification Service (Amazon SNS) notification to the security team.

C.  

Use Amazon Macie. Create an Amazon EventBridge rule to filter the SensitiveData:S3Object/Personal event type from Macie findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.

D.  

Use Amazon GuardDuty. Create an Amazon EventBridge rule to filter the CRITICAL event type from GuardDuty findings and to send an Amazon Simple Queue Service (Amazon SQS) notification to the security team.

Discussion 0
Questions 85

A company is planning to run an AI/ML workload on AWS. The company needs to train a model on a dataset that is in Amazon S3 Standard. A model training application requires multiple compute nodes and single-digit millisecond access to the data.

Which solution will meet these requirements in the MOST cost-effective way?

Options:

A.  

Move the data to S3 Intelligent-Tiering. Point the model training application to S3 Intelligent-Tiering as the data source.

B.  

Add partitions to the S3 bucket by adding random prefixes. Reconfigure the model training application to point to the new prefixes as the data source.

C.  

Move the data to S3 Express One Zone. Point the model training application to S3 Express One Zone as the data source.

D.  

Move the data to a General Purpose SSD (gp3) Amazon Elastic Block Store (Amazon EBS)volume attached to an Amazon EC2 instance. Point the model training application to the gp3 volume as the data source.

Discussion 0
Questions 86

A company has applications that run on Amazon EC2 instances in a VPC One of the applications needs to call the Amazon S3 API to store and read objects. According to the company's security regulations, no traffic from the applications is allowed to travel across the internet.

Which solution will meet these requirements?

Options:

A.  

Configure an S3 gateway endpoint.

B.  

Create an S3 bucket in a private subnet.

C.  

Create an S3 bucket in the same AWS Region as the EC2 instances.

D.  

Configure a NAT gateway in the same subnet as the EC2 instances

Discussion 0
Questions 87

A company is running a media store across multiple Amazon EC2 instances distributed across multiple Availability Zones in a single VPC. The company wants a high-performing solution to share data between all the EC2 instances, and prefers to keep the data within the VPC only.

What should a solutions architect recommend?

Options:

A.  

Create an Amazon S3 bucket and call the service APIs from each instance's application.

B.  

Create an Amazon S3 bucket and configure all instances to access it as a mounted volume.

C.  

Configure an Amazon Elastic Block Store (Amazon EBS) volume and mount it across all instances.

D.  

Configure an Amazon Elastic File System (Amazon EFS) file system and mount It across all instances.

Discussion 0
Questions 88

A company is developing a latency-sensitive application. Part of the application includes several AWS Lambda functions that need to initialize as quickly as possible. The Lambda functions are written in Java and contain initialization code outside the handlers to load libraries, initialize classes, and generate unique IDs.

Which solution will meet the startup performance requirement MOST cost-effectively?

Options:

A.  

Move all the initialization code to the handlers for each Lambda function. Activate Lambda SnapStart for each Lambda function. Configure SnapStart to reference the $LATEST version of each Lambda function.

B.  

Publish a version of each Lambda function. Create an alias for each Lambda function. Configure each alias to point to its corresponding version. Set up provisioned concurrency configuration for each Lambda function to point to the corresponding alias.

C.  

Publish a version of each Lambda function. Set up a provisioned concurrency configuration for each Lambda function to point to the corresponding version. Activate Lambda SnapStart for the published versions of the Lambda functions.

D.  

Update the Lambda functions to add a pre-snapshot hook. Move the code that generates unique IDs into the handlers. Publish a version of each Lambda function. Activate Lambda SnapStart for the published versions of the Lambda functions.

Discussion 0
Questions 89

Question:

A company operates an online photo-sharing service and stores data in AWS Account A in a centralized Amazon S3 bucket. The company wants to grant a second AWS account named Account B access to the centralized S3 bucket. The company owns Account B.

Options:

Options:

A.  

Enable S3 Transfer Acceleration to provide Account B access to the centralized S3 bucket in Account

A.  

B.  

Enable cross-Region replication between Account A and Account B to share the S3 bucket data.

C.  

Use Amazon CloudFront to distribute the S3 bucket contents. Grant Account B access to the bucket contents through a signed URL.

D.  

Create a bucket policy that grants Account B permission to access the centralized S3 bucket in Account A.

Discussion 0
Questions 90

A solutions architect needs to implement a solution that can handle up to 5,000 messages per second. The solution must publish messages as events to multiple consumers. The messages are up to 500 KB in size. The message consumers need to have the ability to use multiple programming languages to consume the messages with minimal latency. The solution must retain published messages for more than 3 months. The solution must enforce strict ordering of the messages.

Options:

A.  

Publish messages to an Amazon Kinesis Data Streams data stream. Enable enhanced fan-out. Ensure that consumers ingest the data stream by using dedicated throughput.

B.  

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to subscribe to the topic.

C.  

Publish messages to Amazon EventBridge. Allow each consumer to create rules to deliver messages to the consumer's own target.

D.  

Publish messages to an Amazon Simple Notification Service (Amazon SNS) topic. Ensure that consumers use Amazon Data Firehose to subscribe to the topic.

Discussion 0
Questions 91

A healthcare provider is planning to store patient data on AWS as PDF files. To comply with regulations, the company must encrypt the data and store the files in multiple locations. The data must be available for immediate access from any environment.

Options:

A.  

Store the files in an Amazon S3 bucket. Use the Standard storage class. Enable server-side encryption with Amazon S3 managed keys (SSE-S3) on the bucket. Configure cross-Region replication on the bucket.

B.  

Store the files in an Amazon Elastic File System (Amazon EFS) volume. Use an AWS KMS managed key to encrypt the EFS volume. Use AWS DataSync to replicate the EFS volume to a second AWS Region.

C.  

Store the files in an Amazon Elastic Block Store (Amazon EBS) volume. Configure AWS Backup to back up the volume on a regular schedule. Use an AWS KMS key to encrypt the backups.

D.  

Store the files in an Amazon S3 bucket. Use the S3 Glacier Flexible Retrieval storage class. Ensure that all PDF files are encrypted by using client-side encryption before the files are uploaded. Configure cross-Region replication on the bucket.

Discussion 0
Questions 92

A company runs all its business applications in the AWS Cloud. The company uses AWS Organizations to manage multiple AWS accounts.

A solutions architect needs to review all permissions granted to IAM users to determine which users have more permissions than required.

Which solution will meet these requirements with the LEAST administrative overhead?

Options:

A.  

Use Network Access Analyzer to review all access permissions in the company's AWS accounts.

B.  

Create an AWS CloudWatch alarm that activates when an IAM user creates or modifies resources in an AWS account.

C.  

Use AWS Identity and Access Management (IAM) Access Analyzer to review all the company's resources and accounts.

D.  

Use Amazon Inspector to find vulnerabilities in existing IAM policies.

Discussion 0
Questions 93

A media streaming company is redesigning its infrastructure to accommodate increasing demand for video content that users consume daily. The company needs to process terabyte-sized videos to block some content in the videos. Video processing can take up to 20 minutes.

The company needs a solution that is cost-effective, highly available, and scalable.

Which solution will meet these requirements?

Options:

A.  

Use AWS Lambda functions to process the videos. Store video metadata in Amazon DynamoDB. Store video content in Amazon S3 Intelligent-Tiering.

B.  

Use Amazon Elastic Container Service (Amazon ECS) with the AWS Fargate launch type to implement microservices to process videos. Store video metadata in Amazon Aurora. Store video content in Amazon S3 Intelligent-Tiering.

C.  

Use Amazon EMR to process the videos with Apache Spark. Store video content in Amazon FSx for Lustre. Use Amazon Kinesis Data Streams to ingest videos in real time.

D.  

Deploy a containerized video processing application on Amazon Elastic Kubernetes Service (Amazon EKS) with the Amazon EC2 launch type. Store video metadata in Amazon RDS in a single Availability Zone. Store video content in Amazon S3 Glacier Deep Archive.

Discussion 0
Questions 94

A company is building a new web application on AWS. The application needs to consume files from a legacy on-premises application that runs a batch process and outputs approximately 1 GB of data every night to an NFS file mount.

A solutions architect needs to design a storage solution that requires minimal changes to the legacy application and keeps costs low.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy an Outpost in AWS Outposts to the on-premises location where the legacy application is stored. Configure the legacy application and the web application to store and retrieve the files in Amazon S3 on the Outpost.

B.  

Deploy an AWS Storage Gateway Volume Gateway on premises. Point the legacy application to the Volume Gateway. Configure the web application to use the Amazon S3 bucket that the Volume Gateway uses.

C.  

Deploy an Amazon S3 interface endpoint on AWS. Reconfigure the legacy application to store the files directly on an Amazon S3 endpoint. Configure the web application to retrieve the files from Amazon S3.

D.  

Deploy an Amazon S3 File Gateway on premises. Point the legacy application to the File Gateway. Configure the web application to retrieve the files from the S3 bucket that the File Gateway uses.

Discussion 0
Questions 95

A company is designing an application to maintain a record of customer orders. The application will generate events. The company wants to use an Amazon EventBridge event bus to send the application's events to an Amazon DynamoDB table. Which solution will meet these requirements?

Options:

A.  

Use the EventBridge default event bus. Configure DynamoDB Streams for the DynamoDB table that hosts the customer order data.

B.  

Create an EventBridge custom event bus. Create an AWS Lambda function as a target. Configure the Lambda function to forward the customer order data to the DynamoDB table.

C.  

Create an EventBridge partner event bus. Create an Amazon Simple Notification Service (Amazon SNS) topic. Subscribe an AWS Lambda function to the SNS topic. Configure the Lambda function to read the customer order data and to forward the data to the DynamoDB table.

D.  

Create an EventBridge partner event bus. Create an AWS Lambda function as a target. Configure the Lambda function to forward the customer order data to the DynamoDB table.

Discussion 0
Questions 96

A company runs an order management application on AWS. The application allows customers to place orders and pay with a credit card. The company uses an Amazon CloudFront distribution to deliver the application.

A security team has set up logging for all incoming requests. The security team needs a solution to generate an alert if any user modifies the logging configuration.

Options (Select TWO):

Options:

A.  

Configure an Amazon EventBridge rule that is invoked when a user creates or modifies a CloudFront distribution. Add the AWS Lambda function as a target of the EventBridge rule.

B.  

Create an Application Load Balancer (ALB). Enable AWS WAF rules for the AL

B.  

Configure an AWS Config rule to detect security violations.

C.  

Create an AWS Lambda function to detect changes in CloudFront distribution logging. Configure the Lambda function to use Amazon Simple Notification Service (Amazon SNS) to send notifications to the security team.

D.  

Set up Amazon GuardDuty. Configure GuardDuty to monitor findings from the CloudFront distribution. Create an AWS Lambda function to address the findings.

E.  

Create a private API in Amazon API Gateway. Use AWS WAF rules to protect the private API from common security problems.

Discussion 0
Questions 97

A media company hosts a web application on AWS. The application gives users the ability to upload and view videos. The application stores the videos in an Amazon S3 bucket. The company wants to ensure that only authenticated users can upload videos. Authenticated users must have the ability to upload videos only within a specified time frame after authentication. Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Configure the application to generate IAM temporary security credentials for authenticated users.

B.  

Create an AWS Lambda function that generates pre-signed URLs when a user authenticates.

C.  

Develop a custom authentication service that integrates with Amazon Cognito to control and log direct S3 bucket access through the application.

D.  

Use AWS Security Token Service (AWS STS) to assume a pre-defined IAM role that grants authenticated users temporary permissions to upload videos directly to the S3 bucket.

Discussion 0
Questions 98

A company uses a set of Amazon EC2 instances to host a website. The website uses an Amazon S3 bucket to store images and media files.

The company wants to automate website infrastructure creation to deploy the website to multiple AWS Regions. The company also wants to provide the EC2 instances access to the S3 bucket so the instances can store and access data by using AWS Identity and Access Management (IAM).

Which solution will meet these requirements MOST securely?

Options:

A.  

Create an AWS Cloud Format ion template for the web server EC2 instances. Save an IAM access key in the UserData section of the AWS;:EC2::lnstance entity in the CloudFormation template.

B.  

Create a file that contains an IAM secret access key and access key ID. Store the file in a new S3 bucket. Create an AWS CloudFormation template. In the template, create a parameter to specify the location of the S3 object that contains the access key and access key ID.

C.  

Create an IAM role and an IAM access policy that allows the web server EC2 instances to access the S3 bucket. Create an AWS CloudFormation template for the web server EC2 instances that contains an IAM instance profile entity that references the IAM role and the IAM access policy.

D.  

Create a script that retrieves an IAM secret access key and access key ID from IAM and stores them on the web server EC2 instances. Include the script in the UserData section of the AWS::EC2::lnstance entity in an AWS CloudFormation template.

Discussion 0
Questions 99

A global company runs a data lake application in the us-east-1 Region and the eu-west-1 Region in an active-passive configuration. Application data is stored locally in Amazon S3 buckets in each AWS Region. The bucket in us-east-1 is the primary active bucket that handles all writes. The company needs to ensure that the application has Regional fault tolerance. The company also needs the storage layer to provide a highly available active-active capability for reads across Regions. The storage layer must provide low latency access through a single global endpoint.

Options:

A.  

Create an Amazon CloudFront distribution in each Region. Set the S3 bucket within each Region as the origin for the CloudFront distribution in the same Region.

B.  

Use S3 Transfer Acceleration for cross-Region data transfers to the S3 buckets.

C.  

Configure AWS Backup to replicate S3 buckets across Regions. Set up a disaster recovery environment.

D.  

Create an S3 Multi-Region Access Point. Configure cross-Region replication.

Discussion 0
Questions 100

A company is creating a web application that will store a large number of images in Amazon S3. The images will be accessed by users over variable periods of time. The company wants to:

Retain all the images.

Incur no cost for retrieval.

Have minimal management overhead.

Have the images available with no impact on retrieval time.

Which solution meets these requirements?

Options:

A.  

Implement S3 Intelligent-Tiering.

B.  

Implement S3 storage class analysis.

C.  

Implement an S3 Lifecycle policy to move data to S3 Standard-Infrequent Access (S3 Standard-IA).

D.  

Implement an S3 Lifecycle policy to move data to S3 One Zone-Infrequent Access (S3 One Zone-IA).

Discussion 0
Questions 101

A solutions architect needs to host a high performance computing (HPC) workload in the AWS Cloud. The workload will run on hundreds of Amazon EC2 instances and will require parallel access to a shared file system to enable distributed processing of large datasets. Datasets will be accessed across multiple instances simultaneously. The workload requires access latency within 1 ms. After processing has completed, engineers will need access to the dataset for manual postprocessing.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Elastic File System (Amazon EFS) as a shared fie system. Access the dataset from Amazon EFS.

B.  

Mount an Amazon S3 bucket to serve as the shared file system. Perform postprocessing directly from the S3 bucket.

C.  

Use Amazon FSx for Lustre as a shared file system. Link the file system to an Amazon S3 bucket for postprocessing.

D.  

Configure AWS Resource Access Manager to share an Amazon S3 bucket so that it can be mounted to all instances for processing and postprocessing.

Discussion 0
Questions 102

A finance company uses backup software to back up its data to physical tape storage on-premises. To comply with regulations, the company needs to store the data for 7 years. The company must be able to restore archived data within one week when necessary.

The company wants to migrate the backup data to AWS to reduce costs. The company does not want to change the current backup software.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Use AWS DataSync to migrate the virtual tapes to the Amazon S3 Standard-Infrequent Access (S3 Standard-IA). Change the target of the backup software to S3 Standard-I

A.  

B.  

Convert the physical tapes to virtual tapes. Use AWS DataSync to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to the S3 Glacier Flexible Retrieval.

C.  

Use AWS Storage Gateway Tape Gateway to copy the data to virtual tapes. Migrate the virtual tapes to Amazon S3 Glacier Deep Archive. Change the target of the backup software to the virtual tapes.

D.  

Convert the physical tapes to virtual tapes. Use AWS Snowball Edge storage-optimized devices to migrate the virtual tapes to Amazon S3 Glacier Flexible Retrieval. Change the target of the backup software to S3 Glacier Flexible Retrieval.

Discussion 0
Questions 103

A company hosts a database that runs on an Amazon RDS instance deployed to multiple Availability Zones. A periodic script negatively affects a critical application by querying the database. How can application performance be improved with minimal costs?

Options:

A.  

Add functionality to the script to identify the instance with the fewest active connections and query that instance.

B.  

Create a read replica of the database. Configure the script to query only the read replica.

C.  

Instruct the development team to manually export new entries at the end of the day.

D.  

Use Amazon ElastiCache to cache the common queries the script runs.

Discussion 0
Questions 104

A company is planning to migrate an on-premises online transaction processing (OLTP) database that uses MySQL to an AWS managed database management system. Several reporting and analytics applications use the on-premises database heavily on weekends and at the end of each month. The cloud-based solution must be able to handle read-heavy surges during weekends and at the end of each month.

Which solution will meet these requirements?

Options:

A.  

Migrate the database to an Amazon Aurora MySQL cluster. Configure Aurora Auto Scaling to use replicas to handle surges.

B.  

Migrate the database to an Amazon EC2 instance that runs MySQL. Use an EC2 instance type that has ephemeral storage. Attach Amazon EBS Provisioned IOPS SSD (io2) volumes to the instance.

C.  

Migrate the database to an Amazon RDS for MySQL database. Configure the RDS for MySQL database for a Multi-AZ deployment, and set up auto scaling.

D.  

Migrate from the database to Amazon Redshift. Use Amazon Redshift as the database for both OLTP and analytics applications.

Discussion 0
Questions 105

A website uses EC2 instances with Auto Scaling and EFS. How can the company optimize costs?

Options:

A.  

Reconfigure the Auto Scaling group to set a desired number of instances. Turn off scheduled scaling.

B.  

Create a new launch template version that uses larger EC2 instances.

C.  

Reconfigure the Auto Scaling group to use a target tracking scaling policy.

D.  

Replace the EFS volume with instance store volumes.

Discussion 0
Questions 106

A company needs to give a globally distributed development team secure access to the company's AWS resources in a way that complies with security policies.

The company currently uses an on-premises Active Directory for internal authentication. The company uses AWS Organizations to manage multiple AWS accounts that support multiple projects.

The company needs a solution to integrate with the existing infrastructure to provide centralized identity management and access control.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Set up AWS Directory Service to create an AWS managed Microsoft Active Directory on AWS. Establish a trust relationship with the on-premises Active Directory. Use IAM roles that are assigned to Active Directory groups to access AWS resources within the company's AWS accounts.

B.  

Create an IAM user for each developer. Manually manage permissions for each IAM user based on each user's involvement with each project. Enforce multi-factor authentication (MFA) as an additional layer of security.

C.  

Use AD Connector in AWS Directory Service to connect to the on-premises Active Directory. Integrate AD Connector with AWS IAM Identity Center. Configure permissions sets to give each AD group access to specific AWS accounts and resources.

D.  

Use Amazon Cognito to deploy an identity federation solution. Integrate the identity federation solution with the on-premises Active Directory. Use Amazon Cognito to provide access tokens for developers to access AWS accounts and resources.

Discussion 0
Questions 107

A company is building a serverless application to process clickstream data from its website. The clickstream data is sent to an Amazon Kinesis Data Streams data stream from the application web servers.

The company wants to enrich the clickstream data by joining the clickstream data with customer profile data from an Amazon Aurora Multi-AZ database. The company wants to use Amazon Redshift to analyze the enriched data. The solution must be highly available.

Which solution will meet these requirements?

Options:

A.  

Use an AWS Lambda function to process and enrich the clickstream data. Use the same Lambda function to write the clickstream data to Amazon S3. Use Amazon Redshift Spectrum to query the enriched data in Amazon S3.

B.  

Use an Amazon EC2 Spot Instance to poll the data stream and enrich the clickstream data. Configure the EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.

C.  

Use an Amazon Elastic Container Service (Amazon ECS) task with AWS Fargate Spot capacity to poll the data stream and enrich the clickstream data. Configure an Amazon EC2 instance to use the COPY command to send the enriched results to Amazon Redshift.

D.  

Use Amazon Kinesis Data Firehose to load the clickstream data from Kinesis Data Streams to Amazon S3. Use AWS Glue crawlers to infer the schema and populate the AWS Glue Data Catalog. Use Amazon Athena to query the raw data in Amazon S3.

Discussion 0
Questions 108

A company tracks customer satisfaction by using surveys that the company hosts on its website. The surveys sometimes reach thousands of customers every hour. Survey results are currently sent in email messages to the company so company employees can manually review results and assess customer sentiment.

The company wants to automate the customer survey process. Survey results must be available for the previous 12 months.

Which solution will meet these requirements in the MOST scalable way?

Options:

A.  

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Create an AWS Lambda function to poll the SQS queue, call Amazon Comprehend for sentiment analysis, and save the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

B.  

Send the survey results data to an API that is running on an Amazon EC2 instance. Configure the API to store the survey results as a new record in an Amazon DynamoDB table, call Amazon Comprehend for sentiment analysis, and save the results in a second DynamoDB table. Set the TTL for all records to 365 days in the future.

C.  

Write the survey results data to an Amazon S3 bucket. Use S3 Event Notifications to invoke an AWS Lambda function to read the data and call Amazon Rekognition for sentiment analysis. Store the sentiment analysis results in a second S3 bucket. Use S3 Lifecycle policies on each bucket to expire objects after 365 days.

D.  

Send the survey results data to an Amazon API Gateway endpoint that is connected to an Amazon Simple Queue Service (Amazon SQS) queue. Configure the SQS queue to invoke an AWS Lambda function that calls Amazon Lex for sentiment analysis and saves the results to an Amazon DynamoDB table. Set the TTL for all records to 365 days in the future.

Discussion 0
Questions 109

A company is building a serverless application to process orders from an e-commerce site. The application needs to handle bursts of traffic during peak usage hours and to maintain high availability. The orders must be processed asynchronously in the order the application receives them.

Options:

A.  

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use an AWS Lambda function to process the orders.

B.  

Use an Amazon Simple Queue Service (Amazon SQS) FIFO queue to receive orders. Use an AWS Lambda function to process the orders.

C.  

Use an Amazon Simple Queue Service (Amazon SQS) standard queue to receive orders. Use AWS Batch jobs to process the orders.

D.  

Use an Amazon Simple Notification Service (Amazon SNS) topic to receive orders. Use AWS Batch jobs to process the orders.

Discussion 0
Questions 110

A company is migrating its on-premises Oracle database to an Amazon RDS for Oracle database. The company needs to retain data for 90 days to meet regulatory requirements. The company must also be able to restore the database to a specific point in time for up to 14 days.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create Amazon RDS automated backups. Set the retention period to 90 days.

B.  

Create an Amazon RDS manual snapshot every day. Delete manual snapshots that are older than 90 days.

C.  

Use the Amazon Aurora Clone feature for Oracle to create a point-in-time restore. Delete clones that are older than 90 days

D.  

Create a backup plan that has a retention period of 90 days by using AWS Backup for Amazon RDS.

Discussion 0
Questions 111

A company is planning to migrate a legacy application to AWS. The application currently uses NFS to communicate to an on-premises storage solution to store application data. The application cannot be modified to use any other communication protocols other than NFS for this purpose.

Which storage solution should a solutions architect recommend for use after the migration?

Options:

A.  

AWS DataSync

B.  

Amazon Elastic Block Store (Amazon EB5)

C.  

Amazon Elastic File System (Amazon EF5)

D.  

Amazon EMR File System (Amazon EMRFS)

Discussion 0
Questions 112

A company stores user data in AWS. The data is used continuously with peak usage during business hours. Access patterns vary, with some data not being used for months at a time. A solutions architect must choose a cost-effective solution that maintains the highest level of durability while maintaining high availability.

Which storage solution meets these requirements?

Options:

A.  

Amazon S3 Standard

B.  

Amazon S3 Intelligent-Tiering

C.  

Amazon S3 Glacier Deep Archive

D.  

Amazon S3 One Zone-Infrequent Access (S3 One Zone-IA)

Discussion 0
Questions 113

An ecommerce company is launching a new marketing campaign. The company anticipates the campaign to generate ten times the normal number of daily orders through the company's ecommerce application. The campaign will last 3 days.

The ecommerce application architecture is based on Amazon EC2 instances in an Auto Scaling group and an Amazon RDS for MySQL database. The application writes order transactions to an Amazon Elastic File System (Amazon EFS) file system before the application writes orders to the database. During normal operations, the application write operations peak at 5,000 IOPS.

A solutions architect needs to ensure that the application can handle the anticipated workload during the marketing campaign.

Which solution will meet this requirement?

Options:

A.  

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Bursting throughput.

B.  

For the duration of the campaign, increase the provisioned IOPS for the RDS for MySQL database. Set the Amazon EFS throughput mode to Elastic throughput.

C.  

Convert the database to a Multi-AZ deployment. Set the Amazon EFS throughput mode to Elastic throughput for the duration of the campaign.

D.  

Use AWS Database Migration Service (AWS DMS) to convert the database to RDS for PostgreSQL. Set the Amazon EFS throughput mode to Bursting throughput.

Discussion 0
Questions 114

A company has an employee web portal. Employees log in to the portal to view payroll details. The company is developing a new system to give employees the ability to upload scanned documents for reimbursement. The company runs a program to extract text-based data from the documents and attach the extracted information to each employee's reimbursement IDs for processing.

The employee web portal requires 100% uptime. The document extract program runs infrequently throughout the day on an on-demand basis. The company wants to build a scalable and cost-effective new system that will require minimal changes to the existing web portal. The company does not want to make any code changes.

Which solution will meet these requirements with the LEAST implementation effort?

Options:

A.  

Run Amazon EC2 On-Demand Instances in an Auto Scaling group for the web portal. Use an AWS Lambda function to run the document extract program. Invoke the Lambda function when an employee uploads a new reimbursement document.

B.  

Run Amazon EC2 Spot Instances in an Auto Scaling group for the web portal. Run the document extract program on EC2 Spot Instances Start document extract program instances when an employee uploads a new reimbursement document.

C.  

Purchase a Savings Plan to run the web portal and the document extract program. Run the web portal and the document extract program in an Auto Scaling group.

D.  

Create an Amazon S3 bucket to host the web portal. Use Amazon API Gateway and an AWS Lambda function for the existing functionalities. Use the Lambda function to run the document extract program. Invoke the Lambda function when the API that is associated with a new document upload is called.

Discussion 0
Questions 115

A company wants to publish a private website for its on-premises employees. The website consists of several HTML pages and image files. The website must be available only through HTTPS and must be available only to on-premises employees. A solutions architect plans to store the website files in an Amazon S3 bucket.

Which solution will meet these requirements?

Options:

A.  

Create an S3 bucket policy to deny access when the source IP address is not the public IP address of the on-premises environment Set up an Amazon Route 53 alias record to point to the S3 bucket. Provide the alias record to the on-premises employees to grant the employees access to the website.

B.  

Create an S3 access point to provide website access. Attach an access point policy to deny access when the source IP address is not the public IP address of the on-premises environment. Provide the S3 access point alias to the on-premises employees to grant the employees access to the website.

C.  

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Use AWS Certificate Manager for SSL. Use AWS WAF with an IP set rule that allows access for the on-premises IP address. Set up an Amazon Route 53 alias record to point to the CloudFront distribution.

D.  

Create an Amazon CloudFront distribution that includes an origin access control (OAC) that is configured for the S3 bucket. Create a CloudFront signed URL for the objects in the bucket. Set up an Amazon Route 53 alias record to point to the CloudFront distribution. Provide the signed URL to the on-premises employees to grant the employees access to the website.

Discussion 0
Questions 116

A company runs an application on Amazon EC2 instances behind an Application Load Balancer (ALB). The company wants to create a public API for the application that uses JSON Web Tokens (JWT) for authentication. The company wants the API to integrate directly with the ALB.

Which solution will meet these requirements?

Options:

A.  

Use Amazon API Gateway to create a REST API.

B.  

Use Amazon API Gateway to create an HTTP API.

C.  

Use Amazon API Gateway to create a WebSocket API.

D.  

Use Amazon API Gateway to create a gRPC API.

Discussion 0
Questions 117

A company is deploying an application that processes streaming data in near-real time. The company plans to use Amazon EC2 instances for the workload. The network architecture must be configurable to provide the lowest possible latency between nodes.

Which networking solution meets these requirements?

Options:

A.  

Place the EC2 instances in multiple VPCs, and configure VPC peering.

B.  

Attach an Elastic Fabric Adapter (EFA) to each EC2 instance.

C.  

Run the EC2 instances in a spread placement group.

D.  

Use Amazon Elastic Block Store (Amazon EBS) optimized instance types.

Discussion 0
Questions 118

A company needs a solution to automate email ingestion. The company needs to automatically parse email messages, look for email attachments, and save any attachments to an Amazon S3 bucket in near real time. Email volume varies significantly from day to day.

Which solution will meet these requirements?

Options:

A.  

Set up email receiving in Amazon Simple Email Service {Amazon SES). Create a rule set and a receipt rule. Create an AWS Lambda function that Amazon SES can invoke to process the email bodies and attachments.

B.  

Set up email content filtering in Amazon Simple Email Service (Amazon SES). Create a content filtering rule based on sender, recipient, message body, and attachments.

C.  

Set up email receiving in Amazon Simple Email Service (Amazon SES). Configure Amazon SES and S3 Event Notifications to process the email bodies and attachments.

D.  

Create an AWS Lambda function to process the email bodies and attachments. Use Amazon EventBridge to invoke the Lambda function. Configure an EventBridge rule to listen for incoming emails.

Discussion 0
Questions 119

A company is launching a new application that requires a structured database to store user profiles, application settings, and transactional data. The database must be scalable with application traffic and must offer backups.

Which solution will meet these requirements MOST cost-effectively?

Options:

A.  

Deploy a self-managed database on Amazon EC2 instances by using open-source software. Use Spot Instances for cost optimization. Configure automated backups to Amazon S3.

B.  

Use Amazon RDS. Use on-demand capacity mode for the database with General Purpose SSD storage. Configure automatic backups with a retention period of 7 days.

C.  

Use Amazon Aurora Serverless for the database. Use serverless capacity scaling. Configure automated backups to Amazon S3.

D.  

Deploy a self-managed NoSQL database on Amazon EC2 instances. Use Reserved Instances for cost optimization. Configure automated backups directly to Amazon S3 Glacier Flexible Retrieval.

Discussion 0
Questions 120

A company is developing a microservices-based application to manage the company's delivery operations. The application consists of microservices that process orders, manage a fleet of delivery vehicles, and optimize delivery routes.

The microservices must be able to scale independently and must be able to handle bursts of traffic without any data loss.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon API Gateway REST APIs to establish communication between microservices. Deploy the application on Amazon EC2 instances in Auto Scaling groups.

B.  

Use Amazon Simple Queue Service (Amazon SQS) to establish communication between microservices. Deploy the application on Amazon Elastic Container Service (Amazon ECS) containers on AWS Fargate.

C.  

Use WebSocket-based communication between microservices. Deploy the application on Amazon EC2 instances in Auto Scaling groups.

D.  

Use Amazon Simple Notification Service (Amazon SNS) to establish communication between microservices. Deploy the application on Amazon Elastic Container Service (Amazon ECS) containers on Amazon EC2 instances.

Discussion 0
Questions 121

A company has an ordering application that stores customer information in Amazon RDS for MySQL. During regular business hours, employees run one-time queries for reporting purposes. Timeouts are occurring during order processing because the reporting queries are taking a long time to run. The company needs to eliminate the timeouts without preventing employees from performing queries.

Options:

A.  

Create a read replica. Move reporting queries to the read replica.

B.  

Create a read replica. Distribute the ordering application to the primary DB instance and the read replica.

C.  

Migrate the ordering application to Amazon DynamoDB with on-demand capacity.

D.  

Schedule the reporting queries for non-peak hours.

Discussion 0
Questions 122

A marketing company receives a large amount of new clickstream data in Amazon S3 from a marketing campaign The company needs to analyze the clickstream data in Amazon S3 quickly. Then the company needs to determine whether to process the data further in the data pipeline.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create external tables in a Spark catalog Configure jobs in AWS Glue to query the data

B.  

Configure an AWS Glue crawler to crawl the data. Configure Amazon Athena to query the data.

C.  

Create external tables in a Hive metastore. Configure Spark jobs in Amazon EMR to query the data.

D.  

Configure an AWS Glue crawler to crawl the data. Configure Amazon Kinesis Data Analytics to use SQL to query the data

Discussion 0
Questions 123

A solutions architect is designing the cloud architecture for a new stateless application that will be deployed on AWS. The solutions architect created an Amazon Machine Image (AMI) and launch template for the application.

Based on the number of jobs that need to be processed, the processing must run in parallel while adding and removing application Amazon EC2 instances as needed. The application must be loosely coupled. The job items must be durably stored.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon Simple Notification Service (Amazon SNS) topic to send the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on CPU usage.

B.  

Create an Amazon Simple Queue Service (Amazon SQS) queue to hold the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on network usage.

C.  

Create an Amazon Simple Queue Service (Amazon SQS) queue to hold the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on the number of items in the SQS queue.

D.  

Create an Amazon Simple Notification Service (Amazon SNS) topic to send the jobs that need to be processed. Create an Auto Scaling group by using the launch template with the scaling policy set to add and remove EC2 instances based on the number of messages published to the SNS topic.

Discussion 0
Questions 124

A company is launching a new gaming application. The company will use Amazon EC2 Auto Scaling groups to deploy the application. The application stores user data in a relational database.

The company has office locations around the world that need to run analytics on the user data in the database. The company needs a cost-effective database solution that provides cross-Region disaster recovery with low-latency read performance across AWS Regions.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon ElastiCache for Redis cluster in the Region where the application is deployed. Create read replicas in Regions where the company offices are located. Ensure the company offices read from the read replica instances.

B.  

Create Amazon DynamoDB global tables. Deploy the tables to the Regions where the company offices are located and to the Region where the application is deployed. Ensure that each company office reads from the tables that are in the same Region as the office.

C.  

Create an Amazon Aurora global database. Configure the primary cluster to be in the Region where the application is deployed. Configure the secondary Aurora replicas to be in the Regions where the company offices are located. Ensure the company offices read from the Aurora replicas.

D.  

Create an Amazon RDS Multi-AZ DB cluster deployment in the Region where the application is deployed. Ensure the company offices read from read replica instances.

Discussion 0
Questions 125

A startup company is hosting a website for its customers on an Amazon EC2 instance. The website consists of a stateless Python application and a MySQL database. The website serves only a small amount of traffic. The company is concerned about the reliability of the instance and needs to migrate to a highly available architecture. The company cannot modify the application code.

Which combination of actions should a solutions architect take to achieve high availability for the website? (Select TWO.)

Options:

A.  

Provision an internet gateway in each Availability Zone in use.

B.  

Migrate the database to an Amazon RDS for MySQL Multi-AZ DB instance.

C.  

Migrate the database to Amazon DynamoDB. and enable DynamoDB auto scaling.

D.  

Use AWS DataSync to synchronize the database data across multiple EC2 instances.

E.  

Create an Application Load Balancer to distribute traffic to an Auto Scaling group of EC2 instances that are distributed across two Availability Zones.

Discussion 0
Questions 126

A company is migrating a daily Microsoft Windows batch job from the company's on-premises environment to AWS. The current batch job runs for up to 1 hour. The company wants to modernize the batch job process for the cloud environment.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create a fleet of Amazon EC2 instances in an Auto Scaling group to handle the Windows batch job processing.

B.  

Implement an AWS Lambda function to process the Windows batch job. Use an Amazon EventBridge rule to invoke the Lambda function.

C.  

Use AWS Fargate to deploy the Windows batch job as a container. Use AWS Batch to manage the batch job processing.

D.  

Use Amazon Elastic Kubernetes Service (Amazon EKS) on Amazon EC2 instances to orchestrate Windows containers for the batch job processing.

Discussion 0
Questions 127

A company needs to grant a team of developers access to the company's AWS resources. The company must maintain a high level of security for the resources.

The company requires an access control solution that will prevent unauthorized access to the sensitive data.

Which solution will meet these requirements?

Options:

A.  

Share the IAM user credentials for each development team member with the rest of the team to simplify access management and to streamline development workflows.

B.  

Define IAM roles that have fine-grained permissions based on the principle of least privilege. Assign an IAM role to each developer.

C.  

Create IAM access keys to grant programmatic access to AWS resources. Allow only developers to interact with AWS resources through API calls by using the access keys.

D.  

Create an AWS Cognito user pool. Grant developers access to AWS resources by using the user pool.

Discussion 0
Questions 128

A global ecommerce company runs its critical workloads on AWS. The workloads use an Amazon RDS for PostgreSQL DB instance that is configured for a Multi-AZ deployment.

Customers have reported application timeouts when the company undergoes database failovers. The company needs a resilient solution to reduce failover time

Which solution will meet these requirements?

Options:

A.  

Create an Amazon RDS Proxy. Assign the proxy to the DB instance.

B.  

Create a read replica for the DB instance Move the read traffic to the read replica.

C.  

Enable Performance Insights. Monitor the CPU load to identify the timeouts.

D.  

Take regular automatic snapshots Copy the automatic snapshots to multiple AWS Regions

Discussion 0
Questions 129

A weather forecasting company needs to process hundreds of gigabytes of data with sub-millisecond latency. The company has a high performance computing (HPC) environment in its data center and wants to expand its forecasting capabilities.

A solutions architect must identify a highly available cloud storage solution that can handle large amounts of sustained throughput Files that are stored in the solution should be accessible to thousands of compute instances that will simultaneously access and process the entire dataset.

What should the solutions architect do to meet these requirements?

Options:

A.  

Use Amazon FSx for Lustre scratch file systems

B.  

Use Amazon FSx for Lustre persistent file systems.

C.  

Use Amazon Elastic File System (Amazon EFS) with Bursting Throughput mode.

D.  

Use Amazon Elastic File System (Amazon EFS) with Provisioned Throughput mode.

Discussion 0
Questions 130

A company is creating a low-latency payment processing application that supports TLS connections from IPv4 clients. The application requires outbound access to the public internet. Users must access the application from a single entry point.

The bank wants to use Amazon Elastic Container Service (Amazon ECS) tasks to deploy the application. The company wants to enable AWSVPC network mode.

Which solution will meet these requirements MOST securely?

Options:

A.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

B.  

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer and a NAT gateway in the public subnets. Deploy the ECS tasks in the private subnets.

C.  

Create a VPC that has an internet gateway, public subnets, and private subnets. Deploy an Application Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

D.  

Create a VPC that has an outbound-only internet gateway, public subnets, and private subnets. Deploy a Network Load Balancer in the public subnets. Deploy the ECS tasks in the public subnets.

Discussion 0
Questions 131

A company runs a monolithic application in its on-premises data center. The company used Java/Tomcat to build the application. The application uses Microsoft SQL Server as a database.

The company wants to migrate the application to AWS.

Which solution will meet this requirement with the LEAST operational overhead?

Options:

A.  

Use AWS App2Container to containerize the application. Deploy the application on Amazon Elastic Kubernetes Service (Amazon EKS). Deploy the database to Amazon RDS for SQL Server. Configure a Multi-AZ deployment.

B.  

Containerize the application and deploy the application on a self-managed Kubernetes cluster on an Amazon EC2 instance. Deploy the database on a separate EC2 instance. Set up Microsoft SQL Server Always On availability groups.

C.  

Deploy the frontend of the web application as a website on Amazon S3. Use Amazon DynamoDB for the database tier.

D.  

Use AWS App2Container to containerize the application. Deploy the application on Amazon Elastic Kubernetes Service (Amazon EKS). Use Amazon DynamoDB for the database tier.

Discussion 0
Questions 132

A company wants to deploy an AWS Lambda function that will read and write objects to Amazon S3 bucket. The Lambda function must be connected to the company's VPC. The company must deploy the Lambda function only to private subnets in the VPC. The Lambda function must not be allowed to access the internet.

Which solutions will meet these requirements? (Select TWO.)

Options:

A.  

Create a private NAT gateway to access the S3 bucket.

B.  

Attach an Elastic IP address to the NAT gateway.

C.  

Create a gateway VPC endpoint for the S3 bucket.

D.  

Create an interface VPC endpoint for the S3 bucket.

E.  

Create a public NAT gateway to access the S3 bucket.

Discussion 0
Questions 133

A company is hosting multiple websites for several lines of business under its registered parent domain. Users accessing these websites will be routed to appropriate backend Amazon EC2instances based on the subdomain. The websites host static webpages, images, and server-side scripts like PHP and JavaScript.

Some of the websites experience peak access during the first two hours of business with constant usage throughout the rest of the day. A solutions architect needs to design a solution that will automatically adjust capacity to these traffic patterns while keeping costs low.

Which combination of AWS services or features will meet these requirements? (Select TWO.)

Options:

A.  

AWS Batch

B.  

Network Load Balancer

C.  

Application Load Balancer

D.  

Amazon EC2 Auto Scaling

E.  

Amazon S3 website hosting

Discussion 0
Questions 134

A company hosts a video streaming web application in a VPC. The company uses a Network Load Balancer (NLB) to handle TCP traffic for real-time data processing. There have been unauthorized attempts to access the application.

The company wants to improve application security with minimal architectural change to prevent unauthorized attempts to access the application.

Which solution will meet these requirements?

Options:

A.  

Implement a series of AWS WAF rules directly on the NLB to filter out unauthorized traffic.

B.  

Recreate the NLB with a security group to allow only trusted IP addresses.

C.  

Deploy a second NLB in parallel with the existing NLB configured with a strict IP address allow list.

D.  

Use AWS Shield Advanced to provide enhanced DDoS protection and prevent unauthorized access attempts.

Discussion 0
Questions 135

A company has an application with a REST-based interface that allows data to be received in near-real time from a third-party vendor. Once received, the application processes and stores the data for further analysis. The application is running on Amazon EC2 instances.

The third-party vendor has received many 503 Service Unavailable Errors when sending data to the application. When the data volume spikes, the compute capacity reaches its maximum limit and the application is unable to process all requests.

Which design should a solutions architect recommend to provide a more scalable solution?

Options:

A.  

Use Amazon Kinesis Data Streams to ingest the data. Process the data using AWS Lambda functions.

B.  

Use Amazon API Gateway on top of the existing application. Create a usage plan with a quota limit for the third-party vendor.

C.  

Use Amazon Simple Notification Service (Amazon SNS) to ingest the data. Put the EC2 instances in an Auto Scaling group behind an Application Load Balancer.

D.  

Repackage the application as a container. Deploy the application using Amazon Elastic Container Service (Amazon ECS) using the EC2 launch type with an Auto Scaling group.

Discussion 0
Questions 136

A marketing team wants to build a campaign for an upcoming multi-sport event. The team has news reports from the past five years in PDF format. The team needs a solution to extract insights about the content and the sentiment of the news reports. The solution must use Amazon Textract to process the news reports.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Provide the extracted insights to Amazon Athena for analysis Store the extracted insights and analysis in an Amazon S3 bucket.

B.  

Store the extracted insights in an Amazon DynamoDB table. Use Amazon SageMaker to build a sentiment model.

C.  

Provide the extracted insights to Amazon Comprehend for analysis. Save the analysis to an Amazon S3 bucket.

D.  

Store the extracted insights in an Amazon S3 bucket. Use Amazon QuickSight to visualize and analyze the data.

Discussion 0
Questions 137

A digital image processing company wants to migrate its on-premises monolithic application to the AWS Cloud. The company processes thousands of images and generates large files as part of the processing workflow.

The company needs a solution to manage the growing number of image processing jobs. The solution must also reduce the manual tasks in the image processing workflow. The company does not want to manage the underlying infrastructure of the solution.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon Elastic Container Service (Amazon ECS) with Amazon EC2 Spot Instances to process the images. Configure Amazon Simple Queue Service (Amazon SQS) to orchestrate the workflow. Store the processed files in Amazon Elastic File System (Amazon EFS)

B.  

Use AWS Batch jobs to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon S3 bucket.

C.  

Use AWS Lambda functions and Amazon EC2 Spot Instances lo process the images. Store the processed files in Amazon FSx.

D.  

Deploy a group of Amazon EC2 instances to process the images. Use AWS Step Functions to orchestrate the workflow. Store the processed files in an Amazon Elastic Block Store (Amazon EBS) volume.

Discussion 0
Questions 138

A company recently launched a new application for its customers. The application runs on multiple Amazon EC2 instances across two Availability Zones. End users use TCP to communicate with the application.

The application must be highly available and must automatically scale as the number of users increases.

Which combination of steps will meet these requirements MOST cost-effectively? (Select TWO.)

Options:

A.  

Add a Network Load Balancer in front of the EC2 instances.

B.  

Configure an Auto Scaling group for the EC2 instances.

C.  

Add an Application Load Balancer in front of the EC2 instances.

D.  

Manually add more EC2 instances for the application.

E.  

Add a Gateway Load Balancer in front of the EC2 instances.

Discussion 0
Questions 139

A solutions architect needs to secure an Amazon API Gateway REST API. Users need to be able to log in to the API by using common external social identity providers (IdPs). The social IdPs must use standard authentication protocols such as SAML or OpenID Connect (OIDC). The solutions architect needs to protect the API against attempts to exploit application vulnerabilities.

Which combination of steps will meet these security requirements? (Select TWO.)

Options:

A.  

Create an AWS WAF web ACL that is associated with the REST API. Add the appropriate managed rules to the ACL.

B.  

Subscribe to AWS Shield Advanced. Enable DDoS protection. Associate Shield Advanced with the REST API.

C.  

Create an Amazon Cognito user pool with a federation to the social IdPs. Integrate the user pool with the REST API.

D.  

Create an API key in API Gateway. Associate the API key with the REST API.

E.  

Create an IP address filter in AWS WAF that allows only the social IdPs. Associate the filter with the web ACL and the API.

Discussion 0
Questions 140

A social media company wants to store its database of user profiles, relationships, and interactions in the AWS Cloud. The company needs an application to monitor any changes in the database. The application needs to analyze the relationships between the data entities and to provide recommendations to users.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Use Amazon Neptune to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

B.  

Use Amazon Neptune to store the information. Use Neptune Streams to process changes in the database.

C.  

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Amazon Kinesis Data Streams to process changes in the database.

D.  

Use Amazon Quantum Ledger Database (Amazon QLDB) to store the information. Use Neptune Streams to process changes in the database.

Discussion 0