Labour Day Special 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

AWS Certified Developer - Associate Question and Answers

AWS Certified Developer - Associate

Last Update Apr 26, 2024
Total Questions : 197

We are offering FREE DVA-C02 Amazon Web Services exam questions. All you do is to just go and sign up. Give your details, prepare DVA-C02 free exam questions and then go for complete pool of AWS Certified Developer - Associate test questions that will help you more.

DVA-C02 pdf

DVA-C02 PDF

$35  $99.99
DVA-C02 Engine

DVA-C02 Testing Engine

$42  $119.99
DVA-C02 PDF + Engine

DVA-C02 PDF + Testing Engine

$56  $159.99
Questions 1

A mobile app stores blog posts in an Amazon DynacnoDB table Millions of posts are added every day and each post represents a single item in the table. The mobile app requires only recent posts. Any post that is older than 48 hours can be removed.

What is the MOST cost-effective way to delete posts that are older man 48 hours?

Options:

A.  

For each item add a new attribute of type String that has a timestamp that is set to the blog post creation time. Create a script to find old posts with a table scan and remove posts that are order than 48 hours by using the Balch Write ltem API operation. Schedule a cron job on an Amazon EC2 instance once an hour to start the script.

B.  

For each item add a new attribute of type. String that has a timestamp that its set to the blog post creation time. Create a script to find old posts with a table scan and remove posts that are Oder than 48 hours by using the Batch Write item API operating. Place the script in a container image. Schedule an Amazon Elastic Container Service (Amazon ECS) task on AWS Far gate that invokes the container every 5 minutes.

C.  

For each item, add a new attribute of type Date that has a timestamp that is set to 48 hours after the blog post creation time. Create a global secondary index (GSI) that uses the new attribute as a sort key. Create an AWS Lambda function that references the GSI and removes expired items by using the Batch Write item API operation Schedule me function with an Amazon CloudWatch event every minute.

D.  

For each item add a new attribute of type. Number that has timestamp that is set to 48 hours after the blog post. creation time Configure the DynamoDB table with a TTL that references the new attribute.

Discussion 0
Questions 2

A developer is working on a Python application that runs on Amazon EC2 instances. The developer wants to enable tracing of application requests to debug performance issues in the code.

Which combination of actions should the developer take to achieve this goal? (Select TWO)

Options:

A.  

Install the Amazon CloudWatch agent on the EC2 instances.

B.  

Install the AWS X-Ray daemon on the EC2 instances.

C.  

Configure the application to write JSON-formatted togs to /var/log/cloudwatch.

D.  

Configure the application to write trace data to /Var/log-/xray.

E.  

Install and configure the AWS X-Ray SDK for Python in the application.

Discussion 0
Questions 3

A company is running a custom application on a set of on-premises Linux servers that are accessed using Amazon API Gateway. AWS X-Ray tracing has been enabled on the API test stage.

How can a developer enable X-Ray tracing on the on-premises servers with the LEAST amount of configuration?

Options:

A.  

Install and run the X-Ray SDK on the on-premises servers to capture and relay the data to the X-Ray service.

B.  

Install and run the X-Ray daemon on the on-premises servers to capture and relay the data to the X-Ray service.

C.  

Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTraceSegments API call.

D.  

Capture incoming requests on-premises and configure an AWS Lambda function to pull, process, and relay relevant data to X-Ray using the PutTelemetryRecords API call.

Discussion 0
Questions 4

A developer is designing an AWS Lambda function that creates temporary files that are less than 10 MB during invocation. The temporary files will be accessed and modified multiple times during invocation. The developer has no need to save or retrieve these files in the future.

Where should the temporary files be stored?

Options:

A.  

the /tmp directory

B.  

Amazon Elastic File System (Amazon EFS)

C.  

Amazon Elastic Block Store (Amazon EBS)

D.  

Amazon S3

Discussion 0
Questions 5

A developer has code that is stored in an Amazon S3 bucket. The code must be deployed as an AWS Lambda function across multiple accounts in the same AWS Region as the S3 bucket an AWS CloudPormation template that runs for each account will deploy the Lambda function.

What is the MOST secure way to allow CloudFormaton to access the Lambda Code in the S3 bucket?

Options:

A.  

Grant the CloudFormation service role the S3 ListBucket and GetObject permissions. Add a bucket policy to Amazon S3 with the principal of "AWS" (account numbers)

B.  

Grant the CloudFormation service row the S3 GetObfect permission. Add a Bucket policy to Amazon S3 with the principal of "'"

C.  

Use a service-based link to grant the Lambda function the S3 ListBucket and GetObject permissions by explicitly adding the S3 bucket's account number in the resource.

D.  

Use a service-based link to grant the Lambda function the S3 GetObject permission Add a resource of "** to allow access to the S3 bucket.

Discussion 0
Questions 6

A developer is designing a serverless application for a game in which users register and log in through a web browser The application makes requests on behalf of users to a set of AWS Lambda functions that run behind an Amazon API Gateway HTTP API

The developer needs to implement a solution to register and log in users on the application's sign-in page. The solution must minimize operational overhead and must minimize ongoing management of user identities.

Which solution will meet these requirements'?

Options:

A.  

Create Amazon Cognito user pools for external social identity providers Configure 1AM roles for the identity pools.

B.  

Program the sign-in page to create users' 1AM groups with the 1AM roles attached to the groups

C.  

Create an Amazon RDS for SQL Server DB instance to store the users and manage the permissions to the backend resources in AWS

D.  

Configure the sign-in page to register and store the users and their passwords in an Amazon DynamoDB table with an attached IAM policy.

Discussion 0
Questions 7

A developer is creating an Amazon DynamoDB table by using the AWS CLI The DynamoDB table must use server-side encryption with an AWS owned encryption key

How should the developer create the DynamoDB table to meet these requirements?

Options:

A.  

Create an AWS Key Management Service (AWS KMS) customer managed key. Provide the key's Amazon Resource Name (ARN) in the KMSMasterKeyld parameter during creation of the DynamoDB table

B.  

Create an AWS Key Management Service (AWS KMS) AWS managed key Provide the key's Amazon Resource Name (ARN) in the KMSMasterKeyld parameter during creation of the DynamoDB table

C.  

Create an AWS owned key Provide the key's Amazon Resource Name (ARN) in the KMSMasterKeyld parameter during creation of the DynamoDB table.

D.  

Create the DynamoDB table with the default encryption options

Discussion 0
Questions 8

A developer is creating an AWS Lambda function. The Lambda function needs an external library to connect to a third-party solution The external library is a collection of files with a total size of 100 MB The developer needs to make the external library available to the Lambda execution environment and reduce the Lambda package space

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Create a Lambda layer to store the external library Configure the Lambda function to use the layer

B.  

Create an Amazon S3 bucket Upload the external library into the S3 bucket. Mount the S3 bucket folder in the Lambda function Import the library by using the proper folder in the mount point.

C.  

Load the external library to the Lambda function's /tmp directory during deployment of the Lambda package. Import the library from the /tmp directory.

D.  

Create an Amazon Elastic File System (Amazon EFS) volume. Upload the external library to the EFS volume Mount the EFS volume in the Lambda function. Import the library by using the proper folder in the mount point.

Discussion 0
Questions 9

A developer is using an AWS Lambda function to generate avatars for profile pictures that are uploaded to an Amazon S3 bucket. The Lambda function is automatically invoked for profile pictures that are saved under the /original/ S3 prefix. The developer notices that some pictures cause the Lambda function to time out. The developer wants to implement a fallback mechanism by using another Lambda function that resizes the profile picture.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.  

Set the image resize Lambda function as a destination of the avatar generator Lambda function for the events that fail processing.

B.  

Create an Amazon Simple Queue Service (Amazon SQS) queue. Set the SQS queue as a destination with an on failure condition for the avatar generator Lambda function. Configure the image resize Lambda function to poll from the SQS queue.

C.  

Create an AWS Step Functions state machine that invokes the avatar generator Lambda function and uses the image resize Lambda function as a fallback. Create an Amazon EventBridge rule that matches events from the S3 bucket to invoke the state machine.

D.  

Create an Amazon Simple Notification Service (Amazon SNS) topic. Set the SNS topic as a destination with an on failure condition for the avatar generator Lambda function. Subscribe the image resize Lambda function to the SNS topic.

Discussion 0
Questions 10

An application that runs on AWS Lambda requires access to specific highly confidential objects in an Amazon S3 bucket. In accordance with the principle of least privilege a company grants access to the S3 bucket by using only temporary credentials.

How can a developer configure access to the S3 bucket in the MOST secure way?

Options:

A.  

Hardcode the credentials that are required to access the S3 objects in the application code. Use the credentials to access me required S3 objects.

B.  

Create a secret access key and access key ID with permission to access the S3 bucket. Store the key and key ID in AWS Secrets Manager. Configure the application to retrieve the Secrets Manager secret and use the credentials to access me S3 objects.

C.  

Create a Lambda function execution role Attach a policy to the rote that grants access to specific objects in the S3 bucket.

D.  

Create a secret access key and access key ID with permission to access the S3 bucket Store the key and key ID as environment variables m Lambda. Use the environment variables to access the required S3 objects.

Discussion 0
Questions 11

A company hosts its application on AWS. The application runs on an Amazon Elastic Container Service (Amazon ECS) cluster that uses AWS Fargate. The cluster runs behind an Application Load Balancer The application stores data in an Amazon Aurora database A developer encrypts and manages database credentials inside the application

The company wants to use a more secure credential storage method and implement periodic credential rotation.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Migrate the secret credentials to Amazon RDS parameter groups. Encrypt the parameter by using an AWS Key Management Service (AWS KMS) key Turn on secret rotation. Use 1AM policies and roles to grant AWS KMS permissions to access Amazon RDS.

B.  

Migrate the credentials to AWS Systems Manager Parameter Store. Encrypt the parameter by using an AWS Key Management Service (AWS KMS) key. Turn on secret rotation. Use 1AM policies and roles to grant Amazon ECS Fargate permissions to access to AWS Secrets Manager

C.  

Migrate the credentials to ECS Fargate environment variables. Encrypt the credentials by using an AWS Key Management Service (AWS KMS) key Turn on secret rotation. Use 1AM policies and roles to grant Amazon ECS Fargate permissions to access to AWS Secrets Manager.

D.  

Migrate the credentials to AWS Secrets Manager. Encrypt the credentials by using an AWS Key Management Service (AWS KMS) key Turn on secret rotation Use 1AM policies and roles to grant Amazon ECS Fargate permissions to access to AWS Secrets Manager by using keys.

Discussion 0
Questions 12

A company built an online event platform For each event the company organizes quizzes and generates leaderboards that are based on the quiz scores. The company stores the leaderboard data in Amazon DynamoDB and retains the data for 30 days after an event is complete The company then uses a scheduled job to delete the old leaderboard data

The DynamoDB table is configured with a fixed write capacity. During the months when many events occur, the DynamoDB write API requests are throttled when the scheduled delete job runs.

A developer must create a long-term solution that deletes the old leaderboard data and optimizes write throughput

Which solution meets these requirements?

Options:

A.  

Configure a TTL attribute for the leaderboard data

B.  

Use DynamoDB Streams to schedule and delete the leaderboard data

C.  

Use AWS Step Functions to schedule and delete the leaderboard data.

D.  

Set a higher write capacity when the scheduled delete job runs

Discussion 0
Questions 13

A company has an application that is hosted on Amazon EC2 instances The application stores objects in an Amazon S3 bucket and allows users to download objects from the S3 bucket A developer turns on S3 Block Public Access for the S3 bucket After this change, users report errors when they attempt to download objects The developer needs to implement a solution so that only users who are signed in to the application can access objects in the S3 bucket.

Which combination of steps will meet these requirements in the MOST secure way? (Select TWO.)

Options:

A.  

Create an EC2 instance profile and role with an appropriate policy Associate the role with the EC2 instances

B.  

Create an 1AM user with an appropriate policy. Store the access key ID and secret access key on the EC2 instances

C.  

Modify the application to use the S3 GeneratePresignedUrl API call

D.  

Modify the application to use the S3 GetObject API call and to return the object handle to the user

E.  

Modify the application to delegate requests to the S3 bucket.

Discussion 0
Questions 14

A developer must analyze performance issues with production-distributed applications written as AWS Lambda functions. These distributed Lambda applications invoke other components that make up me applications. How should the developer identify and troubleshoot the root cause of the performance issues in production?

Options:

A.  

Add logging statements to the Lambda functions. then use Amazon CloudWatch to view the logs.

B.  

Use AWS CloudTrail and then examine the logs.

C.  

Use AWS X-Ray. then examine the segments and errors.

D.  

Run Amazon inspector agents and then analyze performance.

Discussion 0
Questions 15

A company has an existing application that has hardcoded database credentials A developer needs to modify the existing application The application is deployed in two AWS Regions with an active-passive failover configuration to meet company’s disaster recovery strategy

The developer needs a solution to store the credentials outside the code. The solution must comply With the company's disaster recovery strategy

Which solution Will meet these requirements in the MOST secure way?

Options:

A.  

Store the credentials in AWS Secrets Manager in the primary Region. Enable secret replication to the secondary Region Update the application to use the Amazon Resource Name (ARN) based on the Region.

B.  

Store credentials in AWS Systems Manager Parameter Store in the primary Region. Enable parameter replication to the secondary Region. Update the application to use the Amazon Resource Name (ARN) based on the Region.

C.  

Store credentials in a config file. Upload the config file to an S3 bucket in me primary Region. Enable Cross-Region Replication (CRR) to an S3 bucket in the secondary region. Update the application to access the config file from the S3 bucket based on the Region.

D.  

Store credentials in a config file. Upload the config file to an Amazon Elastic File System (Amazon EFS) file system. Update the application to use the Amazon EFS file system Regional endpoints to access the config file in the primary and secondary Regions.

Discussion 0
Questions 16

An application that is hosted on an Amazon EC2 instance needs access to files that are stored in an Amazon S3 bucket. The application lists the objects that are stored in the S3 bucket and displays a table to the user. During testing, a developer discovers that the application does not show any objects in the list.

What is the MOST secure way to resolve this issue?

Options:

A.  

Update the IAM instance profile that is attached to the EC2 instance to include the S3:* permission for the S3 bucket.

B.  

Update the IAM instance profile that is attached to the EC2 instance to include the S3:ListBucket permission for the S3 bucket.

C.  

Update the developer's user permissions to include the S3:ListBucket permission for the S3 bucket.

D.  

Update the S3 bucket policy by including the S3:ListBucket permission and by setting the Principal element to specify the account number of the EC2 instance.

Discussion 0
Questions 17

A developer is working on an ecommerce platform that communicates with several third-party payment processing APIs The third-party payment services do not provide a test environment.

The developer needs to validate the ecommerce platform's integration with the third-party payment processing APIs. The developer must test the API integration code without invoking the third-party payment processing APIs.

Which solution will meet these requirements'?

Options:

A.  

Set up an Amazon API Gateway REST API with a gateway response configured for status code 200 Add response templates that contain sample responses captured from the real third-party API.

B.  

Set up an AWS AppSync GraphQL API with a data source configured for each third-party API Specify an integration type of Mock Configure integration responses by using sample responses captured from the real third-party API.

C.  

Create an AWS Lambda function for each third-party API. Embed responses captured from the real third-party API. Configure Amazon Route 53 Resolver with an inbound endpoint for each Lambda function's Amazon Resource Name (ARN).

D.  

Set up an Amazon API Gateway REST API for each third-party API Specify an integration request type of Mock Configure integration responses by using sample responses captured from the real third-party API

Discussion 0
Questions 18

A company is migrating its PostgreSQL database into the AWS Cloud. The company wants to use a database that will secure and regularly rotate database credentials. The company wants a solution that does not require additional programming overhead.

Which solution will meet these requirements?

Options:

A.  

Use Amazon Aurora PostgreSQL tor the database. Store the database credentials in AWS Systems Manager Parameter Store Turn on rotation.

B.  

Use Amazon Aurora PostgreSQL for the database. Store the database credentials in AWS Secrets Manager Turn on rotation.

C.  

Use Amazon DynamoDB for the database. Store the database credentials in AWS Systems Manager Parameter Store Turn on rotation.

D.  

Use Amazon DynamoDB for the database. Store the database credentials in AWS Secrets Manager Turn on rotation.

Discussion 0
Questions 19

A company is planning to use AWS CodeDeploy to deploy an application to Amazon Elastic Container Service (Amazon ECS) During the deployment of a new version of the application, the company initially must expose only 10% of live traffic to the new version of the deployed application. Then, after 15 minutes elapse, the company must route all the remaining live traffic to the new version of the deployed application.

Which CodeDeploy predefined configuration will meet these requirements?

Options:

A.  

CodeDeployDefault ECSCanary10Percent15Minutes

B.  

CodeDeployDefault LambdaCanary10Percent5Minutes

C.  

CodeDeployDefault LambdaCanary10Percent15Minutes

D.  

CodeDeployDefault ECSLinear10PercentEvery1 Minutes

Discussion 0
Questions 20

A developer is configuring an applications deployment environment in AWS CodePipeine. The application code is stored in a GitHub repository. The developer wants to ensure that the repository package's unit tests run in the new deployment environment. The deployment has already set the pipeline's source provider to GitHub and has specified the repository and branch to use in the deployment.

When combination of steps should the developer take next to meet these requirements with the least the LEAST overhead' (Select TWO).

Options:

A.  

Create an AWS CodeCommt project. Add the repository package's build and test commands to the protects buildspec

B.  

Create an AWS CodeBuid project. Add the repository package's build and test commands to the projects buildspec

C.  

Create an AWS CodeDeploy protect. Add the repository package's build and test commands to the project's buildspec

D.  

Add an action to the source stage. Specify the newly created project as the action provider. Specify the build attract as the actions input artifact.

E.  

Add a new stage to the pipeline alter the source stage. Add an action to the new stage. Speedy the newly created protect as the action provider. Specify the source artifact as the action's input artifact.

Discussion 0
Questions 21

A developer is building a serverless application by using AWS Serverless Application Model (AWS SAM) on multiple AWS Lambda functions. When the application is deployed, the developer wants to shift 10% of the traffic to the new deployment of the application for the first 10 minutes after deployment. If there are no issues, all traffic must switch over to the new version.

Which change to the AWS SAM template will meet these requirements?

Options:

A.  

Set the Deployment Preference Type to Canaryl OPercent10Minutes. Set the AutoPublishAlias property to the Lambda alias.

B.  

Set the Deployment Preference Type to Linearl OPercentEveryIOMinutes. Set AutoPubIishAIias property to the Lambda alias.

C.  

Set the Deployment Preference Type to Canaryl OPercentIOMinutes. Set the PreTraffic and PostTraffic properties to the Lambda alias.

D.  

Set the Deployment Preference Type to Linearl OPercentEvery10Minutes. Set PreTraffic and PostTraffic properties to the Lambda alias.

Discussion 0
Questions 22

A developer is migrating an application to Amazon Elastic Kubernetes Service (Amazon EKS). The developer migrates the application to Amazon Elastic Container Registry (Amazon ECR) with an EKS cluster.

As part of the application migration to a new backend, the developer creates a new AWS account. The developer makes configuration changes to the application to point the application to the new AWS account and to use new backend resources. The developer successfully tests the changes within the application by deploying the pipeline.

The Docker image build and the pipeline deployment are successful, but the application is still connecting to the old backend. The developer finds that the application's configuration is still referencing the original EKS cluster and not referencing the new backend resources.

Which reason can explain why the application is not connecting to the new resources?

Options:

A.  

The developer did not successfully create the new AWS account.

B.  

The developer added a new tag to the Docker image.

C.  

The developer did not update the Docker image tag to a new version.

D.  

The developer pushed the changes to a new Docker image tag.

Discussion 0
Questions 23

A company has an Amazon S3 bucket containing premier content that it intends to make available to only paid subscribers of its website. The S3 bucket currently has default permissions of all objects being private to prevent inadvertent exposure of the premier content to non-paying website visitors.

How can the company Limit the ability to download a premier content file in the S3 Bucket to paid subscribers only?

Options:

A.  

Apply a bucket policy that allows anonymous users to download the content from the S3 bucket.

B.  

Generate a pre-signed object URL for the premier content file when a pad subscriber requests a download.

C.  

Add a Docket policy that requires multi-factor authentication for request to access the S3 bucket objects.

D.  

Enable server-side encryption on the S3 bucket for data protection against the non-paying website visitors.

Discussion 0
Questions 24

A developer is preparing to begin development of a new version of an application. The previous version of the application is deployed in a production environment. The developer needs to deploy fixes and updates to the current version during the development of the new version of the application. The code for the new version of the application is stored in AWS CodeCommit.

Which solution will meet these requirements?

Options:

A.  

From the main branch, create a feature branch for production bug fixes. Create a second feature branch from the main branch for development of the new version.

B.  

Create a Git tag of the code that is currently deployed in production. Create a Git tag for the development of the new version. Push the two tags to the CodeCommit repository.

C.  

From the main branch, create a branch of the code that is currently deployed in production. Apply an IAM policy that ensures no other other users can push or merge to the branch.

D.  

Create a new CodeCommit repository for development of the new version of the application. Create a Git tag for the development of the new version.

Discussion 0
Questions 25

A company wants to share information with a third party. The third party has an HTTP API endpoint that the company can use to share the information. The company has the required API key to access the HTTP API.

The company needs a way to manage the API key by using code. The integration of the API key with the application code cannot affect application performance.

Which solution will meet these requirements MOST securely?

Options:

A.  

Store the API credentials in AWS Secrets Manager. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.

B.  

Store the API credentials in a local code variable. Push the code to a secure Git repository. Use the local code variable at runtime to make the API call.

C.  

Store the API credentials as an object in a private Amazon S3 bucket. Restrict access to the S3 object by using IAM policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.

D.  

Store the API credentials in an Amazon DynamoDB table. Restrict access to the table by using resource-based policies. Retrieve the API credentials at runtime by using the AWS SDK. Use the credentials to make the API call.

Discussion 0
Questions 26

A company is developing an ecommerce application that uses Amazon API Gateway APIs. The application uses AWS Lambda as a backend. The company needs to test the code in a dedicated, monitored test environment before the company releases the code to the production environment.

When solution will meet these requirements?

Options:

A.  

Use a single stage in API Gateway. Create a Lambda function for each environment. Configure API clients to send a query parameter that indicates the endowment and the specific lambda function.

B.  

Use multiple stages in API Gateway. Create a single Lambda function for all environments. Add different code blocks for different environments in the Lambda function based on Lambda environments variables.

C.  

Use multiple stages in API Gateway. Create a Lambda function for each environment. Configure API Gateway stage variables to route traffic to the Lambda function in different environments.

D.  

Use a single stage in API Gateway. Configure a API client to send a query parameter that indicated the environment. Add different code blocks tor afferent environments in the Lambda Junction to match the value of the query parameter.

Discussion 0
Questions 27

A company has an application that stores data in Amazon RDS instances. The application periodically experiences surges of high traffic that cause performance problems.

During periods of peak traffic, a developer notices a reduction in query speed in all database queries.

The team's technical lead determines that a multi-threaded and scalable caching solution should be used to offload the heavy read traffic. The solution needs to improve performance.

Which solution will meet these requirements with the LEAST complexity?

Options:

A.  

Use Amazon ElastiCache for Memcached to offload read requests from the main database.

B.  

Replicate the data to Amazon DynamoD

B.  

Set up a DynamoDB Accelerator (DAX) cluster.

C.  

Configure the Amazon RDS instances to use Multi-AZ deployment with one standby instance. Offload read requests from the main database to the standby instance.

D.  

Use Amazon ElastiCache for Redis to offload read requests from the main database.

Discussion 0
Questions 28

A developer wants to add request validation to a production environment Amazon API Gateway API. The developer needs to test the changes

before the API is deployed to the production environment. For the test, the developer will send test requests to the API through a testing tool.

Which solution will meet these requirements with the LEAST operational overhead?

Options:

A.  

Export the existing API to an OpenAPI file. Create a new API. Import the OpenAPI file. Modify the new API to add request validation. Perform the tests. Modify the existing API to add request validation. Deploy the existing API to production.

B.  

Modify the existing API to add request validation. Deploy the updated API to a new API Gateway stage. Perform the tests. Deploy the updated API to the API Gateway production stage.

C.  

Create a new API. Add the necessary resources and methods, including new request validation. Perform the tests. Modify the existing API to add request validation. Deploy the existing API to production.

D.  

Clone the existing API. Modify the new API to add request validation. Perform the tests. Modify the existing API to add request validation. Deploy the existing API to production.

Discussion 0
Questions 29

A company is running Amazon EC2 instances in multiple AWS accounts. A developer needs to implement an application that collects all the lifecycle events of the EC2 instances. The application needs to store the lifecycle events in a single Amazon Simple Queue Service (Amazon SQS) queue in the company's main AWS account for further processing.

Which solution will meet these requirements?

Options:

A.  

Configure Amazon EC2 to deliver the EC2 instance lifecycle events from all accounts to the Amazon EventBridge event bus of the main account. Add an EventBridge rule to the event bus of the main account that matches all EC2 instance lifecycle events. Add the SQS queue as a target of the rule.

B.  

Use the resource policies of the SQS queue in the main account to give each account permissions to write to that SQS queue. Add to the Amazon EventBridge event bus of each account an EventBridge rule that matches all EC2 instance lifecycle events. Add the SQS queue in the main account as a target of the rule.

C.  

Write an AWS Lambda function that scans through all EC2 instances in the company accounts to detect EC2 instance lifecycle changes. Configure the Lambda function to write a notification message to the SQS queue in the main account if the function detects an EC2 instance lifecycle change. Add an Amazon EventBridge scheduled rule that invokes the Lambda function every minute.

D.  

Configure the permissions on the main account event bus to receive events from all accounts. Create an Amazon EventBridge rule in each account to send all the EC2 instance lifecycle events to the main account event bus. Add an EventBridge rule to the main account event bus that matches all EC2 instance lifecycle events. Set the SQS queue as a target for the rule.

Discussion 0
Questions 30

A developer is building a new application on AWS. The application uses an AWS Lambda function that retrieves information from an Amazon DynamoDB table. The developer hard coded the DynamoDB table name into the Lambda function code. The table name might change over time. The developer does not want to modify the Lambda code if the table name changes.

Which solution will meet these requirements MOST efficiently?

Options:

A.  

Create a Lambda environment variable to store the table name. Use the standard method for the programming language to retrieve the variable.

B.  

Store the table name in a file. Store the file in the /tmp folder. Use the SDK for the programming language to retrieve the table name.

C.  

Create a file to store the table name. Zip the file and upload the file to the Lambda layer. Use the SDK for the programming language to retrieve the table name.

D.  

Create a global variable that is outside the handler in the Lambda function to store the table name.

Discussion 0
Questions 31

An developer is building a serverless application by using the AWS Serverless Application Model (AWS SAM). The developer is currently testing the application in a development environment. When the application is nearly finsihed, the developer will need to set up additional testing and staging environments for a quality assurance team.

The developer wants to use a feature of the AWS SAM to set up deployments to multiple environments.

Which solution will meet these requirements with the LEAST development effort?

Options:

A.  

Add a configuration file in TOML format to group configuration entries to every environment. Add a table for each testing and staging environment. Deploy updates to the environments by using the sam deploy command and the --config-env flag that corresponds to the each environment.

B.  

Create additional AWS SAM templates for each testing and staging environment. Write a custom shell script that uses the sam deploy command and the --template-file flag to deploy updates to the environments.

C.  

Create one AWS SAM configuration file that has default parameters. Perform updates to the testing and staging environments by using the —parameter-overrides flag in the AWS SAM CLI and the parameters that the updates will override.

D.  

Use the existing AWS SAM template. Add additional parameters to configure specific attributes for the serverless function and database table resources that are in each environment. Deploy updates to the testing and staging environments by using the sam deploy command.

Discussion 0
Questions 32

A company has installed smart motes in all Its customer locations. The smart meter’s measure power usage at 1minute intervals and send the usage readings to a remote endpoint tot collection. The company needs to create an endpoint that will receive the smart meter readings and store the readings in a database. The company wants to store the location ID and timestamp information.

The company wants to give Is customers low-latency access to their current usage and historical usage on demand The company expects demand to increase significantly. The solution must not impact performance or include downtime write seeing.

When solution will meet these requirements MOST cost-effectively?

Options:

A.  

Store the smart meter readings in an Amazon RDS database. Create an index on the location ID and timestamp columns Use the columns to filter on the customers ‘data.

B.  

Store the smart motor readings m an Amazon DynamoDB table Croato a composite Key oy using the location ID and timestamp columns. Use the columns to filter on the customers' data.

C.  

Store the smart meter readings in Amazon EastCache for Reds Create a Sorted set key y using the location ID and timestamp columns. Use the columns to filter on the customers’ data.

D.  

Store the smart meter readings m Amazon S3 Parton the data by using the location ID and timestamp columns. Use Amazon Athena lo tiler on me customers' data.

Discussion 0
Questions 33

A company wants to automate part of its deployment process. A developer needs to automate the process of checking for and deleting unused resources that supported previously deployed stacks but that are no longer used.

The company has a central application that uses the AWS Cloud Development Kit (AWS CDK) to manage all deployment stacks. The stacks are spread out across multiple accounts. The developer’s solution must integrate as seamlessly as possible within the current deployment process.

Which solution will meet these requirements with the LEAST amount of configuration?

Options:

A.  

In the central AWS CDK application, write a handler function in the code that uses AWS SDK calls to check for and delete unused resources. Create an AWS CloudPormation template from a JSON file. Use the template to attach the function code to an AWS Lambda function and lo invoke the Lambda function when the deployment slack runs.

B.  

In the central AWS CDK application. write a handler function in the code that uses AWS SDK calls to check for and delete unused resources. Create an AWS CDK custom resource Use the custom resource to attach the function code to an AWS Lambda function and to invoke the Lambda function when the deployment stack runs.

C.  

In the central AWS CDK, write a handler function m the code that uses AWS SDK calls to check for and delete unused resources. Create an API in AWS Amplify Use the API to attach the function code to an AWS Lambda function and to invoke the Lambda function when the deployment stack runs.

D.  

In the AWS Lambda console write a handler function in the code that uses AWS SDK calls to check for and delete unused resources. Create an AWS CDK custom resource. Use the custom resource to import the Lambda function into the stack and to Invoke the Lambda function when the deployment stack runs.

Discussion 0
Questions 34

A company uses Amazon API Gateway to expose a set of APIs to customers. The APIs have caching enabled in API Gateway. Customers need a way to invalidate the cache for each API when they test the API.

What should a developer do to give customers the ability to invalidate the API cache?

Options:

A.  

Ask the customers to use AWS credentials to call the InvalidateCache API operation.

B.  

Attach an InvalidateCache policy to the IAM execution role that the customers use to invoke the API. Ask the customers to send a request that contains the HTTP header when they make an API call.

C.  

Ask the customers to use the AWS SDK API Gateway class to invoke the InvalidateCache API operation.

D.  

Attach an InvalidateCache policy to the IAM execution role that the customers use to invoke the API. Ask the customers to add the INVALIDATE_CACHE query string parameter when they make an API call.

Discussion 0
Questions 35

A company developed an API application on AWS by using Amazon CloudFront. Amazon API Gateway, and AWS Lambda. The API has a minimum of four requests every second A developer notices that many API users run the same query by using the POST method. The developer wants to cache the POST request to optimize the API resources.

Which solution will meet these requirements'?

Options:

A.  

Configure the CloudFront cache Update the application to return cached content based upon the default request headers.

B.  

Override the cache method in me selected stage of API Gateway Select the POST method.

C.  

Save the latest request response in Lambda /tmp directory Update the Lambda function to check the /tmp directory

D.  

Save the latest request m AWS Systems Manager Parameter Store Modify the Lambda function to take the latest request response from Parameter Store

Discussion 0
Questions 36

A company runs an application on AWS The application uses an AWS Lambda function that is configured with an Amazon Simple Queue Service (Amazon SQS) queue called high priority queue as the event source A developer is updating the Lambda function with another SQS queue called low priority queue as the event source The Lambda function must always read up to 10 simultaneous messages from the high priority queue before processing messages from low priority queue. The Lambda function must be limited to 100 simultaneous invocations.

Which solution will meet these requirements'?

Options:

A.  

Set the event source mapping batch size to 10 for the high priority queue and to 90 for the low priority queue

B.  

Set the delivery delay to 0 seconds for the high priority queue and to 10 seconds for the low priority queue

C.  

Set the event source mapping maximum concurrency to 10 for the high priority queue and to 90 for the low priority queue

D.  

Set the event source mapping batch window to 10 for the high priority queue and to 90 for the low priority queue

Discussion 0
Questions 37

A developer at a company needs to create a small application that makes the same API call once each day at a designated time. The company does not have infrastructure in the AWS Cloud yet, but the company wants to implement this functionality on AWS.

Which solution meets these requirements in the MOST operationally efficient manner?

Options:

A.  

Use a Kubernetes cron job that runs on Amazon Elastic Kubernetes Service (Amazon EKS).

B.  

Use an Amazon Linux crontab scheduled job that runs on Amazon EC2.

C.  

Use an AWS Lambda function that is invoked by an Amazon EventBridge scheduled event.

D.  

Use an AWS Batch job that is submitted to an AWS Batch job queue.

Discussion 0
Questions 38

A data visualization company wants to strengthen the security of its core applications The applications are deployed on AWS across its development staging, pre-production, and production environments. The company needs to encrypt all of its stored sensitive credentials The sensitive credentials need to be automatically rotated Aversion of the sensitive credentials need to be stored for each environment

Which solution will meet these requirements in the MOST operationally efficient way?

Options:

A.  

Configure AWS Secrets Manager versions to store different copies of the same credentials across multiple environments

B.  

Create a new parameter version in AWS Systems Manager Parameter Store for each environment Store the environment-specific credentials in the parameter version.

C.  

Configure the environment variables in the application code Use different names for each environment type

D.  

Configure AWS Secrets Manager to create a new secret for each environment type. Store the environment-specific credentials in the secret

Discussion 0
Questions 39

A company is building a new application that runs on AWS and uses Amazon API Gateway to expose APIs Teams of developers are working on separate components of the application in parallel The company wants to publish an API without an integrated backend so that teams that depend on the application backend can continue the development work before the API backend development is complete.

Which solution will meet these requirements?

Options:

A.  

Create API Gateway resources and set the integration type value to MOCK Configure the method integration request and integration response to associate a response with an HTTP status code Create an API Gateway stage and deploy the API.

B.  

Create an AWS Lambda function that returns mocked responses and various HTTP status codes. Create API Gateway resources and set the integration type value to AWS_PROXY Deploy the API.

C.  

Create an EC2 application that returns mocked HTTP responses Create API Gateway resources and set the integration type value to AWS Create an API Gateway stage and deploy the API.

D.  

Create API Gateway resources and set the integration type value set to HTTP_PROXY. Add mapping templates and deploy the API. Create an AWS Lambda layer that returns various HTTP status codes Associate the Lambda layer with the API deployment

Discussion 0
Questions 40

A financial company must store original customer records for 10 years for legal reasons. A complete record contains personally identifiable information (PII). According to local regulations, PII is available to only certain people in the company and must not be shared with third parties. The company needs to make the records available to third-party organizations for statistical analysis without sharing the PII.

A developer wants to store the original immutable record in Amazon S3. Depending on who accesses the S3 document, the document should be returned as is or with all the PII removed. The developer has written an AWS Lambda function to remove the PII from the document. The function is named removePii.

What should the developer do so that the company can meet the PII requirements while maintaining only one copy of the document?

Options:

A.  

Set up an S3 event notification that invokes the removePii function when an S3 GET request is made. Call Amazon S3 by using a GET request to access the object without PII.

B.  

Set up an S3 event notification that invokes the removePii function when an S3 PUT request is made. Call Amazon S3 by using a PUT request to access the object without PII.

C.  

Create an S3 Object Lambda access point from the S3 console. Select the removePii function. Use S3 Access Points to access the object without PII.

D.  

Create an S3 access point from the S3 console. Use the access point name to call the GetObjectLegalHold S3 API function. Pass in the removePii function name to access the object without PII.

Discussion 0
Questions 41

A developer has an application that stores data in an Amazon S3 bucket. The application uses an HTTP API to store and retrieve objects. When the PutObject API operation adds objects to the S3 bucket the developer must encrypt these objects at rest by using server-side encryption with Amazon S3 managed keys (SSE-S3).

Which solution will meet this requirement?

Options:

A.  

Create an AWS Key Management Service (AWS KMS) key. Assign the KMS key to the S3 bucket.

B.  

Set the x-amz-server-side-encryption header when invoking the PutObject API operation.

C.  

Provide the encryption key in the HTTP header of every request.

D.  

Apply TLS to encrypt the traffic to the S3 bucket.

Discussion 0
Questions 42

A developer is deploying a new application to Amazon Elastic Container Service (Amazon ECS). The developer needs to securely store and retrieve different types of variables. These variables include authentication information for a remote API, the URL for the API, and credentials. The authentication information and API URL must be available to all current and future deployed versions of the application across development, testing, and production environments.

How should the developer retrieve the variables with the FEWEST application changes?

Options:

A.  

Update the application to retrieve the variables from AWS Systems Manager Parameter Store. Use unique paths in Parameter Store for each variable in each environment. Store the credentials in AWS Secrets Manager in each environment.

B.  

Update the application to retrieve the variables from AWS Key Management Service (AWS KMS). Store the API URL and credentials as unique keys for each environment.

C.  

Update the application to retrieve the variables from an encrypted file that is stored with the application. Store the API URL and credentials in unique files for each environment.

D.  

Update the application to retrieve the variables from each of the deployed environments. Define the authentication information and API URL in the ECS task definition as unique names during the deployment process.

Discussion 0
Questions 43

A developer needs to build an AWS CloudFormation template that self-populates the AWS Region variable that deploys the CloudFormation template

What is the MOST operationally efficient way to determine the Region in which the template is being deployed?

Options:

A.  

Use the AWS:.Region pseudo parameter

B.  

Require the Region as a CloudFormation parameter

C.  

Find the Region from the AWS::Stackld pseudo parameter by using the Fn::Split intrinsic function

D.  

Dynamically import the Region by referencing the relevant parameter in AWS Systems Manager Parameter Store

Discussion 0
Questions 44

A developer has observed an increase in bugs in the AWS Lambda functions that a development team has deployed in its Node is application. To minimize these bugs, the developer wants to impendent automated testing of Lambda functions in an environment that Closely simulates the Lambda environment.

The developer needs to give other developers the ability to run the tests locally. The developer also needs to integrate the tests into the team's continuous integration and continuous delivery (Ct/CO) pipeline before the AWS Cloud Development Kit (AWS COK) deployment.

Which solution will meet these requirements?

Options:

A.  

Create sample events based on the Lambda documentation. Create automated test scripts that use the cdk local invoke command to invoke the Lambda functions. Check the response Document the test scripts for the other developers on the team Update the CI/CD pipeline to run the test scripts.

B.  

Install a unit testing framework that reproduces the Lambda execution environment. Create sample events based on the Lambda Documentation Invoke the handler function by using a unit testing framework. Check the response Document how to run the unit testing framework for the other developers on the team. Update the OCD pipeline to run the unit testing framework.

C.  

Install the AWS Serverless Application Model (AWS SAW) CLI tool Use the Sam local generate-event command to generate sample events for me automated tests. Create automated test scripts that use the Sam local invoke command to invoke the Lambda functions. Check the response Document the test scripts tor the other developers on the team Update the CI/CD pipeline to run the test scripts.

D.  

Create sample events based on the Lambda documentation. Create a Docker container from the Node is base image to invoke the Lambda functions. Check the response Document how to run the Docker container for the more developers on the team update the CI/CD pipeline to run the Docker container.

Discussion 0
Questions 45

A company has an application that runs as a series of AWS Lambda functions. Each Lambda function receives data from an Amazon Simple Notification Service (Amazon SNS) topic and writes the data to an Amazon Aurora DB instance.

To comply with an information security policy, the company must ensure that the Lambda functions all use a single securely encrypted database connection string to access Aurora.

Which solution will meet these requirements'?

Options:

A.  

Use IAM database authentication for Aurora to enable secure database connections for ail the Lambda functions.

B.  

Store the credentials and read the credentials from an encrypted Amazon RDS DB instance.

C.  

Store the credentials in AWS Systems Manager Parameter Store as a secure string parameter.

D.  

Use Lambda environment variables with a shared AWS Key Management Service (AWS KMS) key for encryption.

Discussion 0
Questions 46

A developer is testing an application that invokes an AWS Lambda function asynchronously. During the testing phase the Lambda function fails to process after two retries.

How can the developer troubleshoot the failure?

Options:

A.  

Configure AWS CloudTrail logging to investigate the invocation failures.

B.  

Configure Dead Letter Queues by sending events to Amazon SQS for investigation.

C.  

Configure Amazon Simple Workflow Service to process any direct unprocessed events.

D.  

Configure AWS Config to process any direct unprocessed events.

Discussion 0
Questions 47

A developer is migrating some features from a legacy monolithic application to use AWS Lambda functions instead. The application currently stores data in an Amazon Aurora DB cluster that runs in private subnets in a VPC. The AWS account has one VPC deployed. The Lambda functions and the DB cluster are deployed in the same AWS Region in the same AWS account.

The developer needs to ensure that the Lambda functions can securely access the DB cluster without crossing the public internet.

Which solution will meet these requirements?

Options:

A.  

Configure the DB cluster's public access setting to Yes.

B.  

Configure an Amazon RDS database proxy for the Lambda functions.

C.  

Configure a NAT gateway and a security group for the Lambda functions.

D.  

Configure the VPC, subnets, and a security group for the Lambda functions.

Discussion 0
Questions 48

An online sales company is developing a serverless application that runs on AWS. The application uses an AWS Lambda function that calculates order success rates and stores the data in an Amazon DynamoDB table. A developer wants an efficient way to invoke the Lambda function every 15 minutes.

Which solution will meet this requirement with the LEAST development effort?

Options:

A.  

Create an Amazon EventBridge rule that has a rate expression that will run the rule every 15 minutes. Add the Lambda function as the target of the EventBridge rule.

B.  

Create an AWS Systems Manager document that has a script that will invoke the Lambda function on Amazon EC2. Use a Systems Manager Run Command task to run the shell script every 15 minutes.

C.  

Create an AWS Step Functions state machine. Configure the state machine to invoke the Lambda function execution role at a specified interval by using a Wait state. Set the interval to 15 minutes.

D.  

Provision a small Amazon EC2 instance. Set up a cron job that invokes the Lambda function every 15 minutes.

Discussion 0
Questions 49

An ecommerce company is using an AWS Lambda function behind Amazon API Gateway as its application tier. To process orders during checkout, the application calls a POST API from the frontend. The POST API invokes the Lambda function asynchronously. In rare situations, the application has not processed orders. The Lambda application logs show no errors or failures.

What should a developer do to solve this problem?

Options:

A.  

Inspect the frontend logs for API failures. Call the POST API manually by using the requests from the log file.

B.  

Create and inspect the Lambda dead-letter queue. Troubleshoot the failed functions. Reprocess the events.

C.  

Inspect the Lambda logs in Amazon CloudWatch for possible errors. Fix the errors.

D.  

Make sure that caching is disabled for the POST API in API Gateway.

Discussion 0
Questions 50

A developer is creating an application that will store personal health information (PHI). The PHI needs to be encrypted at all times. An encrypted Amazon RDS for MySQL DB instance is storing the data. The developer wants to increase the performance of the application by caching frequently accessed data while adding the ability to sort or rank the cached datasets.

Which solution will meet these requirements?

Options:

A.  

Create an Amazon ElastiCache for Redis instance. Enable encryption of data in transit and at rest. Store frequently accessed data in the cache.

B.  

Create an Amazon ElastiCache for Memcached instance. Enable encryption of data in transit and at rest. Store frequently accessed data in the cache.

C.  

Create an Amazon RDS for MySQL read replica. Connect to the read replica by using SSL. Configure the read replica to store frequently accessed data.

D.  

Create an Amazon DynamoDB table and a DynamoDB Accelerator (DAX) cluster for the table. Store frequently accessed data in the DynamoDB table.

Discussion 0
Questions 51

A company is migrating legacy internal applications to AWS. Leadership wants to rewrite the internal employee directory to use native AWS services. A developer needs to create a solution for storing employee contact details and high-resolution photos for use with the new application.

Which solution will enable the search and retrieval of each employee's individual details and high-resolution photos using AWS APIs?

Options:

A.  

Encode each employee's contact information and photos using Base64. Store the information in an Amazon DynamoDB table using a sort key.

B.  

Store each employee's contact information in an Amazon DynamoDB table along with the object keys for the photos stored in Amazon S3.

C.  

Use Amazon Cognito user pools to implement the employee directory in a fully managed software-as-a-service (SaaS) method.

D.  

Store employee contact information in an Amazon RDS DB instance with the photos stored in Amazon Elastic File System (Amazon EFS).

Discussion 0
Questions 52

A developer wants to store information about movies. Each movie has a title, release year, and genre. The movie information also can include additional properties about the cast and production crew. This additional information is inconsistent across movies. For example, one movie might have an assistant director, and another movie might have an animal trainer.

The developer needs to implement a solution to support the following use cases:

For a given title and release year, get all details about the movie that has that title and release year.

For a given title, get all details about all movies that have that title.

For a given genre, get all details about all movies in that genre.

Which data store configuration will meet these requirements?

Options:

A.  

Create an Amazon DynamoDB table. Configure the table with a primary key that consists of the title as the partition key and the release year as the sort key. Create a global secondary index that uses the genre as the partition key and the title as the sort key.

B.  

Create an Amazon DynamoDB table. Configure the table with a primary key that consists of the genre as the partition key and the release year as the sort key. Create a global secondary index that uses the title as the partition key.

C.  

On an Amazon RDS DB instance, create a table that contains columns for title, release year, and genre. Configure the title as the primary key.

D.  

On an Amazon RDS DB instance, create a table where the primary key is the title and all other data is encoded into JSON format as one additional column.

Discussion 0
Questions 53

A company has deployed infrastructure on AWS. A development team wants to create an AWS Lambda function that will retrieve data from an Amazon Aurora database. The Amazon Aurora database is in a private subnet in company's VPC. The VPC is named VPC1. The data is relational in nature. The Lambda function needs to access the data securely.

Which solution will meet these requirements?

Options:

A.  

Create the Lambda function. Configure VPC1 access for the function. Attach a security group named SG1 to both the Lambda function and the database. Configure the security group inbound and outbound rules to allow TCP traffic on Port 3306.

B.  

Create and launch a Lambda function in a new public subnet that is in a new VPC named VPC2. Create a peering connection between VPC1 and VPC2.

C.  

Create the Lambda function. Configure VPC1 access for the function. Assign a security group named SG1 to the Lambda function. Assign a second security group named SG2 to the database. Add an inbound rule to SG1 to allow TCP traffic from Port 3306.

D.  

Export the data from the Aurora database to Amazon S3. Create and launch a Lambda function in VPC1. Configure the Lambda function query the data from Amazon S3.

Discussion 0
Questions 54

A developer accesses AWS CodeCommit over SSH. The SSH keys configured to access AWS CodeCommit are tied to a user with the following permissions:

The developer needs to create/delete branches

Which specific IAM permissions need to be added based on the principle of least privilege?

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Questions 55

A developer creates a static website for their department The developer deploys the static assets for the website to an Amazon S3 bucket and serves the assets with Amazon CloudFront The developer uses origin access control (OAC) on the CloudFront distribution to access the S3 bucket

The developer notices users can access the root URL and specific pages but cannot access directories without specifying a file name. For example, /products/index.html works, but /products returns an error The developer needs to enable accessing directories without specifying a file name without exposing the S3 bucket publicly.

Which solution will meet these requirements'?

Options:

A.  

Update the CloudFront distribution's settings to index.html as the default root object is set

B.  

Update the Amazon S3 bucket settings and enable static website hosting. Specify index html as the Index document Update the S3 bucket policy to enable access. Update the CloudFront distribution's origin to use the S3 website endpoint

C.  

Create a CloudFront function that examines the request URL and appends index.html when directories are being accessed Add the function as a viewer request CloudFront function to the CloudFront distribution's behavior.

D.  

Create a custom error response on the CloudFront distribution with the HTTP error code set to the HTTP 404 Not Found response code and the response page path to /index html Set the HTTP response code to the HTTP 200 OK response code

Discussion 0
Questions 56

A developer is writing a serverless application that requires an AWS Lambda function to be invoked every 10 minutes.

What is an automated and serverless way to invoke the function?

Options:

A.  

Deploy an Amazon EC2 instance based on Linux, and edit its /etc/confab file by adding a command to periodically invoke the lambda function

B.  

Configure an environment variable named PERIOD for the Lambda function. Set the value to 600.

C.  

Create an Amazon EventBridge rule that runs on a regular schedule to invoke the Lambda function.

D.  

Create an Amazon Simple Notification Service (Amazon SNS) topic that has a subscription to the Lambda function with a 600-second timer.

Discussion 0
Questions 57

An online food company provides an Amazon API Gateway HTTP API 1o receive orders for partners. The API is integrated with an AWS Lambda function. The Lambda function stores the orders in an Amazon DynamoDB table.

The company expects to onboard additional partners Some to me panthers require additional Lambda function to receive orders. The company has created an Amazon S3 bucket. The company needs 10 store all orders and updates m the S3 bucket for future analysis

How can the developer ensure that an orders and updates are stored to Amazon S3 with the LEAST development effort?

Options:

A.  

Create a new Lambda function and a new API Gateway API endpoint. Configure the new Lambda function to write to the S3 bucket. Modify the original Lambda function to post updates to the new API endpoint.

B.  

Use Amazon Kinesis Data Streams to create a new data stream. Modify the Lambda function to publish orders to the oats stream Configure in data stream to write to the S3 bucket.

C.  

Enable DynamoDB Streams on me DynamoOB table. Create a new lambda function. Associate the stream's Amazon Resource Name (ARN) with the Lambda Function Configure the Lambda function to write to the S3 bucket as records appear in the table’s stream.

D.  

Modify the Lambda function to punish to a new Amazon. Simple Lambda function receives orders. Subscribe a new Lambda function to the topic. Configure the new Lambda function to write to the S3 bucket as updates come through the topic.

Discussion 0
Questions 58

A developer is deploying a company's application to Amazon EC2 instances The application generates gigabytes of data files each day The files are rarely accessed but the files must be available to the application's users within minutes of a request during the first year of storage The company must retain the files for 7 years.

How can the developer implement the application to meet these requirements MOST cost-effectively?

Options:

A.  

Store the files in an Amazon S3 bucket Use the S3 Glacier Instant Retrieval storage class Create an S3 Lifecycle policy to transition the files to the S3 Glacier Deep Archive storage class after 1 year

B.  

Store the files in an Amazon S3 bucket. Use the S3 Standard storage class. Create an S3 Lifecycle policy to transition the files to the S3 Glacier Flexible Retrieval storage class after 1 year.

C.  

Store the files on an Amazon Elastic Block Store (Amazon EBS) volume Use Amazon Data Lifecycle Manager (Amazon DLM) to create snapshots of the EBS volumes and to store those snapshots in Amazon S3

D.  

Store the files on an Amazon Elastic File System (Amazon EFS) mount. Configure EFS lifecycle management to transition the files to the EFS Standard-Infrequent Access (Standard-IA) storage class after 1 year.

Discussion 0
Questions 59

A developer is building a web application that uses Amazon API Gateway to expose an AWS Lambda function to process requests from clients. During testing, the developer notices that the API Gateway times out even though the Lambda function finishes under the set time limit.

Which of the following API Gateway metrics in Amazon CloudWatch can help the developer troubleshoot the issue? (Choose two.)

Options:

A.  

CacheHitCount

B.  

IntegrationLatency

C.  

CacheMissCount

D.  

Latency

E.  

Count

Discussion 0