Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Amazon Exam DVA-C02 Topic 2 Question 40 Discussion

Actual exam question for Amazon's DVA-C02 exam
Question #: 40
Topic #: 2
[All DVA-C02 Questions]

A developer is creating an AWS Lambda function that consumes messages from an Amazon Simple Queue Service (Amazon SQS) standard queue. The developer notices that the Lambda function processes some messages multiple times.

How should developer resolve this issue MOST cost-effectively?

Show Suggested Answer Hide Answer
Suggested Answer: D

AWS Secrets Manager:Built for managing secrets, providing encryption, automatic rotation, and access control.

Customer Master Key (CMK):Provides an extra layer of control over encryption through AWS KMS.

Automatic Rotation:Enhances security by regularly changing the secret.

User Data Script:Allows secrets retrieval at instance startup and sets them as environment variables for seamless use within the application.


AWS Secrets Manager Documentation:https://docs.aws.amazon.com/secretsmanager/

AWS KMS Documentation:https://docs.aws.amazon.com/kms/

User Data for EC2 Instances:https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/user-data.html

Contribute your Thoughts:

Ashton
1 months ago
Option A: 'Change the queue type? Ain't nobody got time for that!' (in a humorous tone)
upvoted 0 times
Toshia
15 days ago
C: Set the maximum concurrency limit of the AWS Lambda function to 1
upvoted 0 times
...
Beula
16 days ago
B: Set up a dead-letter queue.
upvoted 0 times
...
Winfred
17 days ago
A: Change the queue type? Ain't nobody got time for that! (in a humorous tone)
upvoted 0 times
...
...
Renea
1 months ago
Option D is an interesting idea, but using Kinesis Data Streams might be overkill for this use case. It could end up being more complex and expensive than necessary.
upvoted 0 times
...
Bobbye
1 months ago
Limiting the concurrency to 1 in Option C might work, but it could impact the overall performance of the Lambda function. I'm not sure if it's the best approach.
upvoted 0 times
Reita
2 days ago
A: True, but changing the message processing to use Amazon Kinesis Data Streams could be another effective option to consider.
upvoted 0 times
...
Skye
22 days ago
B: Setting up a dead-letter queue could also help in handling the duplicate messages.
upvoted 0 times
...
Deeanna
24 days ago
A: I think changing the Amazon SQS standard queue to an Amazon SQS FIFO queue with deduplication ID might be the best solution.
upvoted 0 times
...
...
Lizbeth
2 months ago
I like the idea of setting up a dead-letter queue in Option B. That way we can identify and re-process the problematic messages.
upvoted 0 times
Chauncey
5 days ago
Setting up a dead-letter queue seems like the most practical solution to prevent processing messages multiple times.
upvoted 0 times
...
Allene
8 days ago
We should definitely consider implementing a dead-letter queue to improve message processing.
upvoted 0 times
...
Dominque
1 months ago
Let's go with Option B then. It seems like the most practical solution for our problem.
upvoted 0 times
...
Isaac
1 months ago
I agree, setting up a dead-letter queue is a cost-effective way to handle these issues.
upvoted 0 times
...
Phuong
1 months ago
I agree, setting up a dead-letter queue is a cost-effective way to handle the issue.
upvoted 0 times
...
Brandon
1 months ago
Option B sounds like a good solution. It will help us identify and re-process problematic messages.
upvoted 0 times
...
Kayleigh
2 months ago
Option B sounds like a good solution. We can easily identify and re-process problematic messages.
upvoted 0 times
...
...
Brett
2 months ago
But wouldn't setting the maximum concurrency limit of the Lambda function to 1 be a more cost-effective solution?
upvoted 0 times
...
Annabelle
2 months ago
Option A seems like the easiest solution, but I'm not sure if it's the most cost-effective. FIFO queues can be more expensive than standard queues.
upvoted 0 times
Basilia
1 months ago
B: That's a good point, but changing the Amazon SQS standard queue to a FIFO queue with deduplication ID might be more cost-effective in the long run.
upvoted 0 times
...
Lakeesha
1 months ago
A: I think setting up a dead-letter queue could help prevent processing the same messages multiple times.
upvoted 0 times
...
...
Gladys
2 months ago
I disagree, setting up a dead-letter queue could also help in resolving the issue.
upvoted 0 times
...
Brett
3 months ago
I think the best option is to change the Amazon SQS standard queue to an Amazon SQS FIFO queue.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77