Independence Day Deal! Unlock 25% OFF Today – Limited-Time Offer - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud DevOps Engineer Topic 1 Question 66 Discussion

Actual exam question for Google's Professional Cloud DevOps Engineer exam
Question #: 66
Topic #: 1
[All Professional Cloud DevOps Engineer Questions]

You need to create a Cloud Monitoring SLO for a service that will be published soon. You want to verify that requests to the service will be addressed in fewer than 300 ms at least 90% Of the time per calendar month. You need to identify the metric and evaluation method to use. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

Contribute your Thoughts:

Tommy
29 days ago
A) is the way to go. Gotta love these exam questions that are pretty much fill-in-the-blank. Though I'm still trying to figure out why the service will be 'published soon' - is it going to be a bestselling novel or something?
upvoted 0 times
...
Cruz
1 months ago
Hmm, this one's a no-brainer. A) is the right choice - latency metric and request-based evaluation. Though I'm a bit curious why they didn't just say 'select option A' instead of all this mumbo-jumbo.
upvoted 0 times
...
Domingo
1 months ago
A) is the clear winner here. Latency is the metric you need, and a request-based method is the way to go to meet the 90% threshold per calendar month.
upvoted 0 times
...
Loren
1 months ago
Definitely go with A) Select a latency metric for a request-based method of evaluation. That's the only option that matches the criteria of verifying requests are addressed in fewer than 300 ms at least 90% of the time.
upvoted 0 times
Jonell
15 days ago
B) Choose a throughput metric for a time-based method of evaluation.
upvoted 0 times
...
Magnolia
20 days ago
A) Select a latency metric for a request-based method of evaluation.
upvoted 0 times
...
...
Kate
2 months ago
I disagree, I believe we should select an availability metric for a window-based method of evaluation.
upvoted 0 times
...
Ashton
2 months ago
I agree with Lashanda, it makes sense to track latency for this type of service.
upvoted 0 times
...
Lashanda
2 months ago
I think we should select a latency metric for a request-based method of evaluation.
upvoted 0 times
...
Elena
2 months ago
I disagree, I believe we should select an availability metric for a window-based method of evaluation.
upvoted 0 times
...
Isaac
3 months ago
I agree with Dottie, it makes sense to track latency for this type of service.
upvoted 0 times
...
Dottie
3 months ago
I think we should select a latency metric for a request-based method of evaluation.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77