Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Confluent Exam CCDAK Topic 1 Question 52 Discussion

Actual exam question for Confluent's CCDAK exam
Question #: 52
Topic #: 1
[All CCDAK Questions]

We have a store selling shoes. What dataset is a great candidate to be modeled as a KTable in Kafka Streams?

Show Suggested Answer Hide Answer
Suggested Answer: D

Each producer is allowed to produce @ 1MB/s to a broker. Max throughput 5 * 1MB, because we have 5 brokers.


Contribute your Thoughts:

Gladis
10 days ago
The transaction stream seems like the obvious choice here. That's the core of our business - we need to keep track of all the shoes that are being bought and sold.
upvoted 0 times
...
Hubert
20 days ago
I disagree. I think the inventory contents right now would be more useful for tracking available shoe sizes.
upvoted 0 times
...
Avery
21 days ago
I agree with Maile. The transaction stream would provide real-time data for shoe sales.
upvoted 0 times
...
Maile
24 days ago
I think the transaction stream would be a great candidate for a KTable.
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77