Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Confluent Exam CCDAK Topic 1 Question 68 Discussion

Actual exam question for Confluent's CCDAK exam
Question #: 68
Topic #: 1
[All CCDAK Questions]

I am producing Avro data on my Kafka cluster that is integrated with the Confluent Schema Registry. After a schema change that is incompatible, I know my data will be rejected. Which component will reject the data?

Show Suggested Answer Hide Answer
Suggested Answer: D

One partition is assigned a thread, so only 5 will be active, and 25 threads (i.e. tasks) will be created


Contribute your Thoughts:

Nan
19 hours ago
Definitely the Kafka Producer itself. It's the one sending the data, so it should be the one to handle any schema validation issues.
upvoted 0 times
...
Veronika
4 days ago
I think it's the Kafka Broker. That's where the data gets processed, so it makes sense that the broker would reject the data if the schema is incompatible.
upvoted 0 times
...
Tamera
17 days ago
The Confluent Schema Registry, of course! It's the component responsible for managing and enforcing schema compatibility, so it'll reject any data that doesn't match the registered schema.
upvoted 0 times
Dorsey
4 days ago
The Confluent Schema Registry
upvoted 0 times
...
...
Oretha
19 days ago
I agree with Jeannetta, the Confluent Schema Registry will reject the data if it's incompatible.
upvoted 0 times
...
Benedict
20 days ago
I think it's the Kafka Producer itself because it's the one sending the data.
upvoted 0 times
...
Jeannetta
22 days ago
A) The Confluent Schema Registry
upvoted 0 times
...

Save Cancel
az-700  pass4success  az-104  200-301  200-201  cissp  350-401  350-201  350-501  350-601  350-801  350-901  az-720  az-305  pl-300  

Warning: Cannot modify header information - headers already sent by (output started at /pass.php:70) in /pass.php on line 77