Easter Sale Special 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: exams65

Confluent Certified Developer for Apache Kafka Certification Examination Question and Answers

Confluent Certified Developer for Apache Kafka Certification Examination

Last Update May 18, 2024
Total Questions : 150

We are offering FREE CCDAK Confluent exam questions. All you do is to just go and sign up. Give your details, prepare CCDAK free exam questions and then go for complete pool of Confluent Certified Developer for Apache Kafka Certification Examination test questions that will help you more.

CCDAK pdf

CCDAK PDF

$35  $99.99
CCDAK Engine

CCDAK Testing Engine

$42  $119.99
CCDAK PDF + Engine

CCDAK PDF + Testing Engine

$56  $159.99
Questions 1

How does a consumer commit offsets in Kafka?

Options:

A.  

It directly sends a message to the __consumer_offsets topic

B.  

It interacts with the Group Coordinator broker

C.  

It directly commits the offsets in Zookeeper

Discussion 0
Questions 2

An ecommerce wesbite sells some custom made goods. What's the natural way of modeling this data in Kafka streams?

Options:

A.  

Purchase as stream, Product as stream, Customer as stream

B.  

Purchase as stream, Product as table, Customer as table

C.  

Purchase as table, Product as table, Customer as table

D.  

Purchase as stream, Product as table, Customer as stream

Discussion 0
Questions 3

What exceptions may be caught by the following producer? (select two)

ProducerRecord record =

new ProducerRecord<>("topic1", "key1", "value1");

try {

producer.send(record);

} catch (Exception e) {

e.printStackTrace();

}

Options:

A.  

BrokerNotAvailableException

B.  

SerializationException

C.  

InvalidPartitionsException

D.  

BufferExhaustedException

Discussion 0
Questions 4

Suppose you have 6 brokers and you decide to create a topic with 10 partitions and a replication factor of 3. The brokers 0 and 1 are on rack A, the brokers 2 and 3 are on rack B, and the brokers 4 and 5 are on rack C. If the leader for partition 0 is on broker 4, and the first replica is on broker 2, which broker can host the last replica? (select two)

Options:

A.  

6

B.  

1

C.  

2

D.  

5

E.  

0

F.  

3

Discussion 0
Questions 5

Which Kafka CLI should you use to consume from a topic?

Options:

A.  

kafka-console-consumer

B.  

kafka-topics

C.  

kafka-console

D.  

kafka-consumer-groups

Discussion 0
Questions 6

What is a generic unique id that I can use for messages I receive from a consumer?

Options:

A.  

topic + partition + timestamp

B.  

topic + partition + offset

C.  

topic + timestamp

Discussion 0
Questions 7

A producer application in a developer machine was able to send messages to a Kafka topic. After copying the producer application into another developer's machine, the producer is able to connect to Kafka but unable to produce to the same Kafka topic because of an authorization issue. What is the likely issue?

Options:

A.  

Broker configuration needs to be changed to allow a different producer

B.  

You cannot copy a producer application from one machine to another

C.  

The Kafka ACL does not allow another machine IP

D.  

The Kafka Broker needs to be rebooted

Discussion 0
Questions 8

To produce data to a topic, a producer must provide the Kafka client with...

Options:

A.  

the list of brokers that have the data, the topic name and the partitions list

B.  

any broker from the cluster and the topic name and the partitions list

C.  

all the brokers from the cluster and the topic name

D.  

any broker from the cluster and the topic name

Discussion 0
Questions 9

A consumer wants to read messages from a specific partition of a topic. How can this be achieved?

Options:

A.  

Call subscribe(String topic, int partition) passing the topic and partition number as the arguments

B.  

Call assign() passing a Collection of TopicPartitions as the argument

C.  

Call subscribe() passing TopicPartition as the argument

Discussion 0
Questions 10

Which KSQL queries write to Kafka?

Options:

A.  

COUNT and JOIN

B.  

SHOW STREAMS and EXPLAIN statements

C.  

CREATE STREAM WITH and CREATE TABLE WITH

D.  

CREATE STREAM AS SELECT and CREATE TABLE AS SELECT

Discussion 0
Questions 11

is KSQL ANSI SQL compliant?

Options:

A.  

Yes

B.  

No

Discussion 0
Questions 12

Using the Confluent Schema Registry, where are Avro schema stored?

Options:

A.  

In the Schema Registry embedded SQL database

B.  

In the Zookeeper node /schemas

C.  

In the message bytes themselves

D.  

In the _schemas topic

Discussion 0
Questions 13

There are 3 producers writing to a topic with 5 partitions. There are 10 consumers consuming from the topic as part of the same group. How many consumers will remain idle?

Options:

A.  

10

B.  

3

C.  

None

D.  

5

Discussion 0
Questions 14

How do you create a topic named test with 3 partitions and 3 replicas using the Kafka CLI?

Options:

A.  

bin/kafka-topics.sh --create --broker-list localhost:9092 --replication-factor 3 --partitions 3 --topic test

B.  

bin/kafka-topics-create.sh --zookeeper localhost:9092 --replication-factor 3 --partitions 3 --topic test

C.  

bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 3 --partitions 3 --topic test

D.  

bin/kafka-topics.sh --create --bootstrap-server localhost:2181 --replication-factor 3 --partitions 3 --topic test

Discussion 0
Questions 15

Which of the following errors are retriable from a producer perspective? (select two)

Options:

A.  

MESSAGE_TOO_LARGE

B.  

INVALID_REQUIRED_ACKS

C.  

NOT_ENOUGH_REPLICAS

D.  

NOT_LEADER_FOR_PARTITION

E.  

TOPIC_AUTHORIZATION_FAILED

Discussion 0
Questions 16

How do Kafka brokers ensure great performance between the producers and consumers? (select two)

Options:

A.  

It compresses the messages as it writes to the disk

B.  

It leverages zero-copy optimisations to send data straight from the page-cache

C.  

It buffers the messages on disk, and sends messages from the disk reads

D.  

It transforms the messages into a binary format

E.  

It does not transform the messages

Discussion 0
Questions 17

A topic receives all the orders for the products that are available on a commerce site. Two applications want to process all the messages independently - order fulfilment and monitoring. The topic has 4 partitions, how would you organise the consumers for optimal performance and resource usage?

Options:

A.  

Create 8 consumers in the same group with 4 consumers for each application

B.  

Create two consumers groups for two applications with 8 consumers in each

C.  

Create two consumer groups for two applications with 4 consumers in each

D.  

Create four consumers in the same group, one for each partition - two for fulfilment and two for monitoring

Discussion 0
Questions 18

Which of the following Kafka Streams operators are stateful? (select all that apply)

Options:

A.  

flatmap

B.  

reduce

C.  

joining

D.  

count

E.  

peek

F.  

aggregate

Discussion 0
Questions 19

If I want to send binary data through the REST proxy, it needs to be base64 encoded. Which component needs to encode the binary data into base 64?

Options:

A.  

The Producer

B.  

The Kafka Broker

C.  

Zookeeper

D.  

The REST Proxy

Discussion 0
Questions 20

How much should be the heap size of a broker in a production setup on a machine with 256 GB of RAM, in PLAINTEXT mode?

Options:

A.  

4 GB

B.  

128 GB

C.  

16 GB

D.  

512 MB

Discussion 0
Questions 21

What client protocol is supported for the schema registry? (select two)

Options:

A.  

HTTP

B.  

HTTPS

C.  

JDBC

D.  

Websocket

E.  

SASL

Discussion 0
Questions 22

StreamsBuilder builder = new StreamsBuilder();

KStream textLines = builder.stream("word-count-input");

KTable wordCounts = textLines

.mapValues(textLine -> textLine.toLowerCase())

.flatMapValues(textLine -> Arrays.asList(textLine.split("\W+")))

.selectKey((key, word) -> word)

.groupByKey()

.count(Materialized.as("Counts"));

wordCounts.toStream().to("word-count-output", Produced.with(Serdes.String(), Serdes.Long()));

builder.build();

What is an adequate topic configuration for the topic word-count-output?

Options:

A.  

max.message.bytes=10000000

B.  

cleanup.policy=delete

C.  

compression.type=lz4

D.  

cleanup.policy=compact

Discussion 0