Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 0 additions & 7 deletions _includes/shared/markup/ccloud/ccloud-sr-consume.adoc

This file was deleted.

8 changes: 0 additions & 8 deletions _includes/shared/markup/ccloud/ccloud-sr-produce.adoc
Original file line number Diff line number Diff line change
@@ -1,9 +1 @@
You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file.
Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret.

```
Enter your Schema Registry API key:
Enter your Schema Registry API secret:
```

When the console producer starts, it will log some messages and hang, waiting for your input. Type in one line at a time and press enter to send it. Each line represents an event. To send all of the events below, paste the following into the prompt and press enter:
Original file line number Diff line number Diff line change
@@ -1,18 +1,9 @@
Next, let's open up a consumer to read records from the new topic.
Next, let's open up a consumer to read records from the new topic.

From the same terminal you used to create the topic above, run the following command to start a console consumer with the `ccloud` CLI:
From the same terminal you used to create the topic above, run the following command to start a console consumer with the Confluent CLI:

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/console-consumer-producer-avro/confluent/code/tutorial-steps/dev/harness-console-consumer-keys.sh %}</code></pre>
+++++

You will be prompted for the Confluent Cloud Schema Registry credentials as shown below.
Enter the values you got from when you enabled Schema Registry in the Confluent Cloud Console.

```
Enter your Schema Registry API key:
Enter your Schema Registry API secret:
```

The consumer will start up and block waiting for records, you won't see any output until after the next step.

Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Now that the Kafka Streams application is running, run a command line consumer using the `ccloud` CLI to view the events (your `ccloud` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/ccloud-cli/current/command-reference/index.html[Confluent CLI Reference] for additional details).
Now that the Kafka Streams application is running, run a command line consumer using the Confluent CLI to view the events (your `confluent` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/confluent-cli/current/command-reference/overview.html[Confluent CLI Reference] for additional details).

Then, in a new terminal window, run the following console consumer to view the events being generated by the data generator and produced to the `random-strings` topic from the `Randomizer` class in your Kafka Streams application. These are the events that have been streamed into the topology (`.stream(inputTopic, Consumed.with(stringSerde, stringSerde)`).

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,6 @@ In a new terminal window, run the following command to start a Confluent CLI pro
<pre class="snippet"><code class="bash">{% include_raw tutorials/finding-distinct/confluent/code/tutorial-steps/dev/ccloud-produce-events.sh %}</code></pre>
+++++

You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file.
Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret.

```
Enter your Schema Registry API key:
Enter your Schema Registry API secret:
```

When the producer starts, it will log some messages and hang, waiting for your input. Each line represents input data for the Kafka Streams application.
To send all of the events below, paste the following into the prompt and press enter:

Expand All @@ -21,4 +13,4 @@ To send all of the events below, paste the following into the prompt and press e

Enter `Ctrl-C` to exit.

In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration.
In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration.
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@ include::_includes/shared/markup/ccloud/ccloud-sr-produce.adoc[]
<pre class="snippet"><code class="json">{% include_raw tutorials/joining-stream-table/kstreams/code/tutorial-steps/dev/movies.json %}</code></pre>
+++++

In this case the table data originates from a Kafka topic that was populated by a console producer using `ccloud` CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect].
In this case the table data originates from a Kafka topic that was populated by a console producer using the Confluent CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect].
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
To check the status of the connector from the command line, you have the same two options as provisioning.

*Option 1.* Using the `ccloud` CLI.
*Option 1.* Using the Confluent CLI.

+++++
<pre class="snippet"><code class="shell">{% include_raw tutorials/kafka-connect-datagen/confluent/code/tutorial-steps/dev/check-connector.sh %}</code></pre>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,6 @@ Now that your Kafka Streams application is running, open a new terminal window,
confluent kafka topic consume output-topic --from-beginning --print-key
```

include::_includes/shared/markup/ccloud/ccloud-sr-consume.adoc[]

Your results should look something like this:
++++
<pre class="snippet"><code class="shell">
Expand Down