From ecf85996405f19092fdb900d2d7d9332fbe1807d Mon Sep 17 00:00:00 2001 From: Brian Strauch Date: Thu, 11 Jan 2024 09:05:12 -0800 Subject: [PATCH] remove sr api key prompt --- .../shared/markup/ccloud/ccloud-sr-consume.adoc | 7 ------- .../shared/markup/ccloud/ccloud-sr-produce.adoc | 8 -------- .../confluent/markup/dev/consume-topic.adoc | 13 ++----------- .../confluent/markup/dev/run-consumer.adoc | 2 +- .../confluent/markup/dev/ccloud-run-produce.adoc | 10 +--------- .../confluent/markup/dev/ccloud-produce-movies.adoc | 2 +- .../confluent/markup/dev/check-connector.adoc | 2 +- .../confluent/markup/dev/ccloud-run-consumer.adoc | 2 -- 8 files changed, 6 insertions(+), 40 deletions(-) delete mode 100644 _includes/shared/markup/ccloud/ccloud-sr-consume.adoc diff --git a/_includes/shared/markup/ccloud/ccloud-sr-consume.adoc b/_includes/shared/markup/ccloud/ccloud-sr-consume.adoc deleted file mode 100644 index 42edd31d57..0000000000 --- a/_includes/shared/markup/ccloud/ccloud-sr-consume.adoc +++ /dev/null @@ -1,7 +0,0 @@ -You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file. -Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret. - -``` -Enter your Schema Registry API key: -Enter your Schema Registry API secret: -``` \ No newline at end of file diff --git a/_includes/shared/markup/ccloud/ccloud-sr-produce.adoc b/_includes/shared/markup/ccloud/ccloud-sr-produce.adoc index 1079d592cc..4b7a77e67f 100644 --- a/_includes/shared/markup/ccloud/ccloud-sr-produce.adoc +++ b/_includes/shared/markup/ccloud/ccloud-sr-produce.adoc @@ -1,9 +1 @@ -You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file. -Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret. - -``` -Enter your Schema Registry API key: -Enter your Schema Registry API secret: -``` - When the console producer starts, it will log some messages and hang, waiting for your input. Type in one line at a time and press enter to send it. Each line represents an event. To send all of the events below, paste the following into the prompt and press enter: diff --git a/_includes/tutorials/console-consumer-producer-avro/confluent/markup/dev/consume-topic.adoc b/_includes/tutorials/console-consumer-producer-avro/confluent/markup/dev/consume-topic.adoc index cb1a071ba7..93933eae93 100644 --- a/_includes/tutorials/console-consumer-producer-avro/confluent/markup/dev/consume-topic.adoc +++ b/_includes/tutorials/console-consumer-producer-avro/confluent/markup/dev/consume-topic.adoc @@ -1,18 +1,9 @@ -Next, let's open up a consumer to read records from the new topic. +Next, let's open up a consumer to read records from the new topic. -From the same terminal you used to create the topic above, run the following command to start a console consumer with the `ccloud` CLI: +From the same terminal you used to create the topic above, run the following command to start a console consumer with the Confluent CLI: +++++
{% include_raw tutorials/console-consumer-producer-avro/confluent/code/tutorial-steps/dev/harness-console-consumer-keys.sh %}
+++++ -You will be prompted for the Confluent Cloud Schema Registry credentials as shown below. -Enter the values you got from when you enabled Schema Registry in the Confluent Cloud Console. - -``` -Enter your Schema Registry API key: -Enter your Schema Registry API secret: -``` - The consumer will start up and block waiting for records, you won't see any output until after the next step. - diff --git a/_includes/tutorials/creating-first-apache-kafka-streams-application/confluent/markup/dev/run-consumer.adoc b/_includes/tutorials/creating-first-apache-kafka-streams-application/confluent/markup/dev/run-consumer.adoc index 061f3c3c32..a0d5bc193c 100644 --- a/_includes/tutorials/creating-first-apache-kafka-streams-application/confluent/markup/dev/run-consumer.adoc +++ b/_includes/tutorials/creating-first-apache-kafka-streams-application/confluent/markup/dev/run-consumer.adoc @@ -1,4 +1,4 @@ -Now that the Kafka Streams application is running, run a command line consumer using the `ccloud` CLI to view the events (your `ccloud` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/ccloud-cli/current/command-reference/index.html[Confluent CLI Reference] for additional details). +Now that the Kafka Streams application is running, run a command line consumer using the Confluent CLI to view the events (your `confluent` context should be set to the proper environment, cluster, and API Key (see Step 4 above and https://docs.confluent.io/confluent-cli/current/command-reference/overview.html[Confluent CLI Reference] for additional details). Then, in a new terminal window, run the following console consumer to view the events being generated by the data generator and produced to the `random-strings` topic from the `Randomizer` class in your Kafka Streams application. These are the events that have been streamed into the topology (`.stream(inputTopic, Consumed.with(stringSerde, stringSerde)`). diff --git a/_includes/tutorials/finding-distinct/confluent/markup/dev/ccloud-run-produce.adoc b/_includes/tutorials/finding-distinct/confluent/markup/dev/ccloud-run-produce.adoc index 7aab132fa1..11edff6820 100644 --- a/_includes/tutorials/finding-distinct/confluent/markup/dev/ccloud-run-produce.adoc +++ b/_includes/tutorials/finding-distinct/confluent/markup/dev/ccloud-run-produce.adoc @@ -4,14 +4,6 @@ In a new terminal window, run the following command to start a Confluent CLI pro
{% include_raw tutorials/finding-distinct/confluent/code/tutorial-steps/dev/ccloud-produce-events.sh %}
+++++ -You will be prompted for the Confluent Cloud Schema Registry credentials as shown below, which you can find in the `configuration/ccloud.properties` configuration file. -Look for the configuration parameter `basic.auth.user.info`, whereby the ":" is the delimiter between the key and secret. - -``` -Enter your Schema Registry API key: -Enter your Schema Registry API secret: -``` - When the producer starts, it will log some messages and hang, waiting for your input. Each line represents input data for the Kafka Streams application. To send all of the events below, paste the following into the prompt and press enter: @@ -21,4 +13,4 @@ To send all of the events below, paste the following into the prompt and press e Enter `Ctrl-C` to exit. -In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration. \ No newline at end of file +In the next steps we will run a consumer to observe the distinct click events. You can experiment with various orderings of the records in order to observe what makes a click event distinct. By default the distinct event window store looks for distinct clicks over a 2-minute duration. diff --git a/_includes/tutorials/joining-stream-table/confluent/markup/dev/ccloud-produce-movies.adoc b/_includes/tutorials/joining-stream-table/confluent/markup/dev/ccloud-produce-movies.adoc index 57f774a803..3e6ada75a2 100644 --- a/_includes/tutorials/joining-stream-table/confluent/markup/dev/ccloud-produce-movies.adoc +++ b/_includes/tutorials/joining-stream-table/confluent/markup/dev/ccloud-produce-movies.adoc @@ -10,4 +10,4 @@ include::_includes/shared/markup/ccloud/ccloud-sr-produce.adoc[]
{% include_raw tutorials/joining-stream-table/kstreams/code/tutorial-steps/dev/movies.json %}
+++++ -In this case the table data originates from a Kafka topic that was populated by a console producer using `ccloud` CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect]. +In this case the table data originates from a Kafka topic that was populated by a console producer using the Confluent CLI but this doesn't always have to be the case. You can use Kafka Connect to stream data from a source system (such as a database) into a Kafka topic, which could then be the foundation for a lookup table. For further reading checkout this tutorial on link:{{ "connect-add-key-to-source/kstreams.html" | relative_url }}[creating a Kafka Streams table from SQLite data using Kafka Connect]. diff --git a/_includes/tutorials/kafka-connect-datagen/confluent/markup/dev/check-connector.adoc b/_includes/tutorials/kafka-connect-datagen/confluent/markup/dev/check-connector.adoc index 8ff37105e8..91e4337ac9 100644 --- a/_includes/tutorials/kafka-connect-datagen/confluent/markup/dev/check-connector.adoc +++ b/_includes/tutorials/kafka-connect-datagen/confluent/markup/dev/check-connector.adoc @@ -1,6 +1,6 @@ To check the status of the connector from the command line, you have the same two options as provisioning. -*Option 1.* Using the `ccloud` CLI. +*Option 1.* Using the Confluent CLI. +++++
{% include_raw tutorials/kafka-connect-datagen/confluent/code/tutorial-steps/dev/check-connector.sh %}
diff --git a/_includes/tutorials/session-windows/confluent/markup/dev/ccloud-run-consumer.adoc b/_includes/tutorials/session-windows/confluent/markup/dev/ccloud-run-consumer.adoc index 3f8a0e23be..16d4982ddd 100644 --- a/_includes/tutorials/session-windows/confluent/markup/dev/ccloud-run-consumer.adoc +++ b/_includes/tutorials/session-windows/confluent/markup/dev/ccloud-run-consumer.adoc @@ -4,8 +4,6 @@ Now that your Kafka Streams application is running, open a new terminal window, confluent kafka topic consume output-topic --from-beginning --print-key ``` -include::_includes/shared/markup/ccloud/ccloud-sr-consume.adoc[] - Your results should look something like this: ++++