Skip to content

Conversation

@WadeWaldron
Copy link
Contributor

Description

Implementing Exercise 04 - Connecting to Confluent Cloud.

Currently, there are no tests or testing framework. I will investigate adding that in future exercises.

Checklist

  • Unit tests created/updated for any new code (where applicable).
  • Run all tests with ./build.sh validate.
  • Update the CHANGELOG.md.
  • Update the README.md if necessary.

@WadeWaldron WadeWaldron requested a review from a team as a code owner June 13, 2024 18:47
@cla-assistant
Copy link

cla-assistant bot commented Jun 25, 2024

CLA assistant check
All committers have signed the CLA.

Copy link
Member

@pmoskovi pmoskovi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey Wade - Looks good, only a few comments. Thank you!

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The color of the rectangle highlighting the "point of interest" in the screenshot is a bit subtle to me. Is this part of a template/guidance we follow? If not, I suggest changing the color to red (#f00). Here's an example in the GitHub documentation (granted their red is a bit darker than #f00): https://github.com/confluentinc/learn-apache-flink-table-api-for-java-exercises/pull/3/files

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pmoskovi I think you posted the wrong link above.

I have used Confluent colors for the highlight. Sadly red is not in those colors (mostly just blue). However, if we want to break from Confluent Branding, I can definitely do something different.

Set the active environment.

```
confluent environment use <environment id>
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where does the <environment id> come from? (I haven't run the command line, so this may be a dumb question.)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When you run this command, you receive the environment id.

confluent environment create flink-table-api-java


You will need a suitable Java development environment including:

- Java 17
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I assume Java 17 is a Flink requirement/dependency. What if someone has a more recent version? Will things fail? Is it still true that you can have multiple versions installed, as long as you have your JAVA_HOME set properly?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, we are targeting Java 21 for this. I will update it.

- Maven
- An IDE such as IntelliJ, Eclipse, or VS Code.

To easily switch between Java versions, you can use [SDKMAN](https://sdkman.io/).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK - you just answered my question above. Thank you... ;)

Copy link

@alpinegizmo alpinegizmo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had a lot of trouble getting this running. Some of the problems were of my own making, but some changes would help.

I found myself repeating banging my head into the exercise.sh and build.sh scripts and the exercise staging/solving setup. This gets in the way of how I would prefer to work. In this case what I wanted to do was

  • clone the repo
  • import the repo as a project my IDE
  • using the IDE, configure things, run the solution
  • using the IDE, study the code more carefully

Later, when things got messed up, I found myself having to deal with those scripts much more than I wanted to.

The other problems I ran into:

  • I expected the README to tell me everything I needed to know, and didn't discover the instructions until much later.
  • I had problems building the confluent-table-planner, which I solved by disabling spotless.
  • The current version of the confluent-table-planner doesn't match the version that the exercise is configured to use.
  • Before trying to run exercise 4, I had done things with exercise 7. This left tests lying around that weren't deleted when I switched back to exercise 4. This prevented me from being able to build exercise 4.
  • I now have everything built, but it still doesn't work:

❯ java -jar target/flink-table-api-marketplace-0.1.jar
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
Annotation types that do not specify explicit target element types cannot be applied here

at io.confluent.flink.plugin.internal.DefaultPluginContext.waitForCondition(DefaultPluginContext.java:371)
at io.confluent.flink.plugin.internal.DefaultPluginContext.submitStatement(DefaultPluginContext.java:325)
at io.confluent.flink.plugin.internal.DefaultPluginContext.queryBoundedInternal(DefaultPluginContext.java:147)
at io.confluent.flink.plugin.internal.PluginContext.queryBoundedInternal(PluginContext.java:60)
at io.confluent.flink.plugin.internal.Utils.queryAddressableCatalogObjects(Utils.java:80)
at io.confluent.flink.plugin.internal.Utils.queryAddressableCatalogs(Utils.java:55)
at io.confluent.flink.plugin.internal.ConfluentCatalogStore.open(ConfluentCatalogStore.java:34)
at org.apache.flink.table.catalog.CatalogStoreHolder.open(CatalogStoreHolder.java:124)
at org.apache.flink.table.catalog.CatalogStoreHolder$Builder.build(CatalogStoreHolder.java:106)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.create(TableEnvironmentImpl.java:283)
at org.apache.flink.table.api.TableEnvironment.create(TableEnvironment.java:99)
at marketplace.Marketplace.main(Marketplace.java:14)

@alpinegizmo
Copy link

Using flink-1.20-SNAPSHOT seems highly suspect to me. Is this necessary?

@alpinegizmo
Copy link

I decided to re-stage the exercise and try again. Now it runs without failure, but it doesn't produce any output. Not sure how to debug that.

And I unfortunately didn't think to put my cloud.properties file somewhere safe before re-staging the exercise, so I had the fun of recreating it.

@WadeWaldron
Copy link
Contributor Author

WadeWaldron commented Jul 25, 2024

@alpinegizmo

I expected the README to tell me everything I needed to know, and didn't discover the instructions until much later.

Yeah, I'm used to these instructions just ending up in Contentful. I didn't consider the flow when they are in Github. I'll put some links into the README to help with that.

I had problems building the confluent-table-planner, which I solved by disabling spotless.

That does work. However, I believe this can also be resolved by changing Java versions. Honestly, I'd much prefer if they just had a published version of this so that we could avoid these issues. In the meantime, I'll see if I can run through the instructions on my machine and try and sort out the "exact" process for getting it to work.

The current version of the confluent-table-planner doesn't match the version that the exercise is configured to use.

See above. I'll see if I can get more specific instructions to make sure everything works.

Before trying to run exercise 4, I had done things with exercise 7. This left tests lying around that weren't deleted when I switched back to exercise 4. This prevented me from being able to build exercise 4.

The exercises are designed for a linear flow. Jumping back and forth could cause issues.

Using flink-1.20-SNAPSHOT seems highly suspect to me. Is this necessary?

Probably not. I think this might be an artifact of using a sample repo when I created the project. I'll see if I can switch it to a stable version.

I decided to re-stage the exercise and try again. Now it runs without failure, but it doesn't produce any output. Not sure how to debug that.

Did you print the results of the query?

And I unfortunately didn't think to put my cloud.properties file somewhere safe before re-staging the exercise, so I had the fun of recreating it.

Yeah, I've learned to be careful with that. But maybe I can fix it so that we don't have to be careful. I have an idea on how to make this easier.

@WadeWaldron
Copy link
Contributor Author

Tried switching to Flink 1.19.1 from Flink 1.20-SNAPSHOT. Unfortunately, this resulted in an error:

java.lang.NoSuchMethodError: 'org.apache.flink.table.catalog.CatalogTable$Builder org.apache.flink.table.catalog.CatalogTable.newBuilder()'

Seems like the SNAPSHOT version is currently required.

@WadeWaldron WadeWaldron deleted the exercise-04 branch August 21, 2024 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants