-
Notifications
You must be signed in to change notification settings - Fork 5
Implementing Exercise 04 - Connecting to Confluent Cloud #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
pmoskovi
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey Wade - Looks good, only a few comments. Thank you!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The color of the rectangle highlighting the "point of interest" in the screenshot is a bit subtle to me. Is this part of a template/guidance we follow? If not, I suggest changing the color to red (#f00). Here's an example in the GitHub documentation (granted their red is a bit darker than #f00): https://github.com/confluentinc/learn-apache-flink-table-api-for-java-exercises/pull/3/files
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmoskovi I think you posted the wrong link above.
I have used Confluent colors for the highlight. Sadly red is not in those colors (mostly just blue). However, if we want to break from Confluent Branding, I can definitely do something different.
| Set the active environment. | ||
|
|
||
| ``` | ||
| confluent environment use <environment id> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where does the <environment id> come from? (I haven't run the command line, so this may be a dumb question.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When you run this command, you receive the environment id.
confluent environment create flink-table-api-java
|
|
||
| You will need a suitable Java development environment including: | ||
|
|
||
| - Java 17 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I assume Java 17 is a Flink requirement/dependency. What if someone has a more recent version? Will things fail? Is it still true that you can have multiple versions installed, as long as you have your JAVA_HOME set properly?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, we are targeting Java 21 for this. I will update it.
| - Maven | ||
| - An IDE such as IntelliJ, Eclipse, or VS Code. | ||
|
|
||
| To easily switch between Java versions, you can use [SDKMAN](https://sdkman.io/). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK - you just answered my question above. Thank you... ;)
alpinegizmo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I had a lot of trouble getting this running. Some of the problems were of my own making, but some changes would help.
I found myself repeating banging my head into the exercise.sh and build.sh scripts and the exercise staging/solving setup. This gets in the way of how I would prefer to work. In this case what I wanted to do was
- clone the repo
- import the repo as a project my IDE
- using the IDE, configure things, run the solution
- using the IDE, study the code more carefully
Later, when things got messed up, I found myself having to deal with those scripts much more than I wanted to.
The other problems I ran into:
- I expected the README to tell me everything I needed to know, and didn't discover the instructions until much later.
- I had problems building the confluent-table-planner, which I solved by disabling spotless.
- The current version of the confluent-table-planner doesn't match the version that the exercise is configured to use.
- Before trying to run exercise 4, I had done things with exercise 7. This left tests lying around that weren't deleted when I switched back to exercise 4. This prevented me from being able to build exercise 4.
- I now have everything built, but it still doesn't work:
❯ java -jar target/flink-table-api-marketplace-0.1.jar
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
Exception in thread "main" java.lang.Error: Unresolved compilation problem:
Annotation types that do not specify explicit target element types cannot be applied here
at io.confluent.flink.plugin.internal.DefaultPluginContext.waitForCondition(DefaultPluginContext.java:371)
at io.confluent.flink.plugin.internal.DefaultPluginContext.submitStatement(DefaultPluginContext.java:325)
at io.confluent.flink.plugin.internal.DefaultPluginContext.queryBoundedInternal(DefaultPluginContext.java:147)
at io.confluent.flink.plugin.internal.PluginContext.queryBoundedInternal(PluginContext.java:60)
at io.confluent.flink.plugin.internal.Utils.queryAddressableCatalogObjects(Utils.java:80)
at io.confluent.flink.plugin.internal.Utils.queryAddressableCatalogs(Utils.java:55)
at io.confluent.flink.plugin.internal.ConfluentCatalogStore.open(ConfluentCatalogStore.java:34)
at org.apache.flink.table.catalog.CatalogStoreHolder.open(CatalogStoreHolder.java:124)
at org.apache.flink.table.catalog.CatalogStoreHolder$Builder.build(CatalogStoreHolder.java:106)
at org.apache.flink.table.api.internal.TableEnvironmentImpl.create(TableEnvironmentImpl.java:283)
at org.apache.flink.table.api.TableEnvironment.create(TableEnvironment.java:99)
at marketplace.Marketplace.main(Marketplace.java:14)
|
Using flink-1.20-SNAPSHOT seems highly suspect to me. Is this necessary? |
|
I decided to re-stage the exercise and try again. Now it runs without failure, but it doesn't produce any output. Not sure how to debug that. And I unfortunately didn't think to put my cloud.properties file somewhere safe before re-staging the exercise, so I had the fun of recreating it. |
Yeah, I'm used to these instructions just ending up in Contentful. I didn't consider the flow when they are in Github. I'll put some links into the README to help with that.
That does work. However, I believe this can also be resolved by changing Java versions. Honestly, I'd much prefer if they just had a published version of this so that we could avoid these issues. In the meantime, I'll see if I can run through the instructions on my machine and try and sort out the "exact" process for getting it to work.
See above. I'll see if I can get more specific instructions to make sure everything works.
The exercises are designed for a linear flow. Jumping back and forth could cause issues.
Probably not. I think this might be an artifact of using a sample repo when I created the project. I'll see if I can switch it to a stable version.
Did you
Yeah, I've learned to be careful with that. But maybe I can fix it so that we don't have to be careful. I have an idea on how to make this easier. |
|
Tried switching to Flink 1.19.1 from Flink 1.20-SNAPSHOT. Unfortunately, this resulted in an error: Seems like the SNAPSHOT version is currently required. |
Description
Implementing Exercise 04 - Connecting to Confluent Cloud.
Currently, there are no tests or testing framework. I will investigate adding that in future exercises.
Checklist
./build.sh validate.