-
Notifications
You must be signed in to change notification settings - Fork 13k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Not able to create flink-streaming-connectors jar #2058
Conversation
…h restart Temporary work around to restore initial state on failure during recovery as required by a user. Will be superseded by FLINK-3397 with better handling of checkpoint and savepoint restoring. A failure during recovery resulted in restarting a job without its savepoint state. This temporary work around makes sure that if the savepoint coordinator ever restored a savepoint and there was no checkpoint after the savepoint, the savepoint state will be restored again. This closes #1720.
This also includes some minor cleanups This closes #1689
… when involked without arguments
…odicWatermarksOperator
…rofile. This closes #1719
The new flink-gelly-examples module contains all Java and Scala Gelly examples. The module contains compile scope dependencies on flink-java, flink-scala and flink-clients so that the examples can be conveniently run from within the IDE.
This makes AllWindowedStream.fold() take constant space, just like the keyed WindowOperator. Also this adds a new test case in EventTimeAllWindowCheckpointingITCase to verify that the FoldingWindowBuffer works. This also renames the preexisting window buffers to ReducingWindowBuffer and ListWindowBuffer to make the naming scheme consistent.
This implicitly adds KeyedStream.transform() and also explicitly ConnectedStreams.transform(). This also removes the transform exclusions from the API completeness tests.
…ionService This closes #1700.
The current flink-gelly-examples artifact id wrongly used an underscore to separate examples from flink-gelly. This commit replaces the underscore with an hyphen. This closes #1731.
…Coordinator This closes #1732.
This enforces that the user always has to specify keys for both inputs before .window() can be called.
…ng time This brings it more in line with *ProcessingTimeWindows and makes it clear what type of window assigner it is. The old name, i.e. SlidingTimeWindows and TumblingTimeWindows is still available but deprecated.
…s Guava dependency This closes #1737
… SingleOutputStreamOperator
This reverts commit 014a686.
…urceTest This closes #1964.
… section This closes #1991
… BlobUtils. This closes #2000
…ldIndex() This closes #2004
The Flink documentation build process is currently quite messy. These changes move us to a new build process with proper dependency handling. It assures that we all use the same dependency versions for consistent build output. Also, it eases the automated building process on other systems (like the ASF Buildbot). The goal was to make the documentation build process easier and self-contained. - use Ruby's Bundler Gem to install dependencies - update README - adapt Dockerfile - add additional rules to .gitignore - change default doc output path from /target to /content (default path of the flink-web repository) This closes #2033
The Flink documentation build process is currently quite messy. These changes move us to a new build process with proper dependency handling. It assures that we all use the same dependency versions for consistent build output. Also, it eases the automated building process on other systems (like the ASF Buildbot). The goal was to make the documentation build process easier and self-contained. - use Ruby's Bundler Gem to install dependencies - update README - adapt Dockerfile - add additional rules to .gitignore - change default doc output path from /target to /content (default path of the flink-web repository) This closes #2033
Can you please close this pull request and pose this as a question in the mailing list? |
I am trying to put streaming data to Kinesis hence I am using flink-streaming-kinesis connector jar which requires flink-streaming-connector jar ....Error is Could not find artifact org.apache.flink:flink-streaming-connectors:pom:1.1-SNAPSHOT ?? Hence I want flink-streaming-connector jar..I need to create the jar or I need to resolve that dependency...How can I do this ?? |
Can you close the pull request and continue the discussion on JIRA? |
Trying to prevent failures like [1] from happening again. I could not explain who deleted the savepoint file concurrently with the exists check. The savepoint is triggered and retrieved successfully. Shutting down the cluster does not remove any savepoints. [1] https://s3.amazonaws.com/archive.travis-ci.org/jobs/136396433/log.txt
Jar arguments with a single '-' were not parsed correctly if options were present. For example, in `./flink run <options> file.jar -arg value` the jar arguments would be parsed as "arg" and "value". Interestingly, this only happened when <options> were present. The issue has been fixed in commons-cli 1.3.1. A test case was added to test for regressions. This closes #2139
Guard test for ChainedAllReduceDriver This closes #2156.
Hello,I am not able to create jar of flink-streaming-connectors ...I am able to create jar of others like twitter,kafka,flume but I am not able to create jar of flink-streaming connectors ?? How can I create this jar ??