Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to create flink-streaming-connectors jar #2058

Closed
wants to merge 148 commits into from
Closed

Conversation

mrakshay
Copy link

@mrakshay mrakshay commented Jun 1, 2016

Hello,I am not able to create jar of flink-streaming-connectors ...I am able to create jar of others like twitter,kafka,flume but I am not able to create jar of flink-streaming connectors ?? How can I create this jar ??

uce and others added 30 commits February 26, 2016 20:56
…h restart

Temporary work around to restore initial state on failure during recovery as
required by a user. Will be superseded by FLINK-3397 with better handling of
checkpoint and savepoint restoring.

A failure during recovery resulted in restarting a job without its savepoint
state. This temporary work around makes sure that if the savepoint coordinator
ever restored a savepoint and there was no checkpoint after the savepoint,
the savepoint state will be restored again.

This closes #1720.
This also includes some minor cleanups

This closes #1689
The new flink-gelly-examples module contains all Java and Scala Gelly examples. The module
contains compile scope dependencies on flink-java, flink-scala and flink-clients so that
the examples can be conveniently run from within the IDE.
This makes AllWindowedStream.fold() take constant space, just like the
keyed WindowOperator.

Also this adds a new test case in EventTimeAllWindowCheckpointingITCase
to verify that the FoldingWindowBuffer works.

This also renames the preexisting window buffers to ReducingWindowBuffer
and ListWindowBuffer to make the naming scheme consistent.
This implicitly adds KeyedStream.transform() and also explicitly
ConnectedStreams.transform().

This also removes the transform exclusions from the API completeness
tests.
The current flink-gelly-examples artifact id wrongly used an underscore to separate
examples from flink-gelly. This commit replaces the underscore with an hyphen.

This closes #1731.
This enforces that the user always has to specify keys for both inputs
before .window() can be called.
…ng time

This brings it more in line with *ProcessingTimeWindows and makes it
clear what type of window assigner it is.

The old name, i.e. SlidingTimeWindows and TumblingTimeWindows is still
available but deprecated.
gyfora and others added 15 commits April 28, 2016 21:54
The Flink documentation build process is currently quite messy. These
changes move us to a new build process with proper dependency
handling. It assures that we all use the same dependency versions for
consistent build output. Also, it eases the automated building process
on other systems (like the ASF Buildbot). The goal was to make the
documentation build process easier and self-contained.

- use Ruby's Bundler Gem to install dependencies
- update README
- adapt Dockerfile
- add additional rules to .gitignore
- change default doc output path from /target to /content
(default path of the flink-web repository)

This closes #2033
The Flink documentation build process is currently quite messy. These
changes move us to a new build process with proper dependency
handling. It assures that we all use the same dependency versions for
consistent build output. Also, it eases the automated building process
on other systems (like the ASF Buildbot). The goal was to make the
documentation build process easier and self-contained.

- use Ruby's Bundler Gem to install dependencies
- update README
- adapt Dockerfile
- add additional rules to .gitignore
- change default doc output path from /target to /content
(default path of the flink-web repository)

This closes #2033
@StephanEwen
Copy link
Contributor

Can you please close this pull request and pose this as a question in the mailing list?

@mrakshay
Copy link
Author

mrakshay commented Jun 2, 2016

I have filed an issue on Flink ASF JIRA

@mrakshay
Copy link
Author

mrakshay commented Jun 2, 2016

I am trying to put streaming data to Kinesis hence I am using flink-streaming-kinesis connector jar which requires flink-streaming-connector jar ....Error is Could not find artifact org.apache.flink:flink-streaming-connectors:pom:1.1-SNAPSHOT ?? Hence I want flink-streaming-connector jar..I need to create the jar or I need to resolve that dependency...How can I do this ??

@StephanEwen
Copy link
Contributor

Can you close the pull request and continue the discussion on JIRA?
This here is not the right place for such a discussion.

uce and others added 10 commits June 17, 2016 13:40
Trying to prevent failures like [1] from happening again. I could not
explain who deleted the savepoint file concurrently with the exists
check. The savepoint is triggered and retrieved successfully. Shutting
down the cluster does not remove any savepoints.

[1] https://s3.amazonaws.com/archive.travis-ci.org/jobs/136396433/log.txt
Jar arguments with a single '-' were not parsed correctly if options were
present. For example, in `./flink run <options> file.jar -arg value` the
jar arguments would be parsed as "arg" and "value". Interestingly, this only
happened when <options> were present.

The issue has been fixed in commons-cli 1.3.1.

A test case was added to test for regressions.

This closes #2139
@asfgit asfgit closed this in 7206b0e Jul 4, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet