New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FLINK-4520][flink-siddhi] Integrate Siddhi as a light-weight Streaming CEP Library #2486
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
…urn Map<String,Object>
haoch
changed the title
FLINK-4520 Integrate Siddhi as a light-weight Streaming CEP Library
[FLINK-4520][flink-siddhi] Integrate Siddhi as a light-weight Streaming CEP Library
Sep 9, 2016
…t() throws UnknownHostException - If InetAddress.getLocalHost() throws UnknownHostException when attempting to connect with LOCAL_HOST strategy, the code will move on to try the other strategies instead of immediately failing. - Also made minor code style improvements for trying the different strategies. This closes apache#2383
…mentation This also includes minor code and test cleanups. This closes apache#2416
… fail on resharding This no longer allows the Kinesis consumer to transparently handle resharding. This is a short-term workaround until we have a min-watermark notification service available in the JobManager. This closes apache#2414
…n the table This closes apache#2384.
…and 'PredeterminedAssignment'
Implemented Mesos AppMaster including: - runners for AppMaster and TaskManager - MesosFlinkResourceManager as a Mesos framework - ZK persistent storage for Mesos tasks - reusable scheduler actors for: - offer handling using Netflix Fenzo (LaunchCoordinator) - reconciliation (ReconciliationCoordinator) - task monitoring (TaskMonitor) - connection monitoring (ConnectionMonitor) - lightweight HTTP server to serve artifacts to the Mesos fetcher (ArtifactServer) - scenario-based logging for: - connectivity issues - offer handling (receive, process, decline, rescind, accept) - incorporated FLINK-4152, FLINK-3904, FLINK-4141, FLINK-3675, FLINK-4166
Add license information
- Fenzo usage fix - always call scheduleOnce after expireAllLeases. - increased aggressiveness of task scheduler - factored YarnJobManager and MesosJobManager to share base class `ContaineredJobManager` - improved supervision for task actors, unit tests - support for zombie tasks (i.e. non-strict slave registry) - improved javadocs - fix for style violations (e.g. line length) - completed the SchedulerProxy - final fields - improved preconditions - log lines to use {} - cleanup ZK state - serializable messages
The version change didn't cause the Scalastyle errors. Seems like the only viable solution to prevent random failures of the Scalastyle plugin is to disable Scalastyle checks for the affected source file.
… ProducerFailedException
…ctionInfo to TaskManagerLocation This adds the ResourceId to the TaskManagerLocation
…nt of 'Instance'. To allow for a future dynamic slot allocation and release model, the slots should not depend on 'Instance'. In this change, the Slots hold most of the necessary information directly (location, gateway) and the interact with the Instance only via a 'SlotOwner' interface.
…nstance' to have more intuitive names getResourceID() --> getTaskManagerID() getInstanceConnectionInfo() --> getTaskManagerLocation()
…e-defined strictly local assignments.
This caused Scalastyle to fail, presumably depending on the locale used. After a bit of debugging on the Scalastyle plugin I found out that the number in the error is the byte position. "Expected identifier, but got Token(COMMA,,,1772,,)" head -c 1772 flink-mesos/src/test/scala/org/apache/flink/mesos/Utils.scala pointed to the Unicode character '⇒' which causes Scalastyle to fail in certain environments. This closes apache#2466
There is RocksDBAsyncSnapshotTest which tests async snapshots for the RocksDB state backend. Operators themselves cannot do asynchronous checkpoints right now.
Yarn reports null or (1, maxVcores) depending on its internal logic. The test only worked in the past because it summed up the used vcores of the RM and the TM containers. We have checks in place to ensure the vcores config value is passed on to the Flink ResourceManager.
…to not return null Return a DefaultAWSCredentialsProviderChain instead of null when AWS_CREDENTIALS_PROVIDER config is set to "AUTO" This closes apache#2470
Adds a NoOpOperator which is unwound in OperatorTranslation.translate. This will be first used by Gelly as a placeholder to support implicit operator reuse. This closes apache#2294
Rename _configuration to originalConfiguration Remove testing classes from main scope in flink-runtime Previously, the ForkableFlinkMiniCluster which resided in flink-test-utils required these files to be in the main scope of flink-runtime. With the removal of the ForkableFlinkMiniCluster, these classes are now no longer needed and can be moved back to the test scope. This closes apache#2450.
Introduce TaskExecutionStateListener for Task Replace JobManagerGateway in Task by InputSplitProvider and CheckpointNotifier Replace the TaskManager ActorGateway by TaskManagerConnection in Task Rename taskmanager.CheckpointNotifier into CheckpointResponder; rename TaskExecutionStateListener.notifyTaskExecutionState into notifyTaskExecutionStateChanged Remove InputSplitProvider.start; add ClassLoader parameter to InputSplitProvider.getNextInputSplit Removes the unused class InputSplitIterator. Update InputSplitProvider JavaDocs This closes apache#2456.
The Gelly documentation was recently split into multiple pages in FLINK-4104 but was missing a redirect. This commit updates the Gelly redirect to point to the old page. This closes apache#2464
Replaces Delegate with NoOpOperator. This closes apache#2474
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Thanks for contributing to Apache Flink. Before you open your pull request, please take the following check list into consideration.
If your changes take all of the items into account, feel free to open your pull request. For more information and/or questions please refer to the How To Contribute guide.
In addition to going through the list, please provide a meaningful description of your changes.
mvn clean verify
has been executed successfully locally or a Travis build has passedAbstraction
Siddhi CEP is a lightweight and easy-to-use Open Source Complex Event Processing Engine (CEP) released as a Java Library under
Apache Software License v2.0
. Siddhi CEP processes events which are generated by various event sources, analyses them and notifies appropriate complex events according to the user specified queries.It would be very helpful for flink users (especially streaming application developer) to provide a library to run Siddhi CEP query directly in Flink streaming application.
Features
TupleStreamSiddhiOperator
), supporting rich CEP features like...
SiddhiCEP
andSiddhiStream
)AbstractSiddhiOperator
)SiddhiCEP#registerExtension
)Test Cases
org.apache.flink.contrib.siddhi. SiddhiCEPITCase
Example