Skip to content

Commit

Permalink
Rename License file and add Contributing sections
Browse files Browse the repository at this point in the history
* Make it clear from project root we are using MIT-LICENSE
* Add contributing section w/ overview of how to submit code to repo
  • Loading branch information
harlow committed May 23, 2015
1 parent bde3e96 commit 9077d28
Show file tree
Hide file tree
Showing 3 changed files with 34 additions and 5 deletions.
13 changes: 13 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# Contributing to Golang Kinesis Connectors

Pull requests are much apprecited. Please help make the project a success!

To contribute:

1. Fork the [official repository][1].
2. Make your changes in a topic branch.
3. Squash commits and add an [excellent commit message][2].
4. Send a pull request.

[1]: https://github.com/harlow/kinesis-connectors/tree/master
[2]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
File renamed without changes.
26 changes: 21 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,15 @@
# Golang Kinesis Connectors

__Note:__ _This codebase is a under active development. Expect breaking changes until 1.0 version release._
__Note:__ _This codebase is a under active development._

### Kinesis connector applications written in Go

Inspired by the [Amazon Kinesis Connector Library][1]. These components are used for extracting streaming event data
into S3, Redshift, DynamoDB, and more. See the [API Docs][2] for package documentation.

[1]: https://github.com/awslabs/amazon-kinesis-connectors
[2]: http://godoc.org/github.com/harlow/kinesis-connectors

## Overview

Each Amazon Kinesis connector application is a pipeline that determines how records from an Amazon Kinesis stream will be handled. Records are retrieved from the stream, transformed according to a user-defined data model, buffered for batch processing, and then emitted to the appropriate AWS service.
Expand Down Expand Up @@ -35,7 +38,9 @@ The S3 Pipeline performs the following steps:
2. Upload the records to an S3 bucket.
3. Set the current Shard checkpoint in Redis.

The config vars are loaded done with [gcfg][3].
The config vars are loaded done with [gcfg].

[gcfg]: https://code.google.com/p/gcfg/

```go
package main
Expand Down Expand Up @@ -118,6 +123,17 @@ func main() {
}
```

[1]: https://github.com/awslabs/amazon-kinesis-connectors
[2]: http://godoc.org/github.com/harlow/kinesis-connectors
[3]: https://code.google.com/p/gcfg/
## Contributing

Please see [CONTRIBUTING.md].
Thank you, [contributors]!

[LICENSE]: /MIT-LICENSE
[CONTRIBUTING.md]: /CONTRIBUTING.md

## License

Copyright (c) 2015 Harlow Ward. It is free software, and may
be redistributed under the terms specified in the [LICENSE] file.

[contributors]: https://github.com/harlow/kinesis-connectors/graphs/contributors

0 comments on commit 9077d28

Please sign in to comment.