Skip to content

Commit

Permalink
improved README
Browse files Browse the repository at this point in the history
  • Loading branch information
JuliaFernee committed Mar 31, 2017
1 parent db37964 commit a1058f2
Showing 1 changed file with 37 additions and 21 deletions.
58 changes: 37 additions & 21 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,10 @@
# Generic Reader/Writer for S3 (generic-rw-s3)
# Generic Reader/Writer for S3

[![Circle CI](https://circleci.com/gh/Financial-Times/generic-rw-s3.svg?style=shield)](https://circleci.com/gh/Financial-Times/generic-rw-s3)[![Go Report Card](https://goreportcard.com/badge/github.com/Financial-Times/generic-rw-s3)](https://goreportcard.com/report/github.com/Financial-Times/generic-rw-s3) [![Coverage Status](https://coveralls.io/repos/github/Financial-Times/generic-rw-s3/badge.svg)](https://coveralls.io/github/Financial-Times/generic-rw-s3)

__An API for reading/writing generic payloads up to S3. It can be setup to read those payloads off Kafka
## system-code: upp-generic-s3-rw
## Introduction
An API for reading/writing generic payloads up to S3. It can be setup to read those payloads off Kafka.

## Installation

Expand All @@ -14,22 +17,14 @@ or update:
`go get -u github.com/Financial-Times/generic-rw-s3`


## Running

### With read from kafka enabled
`$GOPATH/bin/generic-rw-s3 --port=8080 --bucketName="bucketName" --bucketPrefix="bucketPrefix" --awsRegion="eu-west-1" --source-addresses="<proyx_address>" --source-group="<consumer_group>" --source-topic="<topic_to_read>" --source-queue="kafka"`

### With specified resource path
`$GOPATH/bin/generic-rw-s3 --port=8080 --resourcePath="concepts" --bucketName="bucketName" --bucketPrefix="bucketPrefix" --awsRegion="eu-west-1"`
## Running locally


```
export|set PORT=8080
export|set BUCKET_NAME='bucketName"
export|set AWS_REGION="eu-west-1"
$GOPATH/bin/generic-rw-s3
```

The app assumes that you have correctly set up your AWS credentials by either using the `~/.aws/credentials` file:

```
Expand All @@ -52,7 +47,22 @@ export|set WORKERS=10 # Number of concurrent downloads when downloading all item
export|set SRC_CONCURRENT_PROCESSING=true # Whether the consumer uses concurrent processing for the messages
```

## Endpoints
### Run locally with read from kafka enabled
`$GOPATH/bin/generic-rw-s3 --port=8080 --bucketName="bucketName" --bucketPrefix="bucketPrefix" --awsRegion="eu-west-1" --source-addresses="<proyx_address>" --source-group="<consumer_group>" --source-topic="<topic_to_read>" --source-queue="kafka"`

### Run locally with specified resource path
`$GOPATH/bin/generic-rw-s3 --port=8080 --resourcePath="concepts" --bucketName="bucketName" --bucketPrefix="bucketPrefix" --awsRegion="eu-west-1"`

## Test locally
See Endpoints section.

## Build and deployment
* Docker Hub builds: [coco/generic-rw-s3](https://hub.docker.com/r/coco/generic-rw-s3/)
* Cluster deployment: [concepts-rw-s3@.service](https://github.com/Financial-Times/pub-service-files), [generic-rw-s3@service](https://github.com/Financial-Times/up-service-files)
* CI provided by CircleCI: [generic-rw-s3](https://circleci.com/gh/Financial-Times/generic-rw-s3)
* Code coverage provided by Coverall: [generic-rw-s3](https://coveralls.io/github/Financial-Times/generic-rw-s3)

## Service Endpoints
For complete API specification see [S3 Read/Write API Endpoint](https://docs.google.com/document/d/1Ck-o0Le9cXOfm-aVjiGmOT7ZTB5W5fDTsPqGkhzfa-U/edit#)

### PUT /UUID
Expand All @@ -64,8 +74,13 @@ curl -H 'Content-Type: application/json' -X PUT -d '{"tags":["tag1","tag2"],"que
```

The `Content-Type` is important as that will be what the file will be stored as.
In addition we will also store transaction ID in S3. It is either provided as request header and if not, it is auto-generated.

When the content is uploaded, the key generated for the item is converted from `123e4567-e89b-12d3-a456-426655440000` to `<bucket_prefix>/123e4567/e89b/12d3/a456/426655440000`. The reason we do this is so that it becomes easier to manage/browser for content in the AWS console. It is also good practice to do this as it means that files get put into different partitions. This is important if you're writing and pulling content from S3 as it means that content will get written/read from different partitions on S3.
When the content is uploaded, the key generated for the item is converted from
`123e4567-e89b-12d3-a456-426655440000` to `<bucket_prefix>/123e4567/e89b/12d3/a456/426655440000`.
The reason we do this is so that it becomes easier to manage/browser for content in the AWS console.
It is also good practice to do this as it means that files get put into different partitions.
This is important if you're writing and pulling content from S3 as it means that content will get written/read from different partitions on S3.

### GET /UUID
This internal read should return what was written to S3
Expand All @@ -76,6 +91,11 @@ If not found, you'll get a 404 response.
curl http://localhost:8080/bcac6326-dd23-4b6a-9dfa-c2fbeb9737d9
```

### DELETE /UUID
Will return 204

## Utility endpoints

### GET /
Streams all payloads in a given bucket

Expand All @@ -97,18 +117,14 @@ The return payload will look like:
...
```

### DELETE /UUID
Will return 204 if successful, 404 if not found

### Admin endpoints

Healthchecks: [http://localhost:8080/__health](http://localhost:8080/__health)
Ping: [http://localhost:8080/ping](http://localhost:8080/ping) or [http://localhost:8080/__ping](http://localhost:8080/__ping)
Build Info: [http://localhost:8080/build-info](http://localhost:8080/build-info) or [http://localhost:8080/build-info](http://localhost:8080/__build-info)
GTG: [http://localhost:8080/build-info](http://localhost:8080/__gtg)
Healthchecks: [http://localhost:8080/__health](http://localhost:8080/__health)
Build Info: [http://localhost:8080/__build-info](http://localhost:8080/build-info) or [http://localhost:8080/build-info](http://localhost:8080/__build-info)
GTG: [http://localhost:8080/__gtg](http://localhost:8080/__gtg)


### Notes
### Other Information

#### S3 buckets

Expand Down

0 comments on commit a1058f2

Please sign in to comment.