Skip to content

Commit e83747d

Browse files
author
Rafael Mota
committed
Add DynamoDB example
1 parent 90353d8 commit e83747d

File tree

1 file changed

+44
-0
lines changed

1 file changed

+44
-0
lines changed

examples/DynamoDB.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,44 @@
1+
# Using Dataloader with DynamoDB
2+
3+
DynamoDB is a serveless key-value and document database, that you can pay only by the number of the requests you make, also it supports [batch get operations](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html#batchGet-property), making it good for use with DataLoader. But also, those batch operations have some [limitations](https://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_BatchGetItem.html) that must be accounted for.
4+
5+
Here is an example building a DynamoDB Dataloader using DynamoDB AWS SDK's [DocumentClient](https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB/DocumentClient.html). The Table used is the Movies table described at DynamoDB's official javascript [documentation](https://docs.aws.amazon.com/pt_br/amazondynamodb/latest/developerguide/GettingStarted.JavaScript.html)
6+
7+
```js
8+
const AWS = require("aws-sdk");
9+
10+
const documentClient = new AWS.DynamoDB.DocumentClient({
11+
region: AWS_REGION // aws region that the dynamodb table was created e.g. us-east-1
12+
});
13+
14+
const movieLoader = new Dataloader(async ids => {
15+
let results = [];
16+
let batchGetKeys = ids;
17+
18+
// this retry logic is needed because of a batchGetItem API restriction that mandates that a single operation can retrieve only
19+
// up to 16 MB of data, and trying to get the keys that were not processed because of this restriction,
20+
// if the size of your itens is predictable one alternative is to reduce the maxBatchSize to a quantity that can't exceed the 16mb limit
21+
while (batchGetKeys.length) {
22+
const dynamoCallResult = await documentClient.batchGet({ RequestItems: { "Movies": { Keys: batchGetKeys } } }).promise();
23+
24+
batchGetKeys = dynamoCallResult.UnprocessedKeys && dynamoCallResult.UnprocessedKeys.Movies
25+
&& Array.isArray(dynamoCallResult.UnprocessedKeys.Movies) ? dynamoCallResult.UnprocessedKeys : []
26+
27+
results = results.concat(dynamoCallResult.Responses.Movies);
28+
}
29+
30+
// sort result by the keys they were requested with
31+
const itemsHashmap = {};
32+
results.forEach(item => {
33+
itemsHashmap[JSON.stringify({ title: item.title, year: item.year })] = item;
34+
});
35+
return ids.map(id => itemsHashmap[JSON.stringify(id)] || null);
36+
},
37+
{
38+
// needed because DynamoDb uses an object to represent its keys
39+
cacheKeyFn: key => JSON.stringify(key),
40+
// needed because of an limitation of the batchGetItem command, that limits the quantity of keys in one call to 100
41+
maxBatchSize: 100
42+
});
43+
44+
```

0 commit comments

Comments
 (0)