Utils and Notebooks for building deep-learing models with data-generated from superpaper web-app.
P.S: Most images in dataset are gonna be low in quality as thats how good my art-skills are
The stroke-data is exported as a json file. Use the ExploreStrokes.ipynb notebook to use utils for exploring data.
Currently, a single data-point comprises on 3 elements:
- Stroke JSON
- Base image
- Sketch Layer
{
"description": "<str:title of the art work>",
"stroke": [{
"layer": "<str:Layer name>",
"type": "<str:mouseup or penup>",
"memento": [
[<int:x-xcor>, <int:y-cor>, <int:offset-x>, <int:offset-y>],
[809, 124, -310, -10],
.
.
[]
]
}, {
"layer": "Layer 1",
"type": "mouseup",
"memento": [
[811, 126, -310, -10],
[811, 126, -310, -10],
.
.
.
[]
]
}
.
.
.],
"device_type": "pc",
"canvas_h": 649,
"canvas_w": 999
}
Note that an empty entry in memento (i.e. []) signifies end of stroke
This is the image used as reference for drawing the data
This is the image that was exported after drawing (p.s forgive me for this garbage trace)
- Add Sketch-RNN wrapper
- Create section for comic