Are you working with data of similar modalities, and often have to apply the same function to multiple elements? Are you using something similar to this:
batch = {
"image_a": image_a,
"image_b": image_b,
"image_c": image_c
}
# Move to another device
for key in batch:
batch[key] = batch[key].to(device)
# Transform
for key in batch:
batch[key] = batch[key] * 2 + 1
# Combine
for key in batch:
batch[key] = batch[key] + batch_2[key]
# Process
for key in batch:
batch[key] = batch[key].max()
If the answer is yes, then this module is for you!
Our Batch package is a generic wrapper for dictionary-based batch data. It provides a simple way to apply the same function or operator to the whole batch. The module is completely device and container independent, you can use it with PyTorch, NumPy or any other libraries.
batch = Batch(
image_a=image_a,
image_b=image_b,
image_c=image_c)
# Move to another device
batch = batch.to(device)
# Transform
batch = batch * 2 + 1
# Combine
batch = batch + batch_2
# Process
batch = batch.max()
pip install batch-dev
The example below demonstrates a few basic use-cases using NumPy. Similarly, PyTorch or other containers can also be used.
from batch import Batch
# Create a batch directly
batch = Batch(
image_a=np.random.rand(256, 256, 3),
image_b=np.random.rand(256, 256, 3),
image_c=np.random.rand(256, 256, 3))
batch = {
"image_a": np.random.rand(256, 256, 3),
"image_b": np.random.rand(256, 256, 3),
"image_c": np.random.rand(256, 256, 3)}
# Create a batch from a dictionary
batch = Batch.from_dict(batch)
image = np.random.rand(256, 256, 9)
# Create a batch from a tensor by splitting the tensor along one dimension and store the splits
data_splits = {
"image_a": 1,
"image_b": 1,
"image_c": 1}
dim = 2
batch = Batch.from_tensor(batch, data_splits, dim=dim)
A Batch is a string-keyed dictionary, with potentially mapping or iterable values.
When a string index is given, then it is always interpreted as a key.
Querying a single key returns the value associated:
image_a = batch["image_a"]
You can even index deeper using .
as separator:
batch_2 = Batch(input=batch)
image_a = batch_2["input.image_a"]
Querying multiple keys (tuple or list) return a new batch with the selected keys:
batch_out = batch["image_a", "image_b"]
Wildcard query is also supported and returns a new batch with the matching keys:
batch_out = batch.query_wildcard("image_*")
When an integer index is given, then it is always interpreted as an index to the elements and returns a new batch with the indexed elements:
batch_out = batch[:,:,0]
You can use the followingunary, binary and reverse operators:
# Unary operators
"__not__", "__abs__", "__index__", "__inv__", "__invert__", "__neg__", "__pos__",
# Binary operators
"__add__", "__and__", "__concat__", "__floordiv__", "__lshift__", "__mod__", "__mul__",
"__or__", "__pow__", "__rshift__", "__sub__", "__truediv__", "__xor__", "__eq__",
# Reverse operators
"__radd__", "__rand__", "__rmul__",
"__ror__", "__rsub__", "__rxor__",
# In-place operators
"__iadd__", "__iand__", "__iconcat__", "__ifloordiv__", "__ilshift__", "__imod__", "__imul__",
"__ior__", "__ipow__", "__irshift__", "__isub__", "__itruediv__", "__ixor__"
Example:
# Use operators
batch_out = batch + batch_2 * 2
You can use any member functions of the underlying container, for example:
# Use member functions
batch_out = batch.mean(axis=2)
You can easily apply a function to the whole batch:
batch = batch.map(list) # Converts all elements to list
batch = batch.map(np.stack, axis=0) # Concatenates all elements to a single tensor
You can also apply a function to the keys:
batch = batch.map_keys(lambda x: f"{x}_2") # Add suffix
A few limitations to consider when using this module:
- Use only string keys for the batch.
- Don't use keys starting with underscore (
_
). - Slice indexing is not implemented yet.
- Generic iterable indexing is not implemented, only tuple and list.
- Code documentation is in progress.
- Some features are not yet documented here, please refer to the code directly.
If you have any ideas or requests, feel free to open an issue or a pull request.