Skip to content

v1.11.0

Compare
Choose a tag to compare
@beam2d beam2d released this 12 Jul 05:53
· 25832 commits to master since this release

This is a minor release that contains the following updates:

feature #914 #1142 #1168 #1204 #1217 #1285 #1286 #1289 #1329 #1330 #1331 #1358 #1359 #1373
other #1357
enhancement #886 #1112 #1302 #1311 #1312 #1342
test #1302
document #1328 #1341 #1354 #1355 #1371
bug #1297 #1336 #1341 #1346 #1352 #1355 #1364

Summary:

  • Chainer now supports dataset and training loop abstraction (#1285).
    • It contains the following components.
      • Dataset and Iterator to extract mini batches by iterating over datasets
      • Trainer, Updater, and Extension to customize the training loop with low cost
      • Reporter to collect statistics from inside of the models
    • It also contains off-the-shelf support of data parallel learning with multiple GPUs (#1358, thanks @amitibo!!).
    • MNIST, PTB, and ImageNet examples are updated. They are now using Trainer.
    • Tutorial is also updated to support the updated examples.
  • New links and Functions:
  • Enhancement of links and Functions:
    • BatchNormalization supports flexible initialzation (#1289, thanks @fukutani!!).
    • Classifier is improved. It can change the function to calculate accuracy (#1286) and can accept more than two Variables on call (#1204).
  • Asynchronous data transfer using a CUDA stream is now supported (#1112).