Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

Dynamic Batch Scheduler Implementation #1200

Closed
wants to merge 1 commit into from

Conversation

AkshatSh
Copy link

Summary:
This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:

  • Linear: increases batch size linearly
  • Exponential: increases batch size exponentially

API

The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object scheduler_config, this contains the information needed to compute dynamic batch sizes namely:

class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Dec 17, 2019
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

AkshatSh pushed a commit to AkshatSh/pytext that referenced this pull request Dec 17, 2019
Summary:
Pull Request resolved: facebookresearch#1200

This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:
* **Linear**: increases batch size linearly
* **Exponential**: increases batch size exponentially

# API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely:

```
class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1
```

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

fbshipit-source-id: 4e0640bfa537c199420e07510387cde2fa217f57
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

AkshatSh pushed a commit to AkshatSh/pytext that referenced this pull request Dec 17, 2019
Summary:
Pull Request resolved: facebookresearch#1200

This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:
* **Linear**: increases batch size linearly
* **Exponential**: increases batch size exponentially

# API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely:

```
class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1
```

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

fbshipit-source-id: 63cff4597e04922804c3d551afb6435a6f719591
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

AkshatSh pushed a commit to AkshatSh/pytext that referenced this pull request Dec 17, 2019
Summary:
Pull Request resolved: facebookresearch#1200

This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:
* **Linear**: increases batch size linearly
* **Exponential**: increases batch size exponentially

# API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely:

```
class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1
```

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

fbshipit-source-id: 1f1e8ec850b84778a3e7a2c505a7c500de0ad2d5
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

AkshatSh pushed a commit to AkshatSh/pytext that referenced this pull request Dec 17, 2019
Summary:
Pull Request resolved: facebookresearch#1200

This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:
* **Linear**: increases batch size linearly
* **Exponential**: increases batch size exponentially

# API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely:

```
class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1
```

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

fbshipit-source-id: 65f3ce6d3ea5834a770ffa365c6dda371bbfc77b
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

Summary:
Pull Request resolved: facebookresearch#1200

This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.

The diff implements two schedulers:
* **Linear**: increases batch size linearly
* **Exponential**: increases batch size exponentially

# API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.

Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely:

```
class SchedulerConfig(ModuleConfig):
  # the initial batch size used for training
  start_batch_size: int = 32

  # the final or max batch size to use, any scheduler should
  # not go over this batch size
  end_batch_size: int = 256

  # the number of epochs to increase the batch size over
  epoch_period: int = 10

  # the batch size is kept constant for `step_size` number of epochs
  step_size: int = 1
```

Paper: https://arxiv.org/abs/1711.00489

Reviewed By: seayoung1112, ArmenAg

Differential Revision: D18900677

fbshipit-source-id: 752e75c6f28b99a042d4af1bd0962739e611c658
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18900677

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 3f8d293.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants