Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes #275 #1025

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
67 changes: 67 additions & 0 deletions docs/user/learningratescheduler.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
=======
Support For Learning Rate Schedulers in Skorch
=======
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Skorch, a powerful library for training PyTorch models, offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.
Skorch offers seamless integration with learning rate schedulers. Learning rate schedulers allow you to adapt the learning rate during training, leading to faster convergence and improved model performance. In this section, we'll explore how to use learning rate schedulers with Skorch to fine-tune your neural network training process.

This is very flattering but not necessary ;)


What is a Learning Rate Scheduler?
----------

A learning rate scheduler dynamically adjusts the learning rate during training. It can be a crucial component of your training pipeline, enabling you to control the step size for updating the model's weights as the training progresses.

Using Learning Rate Schedulers in Skorch
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be a section header?

Skorch allows you to integrate PyTorch learning rate schedulers seamlessly into your training process. Here's a step-by-step guide on how to use them:

1. Create Your Neural Network Model
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
1. Create Your Neural Network Model
1. Create Your Neural Network Model

Before you can use a learning rate scheduler, you need to define your neural network model using PyTorch. For example:

.. code:: python

import torch
import torch.nn as nn

class YourModel(nn.Module):
def __init__(self):
super(YourModel, self).__init__()
# Define your layers here
Comment on lines +22 to +25
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
class YourModel(nn.Module):
def __init__(self):
super(YourModel, self).__init__()
# Define your layers here
class MyModule(nn.Module):
def __init__(self):
super().__init__()
self.lin = torch.nn.Linear(20, 2)
def forward(self, x):
return self.lin(x)

Let's use a working example, it's not much longer. Also, the naming is more consistent with the other skorch examples.



2. Create Your Skorch NeuralNet
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
2. Create Your Skorch NeuralNet
2. Create Your Skorch NeuralNet

Now, create a Skorch NeuralNet that wraps your PyTorch model. Make sure to specify the optimizer and learning rate scheduler in the NeuralNet constructor. Below is an example using the StepLR learning rate scheduler:

.. code:: python

from skorch import NeuralNet
from skorch.callbacks import LRScheduler

from torch.optim import SGD
from torch.optim.lr_scheduler import StepLR

net = NeuralNet(
YourModel,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
YourModel,
MyModule,

Also, how about setting max_epochs=20. This way, the effect of the LR scheduler can be more clearly seen (since it steps at epoch 10).

criterion=nn.CrossEntropyLoss,
optimizer=SGD,
optimizer__lr=0.01,
optimizer__momentum=0.9,
iterator_train__shuffle=True,
callbacks=[
('scheduler', LRScheduler(StepLR, step_size=10, gamma=0.5)),
],
)


In the example above, we set the optimizer to Stochastic Gradient Descent (SGD) and attach a StepLR learning rate scheduler with a step size of 10 and a decay factor of 0.5. You can customize the scheduler parameters to suit your needs.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's add that as, with any skorch callback, the parameters of the LRScheduler can be optimized via hyper-parameter search, e.g. to find the best step_size and gamma.


3. Train Your Model
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
3. Train Your Model
3. Train Your Model

With your Skorch NeuralNet defined and the learning rate scheduler attached, you can start training your model as you normally would with scikit-learn:

.. code:: python

net.fit(X_train, y_train)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If we add a tiny amount of code, the example you show can actually be run:

from sklearn.datasets import make_classification
X_train, y_train = make_classification(1000, 20, n_informative=10, random_state=0)
X_train = X_train.astype(np.float32)
y_train = y_train.astype(np.int64)


The learning rate scheduler will automatically adjust the learning rate during training based on the specified schedule.

4. Monitor Training Progress
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
4. Monitor Training Progress
4. Monitor Training Progress

During training, Skorch will automatically keep you informed about the learning rate changes, allowing you to monitor the effect of the learning rate scheduler on your model's performance.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this needs to be extended a bit, as it doesn't explain how that works.


Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change


Learning rate schedulers are a valuable tool for fine-tuning neural network training, and Skorch simplifies their integration into your training pipeline. Experiment with different schedulers and monitor your model's progress to find the best strategy for your specific task. With Skorch, you have the flexibility to choose the scheduler that suits your needs, and you can easily adjust its parameters for optimal results.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this paragraph can be safely removed.