Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

first pass on the module #14

Merged
merged 5 commits into from
Jun 28, 2022
Merged

first pass on the module #14

merged 5 commits into from
Jun 28, 2022

Conversation

paraynaud
Copy link
Member

No description provided.

README.md Outdated
pkg> test KnetNLPModels
```

This step-by-step example suppose prior knowledge [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This step-by-step example suppose prior knowledge [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).
## Example
This step-by-step example assumes prior knowledge with [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).

- the values of the neural network variable $w$;
- the objective function $\mathcal{L}(X,Y;w)$ of the loss function $\mathcal{L}$ at the point $w$ for a given minibatch $X,Y$
- the gradient $\nabla \mathcal{L}(X,Y;w)$ of the loss function at the point $w$ for a given mini-batch $X,Y$
A `KnetNLPModel` gives the user access to:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
A `KnetNLPModel` gives the user access to:
## Synopsis
A `KnetNLPModel` gives the user access to:

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move the "Synopsis" section above the "Example" header.

README.md Outdated
- the gradient $\nabla \mathcal{L}(X,Y;w)$ of the loss function at the point $w$ for a given mini-batch $X,Y$
A `KnetNLPModel` gives the user access to:
- the values of the neural network variables/weights `w`;
- the objective/loss function `L(X, Y; w)` of the loss function `L` at the point `w` for a given minibatch `(X,Y)`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- the objective/loss function `L(X, Y; w)` of the loss function `L` at the point `w` for a given minibatch `(X,Y)`
- the value of the objective/loss function `L(X, Y; w)` at `w` for a given minibatch `(X,Y)`;

README.md Outdated
A `KnetNLPModel` gives the user access to:
- the values of the neural network variables/weights `w`;
- the objective/loss function `L(X, Y; w)` of the loss function `L` at the point `w` for a given minibatch `(X,Y)`
- the gradient `∇L(X, Y; w)` of the objective/loss function at the point `w` for a given mini-batch `(X,Y)`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- the gradient `∇L(X, Y; w)` of the objective/loss function at the point `w` for a given mini-batch `(X,Y)`
- the gradient `∇L(X, Y; w)` of the objective/loss function at `w` for a given mini-batch `(X,Y)`.

README.md Outdated
- Switch the minibatch used to evaluate the neural network
- Measure the neural network's accuracy at the current point for a given testing mini-batch
In addition, it provides tools to:
- Switch the minibatch used to evaluate the neural network;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- Switch the minibatch used to evaluate the neural network;
- switch the minibatch used to evaluate the neural network;

README.md Outdated

## Default behavior
By default, the training minibatch that evaluates the neural network doesn't change between evaluations.
To change the training minibatch use:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
To change the training minibatch use:
To change the training minibatch, use:

README.md Outdated
```
The size of the new minibatch is the size define previously.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The size of the new minibatch is the size define previously.
The size of the new minibatch is the size defined earlier.

README.md Outdated

The size of the training and testing minibatch can be set to `1/denominator` the size of the dataset with:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
The size of the training and testing minibatch can be set to `1/denominator` the size of the dataset with:
The size of the training and test minibatches can be set to `1/p` the size of the dataset with:

README.md Outdated

The size of the training and testing minibatch can be set to `1/denominator` the size of the dataset with:
```julia
set_size_minibatch!(DenseNetNLPModel, denominator) # denominator::Int > 1
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
set_size_minibatch!(DenseNetNLPModel, denominator) # denominator::Int > 1
set_size_minibatch!(DenseNetNLPModel, p) # p::Int > 1

@@ -1,118 +1,118 @@
# KnetNLPModels.jl Tutorial
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same changes here.

RAYNAUD Paul (raynaudp) added 2 commits June 27, 2022 14:43
Copy link
Member

@dpo dpo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You skipped a few of my comments. Please also apply them to the tutorial.

README.md Outdated

A `KnetNLPModel` gives the user access to:
- the values of the neural network variables/weights `w`;
- the value of the objective/loss function `L(X, Y; w)` at `w` for a given minibatch `(X,Y)`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- the value of the objective/loss function `L(X, Y; w)` at `w` for a given minibatch `(X,Y)`
- the value of the objective/loss function `L(X, Y; w)` at `w` for a given minibatch `(X,Y)`;

README.md Outdated
A `KnetNLPModel` gives the user access to:
- the values of the neural network variables/weights `w`;
- the value of the objective/loss function `L(X, Y; w)` at `w` for a given minibatch `(X,Y)`
- the gradient `∇L(X, Y; w)` of the objective/loss function at `w` for a given mini-batch `(X,Y)`
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
- the gradient `∇L(X, Y; w)` of the objective/loss function at `w` for a given mini-batch `(X,Y)`
- the gradient `∇L(X, Y; w)` of the objective/loss function at `w` for a given mini-batch `(X,Y)`.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I already suggested the changes above.

README.md Outdated

## Define the layers of interest
The following code define a dense layer as an evaluable julia structure.
This step-by-step example suppose prior knowledge [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
This step-by-step example suppose prior knowledge [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).
This step-by-step example assumes prior knowledge of [julia](https://julialang.org/) and [Knet.jl](https://github.com/denizyuret/Knet.jl.git).

Also suggested before.

@paraynaud
Copy link
Member Author

Sorry, I did not see those ones.

@paraynaud
Copy link
Member Author

I made the PR #16 from this PR.
It incomporates the last remarks, and the CPU/GPU support that Farhad and I did.

@dpo
Copy link
Member

dpo commented Jun 27, 2022

I made the PR #16 from this PR.
It incomporates the last remarks,

Please don't do that. Keep PRs minimal and orthogonal. We're ready to merge this one.

@dpo dpo mentioned this pull request Jun 27, 2022
@paraynaud paraynaud merged commit a41288d into main Jun 28, 2022
@paraynaud paraynaud deleted the repo_form branch January 9, 2023 22:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants