Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Sync with python, Ftrl optimizer, new examples, (#6042)
Browse files Browse the repository at this point in the history
bugfixes, tutorials, expanded/fixed docs, docker files for perl.
  • Loading branch information
sergeykolychev authored and piiswrong committed May 1, 2017
1 parent 3378bf5 commit fd43d68
Show file tree
Hide file tree
Showing 33 changed files with 681 additions and 41 deletions.
8 changes: 8 additions & 0 deletions docker/Dockerfiles/Dockerfile.in.perl
@@ -0,0 +1,8 @@
# -*- mode: dockerfile -*-
# part of the dockerfile to install the perl binding

COPY install/perl.sh install/
RUN install/perl.sh && \
cd /mxnet/perl-package/AI-MXNetCAPI/ && perl Makefile.PL && make install && \
cd /mxnet/perl-package/AI-NNVMCAPI/ && perl Makefile.PL && make install && \
cd /mxnet/perl-package/AI-MXNet/ && perl Makefile.PL && make install
12 changes: 12 additions & 0 deletions docker/README.md
Expand Up @@ -90,6 +90,18 @@ Available tags:

- mxnet/scala

### Perl

Hosted at https://hub.docker.com/r/mxnet/perl/

Perl version: 5.18.2

Available tags:

- mxnet/perl
- mxnet/perl:gpu


## How to build

The following command build the default Python package
Expand Down
4 changes: 4 additions & 0 deletions docker/install/perl.sh
@@ -0,0 +1,4 @@
#!/usr/bin/env bash
# install libraries for mxnet's perl package on ubuntu
apt-get update && apt-get install -y libmouse-perl pdl cpanminus swig libgraphviz-perl
cpanm -q Function::Parameters
2 changes: 1 addition & 1 deletion docker/run.sh
Expand Up @@ -2,7 +2,7 @@
# Build and push all docker containers

DEVICES=('cpu' 'gpu')
LANGUAGES=('python' 'julia' 'r-lang' 'scala')
LANGUAGES=('python' 'julia' 'r-lang' 'scala' 'perl')
for DEV in "${DEVICES[@]}"; do
for LANG in "${LANGUAGES[@]}"; do
./tool.sh build ${LANG} ${DEV}
Expand Down
2 changes: 1 addition & 1 deletion docker/tool.sh
Expand Up @@ -10,7 +10,7 @@ function show_usage() {
echo ""
echo " COMMAND: build or commit."
echo " commit needs logined in docker hub"
echo " LANGUAGE: the language binding to buld, e.g. python, r-lang, julia, or scala"
echo " LANGUAGE: the language binding to buld, e.g. python, r-lang, julia, scala or perl"
echo " DEVICE: targed device, e.g. cpu, or gpu"
echo ""
}
Expand Down
25 changes: 25 additions & 0 deletions docs/get_started/build_from_source.md
Expand Up @@ -462,3 +462,28 @@ Install the Julia package for MXNet with:
```bash
julia -e 'Pkg.add("MXNet")'
```

### Build the Perl package

Run the following command from the MXNet source root directory to build the MXNet Perl package:

```bash
sudo apt-get install libmouse-perl pdl cpanminus swig libgraphviz-perl
cpanm -q -L "${HOME}/perl5" Function::Parameters

MXNET_HOME=${PWD}
export LD_LIBRARY_PATH=${MXNET_HOME}/lib
export PERL5LIB=${HOME}/perl5/lib/perl5

cd ${MXNET_HOME}/perl-package/AI-MXNetCAPI/
perl Makefile.PL INSTALL_BASE=${HOME}/perl5
make install

cd ${MXNET_HOME}/perl-package/AI-NNVMCAPI/
perl Makefile.PL INSTALL_BASE=${HOME}/perl5
make install

cd ${MXNET_HOME}/perl-package/AI-MXNet/
perl Makefile.PL INSTALL_BASE=${HOME}/perl5
make install
```
25 changes: 25 additions & 0 deletions docs/get_started/index.md
Expand Up @@ -215,6 +215,18 @@ array([[ 3., 3., 3.],
[ 3., 3., 3.]], dtype=float32)
```

```perl
pdl> use AI::MXNet qw(mx)
pdl> $a = mx->sym->var('a')
pdl> $b = $a * 2 + 1
pdl> $c = $b->eval(args => { a => mx->nd->ones([2,3]) })
pdl> print @{$c}[0]->aspdl
[
[3 3 3]
[3 3 3]
]
```

Run the above codes in GPU in straightforward:

```python
Expand All @@ -230,6 +242,9 @@ Run the above codes in GPU in straightforward:
julia> a = mx.ones((2,3), mx.gpu())
```

```perl
pdl> $a = mx->nd->ones([2,3], ctx => mx->gpu())
```
In additional, MXNet provides a large number of neural network layers and
training modules to facilitate developing deep learning algorithms.

Expand All @@ -243,6 +258,16 @@ training modules to facilitate developing deep learning algorithms.
>>> mod.fit(train_data, ctx=[mx.gpu(0), mx.gpu(1)]) # fit on the training data by using 2 GPUs
```

```perl
pdl> $data = mx->sym->var('data')
pdl> $fc1 = mx->sym->FullyConnected($data, num_hidden=>128)
pdl> $act1 = mx->sym.Activation($fc1, act_type=>"relu")
pdl> $fc2 = mx->sym->FullyConnected($act1, num_hidden=>10)
pdl> $loss = mx->sym->SoftmaxOutput($fc2)
pdl> $mod = mx->mod->Module($loss)
pdl> $mod->fit($train_data, ctx=>[mx->gpu(0), mx->gpu(1)]) # fit on the training data by using 2 GPUs
```

## Next Steps

* [Tutorials](http://mxnet.io/tutorials/index.html)
Expand Down
2 changes: 1 addition & 1 deletion docs/get_started/ubuntu_setup.md
Expand Up @@ -286,7 +286,7 @@ To install the MXNet Scala package into your local Maven repository, run the fol
```
### Install the MXNet Package for Perl

Before you build MXNet for Scala from source code, you must complete [building the shared library](#build-the-shared-library). After you build the shared library, run the following command from the MXNet source root directory to build the MXNet Scala package:
Before you build MXNet for Perl from source code, you must complete [building the shared library](#build-the-shared-library). After you build the shared library, run the following command from the MXNet source root directory to build the MXNet Perl package:

```bash
sudo apt-get install libmouse-perl pdl cpanminus swig libgraphviz-perl
Expand Down
4 changes: 4 additions & 0 deletions docs/tutorials/index.md
Expand Up @@ -80,6 +80,10 @@ These tutorials introduce fundamental concepts in deep learning and their realiz

- [Basics](http://mxnet.io/tutorials/c++/basics.html)

### Perl

- [Calculator, handwritten digits and roboshakespreare](http://blogs.perl.org/users/sergey_kolychev/2017/04/machine-learning-in-perl-part2-a-calculator-handwritten-digits-and-roboshakespeare.html)

## Contributing Tutorials

Want to contribute an MXNet tutorial? To get started, download the [tutorial template](https://github.com/dmlc/mxnet/tree/master/example/MXNetTutorialTemplate.ipynb).
Expand Down
8 changes: 7 additions & 1 deletion perl-package/AI-MXNet/Changes
@@ -1,6 +1,12 @@
Revision history for Perl extension AI::MXNet

0.9504 18:59:45 PDT 2017
0.9506 Sat Apr 29 20:26:50 PDT 2017
- Ftrl optimizer, new tests, bugfixes.

0.9505 Sun Apr 23 21:26:04 PDT 2017
- Perplexity bugfix, two new examples.

0.9504 Wed Apr 19 18:59:45 PDT 2017
- LR Scheduler bugfix.

0.9503 Wed Apr 19 13:33:57 PDT 2017
Expand Down
2 changes: 2 additions & 0 deletions perl-package/AI-MXNet/MANIFEST
@@ -1,9 +1,11 @@
META.yml
MANIFEST
examples/calculator.pl
examples/plot_network.pl
examples/char_lstm.pl
examples/get_ptb_data.sh
examples/lstm_bucketing.pl
examples/mnist.pl
examples/cudnn_lstm_bucketing.pl
Makefile.PL
Changes
Expand Down
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/META.json
Expand Up @@ -43,5 +43,5 @@
}
},
"release_status" : "stable",
"version" : "0.9504"
"version" : "0.9506"
}
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/META.yml
Expand Up @@ -23,4 +23,4 @@ requires:
GraphViz: '2.14'
Mouse: v2.1.0
PDL: '2.007'
version: '0.9504'
version: '0.9506'
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/Makefile.PL
Expand Up @@ -27,7 +27,7 @@ my %WriteMakefileArgs = (
"GraphViz" => "2.14"
},
"TEST_REQUIRES" => {},
"VERSION" => "0.9504",
"VERSION" => "0.9506",
"test" => {
"TESTS" => "t/*.t"
}
Expand Down
2 changes: 1 addition & 1 deletion perl-package/AI-MXNet/README
@@ -1,5 +1,5 @@
This archive contains the distribution AI-MXNet,
version 0.9504:
version 0.9506:

Perl interface to MXNet machine learning library

Expand Down
138 changes: 138 additions & 0 deletions perl-package/AI-MXNet/examples/calculator.pl
@@ -0,0 +1,138 @@
#!/usr/bin/perl
use strict;
use warnings;
use AI::MXNet ('mx');

## preparing the samples
## to train our network
sub samples {
my($batch_size, $func) = @_;
# get samples
my $n = 16384;
## creates a pdl with $n rows and two columns with random
## floats in the range between 0 and 1
my $data = PDL->random(2, $n);
## creates the pdl with $n rows and one column with labels
## labels are floats that either sum or product, etc of
## two random values in each corresponding row of the data pdl
my $label = $func->($data->slice('0,:'), $data->slice('1,:'));
# partition into train/eval sets
my $edge = int($n / 8);
my $validation_data = $data->slice(":,0:@{[ $edge - 1 ]}");
my $validation_label = $label->slice(":,0:@{[ $edge - 1 ]}");
my $train_data = $data->slice(":,$edge:");
my $train_label = $label->slice(":,$edge:");
# build iterators around the sets
return(mx->io->NDArrayIter(
batch_size => $batch_size,
data => $train_data,
label => $train_label,
), mx->io->NDArrayIter(
batch_size => $batch_size,
data => $validation_data,
label => $validation_label,
));
}

## the network model
sub nn_fc {
my $data = mx->sym->Variable('data');
my $ln = mx->sym->exp(mx->sym->FullyConnected(
data => mx->sym->log($data),
num_hidden => 1,
));
my $wide = mx->sym->Concat($data, $ln);
my $fc = mx->sym->FullyConnected(
$wide,
num_hidden => 1
);
return mx->sym->MAERegressionOutput(data => $fc, name => 'softmax');
}

sub learn_function {
my(%args) = @_;
my $func = $args{func};
my $batch_size = $args{batch_size}//128;
my($train_iter, $eval_iter) = samples($batch_size, $func);
my $sym = nn_fc();

## call as ./calculator.pl 1 to just print model and exit
if($ARGV[0]) {
my @dsz = @{$train_iter->data->[0][1]->shape};
my @lsz = @{$train_iter->label->[0][1]->shape};
my $shape = {
data => [ $batch_size, splice @dsz, 1 ],
softmax_label => [ $batch_size, splice @lsz, 1 ],
};
print mx->viz->plot_network($sym, shape => $shape)->graph->as_png;
exit;
}

my $model = mx->mod->Module(
symbol => $sym,
context => mx->cpu(),
);
$model->fit($train_iter,
eval_data => $eval_iter,
optimizer => 'adam',
optimizer_params => {
learning_rate => $args{lr}//0.01,
rescale_grad => 1/$batch_size,
lr_scheduler => AI::MXNet::FactorScheduler->new(
step => 100,
factor => 0.99
)
},
eval_metric => 'mse',
num_epoch => $args{epoch}//25,
);

# refit the model for calling on 1 sample at a time
my $iter = mx->io->NDArrayIter(
batch_size => 1,
data => PDL->pdl([[ 0, 0 ]]),
label => PDL->pdl([[ 0 ]]),
);
$model->reshape(
data_shapes => $iter->provide_data,
label_shapes => $iter->provide_label,
);

# wrap a helper around making predictions
my ($arg_params) = $model->get_params;
for my $k (sort keys %$arg_params)
{
print "$k -> ". $arg_params->{$k}->aspdl."\n";
}
return sub {
my($n, $m) = @_;
return $model->predict(mx->io->NDArrayIter(
batch_size => 1,
data => PDL->new([[ $n, $m ]]),
))->aspdl->list;
};
}

my $add = learn_function(func => sub {
my($n, $m) = @_;
return $n + $m;
});
my $sub = learn_function(func => sub {
my($n, $m) = @_;
return $n - $m;
}, batch_size => 50, epoch => 40);
my $mul = learn_function(func => sub {
my($n, $m) = @_;
return $n * $m;
}, batch_size => 50, epoch => 40);
my $div = learn_function(func => sub {
my($n, $m) = @_;
return $n / $m;
}, batch_size => 10, epoch => 80);


print "12345 + 54321 ≈ ", $add->(12345, 54321), "\n";
print "188 - 88 ≈ ", $sub->(188, 88), "\n";
print "250 * 2 ≈ ", $mul->(250, 2), "\n";
print "250 / 2 ≈ ", $div->(250, 2), "\n";

7 changes: 4 additions & 3 deletions perl-package/AI-MXNet/examples/char_lstm.pl
Expand Up @@ -15,7 +15,7 @@
'gpus=s' => \(my $gpus ),
'kv-store=s' => \(my $kv_store = 'device'),
'num-epoch=i' => \(my $num_epoch = 25 ),
'lr=f' => \(my $lr = 0.01 ),
'lr=f' => \(my $lr = 0.001 ),
'optimizer=s' => \(my $optimizer = 'adam' ),
'mom=f' => \(my $mom = 0 ),
'wd=f' => \(my $wd = 0.00001 ),
Expand Down Expand Up @@ -208,8 +208,9 @@ package main;
learning_rate => $lr,
momentum => $mom,
wd => $wd,
clip_gradient => 1,
rescale_grad => 1/$batch_size
clip_gradient => 5,
rescale_grad => 1/$batch_size,
lr_scheduler => AI::MXNet::FactorScheduler->new(step => 1000, factor => 0.99)
},
initializer => mx->init->Xavier(factor_type => "in", magnitude => 2.34),
num_epoch => $num_epoch,
Expand Down

0 comments on commit fd43d68

Please sign in to comment.