@@ -21,8 +21,9 @@ Read the paper [here](https://arxiv.org/abs/1902.06714).
2121  RMSProp, Adagrad, Adam, AdamW
2222*  More than a dozen activation functions and their derivatives
2323*  Loss functions and metrics: Quadratic, Mean Squared Error, Pearson Correlation etc.
24- *  Loading dense and convolutional models from Keras HDF5 (.h5) files
2524*  Data-based parallelism
25+ *  Loading dense and convolutional models from Keras HDF5 (.h5) files
26+ (see the [ nf-keras-hdf5] ( https://github.com/neural-fortran/nf-keras-hdf5 )  add-on)
2627
2728### Available layers  
2829
@@ -51,14 +52,8 @@ cd neural-fortran
5152Required dependencies are:
5253
5354*  A Fortran compiler
54- *  [ HDF5] ( https://www.hdfgroup.org/downloads/hdf5/ ) 
55-   (must be provided by the OS package manager or your own build from source)
56- *  [ functional-fortran] ( https://github.com/wavebitscientific/functional-fortran ) ,
57-   [ h5fortran] ( https://github.com/geospace-code/h5fortran ) ,
58-   [ json-fortran] ( https://github.com/jacobwilliams/json-fortran ) 
59-   (all handled by neural-fortran's build systems, no need for a manual install)
6055*  [ fpm] ( https://github.com/fortran-lang/fpm )  or
61-   [ CMake] ( https://cmake.org )  for building  the code
56+   [ CMake] ( https://cmake.org )  to build  the code
6257
6358Optional dependencies are:
6459
@@ -79,23 +74,7 @@ Compilers tested include:
7974With gfortran, the following will create an optimized build of neural-fortran:
8075
8176``` 
82- fpm build \ 
83-   --profile release \ 
84-   --flag "-I$HDF5INC -L$HDF5LIB" 
85- ``` 
86- 
87- HDF5 is now a required dependency, so you have to provide it to fpm.
88- The above command assumes that the ` HDF5INC `  and ` HDF5LIB `  environment
89- variables are set to the include and library paths, respectively, of your
90- HDF5 install.
91- 
92- If you use Conda, the following instructions work:
93- 
94- ``` 
95- conda create -n nf hdf5 
96- conda activate nf 
97- fpm build --profile release --flag "-I$CONDA_PREFIX/include -L$CONDA_PREFIX/lib -Wl,-rpath -Wl,$CONDA_PREFIX/lib" 
98- fpm test --profile release --flag "-I$CONDA_PREFIX/include -L$CONDA_PREFIX/lib -Wl,-rpath -Wl,$CONDA_PREFIX/lib" 
77+ fpm build --profile release 
9978``` 
10079
10180#### Building in parallel mode  
@@ -106,25 +85,20 @@ Once installed, use the compiler wrappers `caf` and `cafrun` to build and execut
10685in parallel, respectively:
10786
10887``` 
109- fpm build \ 
110-   --compiler caf \ 
111-   --profile release \ 
112-   --flag "-I$HDF5INC -L$HDF5LIB" 
88+ fpm build --compiler caf --profile release 
11389``` 
11490
11591#### Testing with fpm  
11692
11793``` 
118- fpm test \ 
119-   --profile release \ 
120-   --flag "-I$HDF5INC -L$HDF5LIB" 
94+ fpm test --profile release 
12195``` 
12296
12397For the time being, you need to specify the same compiler flags to ` fpm test ` 
12498as you did in ` fpm build `  so that fpm knows it should use the same build
12599profile.
126100
127- See [ Fortran Package Manager] ( https://github.com/fortran-lang/fpm )  for more info on fpm.
101+ See the  [ Fortran Package Manager] ( https://github.com/fortran-lang/fpm )  for more info on fpm.
128102
129103### Building with CMake  
130104
@@ -156,8 +130,7 @@ cafrun -n 4 bin/mnist # run MNIST example on 4 cores
156130#### Building with a different compiler  
157131
158132If you want to build with a different compiler, such as Intel Fortran,
159- set the ` HDF5_ROOT `  environment variable to the root path of your
160- Intel HDF5 build, and specify ` FC `  when issuing ` cmake ` :
133+ specify ` FC `  when issuing ` cmake ` :
161134
162135``` 
163136FC=ifort cmake .. 
@@ -213,6 +186,7 @@ You can configure neural-fortran by setting the appropriate options before
213186including the subproject.
214187
215188The following should be added in the CMake file of your directory:
189+ 
216190``` cmake 
217191if(NOT TARGET "neural-fortran::neural-fortran") 
218192  find_package("neural-fortran" REQUIRED) 
@@ -230,11 +204,7 @@ examples, in increasing level of complexity:
2302043 .  [ dense_mnist] ( example/dense_mnist.f90 ) : Hand-written digit recognition
231205  (MNIST dataset) using a dense (fully-connected) network
2322064 .  [ cnn_mnist] ( example/cnn_mnist.f90 ) : Training a CNN on the MNIST dataset
233- 5 .  [ dense_from_keras] ( example/dense_from_keras.f90 ) : Creating a pre-trained
234-   dense model from a Keras HDF5 file and running the inference.
235- 6 .  [ cnn_from_keras] ( example/cnn_from_keras.f90 ) : Creating a pre-trained
236-   convolutional model from a Keras HDF5 file and running the inference.
237- 7 .  [ get_set_network_params] ( example/get_set_network_params.f90 ) : Getting and
207+ 5 .  [ get_set_network_params] ( example/get_set_network_params.f90 ) : Getting and
238208  setting hyperparameters of a network.
239209
240210The examples also show you the extent of the public API that's meant to be
0 commit comments