Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OMLT documentation #1185

Merged
merged 9 commits into from May 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
@@ -1,10 +1,77 @@
OMLT: Optimization and Machine Learning Toolkit
===============================================

Keras is a deep learning framework that integrates with TensorFlow's structure for
building and training artificial neural networks, and minimizes the number of user
actions required to construct accurate networks. OMLT (Optimization and Machine
Learning Toolkit) provides an interface to formulate machine learning models and
import Keras or ONNX models as Pyomo blocks. The provided tools include an interface
for accessing the Keras module (via the publicly available Python package) within
IDAES flowsheets.
.. toctree::
:maxdepth: 1

omlt-keras-options

Keras is a deep learning framework that integrates with TensorFlow's structure for building and training artificial neural networks, and minimizes the number of user actions required to construct accurate networks. OMLT (Optimization and Machine Learning Toolkit) provides an interface to formulate machine learning models and import Keras or ONNX models as Pyomo blocks. The provided tools include an interface for accessing the Keras module (via the publicly available Python package) within IDAES flowsheets.

Basic Usage
-----------

Keras-OMLT main function is **keras_surrogate.KerasSurrogate**, which populates an IDAES `SurrogateObject` with the OMLT model. This object may then be passed directly to other IDAES methods for visualization or flowsheet integration (see the sections for Visualization and Examples below).

Data can be read in or simulated using available Python packages. The main arguments of the `keras_surrogate.KerasSurrogate` Python function are inputs and outputs. For example,

.. code-block:: python

# import tensorflow
import tensorflow as tf

# selected settings for regression
activation, optimizer, n_hidden_layers, n_nodes_per_layer = "tanh", "Adam", 2, 40
loss, metrics = "mse", ["mae", "mse"]

# Create data objects for training using scalar normalization
model = tf.keras.Sequential()
model.add(
tf.keras.layers.Dense(units=n_nodes_per_layer, input_dim=len(input_labels), activation=activation))
# Create n hidden layers
bpaul4 marked this conversation as resolved.
Show resolved Hide resolved
for i in range(1, n_hidden_layers):
model.add(tf.keras.layers.Dense(units=n_nodes_per_layer, activation=activation))
model.add(tf.keras.layers.Dense(units=len(output_labels)))

# Train surrogate (calls optimizer on neural network and solves for weights)
model.compile(loss=loss, optimizer=optimizer, metrics=metrics)
# Loading the weights file
bpaul4 marked this conversation as resolved.
Show resolved Hide resolved
mcp_save = tf.keras.callbacks.ModelCheckpoint(
".mdl_wts.hdf5", save_best_only=True, monitor="val_loss", mode="min")
history = model.fit(
x=x, y=y, validation_split=0.2, verbose=1, epochs=1000, callbacks=[mcp_save])

keras_surrogate = KerasSurrogate(
model,
input_labels=list(input_labels),
output_labels=list(output_labels),
input_bounds=input_bounds,
input_scaler=input_scaler,
output_scaler=output_scaler,
agarciadiego marked this conversation as resolved.
Show resolved Hide resolved
)

For an example on inputs, outputs, bounds and scalers see the `Autothermal Reformer Flowsheet Optimization with OMLT (TensorFlow Keras) Surrogate Object <https://github.com/IDAES/examples/blob/main/idaes_examples/notebooks/docs/surrogates/omlt/keras_flowsheet_optimization_src.ipynb>`_.

Saving and Loading OMLT-keras models
------------------------------------

The user may save their neural network objects by serializing to JSON, and load into a different script, notebook or environment. For example,

.. code-block:: python

# To save a model
model = keras_surrogate.save_to_folder("keras_surrogate")

# To load a model
keras_surrogate = KerasSurrogate.load_from_folder("keras_surrogate")

Visualizing Surrogate Model Results
-----------------------------------

bpaul4 marked this conversation as resolved.
Show resolved Hide resolved
For visualizing TensorFlow Keras neural networks via parity and residual plots, see :ref:`Visualizing Surrogate Model Results<explanations/modeling_extensions/surrogate/plotting/index:Visualizing Surrogate Model Results>`.


OMLT Example
------------

agarciadiego marked this conversation as resolved.
Show resolved Hide resolved
For an example of optimizing a flowsheet containing TensorFlow Keras neural networks utilizing the OMLT package, see the `Autothermal Reformer Flowsheet Optimization with OMLT (TensorFlow Keras) Surrogate Object <https://github.com/IDAES/examples/blob/main/idaes_examples/notebooks/docs/surrogates/omlt/keras_flowsheet_optimization_src.ipynb>`_.
@@ -0,0 +1,82 @@
keras_surrogate Options
agarciadiego marked this conversation as resolved.
Show resolved Hide resolved
=======================

This page lists in more detail the kerras OMLT options.

.. contents::
:depth: 2

Installing OMLT
----------------

OMLT (Optimization and Machine Learning Toolkit) is an optional dependency: and specific examples through a user manual and installation guide. Alternatively, users may directly access the user guide here: https://omlt.readthedocs.io/en/latest/index.html.

More details on OMLT options may be found in the user guide documentation linked above. If users encounter specific error codes while running the OMLT tool in IDAES, the user guide contains detailed descriptions of each termination condition and error message.

Basic keras surrogate options
-----------------------------

Data Arguments
^^^^^^^^^^^^^^

The following arguments are required by the `KerasSurrogate` method:

* **keras_model**: this is the Keras Sequential model that will be loaded. Note that specialized layers may not be supported at this time
* **input_labels**: user-specified labels given to the inputs
* **output_labels**: user-specified labels given to the outputs
* **input_bounds**: minimum/maximum bounds for each input variable to constraint
* **input_scaler**: the scaler to be used for the inputs
* **output_scaler**: the scaler to be used for the outputs

.. code-block:: python

.. keras_model = model
.. input_labels = input_data.columns
.. output_labels = output_data.columns
.. input_bounds = {input_labels[i]: (xmin[i], xmax[i]) for i in range(len(input_labels))}
.. input_scaler = OffsetScaler.create_normalizing_scaler(input_data)
.. output_scaler = OffsetScaler.create_normalizing_scaler(output_data)

Provided Formulations
^^^^^^^^^^^^^^^^^^^^^

OMLT can formulate what we call full-space and reduced-space neural network representations using the **FULL_SPACE** object (for full-space) and **REDUCED_SPACE** object (for reduced-space). The full space formulation supports non-smooth **RELU_BIGM** activation functions and OMLT uses **RELU_COMPLEMENTARITY** to specify that ReLU activation functions should be formulated using complementarity conditions. For example,

.. code-block:: python

m.fs.surrogate.build_model(
keras_surrogate,
formulation=KerasSurrogate.Formulation.FULL_SPACE,
input_vars=inputs,
output_vars=outputs)


OMLT Layers
^^^^^^^^^^^

* **ConvLayer2D**: two-dimensional convolutional layer
* **DenseLayer**: dense layer implementing output = activation(dot(input, weights) + biases)
* **IndexMapper**: map indexes from one layer to the other
* **InputLayer**: the first layer in any network
* **Layer2D**: abstract two-dimensional layer that downsamples values in a kernel to a single value
* **PoolingLayer2D**: two-dimensional pooling layer


Layer Functions
^^^^^^^^^^^^^^^

* **full_space.full_space_conv2d_layer**
* **full_space.full_space_dense_layer**
* **full_space.full_space_maxpool2d_layer**
* **reduced_space.reduced_space_dense_layer**
* **partition_based.default_partition_split_func**
* **partition_based.partition_based_dense_relu_layer**


Activation Functions
^^^^^^^^^^^^^^^^^^^^

* **linear**: applies the linear activation function
* **sigmoid**: applies the sigmoid function
* **softplus**: applies the softplus function
* **tanh**: applies the tanh function