Skip to content

Latest commit

 

History

History
701 lines (509 loc) · 13.6 KB

layers.rst

File metadata and controls

701 lines (509 loc) · 13.6 KB

API - Layers

tensorlayer.layers

Layer list

Layer

Input

OneHot Word2vecEmbedding Embedding AverageEmbedding

Dense Dropout GaussianNoise DropconnectDense

UpSampling2d DownSampling2d

Conv1d Conv2d Conv3d DeConv2d DeConv3d DepthwiseConv2d SeparableConv1d SeparableConv2d DeformableConv2d GroupConv2d

PadLayer PoolLayer ZeroPad1d ZeroPad2d ZeroPad3d MaxPool1d MeanPool1d MaxPool2d MeanPool2d MaxPool3d MeanPool3d GlobalMaxPool1d GlobalMeanPool1d GlobalMaxPool2d GlobalMeanPool2d GlobalMaxPool3d GlobalMeanPool3d CornerPool2d

SubpixelConv1d SubpixelConv2d

SpatialTransformer2dAffine transformer batch_transformer

BatchNorm BatchNorm1d BatchNorm2d BatchNorm3d LocalResponseNorm InstanceNorm InstanceNorm1d InstanceNorm2d InstanceNorm3d LayerNorm GroupNorm SwitchNorm

RNN SimpleRNN GRURNN LSTMRNN BiRNN

retrieve_seq_length_op retrieve_seq_length_op2 retrieve_seq_length_op3 target_mask_op

Flatten Reshape Transpose Shuffle

Lambda

Concat Elementwise ElementwiseLambda

ExpandDims Tile

Stack UnStack

Sign Scale BinaryDense BinaryConv2d TernaryDense TernaryConv2d DorefaDense DorefaConv2d

PRelu PRelu6 PTRelu6

flatten_reshape initialize_rnn_state list_remove_repeat

Base Layer

Layer

Input Layers

Input Layer

Input

One-hot Layer

OneHot

Word2Vec Embedding Layer

Word2vecEmbedding

Embedding Layer

Embedding

Average Embedding Layer

AverageEmbedding

Activation Layers

PReLU Layer

PRelu

PReLU6 Layer

PRelu6

PTReLU6 Layer

PTRelu6

Convolutional Layers

Convolutions

Conv1d

Conv1d

Conv2d

Conv2d

Conv3d

Conv3d

Deconvolutions

DeConv2d

DeConv2d

DeConv3d

DeConv3d

Deformable Convolutions

DeformableConv2d

DeformableConv2d

Depthwise Convolutions

DepthwiseConv2d

DepthwiseConv2d

Group Convolutions

GroupConv2d

GroupConv2d

Separable Convolutions

SeparableConv1d

SeparableConv1d

SeparableConv2d

SeparableConv2d

SubPixel Convolutions

SubpixelConv1d

SubpixelConv1d

SubpixelConv2d

SubpixelConv2d

Dense Layers

Dense Layer

Dense

Drop Connect Dense Layer

DropconnectDense

Dropout Layers

Dropout

Extend Layers

Expand Dims Layer

ExpandDims

Tile layer

Tile

Image Resampling Layers

2D UpSampling

UpSampling2d

2D DownSampling

DownSampling2d

Lambda Layers

Lambda Layer

Lambda

ElementWise Lambda Layer

ElementwiseLambda

Merge Layers

Concat Layer

Concat

ElementWise Layer

Elementwise

Noise Layer

GaussianNoise

Normalization Layers

Batch Normalization

BatchNorm

Batch Normalization 1D

BatchNorm1d

Batch Normalization 2D

BatchNorm2d

Batch Normalization 3D

BatchNorm3d

Local Response Normalization

LocalResponseNorm

Instance Normalization

InstanceNorm

Instance Normalization 1D

InstanceNorm1d

Instance Normalization 2D

InstanceNorm2d

Instance Normalization 3D

InstanceNorm3d

Layer Normalization

LayerNorm

Group Normalization

GroupNorm

Switch Normalization

SwitchNorm

Padding Layers

Pad Layer (Expert API)

Padding layer for any modes.

PadLayer

1D Zero padding

ZeroPad1d

2D Zero padding

ZeroPad2d

3D Zero padding

ZeroPad3d

Pooling Layers

Pool Layer (Expert API)

Pooling layer for any dimensions and any pooling functions.

PoolLayer

1D Max pooling

MaxPool1d

1D Mean pooling

MeanPool1d

2D Max pooling

MaxPool2d

2D Mean pooling

MeanPool2d

3D Max pooling

MaxPool3d

3D Mean pooling

MeanPool3d

1D Global Max pooling

GlobalMaxPool1d

1D Global Mean pooling

GlobalMeanPool1d

2D Global Max pooling

GlobalMaxPool2d

2D Global Mean pooling

GlobalMeanPool2d

3D Global Max pooling

GlobalMaxPool3d

3D Global Mean pooling

GlobalMeanPool3d

2D Corner pooling

CornerPool2d

Quantized Nets

This is an experimental API package for building Quantized Neural Networks. We are using matrix multiplication rather than add-minus and bit-count operation at the moment. Therefore, these APIs would not speed up the inferencing, for production, you can train model via TensorLayer and deploy the model into other customized C/C++ implementation (We probably provide users an extra C/C++ binary net framework that can load model from TensorLayer).

Note that, these experimental APIs can be changed in the future.

Sign

Sign

Scale

Scale

Binary Dense Layer

BinaryDense

Binary (De)Convolutions

BinaryConv2d

BinaryConv2d

Ternary Dense Layer

TernaryDense

TernaryDense

Ternary Convolutions

TernaryConv2d

TernaryConv2d

DoReFa Convolutions

DorefaConv2d

DorefaConv2d

DoReFa Convolutions

DorefaConv2d

DorefaConv2d

Recurrent Layers

Common Recurrent layer

All recurrent layers can implement any type of RNN cell by feeding different cell function (LSTM, GRU etc).

RNN layer

RNN

RNN layer with Simple RNN Cell

SimpleRNN

RNN layer with GRU Cell

GRURNN

RNN layer with LSTM Cell

LSTMRNN

Bidirectional layer

BiRNN

Advanced Ops for Dynamic RNN

These operations usually be used inside Dynamic RNN layer, they can compute the sequence lengths for different situation and get the last RNN outputs by indexing.

Compute Sequence length 1

retrieve_seq_length_op

Compute Sequence length 2

retrieve_seq_length_op2

Compute Sequence length 3

retrieve_seq_length_op3

Compute mask of the target sequence

target_mask_op

Shape Layers

Flatten Layer

Flatten

Reshape Layer

Reshape

Transpose Layer

Transpose

Shuffle Layer

Shuffle

Spatial Transformer

2D Affine Transformation

SpatialTransformer2dAffine

2D Affine Transformation function

transformer

Batch 2D Affine Transformation function

batch_transformer

Stack Layer

Stack Layer

Stack

Unstack Layer

UnStack

Helper Functions

Flatten tensor

flatten_reshape

Initialize RNN state

initialize_rnn_state

Remove repeated items in a list

list_remove_repeat