Skip to content
Portable Deep Learning Library for .NET
C#
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github/workflows
MNISTTest
Merkurius
XORTest
.gitignore
LICENSE
Merkurius.sln
README.md

README.md

Merkurius

This repository contains the portable deep learning (deep neural networks) library implementation for .NET platform. This library is written by C#.

Installation

You can install the Merkurius NuGet package from the .NET Core CLI command.

> dotnet add package Merkurius

or from the NuGet package manager.

PM> Install-Package Merkurius

Build

To build Merkurius, run .NET Core CLI command.

> dotnet build Merkurius.csproj

Example

Convolutional neural network (CNN).

var model = new Model(
  new Convolution(ch, iw, ih, f, fw, fh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
  new Activation(new ReLU(),
  new MaxPooling(f, mw, mh, pw, ph,
  new FullyConnected(f * ow * oh, (fanIn, fanOut) => Initializers.HeNormal(fanIn),
  new Activation(new ReLU(),
  new Softmax(100, 10, (fanIn, fanOut) => Initializers.GlorotNormal(fanIn, fanOut))))))),
  new Adam(), new SoftmaxCrossEntropy());

model.Fit(trainingList, 50);

Features

  • .NET Standard 2.1 library
  • Code first modeling
  • Dependency-free

Activation Functions

  • ELU (Exponential linear unit)
  • Hyperbolic tangent
  • Identity
  • ReLU (Rectified linear unit)
  • SELU (Scaled exponential linear unit)
  • Sigmoid
  • Softmax
  • SoftPlus
  • Softsign

Layers

  • Batch normalization
  • Convolution
  • Dropout
  • Embedding
  • GRU (Gated recurrent unit)
  • Fully connected
  • LSTM (Long short-term memory)
  • Max pooling
  • Recurrent

Loss Functions

  • Cross-entropy
  • Mean squared error (MSE)

Optimizers

  • AdaDelta
  • AdaGrad
  • Adam
  • Momentum
  • Nesterov
  • RMSprop
  • SGD
You can’t perform that action at this time.