Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@
* SPDX-License-Identifier: Apache-2.0
******************************************************************************/


// MNISTSingleLayer.java
// Simple single-hidden-layer MLP for MNIST digit classification.
// Demonstrates basic feedforward networks in DL4J.

package org.deeplearning4j.examples.quickstart.modeling.feedforward.classification;

import org.deeplearning4j.datasets.iterator.impl.MnistDataSetIterator;
Expand Down Expand Up @@ -72,6 +77,9 @@ public static void main(String[] args) throws Exception {


log.info("Build model....");

// Build a single-hidden-layer MLP for MNIST (28x28 images flattened to 784 inputs)

MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.seed(rngSeed) //include a random seed for reproducibility
// use stochastic gradient descent as an optimization algorithm
Expand Down Expand Up @@ -100,6 +108,7 @@ public static void main(String[] args) throws Exception {
log.info("Train model....");
model.fit(mnistTrain, numEpochs);

// Evaluate the model on the MNIST test dataset

log.info("Evaluate model....");
Evaluation eval = model.evaluate(mnistTest);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@
* SPDX-License-Identifier: Apache-2.0
******************************************************************************/


// ModelXOR.java
// Demonstrates solving the XOR problem using a small MLP.
// XOR is not linearly separable -> requires hidden layers.

package org.deeplearning4j.examples.quickstart.modeling.feedforward.classification;

import org.deeplearning4j.nn.conf.MultiLayerConfiguration;
Expand Down Expand Up @@ -110,6 +115,9 @@ public static void main(String[] args) {

log.info("Network configuration and training...");

// Build a small 2-layer MLP for XOR classification


MultiLayerConfiguration conf = new NeuralNetConfiguration.Builder()
.updater(new Sgd(0.1))
.seed(seed)
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,84 @@
# Feedforward Neural Network Classification Examples – DeepLearning4J

This folder contains several feedforward neural network (MLP) classification examples using DeepLearning4J.
They demonstrate how to train neural networks on classic datasets such as MNIST, Iris, XOR, and synthetic datasets.

---

## 🧠 MNISTSingleLayer.java
A simple single-hidden-layer MLP for MNIST digit classification.

### What this example shows
- Loading MNIST data
- Building a minimal feedforward model
- Backpropagation training
- Evaluating test accuracy

---

## 🧠 MNISTDoubleLayer.java
A deeper MLP with two hidden layers for MNIST.

### Why it's useful
- Shows the impact of depth on accuracy
- Good introduction to multi-layer feedforward networks

---

## 🌸 IrisClassifier.java
A classifier for the Iris flower dataset.

### What you learn
- Basic classification with a very small dataset
- How to use evaluation metrics
- Simple preprocessing

---

## 🧪 ModelXOR.java
A classic MLP solving the XOR problem.

### Why XOR?
- Not linearly separable
- Demonstrates why deep networks are needed

---

## 🌙 MoonClassifier.java
Binary classification on a synthetic two-moon dataset.

### Learnings
- Handling noisy 2D datasets
- Visualizing classification boundaries

---

## 🪐 SaturnClassifier.java
Classification of Saturn (concentric circles) synthetic dataset.

### Shows
- Decision boundaries
- How MLPs learn non-linear patterns

---

## ✔ How to Run Any Example

Use the following command template:


mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.classification.<ClassName>"


Example:



mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.classification.MNISTSingleLayer"


---

## 🙌 Why This README Helps
These classification examples previously had no documentation.
This README improves clarity, explains datasets, and helps beginners understand each example.
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@
* SPDX-License-Identifier: Apache-2.0
******************************************************************************/


// MNISTAutoencoder.java
// Demonstrates training an autoencoder on MNIST digit images.
// Autoencoders learn compressed representations (unsupervised learning).

package org.deeplearning4j.examples.quickstart.modeling.feedforward.unsupervised;

import org.apache.commons.lang3.tuple.ImmutablePair;
Expand Down Expand Up @@ -81,6 +86,8 @@ public static void main(String[] args) throws Exception {
.build())
.build();

// Build a simple autoencoder: Encoder → Bottleneck → Decoder

MultiLayerNetwork net = new MultiLayerNetwork(conf);
net.setListeners(Collections.singletonList(new ScoreIterationListener(10)));

Expand All @@ -101,6 +108,8 @@ public static void main(String[] args) throws Exception {
INDArray indexes = Nd4j.argMax(dsTest.getLabels(),1); //Convert from one-hot representation -> index
labelsTest.add(indexes);
}

// Train the autoencoder to minimize reconstruction loss

//Train model:
int nEpochs = 3;
Expand Down Expand Up @@ -154,8 +163,10 @@ public int compare(Pair<Double, INDArray> o1, Pair<Double, INDArray> o2) {
worst.add(list.get(list.size()-j-1).getRight());
}
}

//Visualize by default
// Evaluate reconstruction quality or print sample reconstructions

//Visualize by default
if (visualize) {
//Visualize the best and worst digits
MNISTVisualizer bestVisualizer = new MNISTVisualizer(2.0, best, "Best (Low Rec. Error)");
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
# Unsupervised Learning Examples – Autoencoder (DeepLearning4J)

This folder contains unsupervised learning examples implemented using DeepLearning4J.
The primary example in this directory demonstrates how to train an autoencoder on MNIST digits to perform dimensionality reduction and reconstruction.

---

## 🧠 MNISTAutoencoder.java

A simple autoencoder trained on the MNIST dataset (28×28 grayscale digit images).
Autoencoders learn to compress input data into a lower-dimensional representation and then reconstruct it.

### What this example shows
- How autoencoders work
- How to compress images into a bottleneck latent space
- How to reconstruct input images
- How unsupervised neural networks are trained

### Key Concepts
- **Encoder:** Compresses image → latent representation
- **Decoder:** Reconstructs latent representation → image
- **Loss Function:** Measures reconstruction quality

### Expected Behavior
The autoencoder gradually learns to:
- Rebuild digit outlines
- Capture key features
- Reduce noise

This is not a classifier — it learns **patterns** without labels.

---

## ✔ How to Run

mvn -q exec:java -Dexec.mainClass="org.deeplearning4j.examples.quickstart.modeling.feedforward.unsupervised.MNISTAutoencoder"


---

## 🙌 Why This README Helps
The unsupervised folder previously had no explanation, run instructions, or conceptual overview.
This documentation improves clarity and helps beginners understand autoencoders and unsupervised learning techniques in DL4J.