Skip to content

NullLabTests/MNIST-train-DPL

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MNIST Train DPL

In this example, we use DeepProbLog to train for MNIST digit recognition.

Running the Experiment

To run the experiment, execute the addition.py file:

python addition.py

Some experiments use a pre-trained neural network. To generate these, run the models/pretrained/create_pretrained.py script:

python models/pretrained/create_pretrained.py

Example Outputs

Pre-training

Pre-training This image shows the initial state before training the neural network. The network has not yet learned to recognize the digits accurately.

Training Epochs

Training Epochs This image shows the training process over multiple epochs. As the training progresses, the network improves its accuracy in recognizing the digits.

DeepProbLog: Neurosymbolic AI

DeepProbLog (DPL) qualifies as neurosymbolic AI when combined with a neural network (NN) because it integrates symbolic reasoning (logic programming) with subsymbolic learning (deep learning) in a unified framework.

Breaking It Down: How DeepProbLog is Neurosymbolic

To be considered a neurosymbolic system, a framework must combine:

  • Neural Networks (NNs) for handling raw data (e.g., images, speech, text).
  • Symbolic Reasoning (Logic-based AI) for structured reasoning, common sense, and knowledge representation.

DeepProbLog achieves this by:

  • Using Prolog (symbolic) for logical inference and knowledge representation.
  • Using Neural Networks (subsymbolic) as differentiable components that provide probabilistic evidence for logic-based reasoning.

How Does DeepProbLog Work?

DeepProbLog extends ProbLog (a probabilistic logic programming framework) by integrating deep neural networks as probabilistic predicates.

Symbolic Component (Logic Programming) – DeepProbLog (DPL)

  • You define logical rules in ProbLog, a probabilistic version of Prolog.
  • The logic engine reasons over these rules, applying symbolic inference.
  • The program can assign probabilities to logical facts based on neural network outputs.

Subsymbolic Component (Deep Learning) – Neural Network

  • A NN processes raw input data (e.g., images, text, speech).
  • The NN outputs a probability distribution over possible labels (e.g., classification results).
  • DeepProbLog treats these NN outputs as probabilistic facts in logic reasoning.

Example: MNIST Addition Task in DeepProbLog

This is an arithmetic task using MNIST digits. The neurosymbolic reasoning works like this:

  1. Neural Network (NN) for Perception

    • The NN (CNN-based) recognizes individual handwritten digits.
    • It outputs a probability distribution over digit labels (0-9).
  2. Symbolic Reasoning for Addition

    • The logic program uses the NN outputs to perform arithmetic operations.
    • It combines the recognized digits to solve addition problems.

Directory Structure

  • addition_noisy.py: Script for noisy addition experiments.
  • addition.py: Main script for MNIST addition experiments.
  • data/: Directory containing data files.
  • models/: Directory containing model files.
  • neural_baseline/: Directory containing neural baseline files.
  • network.py: Script defining the neural network.
  • README.md: This file.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors