Skip to content

be-thomas/darkai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DarkAI

This is a work-in-progress library for Artificial Intelligence. It is merely an experiment. However, I will keep building on this library as and when I have free time.

Example code -

import darkai
from darkai import backends

from darkai.observer import accuracy_observer, mean_squared_error_observer
from darkai.optimizer import gradient_descent
from darkai.supervised import perceptron
from darkai.plugins import numpy_support

import numpy as np

# Enable Numpt support before using numpy as backend
numpy_support.enable(darkai)

# Training data
training_data = np.array([[0, 1], [1, 0], [0, 0], [1, 1]], np.float32)

# Expected Output
expected_output = np.array([0, 0, 0, 1], np.float32)

# create a perceptron with backend "numpy" (this library will be used for math)
p = perceptron(backends["numpy"])

# activation function can be changed
# eg - p.set_activation_fn("sigmoid")
p.set_activation_fn(lambda a: a)

# use gradient descent as optimizer
p.set_optimizer(gradient_descent)

# we will observe the accuracy and MSE
accuracy = accuracy_observer()
accuracy.set_threshold(0.5)
mse = mean_squared_error_observer()

# add the observers to the model
p.add_observer(accuracy)
p.add_observer(mse)

# set learning rate
p.optimizer.set_learning_rate(0.1)

# train with 100 iterations
p.train_iters(10000, training_data, expected_output)


# print training accuracy
print("Accuracy: ")
print(accuracy.data)
print()


# print training MSE
print("MSE: ")
print(mse.data)
print()


# predict
out = p.predict(training_data)
print("Predicted: ", out)

About

A venture to gradually build an Artificial Intelligence library while learning the internals of varial ML & DL models.

Resources

License

MIT, MIT licenses found

Licenses found

MIT
LICENSE
MIT
LICENSE.txt

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages