Implementation of classic machine learning concepts and algorithms from scratch and math behind their implementation.Written in Jupiter Notebook Python
-
Updated
Jun 28, 2022 - HTML
Implementation of classic machine learning concepts and algorithms from scratch and math behind their implementation.Written in Jupiter Notebook Python
Encrypted Search: A Probabilistic Estimator of Confiidentiality
Entropy vs self sequence alignment is a Javascript implementation of a scanner that makes a comparison between two methods, namely between Shanon entropy (Information entropy) and self-sequence alignment (Information content). Information entropy (IE) and Information content (IC) are two methods that quantitatively measure information.
A classification task where LDA and DBSCAN are combined to perform crucial Intraclass outlier detection; then ad hoc feature selection process is executed to reduce the highly dimensional (continuous and discrete) feature space.
Entropy is a measure of the uncertainty in a random variable. This application calculates the entropy of text. The current example calculates the entropy of sequence "TTTAAGCC". In the context of information theory the term "Entropy" refers to the Shannon entropy.
by using a dataset , we are calculating and playing with distance and some features ,i used python ,jupyter note bookes and numpy and pandas.
A single page app to showcase password entropy and how it is calculated
How does random number generation work on Linux systems ?
EntroCalc - Calculateur d'entropie de mot de passe
Advanced statistical analysis accelerated with PyTorch.
Project in which we explore different algorithms and compression approaches such as Huffman coding or entropy change. The distribution of the symbols of the compressed files (image, electronic book, etc.) is also analyzed.
This matlab code is used to find the entropy of plain and cipher images
This is an application designed in HTML5/Javascript of a scanner that makes a comparison between two methods, namely between Shanon entropy (Information entropy) and self-sequence alignment (Information content). Information entropy (IE) and Information content (IC) are two methods that quantitatively measure information.
SOFTWARE ENGINEERING: Introduction to the construction of reliable software. Topics may include software tools, software testing methodologies, retrofitting, regression testing, structured design and structured programming, software characteristics and quality, complexity, entropy, deadlock, fault tolerance, formal proofs of program correctness,…
Documentation website of Entropy framework for Deno.
Add a description, image, and links to the entropy topic page so that developers can more easily learn about it.
To associate your repository with the entropy topic, visit your repo's landing page and select "manage topics."