Skip to content

pythonuzgit/Graph-Neural-Network

Repository files navigation

Graph Neural Network

GraphNeural Network in Natural Language processing with spacexdataset Example

Text Classification using Graph Neural Network Example

Light GCN Supervised Task using GNN and PyTorch Example

Graph neural Network in NLP with Movie Dataset Example

Sound Classification

  1. Sound of Notes Classification Example
  2. Guitar-chards dataset Sound Classification Example

Image Caption Generator

Image Caption Generator - Animal Dataset Example

Knowledge Graph

  1. Knowlege Graph Learning with Spacy Example
  2. Knowlege Graph with machine Learning Example

Transformers

Transformers with Natural Language Processing Example

GPT2/GPT3

GPT2 generated Tweet Example

Natural Language Processing(NLP) using PyTorch

  1. PyTorch: Review Universal studio analysis with Hugging Face using PyTorch Example
  2. PyTorch: Sarcastic analysis with Hugging Face using PyTorch Example
  3. PyTorch: Luxury Product Apparel analysis with Hugging Face using_PyTorch Example
  4. PyTorch: Sap press analysis with Hugging Face using PyTorch Example
  5. PyTorch: Tweet emotions analysis with Hugging Face using PyTprch Example
  6. PyTorch: Climate-change analysis with Hugging Face using PyTorch Example
  7. PyTorch: Sentiment analysis with Hugging Face using PyTorch Example
  8. PyTorch: Natural Language Processing with PyTorch Example
  9. PyTorch: Senitiment analysis with Hugging Face and BERT using PyTorch Example
  10. PyTorch: Emotion analysis using Hugging Face and BERT using PyTorch Example
  11. PyTorch: Training Word Embedding, Gensim and PyTorch Example
  12. PyTorch: Sentiment analysis physic vs chemistry vs biology with Gensim using sklearn Example
  13. PyTorch: Twetter sentiment analysis with Gensim using sklearn Example

Natural Language Processing (NLP) with TensorFlow

  1. TensorFlow: Netflix_titles using Deep Learning in Keras Example
  2. TensorFlow: Deep learning for NLP Example
  3. TensorFlow: Deep learning for NLP Example
  4. TensorFlow: Generating text using LSTM Example
  5. TensorFlow: Text Generation using LSTM Example
  6. TensorFlow: Text generation using Recurrent Neural Network with 034.txt Example
  7. TensorFlow: Logistic regression and Neural Network in Text classification Example
  8. TensorFlow: Text Classification with Deep Neural Network, Logistic regression Example
  9. TensorFlow: with IMDB Dataset Example
  10. TensorFlow: using LSTM with Vaccination tweets Example

Natural Language Processing with Python

  1. Natural Language Processing: Spark NLP & ML for Text Classification Example
  2. Natural Language Processing: using DataAnalyst.csv Example
  3. Natural Language Processing: using Full-Economic-News-DFE-839861.csv Example
  4. Natural Language Processing: Natural Language Procesing with Python Example
  5. Natural Language Processing: Natural Language Processing Example
  6. Natural Language Processing: with SpaCy Example
  7. Natural Language Processing: using SpaCy Example
  8. Natural Language Processing: using KMeans method Example
  9. Natural Language Processing: using KMeans method with internet news data Example
  10. Natural Language Processing: Text Classification with Football-Scenarios-DFE-832307.csv Example
  11. Natural Language Processing: Text Classification using agreement-sentence-agreement-DFE dataset Example
  12. Natural Language Processing: Text Classification using vaccination_tweets Example
  13. Natural Language Processing: Text Classifications with covid19_tweets Example
  14. Natural Language Processing: using tripadvisor_hotel_review dataset Example
  15. Natural Language Processing: using indian_food dataset Example
  16. Natural Language Processing: using Apple-Twitter-Sentiment_DFE Example
  17. Natural Language Processing: using%20user_reviews_g1 Example

Machine learning with scikit-learn

  1. Scikit Learn: Regression Models with Decision Tree, Random Forest and XGBoost Example
  2. Scikit Learn: USA real estate dataset Example
  3. Scikit Learn: House price of Bengaluru_House_Data using Regression model Example
  4. Scikit Learn: kc house price prediction using regression Example

Within the field of machine learning, there are two main types of tasks: supervised and unsupervised. Scikit-learn provides a wide selection of supervised and unsupervised learning algorithms.

Supervised Learning

Supervised learning algorithms can be used to solve both classification and regression problems.

Linear Regresion: This Supervised Learning algorithm is used to predict continuous, numerical values based on given data input. Linear Regression tries to find parameters of the linear function, so the distance between the all the points and the like is as small as possible. Example

Logistic regression: as a tool for building model which is used to a descrete set of classes. Logistic regression is commonly used when the response variable is continuous and a more complex cost function, this cost function can be defined as the 'Sigmoid function'. Example 1 and see also Example 2

k-nearest neigbors: How likely a point belongs to one class or other depending upon which class it's 'k' nearest instances belong to. k-nearest neighbor algorithm (KNN) is a non-parmetric method and can be used for both classification and regression problem. Example

Support vector machines (SVM): SVMs are a set of related supervised learning methods used for classification and regression. Support vector machines can be defined as systems which use hypothesis space of a linear functions in a high dimensional feature space, trained with a learning algorithm from optimization theory that implements a learning bias derived from statistical learning theory. Example

Random Forest: Random forests is a supervised learning algorithm. It can be used both for classification and regression. A Random forest ii made of many decision trees. Random forest has a variety of applications, such as recommendation engines, image classification and feature selection. Example

Naive Bayes: algorithm can be defined as a supervised classification algorithm which is based on Bayes' theorem. Example

Artificial Neural Network (ANNs): ANNs are the most commonly used tools in Machine Learning. A neural network is a statistical tool to interpret a set of features in the input data and it tries to either classify the input(Classification) and predict the output based on a continuous input(Regression). The process of creating a neural network in Python begins with the most basic form, a single perceptron. We can extend the discussion to multilayer perceptrons, or more commonly known as artificial neural networks. Example 1 and Example 2 and see also GitHub

Unsupervised learning

Unsupervised learning algorithm will be use a metric such as distance in order to identify how close a set of points are to each other and how far apart two such groups are.

k-Means cluster: K-Means clustering is one of the most commonly used clustering algorithms, which belong to the family of unsupervised machine learning models. It tries to find cluster centers that are representative of certain regions of the data. Therefore, the specific algorithm that you want to might depend on the problem you are trying to solve and also on what algorithm are available in the specific package that you are using. As we know some of the first clustering algorithm consisted of simply finding the centroid positions that minimize the distances to all the points in each cluster. The points in each cluster are closer to that centroid than other cluster centroids. As might be obvious at this point, the hardest part with this is figuring out how many clusters there are. Example

Principal Component analysis (PCA): PCA is an unsupervised learning method. PCA is an important technique to undersand in the fields of statistics and data science/machine learning. PCA simplifies the complexity in high-dimensional data. It does this by transforming the data into fewer dimensions, which act as summaries of features. PCA is fundamentally a dimensionality reduction algorithm, but it can also be useful as a tool for visualization, for noise filtering, for feature extraction and engineering, and much more. Example

A Gaussian Mixture Models (GMM): GMM is a category of probabilistic model which states that all generated data points are derived from a mixture of a finite Gaussian distributions that has no known parameters. Gaussian mixture models are very useful when it comes to modeling data, especially data which comes from several groups. Example

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published