Skip to content

Implementations of quantum machine learning algorithms that leverage quantum computing for pattern recognition, classification, and data analysis.

Notifications You must be signed in to change notification settings

Benaah/quantum-machine-learning

Repository files navigation

Quantum Machine Learning Projects

This directory contains implementations of quantum machine learning algorithms that leverage quantum computing for pattern recognition, classification, and data analysis.

Projects

1. Variational Quantum Classifier (VQC) (variational_quantum_classifier.py)

Difficulty: Advanced
Description: Quantum machine learning classifier using parameterized quantum circuits for supervised learning tasks.

Key Features:

  • Multiple quantum feature maps (ZZ, angle encoding, basis encoding)
  • Various ansatz types (EfficientSU2, RealAmplitudes, TwoLocal)
  • Classical optimizers (ADAM, SPSA, COBYLA, L-BFGS-B)
  • Automatic dimensionality reduction with PCA
  • Comprehensive evaluation metrics and visualization
  • Support for multiple datasets (Iris, Wine, Breast Cancer, synthetic)
  • Decision boundary visualization for 2D datasets
  • Training convergence analysis

Supported Datasets:

  • Iris: Classic flower classification (binary version)
  • Wine: Wine classification dataset
  • Breast Cancer: Medical diagnosis classification
  • Moons: Synthetic 2D crescent-shaped clusters
  • Circles: Synthetic 2D concentric circles
  • Synthetic: Custom classification problems

Usage:

from variational_quantum_classifier import VariationalQuantumClassifier, QuantumDatasetLoader

# Load dataset
X, y = QuantumDatasetLoader.load_iris(binary=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)

# Create VQC
vqc = VariationalQuantumClassifier(
    num_qubits=2,  # 2 features
    num_classes=2,
    feature_map_type='zz',
    ansatz_type='efficient_su2',
    optimizer='adam',
    reps=2
)

# Train and evaluate
vqc.train(X_train, y_train, max_iter=100)
evaluation = vqc.evaluate(X_test, y_test)
print(f"Accuracy: {evaluation['accuracy']:.3f}")

Feature Maps:

  • ZZ Feature Map: Encodes data using ZZ interactions
  • Angle Encoding: Encodes features as rotation angles
  • Basis Encoding: Encodes features in computational basis

2. Quantum Kernel Estimation (Coming Soon)

Difficulty: Advanced
Description: Quantum kernel methods for support vector machines and other kernel-based algorithms.

Planned Features:

  • Quantum kernel estimation
  • Quantum support vector machines
  • Kernel alignment optimization
  • Quantum advantage analysis

3. Quantum Generative Models (Coming Soon)

Difficulty: Advanced
Description: Quantum generative adversarial networks and variational autoencoders.

Planned Features:

  • Quantum GANs
  • Quantum variational autoencoders
  • Quantum Boltzmann machines
  • Quantum circuit Born machines

Quantum Feature Encoding

Feature Map Selection

# ZZ feature map (good for pairwise interactions)
feature_map = 'zz'

# Angle encoding (simple and effective)
feature_map = 'angle'

# Basis encoding (for specific applications)
feature_map = 'basis'

Ansatz Selection

# Hardware-efficient ansatz
ansatz = 'efficient_su2'

# Real-valued rotations
ansatz = 'real_amplitudes'

# Customizable two-local
ansatz = 'two_local'

Data Preprocessing

Automatic Dimensionality Reduction

# VQC automatically handles high-dimensional data
# Uses PCA to reduce to number of qubits
# Scales features to [-π, π] range

Manual Preprocessing

from sklearn.preprocessing import StandardScaler, MinMaxScaler

# Standard scaling
scaler = StandardScaler()
X_scaled = scaler.fit_transform(X)

# Min-max scaling to [-π, π]
scaler = MinMaxScaler(feature_range=(-np.pi, np.pi))
X_scaled = scaler.fit_transform(X)

Requirements

  • Qiskit >= 0.45.0
  • Qiskit Machine Learning >= 0.6.0
  • Scikit-learn >= 1.3.0
  • NumPy >= 1.24.0
  • Matplotlib >= 3.7.0
  • Seaborn >= 0.12.0

Running the Projects

# Run VQC demonstration
python variational_quantum_classifier.py

# Test specific dataset
python -c "from variational_quantum_classifier import VQCDemo; 
          demo = VQCDemo(); 
          results = demo.run_iris_classification(); 
          print(f'Accuracy: {results[\"accuracy\"]:.3f}')"

Learning Objectives

  1. Understand quantum feature encoding
  2. Implement variational quantum algorithms
  3. Compare quantum vs classical ML performance
  4. Optimize quantum circuits for ML
  5. Analyze quantum ML advantages and limitations

Advanced Topics

Quantum Advantage Analysis

  • Kernel methods: Exponential feature space
  • Expressivity: Quantum circuit capacity
  • Trainability: Barren plateaus and gradients
  • Generalization: Quantum model complexity

Error Mitigation

  • Zero-noise extrapolation: Reduce noise impact
  • Symmetry verification: Enforce constraints
  • Measurement error mitigation: Improve readout

Quantum Data

  • Quantum datasets: Naturally quantum data
  • Quantum feature maps: Quantum-to-quantum encoding
  • Hybrid classical-quantum: Best of both worlds

Performance Analysis

Classical vs Quantum Comparison

# Compare with classical SVM
from sklearn.svm import SVC
classical_svm = SVC(kernel='rbf')
classical_svm.fit(X_train, y_train)
classical_accuracy = classical_svm.score(X_test, y_test)

# Compare with classical neural network
from sklearn.neural_network import MLPClassifier
classical_nn = MLPClassifier(hidden_layer_sizes=(10, 10))
classical_nn.fit(X_train, y_train)
classical_accuracy = classical_nn.score(X_test, y_test)

Quantum Resource Analysis

  • Circuit depth: Affects noise tolerance
  • Parameter count: Training complexity
  • Gate count: Hardware requirements
  • Measurement shots: Statistical accuracy

Next Steps

After mastering VQC:

  1. Implement quantum kernel methods
  2. Try quantum transfer learning
  3. Explore quantum federated learning
  4. Study quantum reinforcement learning
  5. Investigate quantum natural language processing

References

Dataset Guidelines

Small Datasets (< 1000 samples)

  • Iris: 150 samples, 4 features
  • Wine: 178 samples, 13 features
  • Breast Cancer: 569 samples, 30 features

Synthetic Datasets

  • Moons: Adjustable complexity and noise
  • Circles: Non-linearly separable
  • Classification: Customizable parameters

Best Practices

  1. Start with small datasets for faster training
  2. Use 2-4 qubits initially to reduce complexity
  3. Compare with classical baselines to assess quantum advantage
  4. Monitor training convergence to detect barren plateaus
  5. Use appropriate feature scaling for quantum encoding

Quantum vs Classical Performance

When Quantum Helps

  • High-dimensional feature spaces: Kernel advantage
  • Small datasets: Quantum feature maps
  • Specific problem structures: Quantum correlations

When Classical is Better

  • Large datasets: Classical scalability
  • Simple problems: Quantum overhead
  • Limited quantum resources: Noise limitations

Future Directions

Near-term (NISQ) Applications

  • Quantum kernels: Exponential feature spaces
  • Hybrid models: Classical + quantum components
  • Small-scale problems: Proof-of-principle demonstrations

Long-term (Fault-tolerant) Applications

  • Quantum neural networks: Deep quantum circuits
  • Quantum optimization: Learning as optimization
  • Quantum data: Naturally quantum datasets

About

Implementations of quantum machine learning algorithms that leverage quantum computing for pattern recognition, classification, and data analysis.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published