This directory contains implementations of quantum machine learning algorithms that leverage quantum computing for pattern recognition, classification, and data analysis.
Difficulty: Advanced
Description: Quantum machine learning classifier using parameterized quantum circuits for supervised learning tasks.
Key Features:
- Multiple quantum feature maps (ZZ, angle encoding, basis encoding)
- Various ansatz types (EfficientSU2, RealAmplitudes, TwoLocal)
- Classical optimizers (ADAM, SPSA, COBYLA, L-BFGS-B)
- Automatic dimensionality reduction with PCA
- Comprehensive evaluation metrics and visualization
- Support for multiple datasets (Iris, Wine, Breast Cancer, synthetic)
- Decision boundary visualization for 2D datasets
- Training convergence analysis
Supported Datasets:
- Iris: Classic flower classification (binary version)
- Wine: Wine classification dataset
- Breast Cancer: Medical diagnosis classification
- Moons: Synthetic 2D crescent-shaped clusters
- Circles: Synthetic 2D concentric circles
- Synthetic: Custom classification problems
Usage:
from variational_quantum_classifier import VariationalQuantumClassifier, QuantumDatasetLoader
# Load dataset
X, y = QuantumDatasetLoader.load_iris(binary=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.3)
# Create VQC
vqc = VariationalQuantumClassifier(
num_qubits=2, # 2 features
num_classes=2,
feature_map_type='zz',
ansatz_type='efficient_su2',
optimizer='adam',
reps=2
)
# Train and evaluate
vqc.train(X_train, y_train, max_iter=100)
evaluation = vqc.evaluate(X_test, y_test)
print(f"Accuracy: {evaluation['accuracy']:.3f}")Feature Maps:
- ZZ Feature Map: Encodes data using ZZ interactions
- Angle Encoding: Encodes features as rotation angles
- Basis Encoding: Encodes features in computational basis
Difficulty: Advanced
Description: Quantum kernel methods for support vector machines and other kernel-based algorithms.
Planned Features:
- Quantum kernel estimation
- Quantum support vector machines
- Kernel alignment optimization
- Quantum advantage analysis
Difficulty: Advanced
Description: Quantum generative adversarial networks and variational autoencoders.
Planned Features:
- Quantum GANs
- Quantum variational autoencoders
- Quantum Boltzmann machines
- Quantum circuit Born machines
# ZZ feature map (good for pairwise interactions)
feature_map = 'zz'
# Angle encoding (simple and effective)
feature_map = 'angle'
# Basis encoding (for specific applications)
feature_map = 'basis'# Hardware-efficient ansatz
ansatz = 'efficient_su2'
# Real-valued rotations
ansatz = 'real_amplitudes'
# Customizable two-local
ansatz = 'two_local'# VQC automatically handles high-dimensional data
# Uses PCA to reduce to number of qubits
# Scales features to [-π, π] rangefrom sklearn.preprocessing import StandardScaler, MinMaxScaler
# Standard scaling
scaler = StandardScaler()
X_scaled = scaler.fit_transform(X)
# Min-max scaling to [-π, π]
scaler = MinMaxScaler(feature_range=(-np.pi, np.pi))
X_scaled = scaler.fit_transform(X)- Qiskit >= 0.45.0
- Qiskit Machine Learning >= 0.6.0
- Scikit-learn >= 1.3.0
- NumPy >= 1.24.0
- Matplotlib >= 3.7.0
- Seaborn >= 0.12.0
# Run VQC demonstration
python variational_quantum_classifier.py
# Test specific dataset
python -c "from variational_quantum_classifier import VQCDemo;
demo = VQCDemo();
results = demo.run_iris_classification();
print(f'Accuracy: {results[\"accuracy\"]:.3f}')"- Understand quantum feature encoding
- Implement variational quantum algorithms
- Compare quantum vs classical ML performance
- Optimize quantum circuits for ML
- Analyze quantum ML advantages and limitations
- Kernel methods: Exponential feature space
- Expressivity: Quantum circuit capacity
- Trainability: Barren plateaus and gradients
- Generalization: Quantum model complexity
- Zero-noise extrapolation: Reduce noise impact
- Symmetry verification: Enforce constraints
- Measurement error mitigation: Improve readout
- Quantum datasets: Naturally quantum data
- Quantum feature maps: Quantum-to-quantum encoding
- Hybrid classical-quantum: Best of both worlds
# Compare with classical SVM
from sklearn.svm import SVC
classical_svm = SVC(kernel='rbf')
classical_svm.fit(X_train, y_train)
classical_accuracy = classical_svm.score(X_test, y_test)
# Compare with classical neural network
from sklearn.neural_network import MLPClassifier
classical_nn = MLPClassifier(hidden_layer_sizes=(10, 10))
classical_nn.fit(X_train, y_train)
classical_accuracy = classical_nn.score(X_test, y_test)- Circuit depth: Affects noise tolerance
- Parameter count: Training complexity
- Gate count: Hardware requirements
- Measurement shots: Statistical accuracy
After mastering VQC:
- Implement quantum kernel methods
- Try quantum transfer learning
- Explore quantum federated learning
- Study quantum reinforcement learning
- Investigate quantum natural language processing
- Qiskit Machine Learning
- Quantum Machine Learning Review
- Variational Quantum Algorithms
- Quantum Machine Learning Dataset
- Barren Plateaus in QML
- Iris: 150 samples, 4 features
- Wine: 178 samples, 13 features
- Breast Cancer: 569 samples, 30 features
- Moons: Adjustable complexity and noise
- Circles: Non-linearly separable
- Classification: Customizable parameters
- Start with small datasets for faster training
- Use 2-4 qubits initially to reduce complexity
- Compare with classical baselines to assess quantum advantage
- Monitor training convergence to detect barren plateaus
- Use appropriate feature scaling for quantum encoding
- High-dimensional feature spaces: Kernel advantage
- Small datasets: Quantum feature maps
- Specific problem structures: Quantum correlations
- Large datasets: Classical scalability
- Simple problems: Quantum overhead
- Limited quantum resources: Noise limitations
- Quantum kernels: Exponential feature spaces
- Hybrid models: Classical + quantum components
- Small-scale problems: Proof-of-principle demonstrations
- Quantum neural networks: Deep quantum circuits
- Quantum optimization: Learning as optimization
- Quantum data: Naturally quantum datasets