Building mathematical foundations for AI/ML research through systematic daily practice and collaborative learning.
Refined version of this repository: Notion Page
Daily Math is a structure learning initiative where participants solve one mathematical problem daily, document solutions, and engage in peer review. We focus on deep understanding of mathematical concepts that underpin Machine Learning, Deep Learning, and AI.
Project Stats include:
how many problems solved, active contributions, and timeline
Can be checked on this link: Key Performance Information
Currently this project demonstrates mastery across:
Mathematical Foundations
- Linear Algebra: Vector spaces, matrix operations, eigenvalues, SVD
- Vector Calculus: Parametric curves, tangent vectors, multivariable analysis
- Complex Analysis: Complex numbers, complex vector spaces
- Optimization Theory: Gradient methods, convex optimization
Technical Skills
- Mathematical Documentation (LaTex, Markdown)
- Project Management
- Collaborative Problem-Solving
daily-math/
βββ November 2025/ # Daily problems by date
β βββ Day 1/ # Vector Equality in ββΏ
β βββ Day 2/ # Basic Vector Operations
β βββ ...
βββ Solutions/ # Individual solutions by contributor
β βββ Phanie/
β βββ Phanie's Mom/
β βββ ...
βββ Tracker/ # KPIs and performance metrics
βββ Resources/ # Textbooks and reference materials
- π Compilation of Learning Notes on the Problemβs Topic
- π Gantt Chart | Project Timeline
- π Progress Tracker
Week 1 | 11 - 17 November 2025
| Day | Topic | Category | Problem | Who Solved | Notes / Insights |
|---|---|---|---|---|---|
| #1 | Vector Equality in ββΏ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 1 | Phanie's Mom, Phanie | Chapter 1 - Phanie's Note; Requires request access |
| #2 | Basic Vector Operations in βΒ³ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 2 | Phanie's Mom, Phanie | Still in Chapter 1 |
| #3 | Linear Equations Practice Set | Linear Algebra (Machine Learning Math) | Daily Math - DAY 3 | Phanie's Mom, Phanie | Still in Chapter 1 |
| #4 | Dot Product, Orthogonality, Distance, Angle, Projection in βΒ³/ββ΄ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 4 | Phanie's Mom, Phanie | Still in Chapter 1 |
| #5 | Geometry of Hyperplanes and Lines in ββΏ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 5 | Phanie's Mom, Phanie | Still in Chapter 1 |
| #6 | Unit Tangent Vector of a Parametric Curve in βΒ³ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 6 | TBA | Still in Chapter 1 |
| Supplementary Problems for DAY 6 | TBA | Parallel to Calculus 3 | |||
| #7 | Vector Algebra (Component-wise Operations, Dot Product, Norm) | Linear Algebra (Machine Learning Math) | Daily Math - DAY 7 | TBA | Still in Chapter 1 |
| Supplementary Problems for DAY 7 | TBA |
Week 2 | 18 - 24 November 2025
| Day | Topic | Category | Problem | Who Solved | Notes / Insights |
|---|---|---|---|---|---|
| #8 | Vector Algebra β Cross Product (Determinant Method & Component Formula) | Linear Algebra (Machine Learning Math) | Daily Math - DAY 8 | TBA | Chapter 1 - Phanie's Note |
| #9 | Complex Numbers (Algebraic Form, Conjugate, Division, Magnitude) | Linear Algebra (Machine Learning Math) | Daily Math - DAY 9 | TBA | Still in Chapter 1 |
| #10 | Complex Vectors in βΒ³ (Addition, Scalar Multiplication) | Linear Algebra (Machine Learning Math) | Daily Math - DAY 10 | TBA | Still in Chapter 1 |
| #11 | Complex Vectors Algebra β Dot Product & Norm in βΒ³ | Linear Algebra (Machine Learning Math) | Daily Math - DAY 11 | TBA | Last for Chapter 1 |
| Name | GitHub Username | Status - Role | Solution Link | |
|---|---|---|---|---|
| Phanie | @lymphoidcell | Ola! | Active - Host | View |
| My Mom | N/A. I will be the one uploading my mom's solution | N/A | Active - Member | View |
| Aisyah | @aisyahkhns | Ola! | TBA | View |
| Aurelia | @Roring-Aurelia | Ola! | TBA | View |
| Jessica | N/A. I will be the one uploading her solution | Ola! | TBA | View |
| Oci | @rosessea | TBA | TBA | View |
| TBA | TBA | TBA | TBA | TBA |
This category focuses on the mathematical foundations essential for understanding and implementing machine learning algorithms:
- Linear Algebra: Vector spaces, matrix operations, eigenvalues and eigenvectors, singular value decomposition, projections
- Probability Theory: Probability distributions, conditional probability, Bayes' theorem, random variables, expectation and variance
- Statistics: Statistical inference, hypothesis testing, maximum likelihood estimation, confidence intervals, regression analysis
This category covers the mathematical techniques required for deep learning and neural network optimization:
- Calculus: Differentiation, partial derivatives, chain rule, Taylor series, multivariable calculus
- Optimization: Gradient descent, convex optimization, Lagrange multipliers, constraint optimization, convergence analysis
- Matrix Calculus: Jacobians, Hessians, matrix derivatives, backpropagation mathematics
This category explores theoretical foundations and computational aspects of artificial intelligence:
- Information Theory: Entropy, mutual information, KL divergence, cross-entropy, information gain
- Graph Theory: Graph representations, traversal algorithms, network flows, spectral graph theory
- Logic: Propositional and predicate logic, proof theory, computational logic
- Algorithms: Complexity analysis, dynamic programming, greedy algorithms, divide and conquer
This category delves into higher-level mathematical topics that provide deeper theoretical insights:
- Measure Theory: Measurable spaces, integration theory, probability measures, convergence theorems
- Topology: Topological spaces, continuity, compactness, connectedness, metric spaces
- Functional Analysis: Normed spaces, Banach and Hilbert spaces, operators, spectral theory
- Advanced Linear Algebra: Tensor products, Jordan canonical form, spectral theorem, operator theory
Reference: How to Actually Get Better at Math
- Consistent Practice β Daily problem-solving for continuous learning momentum
- Deep Understanding β Build intuitive and rigorous comprehension beyond surface-level knowledge
- Collaborative Learning β Share diverse problem-solving approaches and insights
- Quality Documentation β Maintain detailed solution records for future reference
- Applied Focus β Connect theory to practical ML/DL/AI implementations
- Progressive Challenge β Systematically increase problem complexity
- Commitment to daily participation
- Focus on deep understanding over speed
- Constructive peer review engagement
- Clear technical documentation
Beyond the goals listed above, the most fascinating part of doing math daily is developing a deep and comprehensive understanding of the concepts. I promise this will help in many aspects of lifeβespecially if you work in STEM or finance.
Math isnβt just about solving problems; itβs about training the mind to think with structure, clarity, and logic.
- Fork this repository
- Add a new row to the Daily Progress table with the current date
- Specify the topic, category, and problem source
- If you've solved the problem, add your name and key insights
- Create a pull request with a clear description of your contribution
When documenting solutions, you may include:
- Problem Statement: Clear mathematical formulation
- Approach: Solution strategy and methodology
- Derivation: Step-by-step proof or computation
- Verification: Result validation (where applicable)
- Insights: Key learnings and broader connections
Formats: Markdown, LaTeX, or Jupyter notebooks depending on complexity
- Use GitHub Issues to propose problems or discuss specific topics
- Review others' solutions and provide constructive feedback
- Suggest improvements or alternative approaches through comments or pull requests
- Linear Algebra Done Wrong by Sergei Treil
- Matrices and Linear Transformations (21-241) Spring 2023 Lecture Notes by Elisa Bellah
Online PDF
GitHub repos from other users:
- https://github.com/shreeprasadbhat/matrix-theory
- https://github.com/RePlasma/IntroRandomMatrixLivan
- https://github.com/leogaudin/matrix
- Matrix Theory
| Resource | Topics Covered | File Format | Solutions/Answers | License | Source Link |
|---|---|---|---|---|---|
| Fundamentals of Matrix Algebra (Hartman) | Matrix arithmetic, inverses, determinants, eigenvalues/vectors, systems | PDF, online | End-of-section | CC BY-NC, free adaptation | LibreTexts |
| Matrix Algebra for Engineers (HKUST) | Matrix operations, projections, applications in engineering/statistics | Problems with answers | Free download, attribution | HKUST Math | |
| Random Matrices: Theory & Practice | Random matrix theory, research/AI/stats focus | PDF (arXiv) | Examples, theory | arXiv (CC BY/NC) | arXiv preprint |
| OpenStax College Algebra 2e β Ch 7.5 | Intro matrix operations, inverse, applications (college algebra level) | Online, PDF | Practice problems | CC BY | OpenStax |
- Linear Algebra
| Resource | Topics Covered | File Format | Solutions/Answers | License | Source Link |
|---|---|---|---|---|---|
| MIT OCW 18.06 | Matrices, eigenvalues, SVD | Full solutions | CC BY-NC-SA | MIT OCW 18.06 | |
| Hefferonβs Linear Algebra | All undergraduate topics | PDF, HTML | Full solutions | CC BY-SA | Hefferonβs Linear Algebra |
| Grasple/TU Delft | Complete course, SVD, QR | Interactive | Built-in feedback | CC (attribution) | Grasple Exercises |
| WeBWorK OER | Core topics, applications | Interactive | Automated grading | CC-BY (attribution required) | WeBWorK Problems |
| Erdman Problems | Systems, eigenvalues, det. | PDF, LaTeX | Odd-numbered ans. | Free non-commercial | Cannot paste the link. Try Googling for the resource keywords, thanks! |
| Dalhousie (Selinger) | Core, extra problems | PDF, online | Selected answers | CC BY 4.0 International | Dalhousie Open Text |
- TBA
This project is open source and available under the MIT License. Contributions are welcome from anyone interested in mathematical learning and collaboration.
Project Lead: Phanie | π§ phaniesql@gmail.com
For collaboration inquiries, questions, or suggestions, feel free to open an issue or reach out directly.