I'm a passionate and curious learner with a deep interest in Machine Learning, Deep Learning, and especially the rapidly evolving field of Large Language Models (LLMs). I enjoy exploring the intersection of theory and application β from understanding model architectures to building real-world AI systems.
- Large Language Models (LLMs): Architecture, fine-tuning, prompting, and function calling
- Transformer-based Models: Attention mechanisms
- Natural Language Processing (NLP): Semantic understanding, embeddings, generative text
- Mathematics of Deep Learning: Optimization, result explainability
- Design and deployment of LLM-powered applications
- Prompt engineering and multi-agent orchestration
- Emerging paradigms like tool-augmented LLMs, retrieval-augmented generation, and structured function calling
- Keeping pace with cutting-edge research and model developments in the LLM space
- Collaborations on LLM/NLP-based projects or research
- Contributing to open-source tools around language models
- Discussions on trends in AI research, tools, and systems
- Architectures of LLMs and how they work under the hood
- Getting started with building applications using LLMs
- Transformer theory or NLP concepts you're curious about
- Anything related to ML/DL β happy to share what I know!
I love astronomy and spend a fair share of my day listening to music.