Skip to content

Introduced tokenization, decoding, and prompt-engineering fundamentals for text generation. Demonstrated temperature, top-k/top-p sampling, few-shot prompts, and instruction-based generation, laying the groundwork for efficient and controlled LLM inference.

Notifications You must be signed in to change notification settings

Joe-Naz01/llm_basics

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 

Repository files navigation

LLM Fundamentals — Prompting & Text Generation Basics

Problem. Understand how large language models generate text and how to structure prompts for meaningful, controllable responses.

Approach.

  • Introduced core LLM concepts: tokenization, context windows, temperature, top-k/top-p sampling.
  • Demonstrated local or API-based text generation.
  • Explored few-shot and zero-shot prompting patterns.
  • Tested basic instruction tuning behavior using simple conversational inputs.
  • Highlighted trade-offs between creativity, determinism, and coherence.

Results (qualitative).

  • Identified how decoding parameters shape response variety.
  • Observed improved contextual consistency with few-shot examples.
  • Showcased reproducibility of responses under fixed random seeds.

What I Learned.

  • Practical control of LLM inference parameters.
  • Prompt design strategies for clarity and reliability.
  • Relationship between token limits, context management, and output quality.

Quick Start

git clone https://github.com/Joe-Naz01/llm_basics.git
cd llm_basics

conda env create -f environment.yml
conda activate llm_basics

pip install -r requirements.txt
jupyter notebook

About

Introduced tokenization, decoding, and prompt-engineering fundamentals for text generation. Demonstrated temperature, top-k/top-p sampling, few-shot prompts, and instruction-based generation, laying the groundwork for efficient and controlled LLM inference.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published