Skip to content

AshrithaB/Blog-Generation-using-LLMs

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Blog Generation using LLMs

Overview

This project explores the generation of blogs using Large Language Models (LLMs). It leverages the Llama 2 release, which introduces a family of pretrained and fine-tuned LLMs with varying parameters (7B, 13B, 70B). These models exhibit significant improvements over the previous Llama 1 models, including training on 40% more tokens, a longer context length of 4k tokens, and the use of grouped-query attention for fast inference of the 70B model.

Features

  • Pretrained Models: The project includes pretrained models with 7B parameters.
  • Improved Performance: Llama 2 models show enhancements in training data, context length, and fast inference using grouped-query attention.

If you like this, please leave a ⭐! Thank you!

Getting Started

Prerequisites

  • Python 3.x
  • Dependencies listed in requirements.txt

Installation

pip install -r requirements.txt

Usage

  1. Clone the repository:
git clone https://github.com/AshrithaB/Blog-Generation-using-LLMs.git
cd Blog-Generation-using-LLMs
  1. Install dependencies:
pip install -r requirements.txt
  1. Run the Streamlit App
streamlit run app.py
  1. Explore the pretrained models and utilize them for blog generation.

Contribute

Contributions from the open-source community are welcomed. If you'd like to contribute to this project, please follow these steps:

  1. Fork the repository.
  2. Create a new branch for your feature or bug fix.
  3. Make your changes and commit them.
  4. Push your changes to your fork.
  5. Create a pull request, explaining the changes you've made.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Thanks to the community for continuous support and contributions.
  • Llama 2 release authors and contributors.

About

Blog Generation using LLMs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages