Skip to content

Rhul27/DevOpsAssistant

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

16 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DevOps Assistant πŸ€–

The DevOps Assistant is an AI-powered tool designed to help DevOps engineers and system administrators automate tasks, execute commands on remote servers, and generate accurate Bash commands using a local LLM (Large Language Model). It integrates with Streamlit for a user-friendly interface and uses SQLite for command history and caching.


Features ✨

  • SSH Integration: Connect to remote servers securely via SSH.
  • AI-Powered Command Generation: Use a local LLM (e.g., Ollama) to generate accurate Bash commands.
  • Command Execution: Execute commands on remote servers and view results in real-time.
  • Command History: Store and retrieve past commands and responses for future reference.
  • Caching Mechanism: Cache frequently used commands to improve response times.
  • User Authentication: Secure access with user authentication (optional).
  • Streamlit UI: Intuitive and interactive web-based interface.

Prerequisites πŸ“Š

Before running the DevOps Assistant, ensure you have the following installed:

  • Python 3.8+: Download Python
  • Ollama: A local LLM server. Install Ollama
  • Streamlit: For the web interface.
  • Paramiko: For SSH connections.
  • SQLite3: For database storage (included with Python).

Installation πŸ› οΈ

  1. Clone the repository:

    git clone https://github.com/Rhul27/DevOpsAssistant.git
    cd DevOpsAssistant
  2. Install dependencies:

    pip install -r requirements.txt
  3. Set up Ollama:

    Install Ollama and start the server:

    ollama serve

    Download a model (e.g., llama3.2):

    ollama pull llama3.2
  4. Run the Streamlit app:

    streamlit run main.py
  5. Access the app: Open your browser and navigate to http://localhost:8501.


Usage πŸš€

Connect to the Server:

  1. Enter the server's IP address, username, and password in the sidebar.
  2. Click "Connect to Server".

Connect to the LLM Model:

  1. Select a model from the dropdown in the sidebar.
  2. Click "Connect to LLM Model".

Ask a Question:

  1. Enter your question in the main input box (e.g., "How do I check disk usage on Linux?").
  2. Click "Submit" to get a response.

View Command History:

All executed commands and responses are stored in the database and displayed in the Command History section.


Folder Structure πŸ—‚οΈ

DevOpsAssistant/
β”œβ”€β”€ Core/                     # Core functionality
β”‚   β”œβ”€β”€ func.py               # SSH, LLM, and command execution
β”‚   β”œβ”€β”€ database.py           # Database operations
β”‚   β”œβ”€β”€ auth.py               # User authentication
β”‚   └── utils.py              # Utility functions
β”œβ”€β”€ models/                   # Data models
β”‚   └── command_history.py    # SQLite model for command history
β”œβ”€β”€ main.py                   # Streamlit app entry point
β”œβ”€β”€ requirements.txt          # Python dependencies
└── devops_assistant.db       # SQLite database file

Configuration βš™οΈ

  • Ollama Server URL: Default is http://localhost:11434. Update in the sidebar if needed.
  • Default Model: Set to llama3.2. Change in func.py if required.
  • SSH Timeout: Default is 10 seconds. Adjust in func.py.

Contributing 🀝

Contributions are welcome! If you'd like to contribute, please follow these steps:

  1. Fork the repository.
  2. Create a new branch:
    git checkout -b feature/YourFeatureName
  3. Commit your changes:
    git commit -m 'Add some feature'
  4. Push to the branch:
    git push origin feature/YourFeatureName
  5. Open a pull request.

License πŸ“

This project is licensed under the MIT License. See the LICENSE file for details.


Acknowledgments πŸ™

  • Ollama: For providing the local LLM server.
  • Streamlit: For the easy-to-use web interface.
  • Paramiko: For SSH connectivity.

About

The DevOps Assistant is an AI-powered tool for automating DevOps tasks. It generates and executes Bash commands on remote servers using Streamlit and Ollama. Perfect for simplifying server management and automating repetitive tasks. πŸ€–πŸš€

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages