Skip to content

chigwell/architextor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

Architextor

PyPI version License: MIT Downloads LinkedIn

A Python package for processing and interpreting structured descriptions of neural network architectures from textual input. Architextor uses pattern matching and retries to parse user-provided text about ANN designs (layer types, connections, parameters, etc.) and returns a formalized, machine-readable representation (e.g., JSON schema or graph outline).

Installation

pip install architextor

Usage

Basic Example

from architextor import architextor

user_input = "A neural network with two dense layers: first layer has 128 units and ReLU activation, second has 10 units and softmax activation."
response = architextor(user_input)
print(response)

Using a Custom LLM

You can pass your own LangChain-compatible LLM instance to use OpenAI:

from langchain_openai import ChatOpenAI
from architextor import architextor

llm = ChatOpenAI()
response = architextor(user_input, llm=llm)

use Anthropic:

from langchain_anthropic import ChatAnthropic
from architextor import architextor

llm = ChatAnthropic()
response = architextor(user_input, llm=llm)

use Google:

from langchain_google_genai import ChatGoogleGenerativeAI
from architextor import architextor

llm = ChatGoogleGenerativeAI()
response = architextor(user_input, llm=llm)

Using a Custom API Key

The default LLM is ChatLLM7 (from langchain_llm7). You can provide your own API key:

Via environment variable:

export LLM7_API_KEY="your_api_key_here"

Or directly in code:

response = architextor(user_input, api_key="your_api_key_here")

Get a free API key by registering at https://token.llm7.io/.

Parameters

  • user_input (str): The user input text describing the neural network architecture.
  • llm (Optional[BaseChatModel]): A LangChain LLM instance. If not provided, defaults to ChatLLM7.
  • api_key (Optional[str]): API key for LLM7. If not provided, defaults to the LLM7_API_KEY environment variable.

Default Rate Limits

The default rate limits for LLM7 free tier are sufficient for most use cases. For higher rate limits, provide your own API key.

Issues

Report issues or feature requests on GitHub.

Author

Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages