Integrates ChatGPT into Google Sheets. No middleman reading your prompts involved. Supports local and AWS deployment using AWS CDK.
As simple as it could be, just ask your question in the function parameter and get an answer.
This function is the purpose of the whole endeavor. It is meant to be used in the following case: you have a question which involves several steps to get the best possible answer. During each step, you:
- Ask ChatGPT for an intermediate answer and clarification questions to improve an answer
- (optional) Ask ChatGPT for examples of "good" answers for clarifying questions
- Answer the clarifying questions and get a new version of end-result, after which the process repeats as required.
- Clone this repo
python3 -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt
cp .env.example .env
- Edit
.env
, put your OpenAI key there (you can get your API key on platform.openai.com) python sheetgpt.py
and go to localhost:5000, you should seeworking
message- (optional) if you want to run SheetGPT from you local computer, use ngrok or localhost.run to forward port 5000 and use the resulting URL in the Google Apps Script later
In AWS using AWS CDK
- Do the local setup part
pip install -r requirements-cdk.txt
cdk bootstrap && cdk deploy
, as output you will get a Lambda Function URL like thishttps://[some letters].lambda-url.eu-central-1.on.aws
- Check that Lambda Function works by visiting the address, you should see
working
message. If not — use suppliedlogs.sh
script, which will tail function logs to the terminal, and fix the errors
After any kind of function setup, preceed to the
- Create new or open existing Google Sheet
- Go to Extensions -> Apps Script
3. Copy/Paste the code from sheets.js to Code.gs
- Change the baseUrl variable to point to your code endpoint and save the project
You are all set! Now you can query ChatGPT from Google Sheets!
Sometimes ChatGPT refuses to answer in JSON-only. In most of the cases, slightly changing the request fixes the problem.
You can retrigger the request by adding a dummy parameter to the method and trace the actual answer in logs manually or using provided logs.sh
command which tails logs from CloudWatch.
It feels natural to treat complex problems as problem trees (or graphs), where you go from root problem statement and dig into more and more details as solution process flows.
Original ChatGPT interface doesn't feel comfortable tackling such a problems, because you are limited to straight flow only. And when you are going deep into one single problem, ChatGPT tends to forget previous correcttions, at least in my experiments.
To explore possibilities of such tree-structured writing flow, I created a simple adapter of ChatGPT to Google Sheets.
The architecture involves a caching layer function written in Python. It is anavoidable because of two things:
- Responsiveness. OpenAI API is really unstable: response times are from milliseconds to 10s of seconds, it can even timeout from time to time. You'd better use cached responses as hard as you can.
- Money. Google Sheets re-evaluate sheet's data from time to time. Why whould you pay for repeated requests
- Cost. Caching layer will use AWS free tier only (if you won't share your Google Sheet with millions of users)
- Security. Your endpoint URL is randomly generated both in case you use local deployment and when you use Lambda Function URL.