Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI API Error: 400 - Bad Request #137

Closed
1 task
eduardohilariodev opened this issue Mar 6, 2023 · 17 comments · Fixed by #147
Closed
1 task

OpenAI API Error: 400 - Bad Request #137

eduardohilariodev opened this issue Mar 6, 2023 · 17 comments · Fixed by #147
Labels
bug Something isn't working

Comments

@eduardohilariodev
Copy link

eduardohilariodev commented Mar 6, 2023

Bug description

image
OpenAI API Error: 400 - Bad Request

This one started to happen as soon as I updated from 1.2.0 to 1.3.0. It is a weird one because it happens some of the times, but not always, so reproducing it is just luck. I even changed the API key, but that doesn't seem to be the issue.

aicommits version

v1.3.0

Environment

System:
    OS: Windows 10 10.0.22621
    CPU: (16) x64 Intel(R) Core(TM) i7-10875H CPU @ 2.30GHz
    Memory: 42.57 GB / 63.83 GB
    Shell: PowerShell 7.3.3
  Binaries:
    Node: 14.21.2 - C:\Program Files\nodejs\node.EXE    
    Yarn: 1.22.19 - ~\AppData\Roaming\npm\yarn.CMD      
    npm: 6.14.17 - C:\Program Files\nodejs\npm.CMD

Can you contribute a fix?

  • I’m interested in opening a pull request for this issue.
@eduardohilariodev eduardohilariodev added bug Something isn't working pending triage labels Mar 6, 2023
@eduardohilariodev eduardohilariodev changed the title 400 Bad Request error OpenAI API Error: 400 - Bad Request Mar 6, 2023
@privatenumber
Copy link
Collaborator

Hmm it's a server-side error so it could be a problem with OpenAI.

Perhaps it's because we changed the way we count tokens: https://community.openai.com/t/error-retrieving-completions-400-bad-request/34004/6

Are your diffs really large?

@gabrielmoris
Copy link

I have exactly the same Problem, but I always get this 400 Error.

@privatenumber
Copy link
Collaborator

It would be helpful if you can share more info. What diff are you sending?

@eduardohilariodev
Copy link
Author

eduardohilariodev commented Mar 7, 2023

Hmm it's a server-side error so it could be a problem with OpenAI.

Perhaps it's because we changed the way we count tokens: https://community.openai.com/t/error-retrieving-completions-400-bad-request/34004/6

Are your diffs really large?

No, not really, the size of the diff doesn't make a difference, the error occurs on small and big diffs.

@privatenumber
Copy link
Collaborator

Can you provide it?

I can't do much with how little information is provided right now.

@Mobilpadde
Copy link

Just chiming in, I'm getting 404s all the time 🤷

@mahamaad
Copy link

mahamaad commented Mar 8, 2023

i have same issue

@eduardohilariodev
Copy link
Author

It would be helpful if you can share more info. What diff are you sending?

I tested it, and it probably happens with big diffs, where it doesn't show the "it's a big diff" feedback message. Probably is a good idea to have an explainer message on this 400 error, such as "Your plan has limited access to the API, consider upgrading to ChatGPT Plus" or similar.

@gabrielmoris
Copy link

I have exactly the same Problem, but I always get this 400 Error.

EDIT: after > npm update -g aicommits problem Solved

@privatenumber
Copy link
Collaborator

@Mobilpadde 404 is a completely different error and a more problematic one. It means the server you're reaching is returning a "Not found". Do you have some sort of network configuration (e.g. VPN)?

For anyone else experiencing a 400 error, an example if the diff you're sending would be very helpful so I can reproduce the problem on my end.

@mael-queau
Copy link

mael-queau commented Mar 8, 2023

@Mobilpadde 404 is a completely different error and a more problematic one. It means the server you're reaching is returning a "Not found". Do you have some sort of network configuration (e.g. VPN)?

For anyone else experiencing a 400 error, an example if the diff you're sending would be very helpful so I can reproduce the problem on my end.

I am experiencing this as well. In my case, I am consistently experiencing 400's on this:

diff --git a/tracabilite.ipynb b/tracabilite.ipynb
new file mode 100644
index 0000000..6301d1f
--- /dev/null
+++ b/tracabilite.ipynb
@@ -0,0 +1,121 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "source": [
+    "# Script utilitaire pour la traçabilité des contenants Berny"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 30,
+   "outputs": [],
+   "source": [
+    "file_items = \"export_public_Item_2023_03_08_16_03_42_201.csv\"\n",
+    "file_cleaning = \"export_public_Cleaning_2023_03_08_16_43_10_595.csv\"\n",
+    "file_delivery = \"export_public_Delivery_2023_03_08_16_43_24_349.csv\"\n",
+    "file_returns = \"export_public_Delivery_back_2023_03_08_16_43_32_493.csv\""
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 31,
+   "outputs": [],
+   "source": [
+    "import csv\n",
+    "import pandas as pd\n",
+    "import numpy as np\n",
+    "import matplotlib.pyplot as plt\n",
+    "\n",
+    "items = pd.read_csv(file_items, sep=\",\", encoding=\"utf-8\")\n",
+    "cleanings = pd.read_csv(file_cleaning, sep=\",\", encoding=\"utf-8\")\n",
+    "deliveries = pd.read_csv(file_delivery, sep=\",\", encoding=\"utf-8\")\n",
+    "returns = pd.read_csv(file_returns, sep=\",\", encoding=\"utf-8\")"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "markdown",
+   "source": [
+    "Items: ['id', 'size', 'arrival_date', 'arrival_country', 'in_transit', 'type', 'is_defunct']\n",
+    "Cleanings: ['item_id', 'created_date', 'cleaning_date', 'cleaner', 'id']\n",
+    "Deliveries: ['item_id', 'transportedIn', 'createdDate', 'delivery_date', 'id', 'shop']\n",
+    "Returns: ['is_notified', 'is_unsold', 'actor', 'item_id', 'date', 'lot', 'id', 'shop']"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "markdown",
+   "source": [
+    "# 1. Quels sont les contenants qui ont été utilisés au moins une fois ?"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 32,
+   "outputs": [],
+   "source": [
+    "items_cleaned = cleanings['item_id'].unique()\n",
+    "items_delivered = deliveries['item_id'].unique()\n",
+    "items_returned = returns['item_id'].unique()\n",
+    "items_used = np.unique(np.concatenate((items_cleaned, items_delivered, items_returned)))"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 33,
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Nombre de contenants utilisés: 40857\n"
+     ]
+    }
+   ],
+   "source": [
+    "print(\"Nombre de contenants utilisés: \" + str(len(items_used)))"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 2
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython2",
+   "version": "2.7.6"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}

@eduardohilariodev
Copy link
Author

@Mobilpadde 404 is a completely different error and a more problematic one. It means the server you're reaching is returning a "Not found". Do you have some sort of network configuration (e.g. VPN)?
For anyone else experiencing a 400 error, an example if the diff you're sending would be very helpful so I can reproduce the problem on my end.

I am experiencing this as well. In my case, I am consistently experiencing 400's on this:

diff --git a/tracabilite.ipynb b/tracabilite.ipynb
new file mode 100644
index 0000000..6301d1f
--- /dev/null
+++ b/tracabilite.ipynb
@@ -0,0 +1,121 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "source": [
+    "# Script utilitaire pour la traçabilité des contenants Berny"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 30,
+   "outputs": [],
+   "source": [
+    "file_items = \"export_public_Item_2023_03_08_16_03_42_201.csv\"\n",
+    "file_cleaning = \"export_public_Cleaning_2023_03_08_16_43_10_595.csv\"\n",
+    "file_delivery = \"export_public_Delivery_2023_03_08_16_43_24_349.csv\"\n",
+    "file_returns = \"export_public_Delivery_back_2023_03_08_16_43_32_493.csv\""
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 31,
+   "outputs": [],
+   "source": [
+    "import csv\n",
+    "import pandas as pd\n",
+    "import numpy as np\n",
+    "import matplotlib.pyplot as plt\n",
+    "\n",
+    "items = pd.read_csv(file_items, sep=\",\", encoding=\"utf-8\")\n",
+    "cleanings = pd.read_csv(file_cleaning, sep=\",\", encoding=\"utf-8\")\n",
+    "deliveries = pd.read_csv(file_delivery, sep=\",\", encoding=\"utf-8\")\n",
+    "returns = pd.read_csv(file_returns, sep=\",\", encoding=\"utf-8\")"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "markdown",
+   "source": [
+    "Items: ['id', 'size', 'arrival_date', 'arrival_country', 'in_transit', 'type', 'is_defunct']\n",
+    "Cleanings: ['item_id', 'created_date', 'cleaning_date', 'cleaner', 'id']\n",
+    "Deliveries: ['item_id', 'transportedIn', 'createdDate', 'delivery_date', 'id', 'shop']\n",
+    "Returns: ['is_notified', 'is_unsold', 'actor', 'item_id', 'date', 'lot', 'id', 'shop']"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "markdown",
+   "source": [
+    "# 1. Quels sont les contenants qui ont été utilisés au moins une fois ?"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 32,
+   "outputs": [],
+   "source": [
+    "items_cleaned = cleanings['item_id'].unique()\n",
+    "items_delivered = deliveries['item_id'].unique()\n",
+    "items_returned = returns['item_id'].unique()\n",
+    "items_used = np.unique(np.concatenate((items_cleaned, items_delivered, items_returned)))"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  },
+  {
+   "cell_type": "code",
+   "execution_count": 33,
+   "outputs": [
+    {
+     "name": "stdout",
+     "output_type": "stream",
+     "text": [
+      "Nombre de contenants utilisés: 40857\n"
+     ]
+    }
+   ],
+   "source": [
+    "print(\"Nombre de contenants utilisés: \" + str(len(items_used)))"
+   ],
+   "metadata": {
+    "collapsed": false
+   }
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Python 3",
+   "language": "python",
+   "name": "python3"
+  },
+  "language_info": {
+   "codemirror_mode": {
+    "name": "ipython",
+    "version": 2
+   },
+   "file_extension": ".py",
+   "mimetype": "text/x-python",
+   "name": "python",
+   "nbconvert_exporter": "python",
+   "pygments_lexer": "ipython2",
+   "version": "2.7.6"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 0
+}

It's probably cuz it's a big diff and the package is not throwing the correct message. Try using aicommits only on apparently small diffs.

@TunaKHH
Copy link

TunaKHH commented Mar 9, 2023

I'm using the previous version for now to get this working.
npm install -g aicommits@1.2.0

@mael-queau
Copy link

@edu-hilario I don't know if this is any indication of how it should perform, but trying the prompt in the playground with that diff didn't give me a length error, it worked just fine. If I'm not mistaken, the token limit should be the same there, right?

@jumang4423
Copy link

same here, @1.2.0 does work but newer is broken

@privatenumber
Copy link
Collaborator

I was finally able to reproduce this. Fix coming soon via #147

@sebastienfi
Copy link

1.5.0 solved this for me, try running npm update -g aicommits
Thanks @privatenumber

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

9 participants