-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for interactive mode / instructionGPT? (Or add this workaround in the readme?) #126
Comments
is this why the responses are so bland and sometimes short? if so i woulld love to see this added then! |
It would be useful to have instructions on how to execute this workaround. you don't really explain what exact files were changed, and how to run it. |
The workaround doesn't include actually modifying anything in dalai (that didn't work well) i provided the trick i used in the node.js example in the code block, which just prepended additional prompt before the actual instruction you're trying to ask alpaca to do, that's all! |
Specifically "Below is an instruction that describes a task. Write a response that appropriately completes the request. Also i linked back to the resource i found this trick in and it has further instructions! |
Thank you. The only change is the one in dalai/index.js, i see. |
I actually went ahead and opened a pull request to add prompt templates straight into the web-ui so it's more accessible! #199 <- check it out |
I added the interactive mode in this PR. |
@tibzejoker your fork is amazing, running on windows 10, im getting it to do some coding! in javascript! unfortunately, I seem to encounter an error where X gets added randomly in my input and outputs. I give you an example. In chat mode i can use 1 word prompts no problem, but if i add a second word it gets interpreted as a parameter and it fails with this error dalaialpaca> > [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; C:UsersXXXXdalaialpaca�uildReleasemain --seed -1 --threads 30 --n_predict 200 --model models/30B/ggml-model-q4_0.bin --top_k 420 --top_p 0.9 --temp 0.9 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to
PS C:UsersXXXXdalaialpaca> [System.Console]::OutputEncoding=[System.Console]::InputEncoding=[System.Text.Encoding]::UTF8; C:UsersXXXXdalaialpaca�uildReleasemain --seed -1 --threads 30 --n_predict 200 --model models/30B/ggml-model-q4_0.bin --top_k 420 --top_p 0.9 --temp 0.9 --repeat_last_n 64 --repeat_penalty 1.3 -p "The expected response for a highly intelligent chatbot to usage: C:UsersXXXXdalaialpaca�uildReleasemain.exe [options] options: As you can see there are many X added at the end. If i run it in normal mode i can use prompts, but i still get random X letters after a few lines generated: example: Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request. Instruction:write a pong game in javascript Response:// JavaScript Pong Game import * as p5 from 'p5js'; // import library let canvas;X // declare variable for DOM element -canvas const width = window.innerWidth /2, height=window.innertwidth/10 ; // dimensions of game field let xPosition, yPosistion ,xSpeed,ySpeed;X // ball position and speed variables // ---initialize-- const playerX=375,playerY=468 ;X // initial values for X axis canvas = createCanvas(width, height);X // creates canvas object using window.innerWidth/2 (as width) and Window innerHeight/10 as Height // ---draw setup-- background("#EEE");X // sets the backgound color to a light gray let rightBoundary=canvas.parentElement.offsetWidth -width; // Sets boundary limit for ball on X axis const leftBoundary=-width ;X // setting values of boundaries (X and Y) // ---draw setup-- player1 = createPaddle(0, 625); // creates object with Paddel's coordinate player2=createPaddle(-87.5,-304 );X // sets the coordinates for the second paddle //---controls initialization----------------- let keysPressed;XX //declaring variables initControls(); let ballVelocity = 16,leftScore ,rightscore ;X // defines velocity of a Ball (this variable is not changed in function) if (!keysPressed.up){ playerY=player2 .y; }XX // Up key pressed else if(!keysPressed.down){ ballVelocity=-16, xSpeed =50 ; // Down Key Pressed } else {xPosition+=ballvelocity ; }X //Moving Ball to left (X axis) // ---controls initialization----------------- // Up key pressed if(!keysPressed.left){ yPosistion=player1 .y; }XX // Left Key Pressed else if (! keysPressed.right ){ xSpeed = -50 ; ballVelocity=-ballvelocity} // ---controls initialization----------------- else {xPosition+=bv; yPosistion+=(up/down) } // Right Key Pressed else if (keysPressed.right){ xSpeed = 50 ; ballvelocity= -16;}XX // Down key pressed // ---controls initialization----------------- else {xPosition+=bv; yPosistion+=(up/down) } // Left Key Pressed |
* Update README.md * Update README.md remove facebook
Context
I'm using alpaca.7b on M1 MacBook Pro 32GB ram. I'm trying to get "ChatGPT" like instructions following interactive mode.
The Issue
I tried passing in
interactive: true
into the Node.js API but didn't work very well likely due to the following piece of codedalai/index.js
Lines 182 to 187 in 56850c7
Potential Solution
I've gotten the model to trigger interactive mode by updating it to since llama.cpp has support for it (https://github.com/ggerganov/llama.cpp#interactive-mode)
This still doesn't work quite well. Any chances we could add support for interactive mode? Or maybe I could get some pointers and I can try to figure it out
Current Workaround
I've worked around this (and hopefully this will be helpful for those stumbling upon this) by following similar instructions for instruction mode with alpaca I found in llama.cpp (https://github.com/ggerganov/llama.cpp#instruction-mode-with-alpaca) with alpaca.cpp default parameters
Should we add this in the README? And hopefully w can start a discussion about interactive/instruction mode and getting that implementing straight into the node.js api!
The text was updated successfully, but these errors were encountered: