Skip to content

Stop keywords #365

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from
Closed

Stop keywords #365

wants to merge 3 commits into from

Conversation

joshmackwilliams
Copy link

Implements #57.

Stop keywords can be specified using the "--stop" parameter. Upon seeing one of these keywords in the generated output, the model will terminate generation immediately. Like reverse prompts, multiple stop keywords can be specified by specifying the --stop argument multiple times.

The implementation is heavily based on the reverse prompt implementation to keep things simple. Tested using 7B (quantized) in both interactive and non-interactive modes.

@gjmulder gjmulder added enhancement New feature or request generation quality Quality of model output labels Mar 21, 2023
@anzz1
Copy link
Contributor

anzz1 commented Mar 21, 2023

Great feature!

Somewhat related to this, but for input : #71 (comment)

@SpeedyCraftah
Copy link

Perfect!

@KevinColemanInc KevinColemanInc mentioned this pull request Mar 26, 2023
@prusnak
Copy link
Collaborator

prusnak commented Mar 30, 2023

Why are multiple keywords needed? Isn't just one enough (for example [end of text])?

@joshmackwilliams
Copy link
Author

Sometimes we want to end before Llama decides to generate [end of text], which is the reason this feature was requested. But, sometimes we also might want multiple stop conditions. An example, off the top of my head, might be to stop at the text "YOU WIN" or "GAME OVER" (obviously fairly contrived, but it makes the point that there could be multiple ways in which a generation could terminate).

Also, it's not too hard to implement since the reverse prompt logic already does the heavy lifting. It doesn't really hurt anything to allow multiple keywords, so it seemed like a worthwhile investment.

Unrelated; I've just realized that a lot of conflicts have popped up in this PR. I'll try to correct those over the next few days.

@joshmackwilliams
Copy link
Author

Closing in favor of #769

@ejones ejones mentioned this pull request May 10, 2023
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
….97.0

Bump fastapi from 0.96.0 to 0.97.0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request generation quality Quality of model output
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants