-
Notifications
You must be signed in to change notification settings - Fork 11.5k
Add script to convert old ggml files to newer version #539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
(magic, vocab_size, dim, multiple_of, n_heads, n_layers, rot, ftype) = header | ||
|
||
if magic != 0x67676d6c: | ||
raise Exception('Invalid file magic. Must be an old style ggml file.') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
wait im confused, isn't it supposed to be old style ggml file for this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It expects the unversioned GGML model and produces the versioned one.
#define LLAMA_FILE_MAGIC_UNVERSIONED 0x67676d6c // pre-versioned files
@@ -320,7 +320,7 @@ static bool llama_model_load( | |||
uint32_t magic; | |||
fin.read((char *) &magic, sizeof(magic)); | |||
if (magic == LLAMA_FILE_MAGIC_UNVERSIONED) { | |||
fprintf(stderr, "%s: invalid model file '%s' (too old, regenerate your model files!)\n", | |||
fprintf(stderr, "%s: invalid model file '%s' (too old, regenerate your model files or convert them with convert-unversioned-ggml-to-ggml.py!)\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this shouldn't be changed, as the conversion script is unsupported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it's user friendly if it gives a hint to user how to resolve the situation with incompatible model.
I think it would be good to have the script print a disclaimer that the script is unsupported and results are not guaranteed, and to not post any issues regarding models generated with this. |
Or another idea, post it in the discussions as an attachment? So it can be found there when needed, but at the same time it wouldn't promote using old model conversions? Idk really, I'd like to hear other opinions too 😄 |
The model isn't all that different. The only missing thing is score in vocabulary and I could theoretically fill that in from |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's add this for now, but it will eventually be removed once everyone updates their models
Followup to: #526