Skip to content

Use safe loading for .pth checkpoint #695

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed

Conversation

albanD
Copy link

@albanD albanD commented Apr 2, 2023

This change ensures that only weights can be loaded from the checkpoints, which is the case for the official llama checkpoints. It reduces the potential impact of a malicious checkpoint.

This argument has been available since at least PyTorch 1.13, but I haven't checked earlier versions. If you need to support older versions of PyTorch, please let me know and I can modify the code to be backward compatible.

@prusnak
Copy link
Collaborator

prusnak commented Apr 2, 2023

Can you please also update convert-gptq-to-ggml.py?

@albanD
Copy link
Author

albanD commented Apr 2, 2023

I can for sure.
I don't have a gptq file on hand so I couldn't check if regular gptq files contains things that are not weights?
If you could try on your end or give me instructions how to get one I can check that.

@albanD
Copy link
Author

albanD commented Apr 2, 2023

Added the update but this needs to be tested before merging!

@ggerganov
Copy link
Member

This change was added before and it was a real pain because my Python wasn't the proper version.
Unless this can somehow check if the Python version supports this flag, I prefer to not add it because it will mostly cause trouble than anything else.

Please reopen if you can make it backwards compatible

@ggerganov ggerganov closed this Apr 13, 2023
@prusnak
Copy link
Collaborator

prusnak commented Apr 13, 2023

The new conversion script in #545 does not use torch.load, so we won't need this anymore if it is merged.

@albanD
Copy link
Author

albanD commented Apr 13, 2023

You do run regular python unpickler on it so unsafe idd.
But I guess the safetensors is the right way to go for the people who want this extra safety.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants