Skip to content

llama_load_session_file throws exception without any handler #1716

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
prsyahmi opened this issue Jun 6, 2023 · 3 comments
Closed

llama_load_session_file throws exception without any handler #1716

prsyahmi opened this issue Jun 6, 2023 · 3 comments
Labels
good first issue Good for newcomers

Comments

@prsyahmi
Copy link

prsyahmi commented Jun 6, 2023

The code inside llama_load_session_file uses llama_file::read_raw which throws an exception if read error happens. In my case, 0-byte file due to #1699. This exception is not caught by any handler and just crash.
The rest of the llama_load_session_file will return false for something invalid.

I propose to convert all checks to throw and handle them.

It might be a good idea to have an option to continue/overwrite prompt cache file if it is invalid.

@ggerganov ggerganov added the good first issue Good for newcomers label Jun 10, 2023
@randxie
Copy link
Contributor

randxie commented Jun 27, 2023

I am looking into this issue, and have a question for the proposal. If we convert all the check to throw, e.g.

Original:

if (magic != LLAMA_SESSION_MAGIC || version != LLAMA_SESSION_VERSION) {
    fprintf(stderr, "%s : unknown (magic, version) for session file: %08x, %08x\n", __func__, magic, version);
    return false;
}

becomes:

if (magic != LLAMA_SESSION_MAGIC || version != LLAMA_SESSION_VERSION) {
    throw std::runtime_error(format("%s : unknown (magic, version) for session file: %08x, %08x\n", __func__, magic, version));
}

Should we also change the function signature to be void instead of returning a boolean?

bool llama_load_session_file(...) -> void llama_load_session_file(...)

@prsyahmi
Copy link
Author

prsyahmi commented Jun 27, 2023

@randxie If llama_load_session_file is an API, then it should return bool and handle the exception inside llama_load_session_file and return false.

Edit:
It is an API https://github.com/ggerganov/llama.cpp/blob/0be54f75a6c3e9a09ea71bdfcdabf9a996a0549b/llama.h#L215

@ggerganov
Copy link
Member

fixed by #2022

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers
Projects
None yet
Development

No branches or pull requests

3 participants