Skip to content

terminate called after throwing an instance of 'std::runtime_error' | what(): unexpectedly reached end of file | Aborted (core dumped) #6

@anhbsn

Description

@anhbsn

Hello, I am running the llama-2-7b-chat.ggmlv3.q4_0.bin model with Run_llama2_local_cpu_upload.
My systems: Ubuntu 20.04. I ran on my local computer (Windows), it work very well. But when I run on other machine (Server), it not work.

I use this model with code from https://github.com/MuhammadMoinFaisal/LargeLanguageModelsProjects/tree/main/Run_llama2_local_cpu_upload

Error:
terminate called after throwing an instance of 'std::runtime_error' what(): unexpectedly reached end of file Aborted (core dumped)

if you have any solution, pls show me, thank you so much!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions