Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- >>> from gpt4all import GPT4All
- >>> model = GPT4All('orca-mini-3b.ggmlv3.q4_0', model_path='./', n_threads = 8)
- Found model file at ./orca-mini-3b.ggmlv3.q4_0.bin
- llama.cpp: loading model from ./orca-mini-3b.ggmlv3.q4_0.bin
- llama_model_load_internal: format = ggjt v3 (latest)
- llama_model_load_internal: n_vocab = 32000
- llama_model_load_internal: n_ctx = 2048
- llama_model_load_internal: n_embd = 3200
- llama_model_load_internal: n_mult = 240
- llama_model_load_internal: n_head = 32
- llama_model_load_internal: n_layer = 26
- llama_model_load_internal: n_rot = 100
- llama_model_load_internal: ftype = 2 (mostly Q4_0)
- llama_model_load_internal: n_ff = 8640
- llama_model_load_internal: n_parts = 1
- llama_model_load_internal: model size = 3B
- llama_model_load_internal: ggml ctx size = 0.06 MB
- llama_model_load_internal: mem required = 2862.72 MB (+ 682.00 MB per state)
- llama_new_context_with_model: kv self size = 650.00 MB
- >>> with model.chat_session():
- ... while True:
- ... user_message = input("Human: ")
- ... for token in model.generate(user_message, streaming = True, n_batch = 100, reverse_prompts = [" Biden"]):
- ... print(token, end="", flush=True)
- ... print()
- ...
- Human: Who is the president of the US?
- As of August 2021, the President of the United States is Joe
- Human:
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement