-
Notifications
You must be signed in to change notification settings - Fork 11.4k
Code showing when running. #717
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Comments
What are you trying to achieve? If you want to chat you must provide an example/context for that. If you want a story you must introduce it before starting the inference. For example try adding this argument: |
Yes, you forgot to set a prompt so Llama just came up with its own text completely on her own! :-) |
I was used to alpaca I didn't have to. WHen I run llama with -i it did it also. |
Because alpaca.cpp adds context without showing you, it's a "wrapper" for alpaca finetuning |
Co-authored-by: Andrei <abetlen@gmail.com>
When I start chat.exe with a alpaca bin I get.
main: seed = 1680456908
llama_model_load: loading model from 'models/llama-7B/ggml-model.bin' - please wait ...
llama_model_load: n_vocab = 32000
llama_model_load: n_ctx = 512
llama_model_load: n_embd = 4096
llama_model_load: n_mult = 256
llama_model_load: n_head = 32
llama_model_load: n_layer = 32
llama_model_load: n_rot = 128
llama_model_load: f16 = 3
llama_model_load: n_ff = 11008
llama_model_load: n_parts = 1
llama_model_load: type = 1
llama_model_load: ggml map size = 4820.95 MB
llama_model_load: ggml ctx size = 81.25 KB
llama_model_load: mem required = 6613.03 MB (+ 1026.00 MB per state)
llama_model_load: loading tensors from 'models/llama-7B/ggml-model.bin'
llama_model_load: model size = 4820.52 MB / num tensors = 291
llama_init_from_file: kv self size = 256.00 MB
system_info: n_threads = 4 / 8 | AVX = 1 | AVX2 = 0 | AVX512 = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 0 | SSE3 = 0 | VSX = 0 |
main: interactive mode on.
sampling: temp = 0.800000, top_k = 40, top_p = 0.950000, repeat_last_n = 64, repeat_penalty = 1.100000
generate: n_ctx = 512, n_batch = 8, n_predict = 128, n_keep = 0
== Running in interactive mode. ==
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
namespace _1.Write_a_program_to_find_the_minimum_element_in_an_array
{
class Program
{
static void Main()
{
//Create an array of five elements
int[] arr = new int[5];
The text was updated successfully, but these errors were encountered: