fromthesky commited on
Commit
0950b63
·
1 Parent(s): 7c2160a

Updated readme.

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -45,18 +45,18 @@ Using `pipeline`:
45
  ```python
46
  from transformers import pipeline
47
 
48
- pipeline = pipeline(
49
- task="text-generation",
50
- model="fromthesky/PLDR-LLM-v51G-106M-1",
51
- device="cuda", # or "cpu"
52
- trust_remote_code=True
53
- )
54
 
55
  prompt="The quick brown fox jumps over the lazy dog."
56
 
57
- output=pipeline(prompt, top_p=0.6, top_k=0, temperature=1, do_sample=True,
58
- tokenizer_encode_kwargs={"add_special_tokens":False},
59
- use_cache=True, max_new_tokens=100)
60
  print(output[0]["generated_text"])
61
  ```
62
 
 
45
  ```python
46
  from transformers import pipeline
47
 
48
+ text_generator = pipeline(
49
+ task="text-generation",
50
+ model="fromthesky/PLDR-LLM-v51G-106M-1",
51
+ device="cuda", # or "cpu"
52
+ trust_remote_code=True
53
+ )
54
 
55
  prompt="The quick brown fox jumps over the lazy dog."
56
 
57
+ output=text_generator(prompt, top_p=0.6, top_k=0, temperature=1, do_sample=True,
58
+ tokenizer_encode_kwargs={"add_special_tokens":False},
59
+ use_cache=True, max_new_tokens=100)
60
  print(output[0]["generated_text"])
61
  ```
62