microsoft / transformerviz

Investigations on Transformer Visualizations by the Aether Prototyping and Incubation team

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ValueError: `top_p` has to be a float > 0 and < 1, but is 0

jlema opened this issue · comments

This is using default values on the text generation prompt. Stack trace below.

127.0.0.1 - - [25/May/2021 09:43:22] "OPTIONS /api/v1/generate_text HTTP/1.1" 200 -
[2021-05-25 09:43:22,350] ERROR in app: Exception on /api/v1/generate_text [POST]
Traceback (most recent call last):
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask\app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask\app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask_cors\extension.py", line 165, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask\app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask_compat.py", line 39, in reraise
raise value
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask\app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\flask\app.py", line 1936, in dispatch_request
return self.view_functionsrule.endpoint
File "C:\Users\jlema\git\transformerviz\transformerviz\server.py", line 92, in generate_text
num_beams=num_beams)
File "C:\Users\jlema\git\transformerviz\transformerviz\helpers\utils.py", line 64, in generate_text
temperature=temperature, num_beams=num_beams)
File "C:\Users\jlema\git\transformerviz\transformerviz\helpers\utils.py", line 36, in generate_sentences_hf
return_dict_in_generate=True
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\torch\autograd\grad_mode.py", line 15, in decorate_context
return func(*args, **kwargs)
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\transformers\generation_utils.py", line 963, in generate
top_k=top_k, top_p=top_p, temperature=temperature, num_beams=num_beams
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\transformers\generation_utils.py", line 541, in _get_logits_warper
warpers.append(TopPLogitsWarper(top_p=top_p, min_tokens_to_keep=(2 if num_beams > 1 else 1)))
File "c:\users\jlema\miniconda3\envs\transformerviz\lib\site-packages\transformers\generation_logits_process.py", line 185, in init
raise ValueError(f"top_p has to be a float > 0 and < 1, but is {top_p}")
ValueError: top_p has to be a float > 0 and < 1, but is 0