-
Notifications
You must be signed in to change notification settings - Fork 683
[fix]update apply_chat_template #4137
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[fix]update apply_chat_template #4137
Conversation
|
Thanks for your contribution! |
| """ | ||
|
|
||
| task["preprocess_start_time"] = time.time() | ||
| if task.get("chat_template_kwargs"): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
为空的话,也会有下面两个参数,需要追加。
| else: | ||
| raise ValueError("Invalid input: chat_template_kwargs must be a dict") | ||
| request.prompt_token_ids = self.messages2ids(task) | ||
| chat_template_kwargs["chat_template"] = kwargs.get("chat_template") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
与在线的赋值 层级保持一致, 都放在请求层。
fastdeploy/input/text_processor.py
Outdated
| raise ValueError("Invalid input: chat_template_kwargs must be a dict") | ||
| task.setdefault("enable_thinking", True) | ||
| request.prompt_token_ids = self.messages2ids(task) | ||
| chat_template_kwargs["chat_template"] = kwargs.get("chat_template") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
有 ernie 做同样调整。
17375eb to
64e6344
Compare
18970d9 to
d44972d
Compare
* update apply_chat_template * fix unittest * fix unittest * fix * fix * fix unit test * fix * fix unit test * add unit test
* [fix]Modify follow-up push parameters and Modify the verification method for thinking length (#4086) * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * add completion_token_ids * add logger * fix reasoning_max_tokens ParameterError * add unittest * add unittest * add unittest * add unittest * add unittest * add unit test * fix * [fix]update apply_chat_template (#4137) * update apply_chat_template * fix unittest * fix unittest * fix * fix * fix unit test * fix * fix unit test * add unit test
…4294) * [fix]Modify follow-up push parameters and Modify the verification method for thinking length (#4086) * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * 续推参数 generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式 * add completion_token_ids * add logger * fix reasoning_max_tokens ParameterError * add unittest * add unittest * add unittest * add unittest * add unittest * add unit test * fix * [fix]update apply_chat_template (#4137) * update apply_chat_template * fix unittest * fix unittest * fix * fix * fix unit test * fix * fix unit test * add unit test * fix reasoning_max_tokens
由于PaddleFormer 升级到了0.2.3,AutoTokenizer进行了改造,直接继承了hf的AutoTokenizer,导致apply_chat_template方法的用法发生变化,适配新版本apply_chat_template方法的用法。