Skip to content

Conversation

@luukunn
Copy link
Collaborator

@luukunn luukunn commented Sep 16, 2025

由于PaddleFormer 升级到了0.2.3,AutoTokenizer进行了改造,直接继承了hf的AutoTokenizer,导致apply_chat_template方法的用法发生变化,适配新版本apply_chat_template方法的用法。

@paddle-bot
Copy link

paddle-bot bot commented Sep 16, 2025

Thanks for your contribution!

@paddle-bot paddle-bot bot added the contributor External developers label Sep 16, 2025
"""

task["preprocess_start_time"] = time.time()
if task.get("chat_template_kwargs"):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

为空的话,也会有下面两个参数,需要追加。

else:
raise ValueError("Invalid input: chat_template_kwargs must be a dict")
request.prompt_token_ids = self.messages2ids(task)
chat_template_kwargs["chat_template"] = kwargs.get("chat_template")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

与在线的赋值 层级保持一致, 都放在请求层。

raise ValueError("Invalid input: chat_template_kwargs must be a dict")
task.setdefault("enable_thinking", True)
request.prompt_token_ids = self.messages2ids(task)
chat_template_kwargs["chat_template"] = kwargs.get("chat_template")
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

有 ernie 做同样调整。

@luukunn luukunn force-pushed the update_chat_template branch from 17375eb to 64e6344 Compare September 18, 2025 03:14
@luukunn luukunn force-pushed the update_chat_template branch from 18970d9 to d44972d Compare September 19, 2025 07:29
@LiqinruiG LiqinruiG merged commit 18f4977 into PaddlePaddle:develop Sep 24, 2025
25 of 28 checks passed
luukunn added a commit to luukunn/FastDeploy that referenced this pull request Sep 24, 2025
* update apply_chat_template

* fix unittest

* fix unittest

* fix

* fix

* fix unit test

* fix

* fix unit test

* add unit test
Jiang-Jia-Jun pushed a commit that referenced this pull request Sep 25, 2025
* [fix]Modify follow-up push parameters and Modify the verification method for thinking length (#4086)

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* add completion_token_ids

* add logger

* fix reasoning_max_tokens ParameterError

* add unittest

* add unittest

* add unittest

* add unittest

* add unittest

* add unit test

* fix

* [fix]update apply_chat_template (#4137)

* update apply_chat_template

* fix unittest

* fix unittest

* fix

* fix

* fix unit test

* fix

* fix unit test

* add unit test
Jiang-Jia-Jun pushed a commit that referenced this pull request Sep 28, 2025
…4294)

* [fix]Modify follow-up push parameters and Modify the verification method for thinking length (#4086)

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* 续推参数  generated_token_ids 修改成 completion_token_ids;修改思考长度校验方式

* add completion_token_ids

* add logger

* fix reasoning_max_tokens ParameterError

* add unittest

* add unittest

* add unittest

* add unittest

* add unittest

* add unit test

* fix

* [fix]update apply_chat_template (#4137)

* update apply_chat_template

* fix unittest

* fix unittest

* fix

* fix

* fix unit test

* fix

* fix unit test

* add unit test

* fix reasoning_max_tokens
@luukunn luukunn deleted the update_chat_template branch December 3, 2025 09:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

contributor External developers

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants