-
Notifications
You must be signed in to change notification settings - Fork 74
Cannot pass in PEFT configs when creating a finetuning job #325
Description
Describe the bug
I'm unable to pass in PEFT configs when doing FineTune.create(). The documentation states that it should be a "dict of parameters" but when I pass in a dictionary I get the following error:
llmengine.errors.UnknownError: Internal Server Error: <class 'launch.api_client.exceptions.ApiValueError'>: Invalid inputs given to generate an instance of <class 'launch.api_client.model.create_fine_tune_request.CreateFineTuneRequest.MetaOapg.properties.hyperparameters.MetaOapg.additional_properties'>. None of the anyOf schemas matched the input data.
Looking through the code it seems like only strings, ints, or floats are accepted, so I tried to convert the dictionary to a string but that results in this error later on in the pipeline when the finetuning is actually being done:
'{"events": [{"timestamp": 1697248409.2646654, "message": "\'str\' object has no attribute \'get\'", "level": "error"}]}'
I also tried passing in a LoraConfig object from HuggingFace but that wasn't accepted either. Am I missing something very obvious? How do I pass in my PEFT configs?
LLM Engine Version
- LLM Engine Version: 0.0.0b19
System Version
- Python Version: 3.11.5
- Operating System: MacOS
Minimal Reproducible Example
Running the code snippet as is should do it. I made some dummy test files for this example.
from llmengine import FineTune
response = FineTune.create(
model = "llama-2-7b",
training_file = "file-963KxGaPu8WET25",
validation_file = "file-qU5_mPTaRlmIuLK",
hyperparameters = {
"lr": 2e-3,
"epochs": 5,
"peft_config": {
"r": 8,
"lora_alpha": 8,
"lora_dropout": 0.0
}
},
suffix = "foofoobarbar"
)
Please advise, thank you!