Conversation
projects/book_chapter/folds.jl
Outdated
| hybrid_model, | ||
| ds, | ||
| (); | ||
| folds = folds, |
There was a problem hiding this comment.
not sure, maybe is better to do an outer loop over the folds? That way we could use pmap, distributed over them.
There was a problem hiding this comment.
it should be pmap-able, see the for loop in the last commit
There was a problem hiding this comment.
I mean that we can pass the k_fold split directly into train, no need for the new folds argument.
projects/book_chapter/folds.jl
Outdated
|
|
||
| for val_fold in 1:k | ||
| @info "Split data outside of train function. Training fold $val_fold of $k" | ||
| sdata = split_data(ds, hybrid_model; val_fold = val_fold, folds = folds) |
There was a problem hiding this comment.
Yes, we can also do the split outside of train. I guess then we don't blow up train with more keyword arguments and can get rid of the additional ones I added for kfold. @lazarusA
| # ? split training and validation data | ||
| (x_train, y_train), (x_val, y_val) = split_data(data, hybridModel; split_by_id=split_by_id, shuffleobs=shuffleobs, split_data_at=split_data_at) | ||
|
|
||
| (x_train, y_train), (x_val, y_val) = split_data(data, hybridModel; kwargs...) |
There was a problem hiding this comment.
pass everything for data handling via kwargs...?
There was a problem hiding this comment.
maybe, we need to make that works (I tried this already and somehow it was failing in some cases), hence some tests will be needed. Let's do these updates to split_data and train in a new PR, only dealing with that.
There was a problem hiding this comment.
Do you remember which cases were failing?
| shuffleobs=false, | ||
| split_by_id=nothing, | ||
| split_data_at=0.8, | ||
| # Data handling parameters are now passed via kwargs... |
There was a problem hiding this comment.
This would not blow up the length of train arguments further but even decrease it
29d4031 to
81c1046
Compare
|
@lazarusA I think this is more less finished - I would like to merge it soon:
Any objections? Sorry that it is a PR on multiple things |
|
Great, as usual, if it works, merge! 🙂 |
No description provided.