Skip to content

Redesign multi mlp

choucl requested to merge wip/redesign-multi-mlp into main

After discussion with Parker and YuYu, the multi-mlp model preprocessing and training method is being redesigned. When training with multi-mlp model, 4 different datasets will be created, which collects either branch data or prefetch data. When training, the model will be trained with all of these datasets. When training with one dataset all the other MLP parts that responsible for other predictions will be freezed.

In this MR, the problem of page number overflow and data preprocessing early stopped is fixed.

Edited by choucl

Merge request reports