Maize (Zea mays L.
) has been shown to be sensitive to temperature deviations, influencing its yield
potential. The development of new maize hybrids resilient to unfavourable weather is a desirable aim
for crop breeders. In this paper, we showcase the development of a multimodal deep learning model
using RGB images, phenotypic, and weather data under temporal effects to predict the yield potential
of maize before or during anthesis and silking stages. The main objective of this study was to assess
if the inclusion of historical weather data, maize growth captured through imagery, and important
phenotypic traits would improve the predictive power of an established multimodal deep learning
model. Evaluation of the model performance when training from scratch showed its ability to
accurately predict ~89% of hybrids with high-yield potential and demonstrated enhanced explanatory
power compared with previously published models.