Monocular depth estimation in outdoor scenes presents significant challenges due to ambiguity from occlusions and structural variations. One important challenge lies in effectively incorporating loss functions while considering the distribution of ground truth pixels and structural variations of the scene. The utilization of conventional loss functions, such as scale-invariant loss and gradient loss without considering contribution of each loss in relation to the structural variation of the scene may lead to suboptimal outcomes. To solve this problem, we propose an Adaptive Momentum-based Loss Rebalancing (AMLR) to balance loss functions for monocular depth estimation in outdoor scenes. Our method utilizes the scale-invariant loss and gradient loss, with the proposed balancing term inspired by traditional weight optimizer, Adam. By dynamically updating the loss weights using momentum and considering the increase and decrease of individual losses, we facilitate convergence of the total loss and consequently obtain more accurate results. We observed the gradient loss with an appropriate weight serves the role of assistant to the overall loss convergence. Experimental results on the KITTI benchmark demonstrate that our approach achieves performance comparable to state-of-the-art, achieving an absolute relative difference of 0.049. This work contributes to advancing the field of monocular depth estimation in challenging outdoor scenes.
This work was supported in part by the Brain Korea 21 (BK21) FOUR Program of the National Research Foundation of Korea through the Ministry of Education under Grant NRF5199991014091; in part by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education under Grant 2022R1F1A1065702; and in part by the Ministry of Science and Information and Communications Technology (MSIT), South Korea, through the Information Technology Research Center (ITRC) Support Program, supervised by the Institute for Information and Communications Technology Promotion (IITP) under Grant IITP-2023-2018-0-01424.