What is it about?

We address one of the major challenges faced by the research community when handling data with rare events: class imbalance. Existing methodologies require re-sampling strategies or algorithmic modifications. In this work, we propose a dynamically weighted balanced loss function which could be promptly applied to any deep neural network architecture. Comprehensive experiments conducted provide justification for the proposed strategy. Our research corroborates that, when trained with the proposed loss function, significant performance gains can be achieved in highly imbalanced data.

Featured Image

Read the Original

This page is a summary of: Dynamically Weighted Balanced Loss: Class Imbalanced Learning and Confidence Calibration of Deep Neural Networks, IEEE Transactions on Neural Networks and Learning Systems, January 2021, Institute of Electrical & Electronics Engineers (IEEE),
DOI: 10.1109/tnnls.2020.3047335.
You can read the full text:

Read

Contributors

The following have contributed to this page