What is it about?

Traditional machine learning, mainly supervised learning, follows the assumptions of closed-world learning, i.e., for each testing class, a training class is available. However, such machine learning models fail to identify the classes, which were not available during training time. These classes can be referred to as unseen classes. Open-world Machine Learning (OWML) is a novel technique, which deals with unseen classes. Although OWML is around for a few years and many significant research works have been carried out in this domain, there is no comprehensive survey of the characteristics, applications, and impact of OWML on the major research areas. In this paper, we aimed to capture the different dimensions of OWML with respect to other traditional ML models. We have thoroughly analyzed the existing literature and provided a novel taxonomy of OWML considering its two major application domains - Computer Vision and Image Processing (CVIP) and Natural Language Processing (NLP). We listed the available software packages and open datasets in OWML for future researchers. Finally, the paper concludes with a set of research gaps, open challenges, and future directions.

Featured Image

Why is it important?

Traditional machine learning approaches have produced promising outcomes for decades for every data analysis domain. However, formal machine learning, mainly supervised learning, has some limitations, such as 1) it works with isolated data and learns without using previous knowledge, and 2) a trained machine model can only work with the input instances for which similar instances have been used for the training purposes. In order to correctly identify classes which did not occur during the training phase (known as the ‘unseen’ classes), we require to use Open-world Machine Learning (OWML).

Read the Original

This page is a summary of: Open-world Machine Learning: Applications, Challenges, and Opportunities, ACM Computing Surveys, September 2022, ACM (Association for Computing Machinery),
DOI: 10.1145/3561381.
You can read the full text:

Read

Contributors

The following have contributed to this page