What is it about?

Neural Architecture Search (NAS) has recently gained increased attention, as a class of approaches that automatically searches in an input space of network architectures. A crucial part of the NAS pipeline is the encoding of the architecture that consists of the applied computational blocks, namely the operations and the links between them. Most of the existing approaches either fail to capture the structural properties of the architectures or use hand-engineered vector to encode the operator information. In this paper, the authors propose the replacement of fixed operator encoding with learnable representations in the optimization process. This approach, which effectively captures the relations of different operations, leads to smoother and more accurate representations of the architectures and consequently to improved performance of the end task. An extensive evaluation in ENAS benchmark demonstrates the effectiveness of the proposed operation embeddings to the generation of highly accurate models, achieving state-of-the-art performance.

Featured Image

Why is it important?

Neural Architecture Search (NAS) has emerged as the most promising field for the efficient automated search and generation of state-of-the-art models. However, most of the existing approaches either fail to capture the structural properties of the architectures or use fixed hand-engineered vectors, that cannot exploit information from data, to encode the operators . In this paper, the authors propose operation embeddings: a continuous representation of the applied operators tha can be integrated into various graph autoencoders as parameters. They demonstrate that the learnable representations of the operations lead to generation of state-of-the-art architectures. They also observe that the top-performing architectures share similar structural patterns: – clustering coefficient – average path length.

Read the Original

This page is a summary of: Graph-based Neural Architecture Search with Operation Embeddings, October 2021, Institute of Electrical & Electronics Engineers (IEEE),
DOI: 10.1109/iccvw54120.2021.00048.
You can read the full text:

Read

Contributors

The following have contributed to this page