Sequences have become first class citizens in supervised learning thanks to the resurgence of recurrent neural networks. Many complex tasks that require mapping from or to a sequence of observations can now be formulated with the sequence-to-sequence (seq2seq) framework which employs the chain rule to efficiently represent the joint probability of sequences. In many cases, however, variable sized inputs and/or outputs might not be naturally expressed as sequences. For instance, it is not clear how to input a set of numbers into a model where the task is to sort them; similarly, we do not know how to organize outputs when they correspond to random variables and the task is to model their unknown joint probability. In this paper, we first show using various examples that the order in which we organize input and/or output data matters significantly when learning an underlying model. We then discuss an extension of the seq2seq framework that goes beyond sequences and handles input sets in a principled way. In addition, we propose a loss which, by searching over possible orders during training, deals with the lack of structure of output sets. We show empirical evidence of our claims regarding ordering, and on the modifications to the seq2seq framework on benchmark language modeling and parsing tasks, as well as two artificial tasks – sorting numbers and estimating the joint probability of unknown graphical models.
Order Matters Sequence to sequence for sets
This paper explores how the order of inputs and outputs affects the performance of sequence-to-sequence (seq2seq) models, even when the data is unordered (e.g., sets). It introduces architectural extensions such as the Read-Process-Write model and proposes a training approach that searches over output permutations to improve learning. The paper shows that optimal ordering significantly impacts tasks like language modeling, parsing, and combinatorial problems. This work highlights the importance of considering input/output ordering in model design and has influenced further research in permutation-invariant architectures.
Introduction
Information
- Websitearxiv.org
- AuthorsOriol Vinyals, Samy Bengio, Manjunath Kudlur
- Published date2015/11/19
Categories
More Items
This paper proposes a quantitative framework for the rise-and-fall trajectory of complexity in closed systems, showing that a coffee-and-cream cellular automaton exhibits a bell-curve of apparent complexity when particles interact, thereby linking information theory with thermodynamics and self-organization.