Kommande
Data & IT
Structured Representation Learning
Yue Song • Thomas Anderson Keller • Nicu Sebe • Max Welling
Inbunden
999:-
This book introduces approaches to generalize the benefits of equivariant deep learning to a broader set of learned structures through learned homomorphisms. In the field of machine learning, the idea of incorporating knowledge of data symmetries into artificial neural networks is known as equivariant deep learning and has led to the development of cutting edge architectures for image and physical data processing. The power of these models originates from data-specific structures ingrained in them through careful engineering. To-date however, the ability for practitioners to build such a structure into models is limited to situations where the data must exactly obey specific mathematical symmetries. The authors discuss naturally inspired inductive biases, specifically those which may provide types of efficiency and generalization benefits through what are known as homomorphic representations, a new general type of structured representation inspired from techniques in physics and neuroscience. A review of some of the first attempts at building models with learned homomorphic representations are introduced. The authors demonstrate that these inductive biases improve the ability of models to represent natural transformations and ultimately pave the way to the future of efficient and effective artificial neural networks.
- Format: Inbunden
- ISBN: 9783031881107
- Språk: Engelska
- Antal sidor: 140
- Utgivningsdatum: 2025-05-08
- Förlag: Springer International Publishing AG