research-article Free Access
- Authors:
- Irina Higgins DeepMind, London
DeepMind, London
Search about this author
- Peter Wirnsberger DeepMind, London
DeepMind, London
Search about this author
- Andrew Jaegle DeepMind, London
DeepMind, London
Search about this author
- Aleksandar Botev DeepMind, London
DeepMind, London
Search about this author
NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing SystemsDecember 2021Article No.: 1960Pages 25591–25605
Published:10 June 2024Publication History
- 0citation
- 0
- Downloads
Metrics
Total Citations0Total Downloads0Last 12 Months0
Last 6 weeks0
- Get Citation Alerts
New Citation Alert added!
This alert has been successfully added and will be sent to:
You will be notified whenever a record that you have chosen has been cited.
To manage your alert preferences, click on the button below.
Manage my Alerts
New Citation Alert!
Please log in to your account
- Publisher Site
NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing Systems
SyMetric: measuring the quality of learnt hamiltonian dynamics inferred from vision
Pages 25591–25605
PreviousChapterNextChapter
ABSTRACT
A recently proposed class of models attempts to learn latent dynamics from high-dimensional observations, like images, using priors informed by Hamiltonian mechanics. While these models have important potential applications in areas like robotics or autonomous driving, there is currently no good way to evaluate their performance: existing methods primarily rely on image reconstruction quality, which does not always reflect the quality of the learnt latent dynamics. In this work, we empirically highlight the problems with the existing measures and develop a set of new measures, including a binary indicator of whether the underlying Hamiltonian dynamics have been faithfully captured, which we call Symplecticity Metric or SyMetric. Our measures take advantage of the known properties of Hamiltonian dynamics and are more discriminative of the model's ability to capture the underlying dynamics than reconstruction error. Using SyMetric, we identify a set of architectural choices that significantly improve the performance of a previously proposed model for inferring latent dynamics from pixels, the Hamiltonian Generative Network (HGN). Unlike the original HGN, the new HGN++ is able to discover an interpretable phase space with physically meaningful latents on some datasets. Furthermore, it is stable for significantly longer rollouts on a diverse range of 13 datasets, producing rollouts of essentially infinite length both forward and backwards in time with no degradation in quality on a subset of the datasets.
Skip Supplemental Material Section
Supplemental Material
Available for Download
3540261.3542221_supp.pdf (521.6 KB)
Supplemental material.
References
- J. L. F. Abascal and C. Vega. A general purpose model for the condensed phases of water: Tip4p/2005. J. Chem. Phys., 123:234505, 2005.Google Scholar
Cross Ref
- R. Abraham and J.E. Marsden. Foundations of Mechanics. AMS Chelsea publishing. AMS Chelsea Pub./American Mathematical Society, 2008.Google Scholar
- Christine Allen-Blanchette, Sushant Veer, Anirudha Majumdar, and Naomi Ehrich Leonard. LagNetViP: A Lagrangian neural network for video prediction. In Proceedings of AAAI Conference on Artificial Intelligence, 2020.Google Scholar
- Omri Azencot, N Benjamin Erichson, Vanessa Lin, and Michael W Mahoney. Forecasting sequential data using consistent Koopman autoencoders. In Proceedings of International Conference on Machine Learning (ICML), 2020.Google Scholar
- James P Bailey and Georgios Piliouras. Multi-agent learning in network zero-sum games is a Hamiltonian system. In Proceedings of the International Conference on Autonomous Agents and Multiagent Systems (AAMAS), 2019.Google Scholar
- David Balduzzi, Sebastien Racaniere, James Martens, Jakob Foerster, Karl Tuyls, and Thore Graepel. The mechanics of n-player differentiable games. In International Conference on Machine Learning, pages 354–363, 2018.Google Scholar
- Alexandar Botev, Andrew Jaegle, Peter Wirnsberger, Daniel Hennes, and Irina Higgins. Which priors matter? Benchmarking models for learning latent dynamics. In Proceedings of Neural Information Processing Systems (NeurIPS), 2021.Google Scholar
- Zhengdao Chen, Jianyu Zhang, Martin Arjovsky, and Léon Bottou. Symplectic recurrent neural networks. In Proceedings of International Conference on Learning Representations (ICLR), 2020.Google Scholar
- Anshul Choudhary, John F Lindner, Elliott G Holliday, Scott T Miller, Sudeshna Sinha, and William L Ditto. Forecasting Hamiltonian dynamics without canonical coordinates. Nonlinear Dynamics, pages 1–10, 2020.Google Scholar
- Wendy D. Cornell, Piotr Cieplak, Christopher I. Bayly, Ian R. Gould, Kenneth M. Merz, David M. Ferguson, David C. Spellmeyer, Thomas Fox, James W. Caldwell, and Peter A. Kollman. A second generation force field for the simulation of proteins, nucleic acids, and organic molecules. J. Am. Chem. Soc., 117(19):5179–5197, 1995.Google Scholar
Cross Ref
- Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, and Shirley Ho. Lagrangian neural networks. In ICLR Deep Differential Equations Workshop, 2020.Google Scholar
- Shaan A. Desai, Marios Mattheakis, and StephenJ. Roberts. Variational integrator graph networks for learning energy-conserving dynamical systems. Physical Review E., 104(3), 2020.Google Scholar
Cross Ref
- Jay L. Devore. Probability and Statistics for Engineering and the Sciences. Spinger, 2008.Google Scholar
- Daniel M. DiPietro, Shiying Xiong, and BoZhu. Sparse symplectically integrated neural networks. In Proceedings of Neural Information Processing Systems (NeurIPS), 2020.Google Scholar
- Alexey Dosovitskiy and Thomas Brox. Generating images with perceptual similarity metrics based on deep networks. In Proceedings of Neural Information Processing Systems (NeurIPS), 2016.Google Scholar
- Sunny Duan, Loic Matthey, Andre Saraiva, Nicholas Watters, Christopher P Burgess, Alexander Lerchner, and Irina Higgins. Unsupervised model selection for variational disentangled representation learning. In Proceedings of International Conference on Learning Representations (ICLR), 2020.Google Scholar
- Marc Finzi, Ke Alexander Wang, and Andrew Gordon Wilson. Simplifying Hamiltonian and Lagrangian neural networks via explicit constraints. In Proceedings of Neural Information Processing Systems (NeurIPS), 2020.Google Scholar
- Jason Frank. Symplectic flows and maps and volume preservation. https://webspace.science.uu.nl/ frank011/Classes/numwisk/ch16.pdf, 2008.Google Scholar
- Daan Frenkel and Berend Smit. Understanding Molecular Simulation. Academic Press, San Diego, second edition, 2002.Google Scholar
- Samuel Greydanus, Misko Dzamba, and Jason Yosinski. Hamiltonian neural networks. In Proceedings of Neural Information Processing Systems (NeurIPS), 2019.Google Scholar
- Niklas Heim, Václav Šmídl, and Tomáš Pevnỳ. Rodent: Relevance determination in differential equations. arXiv preprint arXiv:1912.00656, 2019.Google Scholar
- Irina Higgins, Loic Matthey, Arka Pal, Christopher Burgess, Xavier Glorot, Matthew Botvinick, Shakir Mohamed, and Alexander Lerchner. β-VAE: Learning basic visual concepts with a constrained variational framework. In Proceedings of International Conference on Learning Representations (ICLR), 2017.Google Scholar
- William G. Hoover. Canonical dynamics: Equilibrium phase-space distributions. Physical Review A, 31:1695–1697, 1985.Google Scholar
Cross Ref
- In Huh, Eunho Yang, Sung Ju Hwang, and Jinwoo Shin. Time-reversal symmetric ODE network. In Proceedings of Neural Information Processing Systems (NeurIPS), 2020.Google Scholar
- Maxim Jeffs. Harvard, Lecture Notes: Classical mechanics and symplectic geometry, 2020. URL: http://people.math.harvard.edu/~jeffs/SymplecticNotes.pdf.Google Scholar
- Pengzhan Jin, Zhen Zhang, Aiqing Zhu, Yifa Tang, and George Em Karniadakis. SympNets: Intrinsic structure-preserving symplectic networks for identifying Hamiltonian systems. Neural Networks, 132:166–179, 2020.Google Scholar
Cross Ref
- J. E. Jones and Sydney Chapman. On the determination of molecularfields. –II. From the equation of state of a gas. Proc. R. Soc. Lond. A, 106(738):463–477, 1924.Google Scholar
Cross Ref
- Rishabh Kabra, Chris Burgess, Loic Matthey, Raphael Lopez Kaufman, Klaus Greff, Malcolm Reynolds, and Alexander Lerchner. Multi-object datasets. https://github.com/deepmind/multi-object-datasets/, 2019.Google Scholar
- Lev Davidovich Landau and Evgenii Mikhailovich Lifsh*tz. Course of theoretical physics. Elsevier, 2013.Google Scholar
- Anders Boesen Lindbo Larsen, Søren Kaae Sønderby, Hugo Larochelle, and Ole Winther. Autoencoding beyond pixels using a learned similarity metric. In Proceedings of International Conference on Machine Learning (ICML), 2016.Google Scholar
- Yunzhu Li, Toru Lin, Kexin Yi, Daniel Bear, Daniel LK Yamins, Jiajun Wu, Joshua B Tenenbaum, and Antonio Torralba. Visual grounding of learned physical models. In Proceedings of International Conference on Machine Learning (ICML), 2020.Google Scholar
- Michael Lutter, Christian Ritter, and Jan Peters. Deep Lagrangian networks: Using physics as model prior for deep learning. In Proceedings of International Conference on Learning Representations (ICLR), 2019.Google Scholar
- Andrew L Maas, Awni Y Hannun, and Andrew Y Ng. Rectifier nonlinearities improve neural network acoustic models. In Proceedings of International Conference on Machine Learning (ICML), 2013.Google Scholar
- Lars Mescheder, Andreas Geiger, and Sebastian Nowozin. Which training methods for GANs do actually converge? In Proceedings of International Conference on Machine Learning (ICML), 2018.Google Scholar
- Lars Mescheder, Sebastian Nowozin, and Andreas Geiger. The numericsofGANs. In Proceedings of Neural Information Processing Systems (NeurIPS), 2017.Google Scholar
- Luke Metz, Ben Poole, David Pfau, and Jascha Sohl-Dickstein. Unrolled generative adversarial networks. In Proceedings of International Conference on Learning Representations (ICLR), 2017.Google Scholar
- Shuichi Nosé. A unified formulation of the constant temperature molecular dynamics methods. J. Chem. Phys., 81(1):511–519, 1984.Google Scholar
Cross Ref
- Chongli Qin, Yan Wu, Jost Tobias Springenberg, Andrew Brock, Jeff Donahue, Timothy P Lillicrap, and Pushmeet Kohli. Training generative adversarial networks by solving ordinary differential equations. In Proceedings of Neural Information Processing Systems (NeurIPS), 2020.Google Scholar
- Prajit Ramachandran, Barret Zoph, and Quoc V Le. Swish: a self-gated activation function. arXiv preprint arXiv:1710.05941.Google Scholar
- Katharina Rath, Christopher G. Albert, Bernd Bischl, and Udo von Toussaint. Symplectic Gaussian process regression of maps in Hamiltonian systems. Chaos, 31, 2021.Google Scholar
- Danilo Jimenez Rezende, Sébastien Racanière, Irina Higgins, and Peter Toth. Equivariant Hamiltonian flows. In NeurIPS workshop: Machine Learning and the Physical Sciences, 2019.Google Scholar
- Danilo Jimenez Rezende and Fabio Viola. Taming VAEs. arXiv preprint arXiv:1810.00597, 2018.Google Scholar
- Manuel A Roehrl, Thomas A Runkler, Veronika Brandtstetter, Michel Tokic, and Stefan Ober-mayer. Modeling system dynamics with physics-informed neural networks based on Lagrangian mechanics. In International Federation of Automatic Control (IFAC) World Congress, 2020.Google Scholar
Cross Ref
- Oleh Rybkin. The reasonable ineffectiveness of pixel metrics for future prediction (and what to do about it), 2018. URL: https://bit.ly/2YHPRg2.Google Scholar
- Steindor Saemundsson, Alexander Terenin, Katja Hofmann, and Marc Deisenroth. Variational integrator networks for physically structured embeddings. In International Conference on Artificial Intelligence and Statistics (AISTATS), 2020.Google Scholar
- Iain W. Stewart. MIT, Lecture Notes: Advanced Classical Mechanics, 2016. URL: https://bit.ly/3oP7fuc.Google Scholar
- Emanuel Todorov, Tom Erez, and Yuval Tassa. Mujoco: A physics engine for model-based control. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2012.Google Scholar
Cross Ref
- Yunjin Tong, Shiying Xiong, Xingzhe He, Guanghan Pan, and Bo Zhu. Symplectic neural networks in Taylor series form for Hamiltonian systems. Journal of Computational Physics, 437, 2020.Google Scholar
- Peter Toth, Danilo Jimenez Rezende, Andrew Jaegle, Sébastien Racanière, Aleksandar Botev, and Irina Higgins. Hamiltonian generative networks. In Proceedings of International Conference on Learning Representations (ICLR), 2020.Google Scholar
- Silviu-Marian Udrescu and Max Tegmark. Symbolic pregression: Discovering physical laws from raw distorted video. Physical Review E, 103, 2021.Google Scholar
- David Warde-Farley and Yoshua Bengio. Improving generative adversarial networks with denoising feature matching. In Proceedings of International Conference on Learning Representations (ICLR), 2017.Google Scholar
- Nicholas Watters, Loic Matthey, Christopher P Burgess, and Alexander Lerchner. Spatial broadcast decoder: A simple architecture for learning disentangled representations in VAEs. arXiv preprint arXiv:1901.07017, 2019.Google Scholar
- Jinshan Wu. Hamiltonian formalism of game theory. arXiv preprint quant-ph/0501088, 2005.Google Scholar
- Shiying Xiong, Yunjin Tong, Xingzhe He, Cheng Yang, Shuqi Yang, and Bo Zhu. Nonseparable symplectic neural networks. In Proceedings of International Conference on Learning Representations (ICLR), 2021.Google Scholar
- Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. Dissipative SymODENet: Encoding Hamiltonian dynamics with dissipation and control into deep learning. arXiv preprint arXiv:2002.08860, 2020.Google Scholar
- Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. Symplectic ODE-net: Learning Hamiltonian dynamics with control. In Proceedings of International Conference on Learning Representations (ICLR), 2020.Google Scholar
- Yaofeng Desmond Zhong, Biswadip Dey, and Amit Chakraborty. Benchmarking energy-conserving neural networks for learning dynamics from data. In Conference on Learning for Dynamics and Control (L4DC), 2021.Google Scholar
- Yaofeng Desmond Zhong and Naomi Leonard. Unsupervised learning of Lagrangian dynamics from images for prediction and control. Proceedings of Neural Information Processing Systems (NeurIPS), 2020.Google Scholar
Cited By
View all
Recommendations
- Stability and Hopf bifurcation of a Lorenz-like system
Hopf bifurcation is one of the important dynamical behaviors. It could often cause some phenomena, such as quasiperiodicity and intermittency. Consequently, chaos will happen due to such dynamical behaviors. Since chaos appears in the Lorenz-like system,...
Read More
- Bifurcation and control for a discrete-time prey-predator model with Holling-IV functional response
The dynamics of a discrete-time predator-prey model with Holling-IV functional response are investigated. It is shown that the model undergoes a flip bifurcation, a Hopf bifurcation and a saddle-node bifurcation by using the center manifold theorem and ...
Read More
- Quantum correlation swapping
Quantum correlations (QCs), including quantum entanglement and those different, are important quantum resources and have attracted much attention recently. Quantum entanglement swapping as a kernel technique has already been applied to quantum repeaters ...
Read More
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in
Full Access
Get this Publication
- Information
- Contributors
Published in
NIPS '21: Proceedings of the 35th International Conference on Neural Information Processing Systems
December 2021
30517 pages
ISBN:9781713845393
- Editors:
- M. Ranzato,
- A. Beygelzimer,
- Y. Dauphin,
- P.S. Liang,
- J. Wortman Vaughan
Copyright © 2021 Neural Information Processing Systems Foundation, Inc.
Sponsors
In-Cooperation
Publisher
Curran Associates Inc.
Red Hook, NY, United States
Publication History
- Published: 10 June 2024
Qualifiers
- research-article
- Research
- Refereed limited
Conference
Funding Sources
Other Metrics
View Article Metrics
- Bibliometrics
- Citations0
Article Metrics
- View Citations
Total Citations
Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
View Author Metrics
Cited By
This publication has not been cited yet
Digital Edition
View this article in digital edition.
View Digital Edition
- Figures
- Other
Close Figure Viewer
Browse AllReturn
Caption
View Table of Contents
Export Citations
Your Search Results Download Request
We are preparing your search results for download ...
We will inform you here when the file is ready.
Download now!
Your Search Results Download Request
Your file of search results citations is now ready.
Download now!
Your Search Results Download Request
Your search export query has expired. Please try again.