Large-step neural network for learning the symplectic evolution from partitioned data

Xin Li, Jian Li*, Zhihong Jeff Xia, Nikolaos Georgakarakos

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations


In this study, we focus on learning Hamiltonian systems, which involves predicting the coordinate () and momentum () variables generated by a symplectic mapping. Based on Chen & Tao (2021), the symplectic mapping is represented by a generating function. To extend the prediction time period, we develop a new learning scheme by splitting the time series (,) into several partitions. We then train a large-step neural network (LSNN) to approximate the generating function between the first partition (i.e. the initial condition) and each one of the remaining partitions. This partition approach makes our LSNN effectively suppress the accumulative error when predicting the system evolution. Then we train the LSNN to learn the motions of the 2:3 resonant Kuiper belt objects for a long time period of 25 000 yr. The results show that there are two significant improvements over the neural network constructed in our previous work: (1) the conservation of the Jacobi integral and (2) the highly accurate predictions of the orbital evolution. Overall, we propose that the designed LSNN has the potential to considerably improve predictions of the long-term evolution of more general Hamiltonian systems.

Original languageEnglish (US)
Pages (from-to)1374-1385
Number of pages12
JournalMonthly Notices of the Royal Astronomical Society
Issue number1
StatePublished - Sep 1 2023


  • celestial mechanics
  • Kuiper belt: general
  • methods: miscellaneous
  • planets and satellites: dynamical evolution and stability

ASJC Scopus subject areas

  • Astronomy and Astrophysics
  • Space and Planetary Science


Dive into the research topics of 'Large-step neural network for learning the symplectic evolution from partitioned data'. Together they form a unique fingerprint.

Cite this