Poster Title:  Recurrent Neural Network based linear embeddings for time evolution of non-linear dynamics
Poster Abstract: 

In modern dynamical system modelling, finding coordinate transformation for representing highly non-linear dynamics in terms of approximate linear dynamics has been of crucial importance for enabling non-linear control, estimation, and prediction. Recently developed interest in Koopman operator theory has shown that its eigenfunctions can provide such coordinates that intrinsically linearize the global dynamics But finding and representation of such eigenfunctions have been challenging. The present work leverages deep learning methods, specifically Recurrent Neural Networks (RNNs) for discovering the Koopman eigenfunction representations and exploit RNNs ability to model temporal dependencies, to allow multi-step evolution of the dynamics, as long forecasting for such systems still remains a major challenge. Current work is an incremental work on the network architecture, which is interpretable in terms of Koopman theory and parsimonious, allowing augmentation to the lacking interpretability to deep learning architectures, while capturing the fewest meaningful eigenfunctions. Some other challenges related to modelling such architectures are discussed in future work.


Poster ID:  D-15
Poster File:  PDF document IHPCSS_Poster.pdf
Poster Image: 
Poster URL: