What is it about?

Traditionally, research on Markov chains has primarily focused on theoretical properties such as the hitting-time distribution for a given generator matrix. However, in practical applications, obtaining the dynamics (i. e. generator matrix) of underlying Markov system relies on observable data, often from partial states. This raises the question of whether the available data can be utilized to characterize the system's dynamics and how such observable data can be employed to determine the dynamics (generator matrix), representing an inverse problem of a Markov chain. This paper aims to establish a systematic framework for addressing the inverse problem of stationary continuous- and discrete-time Markov systems, irrespective of their reversibility. To begin, the hitting-time distribution of an irreversibly stationary Markov system is generalized from the reversible case. The hitting-time distribution is then utilized, through the taboo rate, to infer the probable n-step transitions originating from a given state. This method demonstrates that partially observable data can effectively identify the dynamics of a reversible Markov system or a subgroup of irreversibly stationary Markov systems. Surprisingly, even with just one or a maximum of two observable states, this approach remains applicable.

Featured Image

Why is it important?

The rapid development and application of observation methods across various fields ranging from the natural sciences to the humanities has facilitated the retrieval of big datasets, and this in turn has motivated the development of new learning algorithms and analysis approaches for studying complex systems. Accordingly, using big data to explore the dynamic properties of these systems—including their stochastic dynamics—has emerged as a new trend. Previous studies in this area have mostly used Markov processes to characterize complex systems as time-homogeneous Markov chains on a finite state space. As well as being investigated systematically, Markov models have been used in various disciplines including chemistry, finance, epidemiology, and engineering to model stochastic systems. Remarkably, this approach also sheds some light on identifiability and facilitates the observation and identification of stochastic dynamics regarding which types of information are suitable for experimental setups or applications.

Perspectives

The article presents a systematic theory for observing Markov chains, which involves determining the dynamics of a Markov chain system using partially observable data. The significance of this approach is underscored by the fact that typically, observing the stay and leave data of just one or two states is sufficient. This method is effective for inferring the overall dynamic information of a Markov chain system from partially observable data. Additionally, this methodology provides valuable insights into identifiability issues, enhances the observation and identification of stochastic dynamics, and offers guidance for experimental setups by identifying the types of relevant information for various applications.

Xuyan Xiang
Hunan University of Arts and Science

Read the Original

This page is a summary of: Identifying the generator matrix of a stationary Markov chain using partially observable data, Chaos An Interdisciplinary Journal of Nonlinear Science, February 2024, American Institute of Physics,
DOI: 10.1063/5.0156458.
You can read the full text:

Read

Contributors

The following have contributed to this page