This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

An overview of parameter estimation methods in the Ising model

Sara Keetelaar
University of Amsterdam

In recent years, graphical models have gained interest in psychology for modelling relationships between variables. On of the most popular models is the Ising model for binary data. The model contains parameters for the main effects of the variables as well as interaction parameters between the variables. Estimating these parameters is challenging due to the intractable normalising constant in the probability density function. For this reason, the straightforward method of maximum likelihood estimation can only be used for small networks (up to 15 nodes). For larger networks, approximate methods are used, in particular the joint pseudo-likelihood method is often used. This method is known to be consistent. However, for finite samples, not much is known about how well it estimates parameters. In addition to this method, other approximate likelihood methods are often used for parameter estimation, such as the independent conditional likelihood method, which evaluates the conditional likelihood of each variable given the rest of the variables separately, or methods that try to simplify the normalizing constant, which we call the observed population method. In this talk, an overview of all these parameter estimation methods is provided. First theoretically, discussing the advantages and shortcomings of each method. Secondly, a simulation study will provide insights into how the methods actually perform against each other under different circumstances, such as varying network structures and different graph and sample sizes.



Graphical modelsing
Ising model
maximum likelihood
parameter estimation

There is nothing here yet. Be the first to create a thread.

Cite this as:

Keetelaar, S. (2023, July). An overview of parameter estimation methods in the Ising model. Abstract published at MathPsych/ICCM/EMPG 2023. Via