I have been reading the paper “Multilevel Bayesian Models of Categorical Data Annotation 20” by Bob Carpenter, and have been enjoying learning about Bayesian methods for modeling sets of categorizations by multiple raters.
In the paper, analysis of a dataset comprising of 5 rater’s classification of 3869 dental x-rays for the presence or absence of dental caries is provided. By googling around, I managed to source what I think is the dataset from the randomLCA package in R.
To fit the four models described in the paper using brms, you can use the following code:
Model 4.1: Binomial model fit
library(brms)
# Define the formula for the model formula_4.1 <- bf(y | trials(n) ~ 1)
# Fit the model using brms fit_4.1 <- brm( formula_4.1, data = dentistry, family = binomial(), prior = c(set_prior("normal(0, 1)", class = "Intercept")) )
Model 4.2: Beta-Binomial by Annotator Fit
# Define the formula for the model formula_4.2 <- bf(y | trials(n) ~ 1 + (1 | annotator))
# Fit the model using brms fit_4.2 <- brm( formula_4.2, data = dentistry, family = betabinomial(), prior = c(set_prior("normal(0, 1)", class = "Intercept")), control = list(adapt_delta = 0.99) )
Model 4.3: Beta-Binomial by Item Fit
# Define the formula for the model formula_4.3 <- bf(y | trials(n) ~ 1 + (1 | item))
# Fit the model using brms fit_4.3 <- brm( formula_4.3, data = dentistry, family = betabinomial(), prior = c(set_prior("normal(0, 1)", class = "Intercept")), control = list(adapt_delta = 0.99) )
Model 4.4: Logistic Fit
# Define the formula for the model formula_4.4 <- bf(y | trials(n) ~ 1 + (1 | item) + (1 | annotator))
# Fit the model using brms fit_4.4 <- brm( formula_4.4, data = dentistry, family = binomial(), prior = c(set_prior("normal(0, 1)", class = "Intercept")), control = list(adapt_delta = 0.99) )
Note that for the Beta-Binomial models (4.2 and 4.3), we use the betabinomial() family function in brms. Also, we set adapt_delta = 0.99 in the control the argument to ensure good mixing of the MCMC algorithm and data annotation
I hope this helps you with your learning! Let me know if you have any further questions.