Supriya Ghosh (Editor)

Scale analysis (statistics)

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

In statistics, scale analysis is a set of methods to analyze survey data, in which responses to questions are combined to measure a latent variable. These items can be dichotomous (e.g. yes/no, agree/disagree, correct/incorrect) or polytomous (e.g. disagree strongly/disagree/neutral/agree/agree strongly). Any measurement for such data is required to be reliable, valid, and homogeneous with comparable results over different studies.

Contents

Constructing scales

The item-total correlation approach is a way of identifying a group of questions whose responses can be combined together into a single measure or scale. This is a simple approach that works by ensuring that, when considered across a whole population, responses to the questions in the group tend to vary together and, in particular, that responses to no individual question are poorly related to an average calculated from the others.

Measurement models

Measurement is the assignment of numbers to subjects in such a way that the relations between the objects are represented by the relations between the numbers (Michell, 1990).

Traditional models

  • Likert scale
  • Reliability analysis, see also Classical test theory and Cronbach's alpha
  • Factor analysis
  • Modern models based on Item response theory

  • Guttman scale
  • Mokken scale
  • Rasch model
  • (Circular) Unfolding analysis
  • Circumplex model
  • Other models

  • Latent class analysis
  • Multidimensional scaling
  • NOMINATE (scaling method)
  • References

    Scale analysis (statistics) Wikipedia