Tuesday 10 January 2017

Meng's Test for the Heterogeneity of Correlated Correlation Coefficients Made Easier in R


Meng's test for the heterogeneity of correlated correlation coefficients (which appears in Meng, Rosenthal, & Rubin, 1992) is potentially of interest to anyone with multiple predictor variables (Xs) and a single outcome variable (Y). That's basically all and sundry. Unfortunately, the test, to my knowledge, is not programmed in any of the common statistical programs used in research psychology (e.g. SPSS, SAS) or R. Meaning that one has no option but to do it by hand (although I have heard it on the internet message board grapevine that someone may have created a MatLab code for it). This just isn't cricket (a game a bit, but not much, like baseball if you're American) in 2017.

Now, I really wanted to use this test in my research. So with necessity being the mother of invention and all, I set about making an R code that makes it easier. After some teething problems, below is that code (slightly rough and ready, but it works). It's an R code for the example test provided in Meng, Rosenthal, & Rubin (1992), following equations 2, 3, and 5 in that paper. It's contingent on the psych package being installed and loaded install.packages("psych"), so make sure you do that. If you put it in to R you will see that you get the same result as Meng et al. This makes me pretty confident it's true to the equations. Modifications to this code will allow you to carry out the test in a fraction of the time that it would take by hand. For now all you need to enter is the collection of correlation coefficients, the median inter-correlation between Xs, and the sample size. I aim to make this automated from a .csv file, but for now you will have to do something yourself. Work the first two things out using your favored data analysis program  (Update: 26/01/17: see my other blog to find out how to do these things quickly). The sample size is something you will know.

Then use a chi-square checker such as this one to check significance level of the chi square the procedure spews out at the end (which will be 5.71 for the Meng example).

8 comments:

  1. So after you have chi-square, do you follow it up with comparing pairs of correlations? what is the proper thing to do?

    ReplyDelete
    Replies
    1. Hello Julie,

      Thanks for the comment! In the paper, Meng and colleagues provide formulas for z-tests for the difference between correlated correlation coefficients that can be used for follow up analyses.

      That formula and an R script for its implementation can be found on my other blog here http://adampegler.blogspot.co.uk/2017/02/test-contrasts-among-set-of-correlated.html

      This chi-square test can be thought of a bit like the F test in ANOVA I suppose, and you can follow up with post-hoc tests. But I think your analysis should be guided by your hypotheses. If you only hypothesize that correlations will be heterogeneous, perhaps the heterogeneity test is enough. Have specific hypotheses about which correlations in particular should be different? Do a follow up z-test as well.

      A classic paper for all things to do with running tests between correlation coefficients is Steiger (1980), which you can get here http://l.academicdirect.org/Horticulture/GAs/Refs/Steiger_1980.pdf

      Hope that helps. Good luck with your research

      Adam

      Delete
  2. Hi Adam, your blog is very helpful.
    I calculated contrasts (z-test) using your syntax. I have three correlations that I want to compare. So after I did the first comparison, do I just rerun the syntax with changing the weights?
    Also, how do I know df form my Chi-square for heterogeneity? I know this questions sound stupid, I just don't want to make some mistake.
    Thanks for advice and pointing out that paper :-)

    ReplyDelete
  3. Hello Again Julie!

    Yes just rerun the syntax, changing the weights as appropriate. As you probably already know, they need to sum to 0. If you have 3 cors: .45, .47, .20 and want to contrast the first two with the last weights would be 1, 1, -2 (for example). Or, contrast the only the first with the last 1, 0, -1 (for example).

    Your degrees of freedom are k - 1, where k is the number of correlated correlation coefficients you have.

    Have a good rest of the week!

    Adam

    ReplyDelete
    Replies
    1. The Wikipedia page for contrasts is quite good, as a refresher. https://en.wikipedia.org/wiki/Contrast_(statistics)

      Delete
    2. Thank you very much! For refreshing and gaining new knowledge I'm using Andy Field's textbook, which I totally recommend.
      Have a nice week too :-)
      Best,
      J.

      Delete
    3. Yes! much better idea. Field's textbooks are very good!

      Delete