Skip to main content

Part of the book series: Springer-Lehrbuch ((SLB))

  • 438k Accesses

Zusammenfassung

Dieses Kapitel vermittelt folgende Lernziele: Ziele und Anwendungsmöglichkeiten der Metaanalyse kennen. Die Metaanalyse von narrativen Reviews abgrenzen können. Vorliegende Metaanalysen kritisch rezipieren können. Die Schritte bei der Durchführung einer quantitativen Metaanalyse erläutern können. Mögliche Fehler bei der Durchführung einer Metaanalyse sowie entsprechende Gegenmaßnahmen erklären können.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Literatur

  • Anderson, C. A., Shibuya, A., Ihori, N., Swing, E. L., Bushman, B. J., Sakamoto, A., et al. (2010). Violent video game effects on aggression, empathy, and prosocial behavior in eastern and western countries: A meta-analytic review. Psychological Bulletin, 136(2), 151–173.

    Google Scholar 

  • Bangert-Drowns, R. L. (1986). Review of development in meta-analytic method. Psychological Bulletin, 99(3), 388–399.

    Google Scholar 

  • Bax, L., Yu, L. M., Ikeda, N., Tsuruta, H., & Moons, K. G. (2006). Development and validation of MIX: Comprehensive free software for meta-analysis of casual research data. BMC Medial Research Methodology, 6. Retrieved [November 6, 2013], from http://www.biomedcentral.com/14712288/6/50/

  • Beaman, A. L. (1991). An empirical comparison of meta-analytic and traditional reviews. Personality and Social Psychology Bulletin, 17(3), 252–257.

    Google Scholar 

  • Becker, B. J. (1987). Applying tests of combined significance in meta-analysis. Psychological Bulletin, 102(1), 164–172.

    Google Scholar 

  • Becker, B. J. (1994). Combining significance levels. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 215–230). Thousand Oaks: Sage.

    Google Scholar 

  • Beelmann, A. & Bliesener, T. (1994). Aktuelle Probleme und Strategien der Metaanalyse. Psychologische Rundschau, 45, 211–233.

    Google Scholar 

  • Beelmann, A. & Lipsey, M. W. (in press). Meta-analysis of effect eestimates from multiple studies. In M. W. Lipsey & D. S. Cordray (Eds.), Field experimentation: Methods for evaluating what works, for whom, under what circumstances, how, and why. Thousand Oaks: Sage.

    Google Scholar 

  • Beelmann, A. & Schneider, N. (2003). Wirksamkeit der Psychotherapie bei Kindern und Jugendlichen. Eine Übersicht und Meta-Analyse zum Stand und zu Ergebnissen der deutschsprachigen Effektivitätsforschung. Zeitschrift für Klinische Psychologie und Psychotherapie, 32(2), 129–143.

    Google Scholar 

  • Bond, C. F. J., Wiitala, W. L., & Dan Richard, F. (2003). Meta-analysis of raw mean differences. Psychological Methods, 8(4), 406–418.

    Google Scholar 

  • Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2005). Comprehensive meta-analysis [Computer Program] (Version 2). Englewood Biostat.

    Google Scholar 

  • Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis. Chichester: Wiley.

    Google Scholar 

  • Bortz, J. (2005). Statistik (6. Aufl.). Berlin: Springer.

    Google Scholar 

  • Bortz, J. & Lienert , G. A. (2003). Kurzgefaßte Statistik für die klinische Forschung. Ein praktischer Leitfaden für die Analyse kleiner Stichproben (2. Aufl.). Heidelberg: Springer.

    Google Scholar 

  • Bortz, J. & Lienert , G. A. (2008). Kurzgefasste Statistik für die klinische Forschung. Leitfaden für die verteilungsfreie Analyse kleiner Stichproben. (3. Aufl.). Berlin: Springer.

    Google Scholar 

  • Bortz, J., Lienert , G. A., & Boehnke, K. (2008). Verteilungsfreie Methoden in der Biostatistik (3. Aufl.). Berlin: Springer.

    Google Scholar 

  • Bortz, J. & Schuster, C. (2010). Statistik für Human- und Sozialwissenschaftler. Berlin: Springer.

    Google Scholar 

  • Bosnjak, M. & Viechtbauer, W. (2009). Die Methode der Meta-Analyse zur Evidenzbasierung von Gesundheitsrisiken: Beiträge der Sozial-, Verhaltens- und Wirtschaftswissenschaften. Zentralblatt für Abeitsmedizin, Abeitsschutz & Ergonomie, 11, 322–333.

    Google Scholar 

  • Brüderl, J. (2004). Meta-Analyse in der Soziologie: Bilanz der deutschen Scheidungsursachenforschung oder „statistischer Fruchtsalat“? Zeitschrift für Soziologie, 33(1), 84–86.

    Google Scholar 

  • Bushman, B. J. (1994). Vote Counting Procedures in Meta-Analysis. In H. Cooper & L. V. Hedges (Eds.), The Handbook of Research Synthesis (pp. 193–213). Thousand Oaks: Sage.

    Google Scholar 

  • Cafri, G., Kromrey, J. D., & Brannick, M. T. (2010). A Meta-Meta-Analysis: Empirical review of statistical power, type I error rates, effect sizes, and model selection of meta-analyses published in psychology. Multivariate Behavioral Research, 45(2), 239–270.

    Google Scholar 

  • Carson, K. P., Schriesheim, C. A., & Kinicki, A. J. (1990). The Usefullness of the „Fail-Safe’“ statistic in meta-analysis. Educational and Psychological Measurement 50(2), 233–243.

    Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. New York: Erlbaum.

    Google Scholar 

  • Cohn, L. D. & Becker, B. J. (2003). How meta-analysis increases statistical power. Psychological Methods, 8(3), 243–253.

    Google Scholar 

  • Cook, D. J., Guyatt, G. H., Ryan, G., Clifton, J., Buckingham, L., Willan, A., et al. (1993). Should unpublished data be included in meta-analyses? Current convictions and controversies. JAMA The Journal of the American Medical Association, 269(21), 2749–2753.

    Google Scholar 

  • Cooper, H., Charlton, K., Valentine, J. C., & Muhlenbruck, L. (2000). Making the most of summer school: A meta-analytic and narrative review (Vol. 65). United Kingdom: Blackwell Publishing.

    Google Scholar 

  • Cooper, H., De Neve, K., & Charlton, K. (1997). Finding the missing science. The fate of studies submitted for review by a human subjects committee. Psychological Methods, 2(4), 447–452.

    Google Scholar 

  • Cooper, H. M. (2009). Research synthesis and meta-analysis: A step-by-step approach. Thousand Oaks: Sage.

    Google Scholar 

  • Cooper, H. M. & Hedges, L. V. (Eds.). (1993). The handbook of research synthesis. New York: Russell Sage Foundation.

    Google Scholar 

  • Cooper, H. M., Hedges, L. V., & Valentine, J. (Eds.). (2009). The handbook of research synthesis and meta-analysis (2nd ed.). New York: Russell Sage.

    Google Scholar 

  • Cornwell, J. M. & Ladd, R. T. (1993). Power and accuracy of the Schmidt and Hunter meta-analytic procedures. Educational and Psychological Measurement, 53(4), 877–895.

    Google Scholar 

  • Czienskowski, U. (2003). Meta-analysis – not just research synthesis. In R. Schulze, H. Holling & D. Böhning (Eds.), Metaanalysis. New developments and aplications in medical and social science (pp. 141–152). Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Darlington, R. B. & Hayes, A. F. (2000). Combining Independent p values: Extensions of the Stouffer and Binomial Methods. Psychological Methods, 5, 496-515.

    Google Scholar 

  • Duval, S. & Tweedie, R. (2000a). A nonparametric „trim and fill“ method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association, 95(449), 89–98.

    Google Scholar 

  • Duval, S. & Tweedie, R. (2000b). Trim and fill: A simple funnel-plot-based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455–463.

    Google Scholar 

  • Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. The British Medical Journal, 315(7109), 629–634.

    Google Scholar 

  • Ellis, P. D. (2010). The essential guide to effect sizes. Statistical power, meta-analysis, and the interpretation of research results. Cambridge: University Press.

    Google Scholar 

  • Eysenck, H. J. (1952). The effects of psychotherapy: An Evaluation. Journal of Consulting Psychology, 16(5), 319–324.

    Google Scholar 

  • Eysenck, H. J. (1978). An Exercise in Mega-Silliness. American Psychologist, 33, 517.

    Google Scholar 

  • Ferguson, C. J. & Brannick, M. T. (2012). Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychological Methods, 17(1), 120–128.

    Google Scholar 

  • Ferguson, C. J. & Kilburn, J. (2010). Much ado about nothing: The misestimation and overinterpretation of violent video game effects in eastern and western nations: Comment on Anderson et al. Psychological Bulletin, 136(2), 174–178.

    Google Scholar 

  • Fischer, P., Krueger, J. I., Greitemeyer, T., Vogrincic, C., Kastenmüller, A., Frey, D., et al. (2011). The bystander-effect: A meta-analytic review on bystander intervention in dangerous and non-dangerous emergencies. Psychological Bulletin, 137(4), 517–537.

    Google Scholar 

  • Fricke, R. & Treinies, G. (1985). Einführung in die Metaanalyse. Bern: Huber.

    Google Scholar 

  • Gillett, R. (2003). The metric comparability of meta-analytic effectsize estimators from factorial designs. Psychological Methods, 8(4), 419–433.

    Google Scholar 

  • Gilpin, A. R. (1993). Table for conversion of Kendall’s tau to Spearman’s rho within the context of measures of magnitude effect for meta-analysis. Educational and Psychological Measurement, 53(1), 87–92.

    Google Scholar 

  • Glass, G. V. (1976). Primary, secondary, and meta-analysis of research. Educational Researcher, 5(10), 3–8.

    Google Scholar 

  • Glass, G. V. (1999). Meta-analysis at 25. Paper presented at the Office of Special Education Programs Research Project Directors’ Conference, U.S. Department of Education. Retrieved November 6, 2013, from http://www.gvglass.info/papers/meta25.html

  • Glass, G. V., McGraw, B., & Smith, M. L. (1981). Meta analysis in social research. Thousand Oaks: Sage.

    Google Scholar 

  • Gleser, L. J. & Olkin, J. (1994). Stochastically dependent effect sizes. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 339–355). Thousand Oaks: Sage.

    Google Scholar 

  • Gottfredson, S. D. (1978). Evaluating psychological research reports. Dimensions, reliability, and correlates of quality judgements. American Psychologists, 33(10), 920–934.

    Google Scholar 

  • Green, B. F. & Hall, J. A. (1984). Quantitative methods for literature review’s. Annual Revue of Psychology, 35, 37–53.

    Google Scholar 

  • Grégoire, G., Derderian, F., & LeLorie, J. (1995). Selecting the language of the publications included in meta,analysis: is there a tower-of-babel-bias? Journal of Clinical Epidemiology, 48(1), 159–163.

    Google Scholar 

  • Hadjar, A. (2011). Geschlechtsspezifische Bildungsungleichheiten. Wiesbaden: VS.

    Google Scholar 

  • Hadjar, A. & Lupatsch, J. (2011). Geschlechterunterschiede im Schulerfolg: Spielt die Lehrperson eine Rolle? ZSE Zeitschrift für Soziologie der Erziehung und Sozialisation, 31(1), 79–94.

    Google Scholar 

  • Hager, W. (2004). Testplanung zur statistischen Prüfung psychologischer Hypothesen. Göttingen: Hogrefe.

    Google Scholar 

  • Hall, J. A., Tickle-Degnerz, L., Rosenthal, R., & Mosteller, F. (1994). Hypothesis and problems in research synthesis. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 18–27). Thousand Oaks: Sage.

    Google Scholar 

  • Hannover, B. & Kessels, U. (2011). Sind Jungen die neuen Bildungsverlierer? Empirische Evidenz für Geschlechterdisparitäten zuungunsten von Jungen und Erklärungsansätze. Zeitschrift für Pädagogische Psychologie, 25(2), 89–103.

    Google Scholar 

  • Harris, M. J. (1991). Controversy and cumulation: Meta-analysis and research on interpersonal expectancy effects. Personality and Social Psychology Bulletin, 17(3), 316–322.

    Google Scholar 

  • Hattie, J. A. (2008). Visible learning. A synthesis of over 800 meta-analyses relating to achievement. London: Routledge.

    Google Scholar 

  • Hedges, L. V. (1982). Statistical methodology in meta-analysis. Princeton: Educational Testing Service.

    Google Scholar 

  • Hedges, L. V. (1994). Fixed effects models. In H. Cooper & L. V. Hedges (Eds.), The Handbook of Research Synthesis (pp. 286–298). Thousand Oaks: Sage.

    Google Scholar 

  • Hedges, L. V., Cooper, H., & Bushman, B. J. (1992). Testing the null hypothesis in meta-analysis: A comparison of combined probability and confidence interval procedures. Psychological Bulletin, 111(1), 188–194.

    Google Scholar 

  • Hedges, L. V. & Olkin, J. (1985). Statistical methods for meta-analysis. Orlando: Academic Press.

    Google Scholar 

  • Hedges, L. V. & Pigott, T. D. (2001). The power of statistical tests in meta-analysis. Psychological Methods, 6(3), 203–217.

    Google Scholar 

  • Hedges, L. V. & Pigott, T. D. (2004). The power of statistical tests for moderators in meta-analysis. Psychological Methods, 9(4), 426–445.

    Google Scholar 

  • Hedges, L. V. & Vevea, J. L. (1996). Estimating effect size under publication bias: Small sample properties and robustness of a random effects selection model. Journal of Educational and Behavioral Statistics, 21(4), 299–332.

    Google Scholar 

  • Hedges, L. V. & Vevea, J. L. (1998). Fixed- and random-effects models in meta-analysis. Psychological Methods, 3(4), 486–504.

    Google Scholar 

  • Heirs, M. & Dean, M. E. (2007). Homeopathy for attention deficit/hyperactivity disorder or hyperkinetic disorder. Cochrane Database of Systematic Reviews, 4.

    Google Scholar 

  • Heres, S., Wagenpfeil, S., Hamann, J., Kissling, W., & Leucht, S. (2004). Language bias in neuroscience – is the tower of babel located in Germany? European Psychiatry, 19(4), 230–232.

    Google Scholar 

  • Heyvaert, M., Maes, B., & Onghena, P. (2011). Mixed methods research synthesis: definition, framework, and potential. Quality & Quantity(online first), 1–18.

    Google Scholar 

  • Higgins, J. P. T. & Green, S. (2009). Cochrane handbook for systematic reviews of interventions version 5.0.2. Retrieved November 6, 2013, from http://www.cochranehandbook.org

  • Hsu, L. M. (2005). Some properties ofrequivalent: a simple effect size indicator. Psychological Methods, 10(4), 420–427.

    Google Scholar 

  • Hunter, J. E. & Schmidt, F. L. (1989). Methods of meta-analysis: Correcting error and bias in research findings. Thousand Oaks: Sage.

    Google Scholar 

  • Hunter, J. E. & Schmidt, F. L. (2004). Methods of Meta-Analysis: Correcting Error and Bias in Research Findings (2nd ed.). Thousand Oaks: Sage.

    Google Scholar 

  • Hunter, J. E., Schmidt, F. L., & Jackson, G. B. (1982). Meta-analysis cumulating research finding across studies. Thousand Oaks: Sage.

    Google Scholar 

  • Jackson, D., Riley, R., & White, I. R. (2011). Multivariate meta-analysis: Potential and promise. Statistics in Medicine, 30(20), 2481–2498.

    Google Scholar 

  • Johnson, B. T., Mullen, B., & Salas, E. (1995). Comparison of three major meta-analytic approaches. Journal of Applied Psychology, 80(1), 94–106.

    Google Scholar 

  • Jussim, L. & Harber, K. D. (2005). Teacher expectations and self-fulfilling prophecies: Knowns and unknowns, resolved and unresolved controversies. Personality and Social Psychology Review, 9(2), 131–155.

    Google Scholar 

  • Jussim, L., Robustelli, S. L., & Cain, T. R. (2009). Teacher expectations and self-fulfilling prophecies. In K. R. Wenzel & A. Wigfield (Eds.), Handbook of motivation at school. (Educational Psychology Handbook Series) (pp. 349–380). New York: Routledge/Taylor & Francis Group.

    Google Scholar 

  • Kirk, R. E. (1996). Practical significance: A concept whose time has come. Educational Psychological Measurement, 56(5), 746–759.

    Google Scholar 

  • Kontopantelis, E. & Reeves, D. (2009). MetaEasy: A meta-analysis add-in for Microsoft Excel. Journal of Statistical Software, 30(7), 1–25.

    Google Scholar 

  • Kraemer, H. C. (1983). Theory of estimation and testing of effect sizes: Use in meta-analysis. Journal of Educational Statistics, 8(2), 93–101.

    Google Scholar 

  • Kraemer, H. C. (1985). A strategy to teach the concept and application of power of statistical tests. Journal of Educational Statistics, 10(3), 173–195.

    Google Scholar 

  • Kraemer, H. C. (2005). A simple effect size indicator for two-group comparisons? A comment on r equivalent. Psychological Methods, 10(4), 413–419.

    Google Scholar 

  • Kraemer, H. C., Gardner, C., Brooks III, J. O., & Yesavage, J. A. (1998). Advantages of excluding underpowered studies in meta-analysis: Inclusionist vs. exclusionist view points. Psychological Methods, 3(1), 23–31.

    Google Scholar 

  • Kraemer, H. C. & Thiemann, S. (1987). How many subjects? Statistical power analysis in research. Thousand Oaks: Sage.

    Google Scholar 

  • Kunz, R., Khan, K. S., Kleijnen, J., & Antes, G. (2009). Systematische Übersichtsarbeiten und Meta-Analysen (2. Aufl.). Bern: Huber.

    Google Scholar 

  • Landman, J. R. & Dawes, R. M. (1982). Psychotherapy outcome: Smith and Glass’ Conclusions stand up under scrutinity. American Psychologist, 37(5), 504–516.

    Google Scholar 

  • Light, R. J. & Pillemer, D. B. (1984). Summing up: The science of reviewing research. Cambridge: Harvard University Press.

    Google Scholar 

  • Light, R. J. & Smith, P. V. (1971). Accumulating evidence: Procedure for resolving contradictions among different research studies. Harvard Educational Review, 41(4), 429–471.

    Google Scholar 

  • Lipsey, M. W. & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks: Sage.

    Google Scholar 

  • Littell, J., Corcoran, J., & Pillai, V. (2008). Systematic reviews and meta-analysis. New York: Oxford University Press.

    Google Scholar 

  • Lösel, F. & Breuer-Kreuzer, D. (1990). Metaanalyse in der Evaluationsforschung: Allgemeine Probleme und eine Studie über den Zusammenhang zwischen Familienmerkmalen und psychischen Auffälligkeiten bei Kindern und Jugendlichen. Zeitschrift für Pädagogische Psychologie, 4, 253–268.

    Google Scholar 

  • MacKay, D. G. (1993). The theoretical epistemology: A new perspective on some long-standing methodological issues in psychology. In G. Keren & C. Lewis (Eds.), A handbook for data analysis in the behavioral sciences. Methodological issues (pp. 229–255). Hillsdale: Erlbaum.

    Google Scholar 

  • Magnusson, D. (1966). Test theory. Reading: Addison-Wesley.

    Google Scholar 

  • Manfreda, K. L., Bosnjak, M., Berzelak, J., Haas, I., Vehovar, V., & Berzelak, N. (2008). Web surveys vs. other survey modes. A meta-analysis comparing response rates. Source International Journal of Market Research, 50(1), 79–104.

    Google Scholar 

  • Mansfield, R. S. & Busse, T. V. (1977). Meta-analysis of research: A rejoinder to Glass. Educational Researcher, 6(9), 3.

    Google Scholar 

  • Morris, S. B. & De Shon, R. P. (2002). Combining effect size estimates in meta-analysis with repeated measure and independent groups designs. Psychological Methods, 7(1), 105–125.

    Google Scholar 

  • Mullen, B. (1989). Advanced basic meta-analysis. Hillsdale: Erlbaum.

    Google Scholar 

  • Mullen, B. & Rosenthal, R. (1985). BASIC meta-analysis: Procedures and program. Hillsdale: Erlbaum.

    Google Scholar 

  • Neugebauer, M., Helbig, M., & Landmann, A. (2011). Unmasking the myth of the same-sex teacher advantage. European Sociological Review, 27(5), 669–689.

    Google Scholar 

  • Olejnik, S. & Algina, J. (2000). Measures of effect size for comparative studies: Applications, interpretations and limitations. Contemporary Educational Psychology, 25(3), 241–286.

    Google Scholar 

  • Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8(2), 157–159.

    Google Scholar 

  • Orwin, R. G. (1994). Evaluating coding decisions. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 139–162). Thousand Oaks: Sage.

    Google Scholar 

  • Paterson, B., Dubouloz, C.-J., Chevrier, J., Ashe, B., King, J., & Moldoveanu, M. (2009). Conducting qualitative metasynthesis research: Insights from a metasynthesis project. International Journal of Qualitative Methods, 8(3), 22–33.

    Google Scholar 

  • Petersen, J. L. & Hyde, J. S. (2010). A meta-analytic review of research on gender differences in sexuality. Psychological Bulletin, 136(1), 21–38.

    Google Scholar 

  • Petticrew, M. & Roberts, H. (Eds.). (2006). Systematic reviews in the social sciences: A practical guide. Oxford: Blackwell.

    Google Scholar 

  • Pigott, T. D. (1994). Methods for handling missing data in research synthesis. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 164–174). Thousand Oaks: Sage.

    Google Scholar 

  • Popay, J. (2006). Moving beyond effectiveness in evidence synthesis: Methodological issues in the synthesis of diverse sources of evidence. London: NICE.

    Google Scholar 

  • Pope, C., Mays, N., & Popay, J. (2007). Synthesising qualitative and quantitative health research: A guide to methods maidenhead: Open University Press.

    Google Scholar 

  • Radin, D. I. & Ferrari, D. C. (1991). Effects of consciousness on the fall of dice: a meta-analysis. Journal of Scientific Exploration, 5(3), 61–83.

    Google Scholar 

  • Rosenberg, M. S. (2005). The file-drawer problem revisited: A general weighted method for calculating fail-safe numbers in meta-analysis. Evolution, 59(2), 464–468.

    Google Scholar 

  • Rosenberg, M. S., Adams, D. C., & Gurevitch, J. (2007). MetaWin [Computer Program] (Version 2.0): self-distributed.

    Google Scholar 

  • Rosenthal, M. C. (1994). The fugitive literature. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 85–94). Thousand Oaks: Sage.

    Google Scholar 

  • Rosenthal, R. (1968). Experimenter expectancy and the reassuring natur of the null hypothesis decision procedure. Psychological Bulletin, 70(6), 30–47.

    Google Scholar 

  • Rosenthal, R. (1969). Interpersonal expectations. In R. Rosenthal & R. L. Rosnow (Hrsg.), Artifact in behavioral research (S. 181–277). Orlando: Academic Press.

    Google Scholar 

  • Rosenthal, R. (1973a). the mediation of pygmalion effects: A four factor „ theory“. Papau New Guinea Journal of Education, 9(1), 1–12.

    Google Scholar 

  • Rosenthal, R. (1973b). On the social psychology of the self-fulfilling prophecy: Further evidence for pygmalion effects and their mediating mechanisms. MSS Modular Publications, Module 53, 1–28.

    Google Scholar 

  • Rosenthal, R. (1976). Experimenter effects in behavioral research. New York: Halsted Press.

    Google Scholar 

  • Rosenthal, R. (1978). Combining results of independent studies. Psychological Bulletin, 85(1), 185–193.

    Google Scholar 

  • Rosenthal, R. (1979). The „file drawer problem’’ and tolerance for null results. Psychological Bulletin, 86(3), 638–641.

    Google Scholar 

  • Rosenthal, R. (1984). Meta-analytic procedures for social research. Thousand Oaks: Sage.

    Google Scholar 

  • Rosenthal, R. (1993). Cumulating evidence. In G. Keren & C. Lewis (Eds.), A handbook for data analysis in the behavioural sciences. Methodological issues (pp. 519–559). Hillsdale: Erlbaum.

    Google Scholar 

  • Rosenthal, R. (1994). Parametric measures of effect size. In H. Cooper & L. V. Hedges (Eds.), The Handbook of Research Synthesis (S. 232–243). Thousand Oaks: Sage.

    Google Scholar 

  • Rosenthal, R. (1995). Critiquing Pygmalion: A 25-year perspective. Current Directions in Psychological Science, 4(6), 171–172.

    Google Scholar 

  • Rosenthal, R. & DiMatteo, M. R. (2001). Meta analysis: Recent developments in quantitative methods for literature reviews. Annual Review of Psychology, 52, 59–82.

    Google Scholar 

  • Rosenthal, R. & Fode, K. (1963). The effect of experimenter bias on performance of the albino rat. Behavioral Science, 8(3), 183–189.

    Google Scholar 

  • Rosenthal, R. & Jacobson, L. (1968). Pygmalion in the classroom: Teacher expectation and pupils’ intellectual development. New York: Holt, Rinehart & Winston.

    Google Scholar 

  • Rosenthal, R. & Rubin, D. B. (1971). Pygmalion Reaffirmed. In J. D. Elashoff & R. E. Snow (Eds.), Pygmalion reconsidered (pp. 139–155). Worthington: Jones.

    Google Scholar 

  • Rosenthal, R. & Rubin, D. B. (1978). Interpersonal expectancy effects: The first 345 studies. Behavioral and Brain Sciences, 1(3), 377–386.

    Google Scholar 

  • Rosenthal, R. & Rubin, D. B. (1986). Meta-analytic procedures for combining studies with multiple effect sizes. Psychological Bulletin, 99(3), 400–406.

    Google Scholar 

  • Rosenthal, R. & Rubin, D. B. (2003).requivalent: A simple effect size indicator. Psychological Methods, 8(4), 492–496.

    Google Scholar 

  • Rossi, J. S. (1997). A case study in the failure of psychology as a cumulative science: The spontaneous recovery of verbal learning. In L. L. Harlow, S. A. Mulaik & J. H. Steiger (Eds.), What if there were no significance tests? (pp. 175–197). Mahwah: Erlbaum.

    Google Scholar 

  • Rustenbach, S. J. (2003). Metaanalyse. Eine anwendungsorientierte Einführung. Bern: Huber.

    Google Scholar 

  • Sackett, P. R., Harris, M. M., & Orr, J. M. (1986). On seeking moderator variables in the meta-analysis of correlational data: A monte carlo investigation of statistical power and resistence to type I error. Journal Applied Psychology, 71(2), 302–310.

    Google Scholar 

  • Sánchez-Meca, J., Marin-Martinez, F., & Chacón-Moscoso, S. (2003). Effect-size indices for dichotomized outcomes in meta-analysis. Psychological Methods, 8, 448–467.

    Google Scholar 

  • Sandelowski, M. & Barroso, J. (2006). Handbook for synthesising qualitative research. New York: Springer.

    Google Scholar 

  • Saner, H. (1994). A conservative inverse normal test procedure for combining p-values in integrative research. Psychometrika, 59(2), 253–267.

    Google Scholar 

  • Sauerbrei, W. & Blettner, M. (2003). Issues of traditional reviews and meta-analysis of observational studies in medical research. In R. Schulze, H. Holling & D. Böhning (Eds.), Metaanalysis. new developments and applications in medical and social sciences (pp. 79–98). Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Schulze, R. (2004). Meta-analysis. A comparison of approaches. Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Schulze, R., Holling, H., Großmann, H., Jütting, A., & Brocke, M. (2003). Differences in the results of two meta-analytical approaches. In R. Schulze, H. Holling & D. Böhning (Eds.), Meta-analysis. New developments and applications in medical and social sciences (pp. 19–39). Göttingen: Hogrefe & Huber.

    Google Scholar 

  • Seifert, T. L. (1991). Determining effect sizes in various experimental designs. Educational and Psychological Measurement, 51(2), 341–347.

    Google Scholar 

  • Shadish, W. R. & Haddock, C. K. (1994). Combining estimates of effect size. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 262–280). Thousand Oaks: Sage.

    Google Scholar 

  • Smith, G. & Egger, M. (1998). Meta-analysis: Unresolved issues and future developments. British Medical Journal, 316(7126), 221–225.

    Google Scholar 

  • Smith, M. L. & Glass, G. V. (1977). Meta-analysis of psychotherapy outcome studies. American Psychologist, 32(9), 752–760.

    Google Scholar 

  • Snook, I., O’Neill, J., Clark, J., O’Neill, A.-M., & Openshaw, R. (2009). Invisible learnings? A Commentary on John Hattie’s book – Visible learnings? A synthesis of over 800 meta-analyses relating to achievement. New Zealand Journal of Educational Studies, 44(1), 93–106.

    Google Scholar 

  • Soilevuo Grønnerød, J. & Grønnerød, C. (2012). The Wartegg Zeichen Test: A literature overview and a meta-analysis of reliability and validity. Psychological Assessment, 24(2), 476–489.

    Google Scholar 

  • Spector, P. E. & Levine, E. L. (1987). Meta-analysis for integrating study outcomes. A Monte Carlo study of its susceptibility to type I and type II errors. Journal of Applied Psychology, 72(1), 3–9.

    Google Scholar 

  • Steiner, D. D., Lane, J. M., Dobbins, G. H., Schnur, A., & McConnell, S. (1991). A review of meta-analysis in organizational behavior and human resources management: An empirical assessment. Educational and Psychological Measurement, 51(3), 609–626.

    Google Scholar 

  • Sterne, J. A. C., Egger, M., & Davey Smith, G. (2001). Investigating and dealing with publication and other biases. In M. Egger, G. Davey Smith & D. Altman (Eds.), Systematic reviews in health care: Meta-analysis in context (2nd. ed., pp. 189–208). London: BMJ Books.

    Google Scholar 

  • Stock, W. A. (1994). Systematic coding for research synthesis. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 125–138). Thousand Oaks: Sage.

    Google Scholar 

  • Stock, W. A., Okun, M. A., Haring, M. J., Miller, W., Kinney, C., & Ceur vorst, R. W. (1982). Rigor in data synthesis: A case study of reliability in meta-analysis. Educational Researcher, 11(6), 10–14.

    Google Scholar 

  • Stouffer, S. A., Suchman, E. A., de Vinney, L. C., Star, S. A., & Williams, R. M. j. (1949). The American soldier: Adjustment during army life (Vol. 1). Princeton: Princeton University Press.

    Google Scholar 

  • Sutton, A. J. & Abrams, K. R. (2001). Bayesian methods in meta-analysis and evidence synthesis. Statistical Methods in Medical Research, 10(4), 277–303.

    Google Scholar 

  • The Cochrane Collaboration. (2011). Review manager (RevMan) [Computer program] (Version 5.1). Copenhagen: The Nordic Cochrane Centre.

    Google Scholar 

  • Timulak, L. (2009). Meta-analysis of qualitative studies: a tool for reviewing qualitative research findings in psychotherapy. Psychotherapy Research, 19(4–5), 591–600.

    Google Scholar 

  • Tracz, S. M., Elmore, P. B., & Pohlmann, J. T. (1992). Correlational meta-analysis. Independent and nonindependent cases. Educational and Psychological Measurement, 52(4), 879–888.

    Google Scholar 

  • Utts, J. (1991). Replication and meta-analysis in parapsychology. Statistical Science, 6(4), 363–403.

    Google Scholar 

  • Vevea, J. L. & Hedges, L. V. (1995). A general linear model for estimating effect size in the presence of publication bias. Psychometrika, 60(3), 419–435.

    Google Scholar 

  • Vevea, J. L., & Woods, C. M. (2005). Publication bias in research synthesis. Sensitivity analysis using a priori weight functions. Psychological Methods, 10(4), 428–443.

    Google Scholar 

  • Viechtbauer, W. (2007). Confidence intervals for the amount of heterogeneity in meta-analysis. Statistics in Medicine, 26(1), 37–52.

    Google Scholar 

  • Wachter, K. M. & Straf, M. L. (Eds.). (1990). The future of meta-analysis. New York: Sage.

    Google Scholar 

  • Wang, M. C. & Bushman, B. J. (1998). Using the normal quantile plot to explore meta–analytic data sets. Psychological Methods, 3(1), 46–54.

    Google Scholar 

  • Westermann, R. (2000). Wissenschaftstheorie und Experimentalmethodik. Ein Lehrbuch zur Psychologischen Methodenlehre. Göttingen: Hogrefe.

    Google Scholar 

  • White, H. D. (1994). Scientific communication and literature retrieval. In H. Cooper & L. V. Hedges (Eds.), The Handbook of research synthesis (pp. 41–55). Thousand Oaks: Sage.

    Google Scholar 

  • Wilkinson, B. (1951). Statistical consideration in psychological research. Psychological Bulletin, 48(2), 156–158.

    Google Scholar 

  • Wilson, D. B. & Lipsey, M. W. (2001). The role of method in treatment effectiveness research: Evidence from meta-analysis. Psychological Methods, 6(4), 413–429.

    Google Scholar 

  • Wolf, F. M. (1987). Meta-analysis: Quantitative methods for research synthesis. Thousand Oaks: Sage.

    Google Scholar 

  • Wortmann, P. M. (1994). Judging research quality. In H. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (S. 97–109). Thousand Oaks: Sage.

    Google Scholar 

  • Zhao, S. (1991). Metatheory, metamethod, qualitative meta-analysis: What, why and how? Sociological Perspectives, 34(3), 377–390.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Döring, N., Bortz, J. (2016). Metaanalyse. In: Forschungsmethoden und Evaluation in den Sozial- und Humanwissenschaften. Springer-Lehrbuch. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-41089-5_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-41089-5_16

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-41088-8

  • Online ISBN: 978-3-642-41089-5

  • eBook Packages: Psychology (German Language)

Publish with us

Policies and ethics