Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals

Psychology articles from across Nature Portfolio

Psychology is a scientific discipline that focuses on understanding mental functions and the behaviour of individuals and groups.

Related Subjects

  • Human behaviour

Latest Research and Reviews

research paper about psychologists

Dimensional Affective Sensitivity to Hormones across the Menstrual Cycle (DASH-MC): A transdiagnostic framework for ovarian steroid influences on psychopathology

  • Jessica R. Peters
  • Katja M. Schmalenberger
  • Tory A. Eisenlohr-Moul

research paper about psychologists

A theoretical framework for polarization as the gradual fragmentation of a divided society

A theoretical framework informed by computational social science and social psychology explains the process of polarization as the gradual fragmentation of a divided society.

  • Ana-Maria Bliuc
  • John M. Betts
  • Mioara Cristea

research paper about psychologists

Prolonged exposure to mixed reality alters task performance in the unmediated environment

  • Xiaoye Michael Wang
  • Daniel Southwick
  • Timothy N. Welsh

research paper about psychologists

Prevalence and influencing factors of kinesiophobia in patients with heart disease: a meta-analysis and systematic review

research paper about psychologists

Differential patch-leaving behavior during probabilistic foraging in humans and gerbils

A study comparing behavioral heuristics that influence the onset of exploratory behavior in humans and rodents during probabilistic foraging.

  • Lasse Güldener
  • Parthiban Saravanakumar
  • Stefan Pollmann

research paper about psychologists

Immediate and evolving emotions among directly exposed survivors 7 years after the Oklahoma City bombing

  • E. Whitney Pollio
  • Helena Zhang
  • Carol S. North

Advertisement

News and Comment

research paper about psychologists

Understanding cultural variation in cognition one child at a time

Cross-cultural developmental research is crucial for understanding the roots of human cognition. Although group-level analyses can reveal how culture co-varies with cognition, individual-level analyses are needed to discern how specific cultural and ecological factors influence cognitive development.

  • Manuel Bohn
  • Frankie T. K. Fong
  • Daniel B. M. Haun

Reply to ‘The language network is topographically diverse and driven by rapid syntactic inferences’

  • Evelina Fedorenko
  • Anna A. Ivanova
  • Tamar I. Regev

The language network is topographically diverse and driven by rapid syntactic inferences

  • Elliot Murphy
  • Oscar Woolnough

Commentary to the article “Compulsive sexual behavior and paraphilic interests in adults with chronic tic disorders and Tourette syndrome: a survey-based study”

  • Natalia Szejko
  • Anna Dunalska
  • Kamila Saramak

research paper about psychologists

How pregnancy transforms the brain to prepare it for parenthood

It’s a transformational time long neglected by neuroscience. That is starting to change.

research paper about psychologists

Loneliness limits recall of positive social experiences

  • Jenn Richler

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research paper about psychologists

Psychology - Science topic

Rodolfo Soto

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Front Psychol

Measurement of the Effects of School Psychological Services: A Scoping Review

School psychologists are asked to systematically evaluate the effects of their work to ensure quality standards. Given the different types of methods applied to different users of school psychology measuring the effects of school psychological services is a complex task. Thus, the focus of our scoping review was to systematically investigate the state of past research on the measurement of the effects of school psychological services published between 1998 and 2018 in eight major school psychological journals. Of the 5,048 peer-reviewed articles published within this period, 623 were coded by two independent raters as explicitly refering to school psychology or counseling in the school context in their titles or abstracts. However, only 22 included definitions of effects of school psychological services or described outcomes used to evaluate school psychological services based on full text screening. These findings revealed that measurement of the effects of school psychological services has not been a focus of research despite its' relevance in guidelines of school psychological practice.

Introduction

School psychology is an independent and applied field of psychology concerned with providing mental health services to students, their families, teachers, school principals and other school staff (Jimerson et al., 2007 ; Bundesverband Deutscher Psychologinnen und Psychologen, 2015 ; American Psychological Association, 2020 ). According to the APA “school psychologists are prepared to intervene at the individual and system level, and develop, implement and evaluate programs to promote positive learning environments for children and youth from diverse backgrounds, and to ensure equal access to effective educational and psychological services that promote health development” (American Psychological Association, 2020 ). They promote “healthy growth and development of children and youth in a range of contexts” that impact instruction, learning or school behavior by providing services such as “assessment and evaluation of individuals and instructional and organizational environments,” “prevention and intervention programs,” “crisis intervention,” “consultations with teachers, parents, administrators, and other health service providers,” “supervision of psychological services” and “professional development programs” (American Psychological Association, 2020 ).

While a large extent of the available scientific literature on the scope of school psychology is consistent with the above-mentioned definition of the APA and reflects common practice in the U.S., there are considerable differences among countries (see Jimerson et al., 2007 for an extensive overview of school psychology across 43 countries). In coherence, the International School Psychology Association (ISPA) states that “the term school psychology is used in a general form to refer to professionals prepared in psychology and education and who are recognized as specialists in the provision of psychological services to children and youth within the context of schools, families, and other settings that impact their growth and development. As such, the term also refers to and is meant to include educational psychologists (…)” (International School Psychology Association, 2021 ). For example, in England, Wales, and Hong Kong the term “educational psychologists” is used as an equivalent to the term school psychologists (Lam, 2007 ; Squires and Farrell, 2007 ). In this review we use the term “school psychology” in coherence with the above-mentioned definition provided by the International School Psychology Association ( 2021 ).

A particular characteristic of school psychology is its multifaceted nature. Practitioners in this discipline cater for the needs of different types of users (e.g., students, teachers, parents) by relying on diverse methods (e.g., counseling of individuals or groups of students or teachers, screening and diagnostics, training, consultation). Moreover, school psychologists address a broad range of psychological needs (e.g., academic achievement, mental health, and behavior) and support conducive learning environments and practices (Bundesverband Deutscher Psychologinnen und Psychologen, 2015 ; American Psychological Association, 2020 ). They provide student-level services (i.e., educational and mental health interventions and services offered for small groups or individuals) as well as system-level services (e.g., implementation and evaluation of programs for teachers). Student-level services are, however, not limited to direct interventions with students. In order to support students effectively, it is often necessary to work with adults that play a significant role in students' lives. School psychologists, therefore, often rely on indirect services with parents and teachers to promote well-being in students. These mediated actions that underlie indirect service delivery models of school psychology are often referred to as the “ paradox of school psychology ” following Gutkin and Conoley ( 1990 ). Again, there are considerable differences among and within countries with respect to the extent that school psychologists engage in these direct and indirect services. For instance, in some regions of the U.S. school psychologists are mostly responsible for providing direct services like psychoeducational assessments, while a related professional group of so-called “ school counselors ” engage in indirect services such as consultation. In contrast, in a study by Bahr et al. ( 2017 ) school psychologists from three Midwestern U.S. states reported that “problem-solving consultation was the activity on which they spent the greatest amount of their time” (p. 581). Recent developments have extended the role of school psychologists to also provide system-level services that aim to support the organizational development of schools. In this context, a lot of emphasis is being placed on preventive work and interventions with parents, educators, and other professionals that intend to create supportive learning and social environments for students (Burns, 2011 ; Skalski et al., 2015 ).

Professional associations in different countries have attempted to summarize the above-mentioned multi-faceted nature of school psychology in practical frameworks and/or models. For instance, the Model for Services by School Psychologists by the National Association of School Psychology in the U.S. (Skalski et al., 2015 ) distinguishes between student-level services (i.e., interventions and instructional support to develop academic skills and interventions and mental health services to develop social and life skills) and systems-level services (i.e., school-wide practices to promote learning, prevention, responsive services and family-school collaboration services). Similarly, the Department of School Psychology of the Professional Association of German Psychologists (Bundesverband Deutscher Psychologinnen und Psychologen, 2015 ) states that school psychologists provide support for students through individual counseling in cases of learning, developmental, and behavioral problems of students (e.g., fostering gifted students, identifying special needs in inclusive schools, etc.), as well as for schools through system-level consultation (e.g., development toward inclusive schools, violence prevention, etc.).

There have also been several theoretical proposals to conceptualize the manifold field of school psychology. For example, Nastasi ( 2000 ) suggested that this subdiscipline should be understood as a comprehensive health care service that ranges from prevention to treatment. Also, Sheridan and Gutkin ( 2000 ) advocated for an ecological framework of service delivery that takes into account multiple eco-systemic levels. Moreover, Strein et al. ( 2003 ) acknowledged that thinking of school psychology in terms of direct and indirect services has advanced the field notably. They suggest broadening the framework even further to adopt a public health perspective that evaluates both individual- and system-level outcomes. Although there are, undoubtably, many differences between the way school psychology is understood and practiced in different countries, there seems to be consensus that the broad distinctions between individual vs. system-level and direct vs. indirect services represent an essential part of the scope of school psychological services (cf. Jimerson et al., 2007 ).

Just like other health professionals, school psychologists are expected to rely on evidence-based practices and evaluate the effects of their interventions to ensure quality standards (Hoagwood and Johnson, 2003 ; Kratochwill and Shernoff, 2004 ; White and Kratochwill, 2005 ; Forman et al., 2013 ; Morrison, 2013 ). For instance, some of the examples of professional practice included in the domain of Research and Program Evaluation of the Model for Services by School Psychologists by the NASP (Skalski et al., 2015 ) in the U.S. are “using research findings as the foundation for effective service delivery” and “using techniques of data collection to evaluate services at the individual, group, and systems levels” (Skalski et al., 2015 , p. I-5). Similarly, the professional profile of the Department of School Psychology of the Professional Association of German Psychologists (Bundesverband Deutscher Psychologinnen und Psychologen, 2015 ) mentions that school psychologists engage in regular measures of documentation, evaluation and quality insurance of their work in collaboration with their superiors (Bundesverband Deutscher Psychologinnen und Psychologen, 2015 , p. 6).

Measuring the effects of service delivery is, however, a very complex task that needs to encompass the multiple dimensions involved in school psychological practice, if we take into consideration the multifaceted nature of school psychology described above. This makes it difficult to define the effects of school psychological services and to derive recommendations on how to operationalize and measure them in practice. Practical guidelines and/or models, such as the ones mentioned above, rarely provide any specifications about the designs, instruments, or outcome variables that should be used to implement service evaluations. Results of a survey on contemporary practices in U.S. school psychological services showed that 40% reported using teacher or student reports (verbal and written), observational data, and single-subject design procedures to evaluate their services (Bramlett et al., 2002 ). In contrast, the results of the International School Psychology Survey indicate that this is not the international standard. School psychologists in many countries complain about the lack of research and evaluation of school psychological services in their country (Jimerson et al., 2004 ) or even express the need for more studies on service evaluation (Jimerson et al., 2006 ). Although the survey did not collect information about (self-)evaluation practices, results suggest huge variability in evaluation and thereby the understanding of effects of school psychological services.

Furthermore, attempts to define the term “effects of school psychological services” vary considerably and imply different ways of operationalizing measurement of effects. Some approaches define effects as significant changes in outcome variables that are thought to be a consequence of service delivery. Other approaches define effects as the impact of interventions as perceived by school psychologists themselves and/or clients (i.e., consumer satisfaction) or follow an economic point of view describing effects in relation to costs. These diverse perspectives seem to be used in parallel or even as synonyms, although they refer to different aspects of evaluation of school psychological services. For instance, Phillips ( 1998 ) suggested that the methods used by practicing school psychologist need to be investigated through controlled experimental evaluation studies conducted by researchers to determine changes in students, teachers, and/or other users of services. In this perspective an effect refers to the extent to which services can achieve the expected outcome. According to Andrews ( 1999 ) this should be labeled as efficacy and measured in experimental settings, ideally randomized controlled trials. In contrast to standardized designs commonly used in empirical research, school psychological practices are often characterized by the simultaneous application of heterogeneous methods to a broad range of psychological needs with varying frequencies. Thus, as a next step, effective methods that showed significant results in experimental designs need to be applied by school psychologists in practice to confirm their effectiveness in real-world settings where ideal conditions cannot be assured (Andrews, 1999 ; White and Kratochwill, 2005 ; Forman et al., 2013 ).

From a different perspective, some authors define the effects of school psychological services as the perceived impact of services on students, teachers, and/or other users from the perspective of school psychologists (e.g., Manz et al., 2009 ) or from the users' perspectives (e.g., Sandoval and Lambert, 1977 ; Anthun, 2000 ; Farrell et al., 2005 ). Again, another group of studies argue that evaluation results are necessary to justify cost-effectiveness of school psychological services. For example, Phillips ( 1998 ) stated that the “value of psychological service is measured by its efficiency, i.e., its desirable outcomes in relation to its costs” (p. 269; cf. Andrews, 1999 ). For this purpose, effects are often operationalized via frequency counts, like the number of tests used or number of children screened for special education per month (Sandoval and Lambert, 1977 ), the number of students participating in services offered by school psychologists (Braden et al., 2001 ), or the time dedicated to certain activities.

Taken together, what exactly is meant when school psychologists are asked to systematically evaluate the effects of their work and in this way ensure quality standards, seems to depend on the (often implicit) definitions and operationalizations of the measurement of effects each guideline or model adheres to. A possible reason for this variability might be related to the broad scope of the field of school psychology. As described in the paradox of school psychology (Gutkin and Conoley, 1990 ) it is mostly necessary to work with adults playing significant roles in the students' life to support students effectively. Thus, the impact of school psychological services on students is often indirect, mediated by actions of adults like teachers or parents who interact directly with school psychologists (Gutkin and Curtis, 2008 ). This poses methodological challenges for the measurement of effects of school psychological services, since both the development of the students as well as the changes in adults can be used as outcomes to define and operationalize effects in the three aforementioned perspectives.

One way of shedding light on this matter is to rely on evidence synthesis methodologies to reach a better understanding of how effects of school psychological services have been conceptualized to date, as Burns et al. ( 2013 ) propose. These authors conducted a mega-analysis summarizing past research from 47 meta-analyses to inform practice and policy on the effectiveness of school psychological services. While their findings reveal moderate and large effects for various types of school psychological services, several questions remain unanswered with respect to the measurement of the above-mentioned effects. Burns et al. ( 2013 ) charted the available evidence with great detail focusing on the type of intervention (e.g., domain of practice: academic, social and health wellness, etc.; tier of intervention: universal, targeted, intensive, etc.) and target population (e.g., preschool, elementary, middle or high school students). However, their focus did not lie on measurement of effects itself, such as distinguishing between effects measured from the perspective of school psychologists, service users or objective assessment measures. This is important, because the efficacy of an intervention might depend on the measurement used to capture results (e.g., consumer satisfaction survey vs. standardized test). Furthermore, the evidence summarized by Burns et al. ( 2013 ) was not limited to research explicitly conducted in the field of school psychology, but rather in the broader context of educational psychology. While there are undoubtably several relevant studies that nurture the field of school psychology without explicitly mentioning it, it is also important to highlight research that explicitly focuses on school psychology to strengthen the adoption of evidence-based practices in this field.

Building on this work, in the present review we aimed to address these gaps and contribute to a better understanding of the measurement of effects of school psychological services by systematically investigating the state of past research on this topic. We decided to adopt a scoping review approach, a type of knowledge synthesis to summarize research findings, identify the main concepts, definitions, and studies on a topic as well as to determine research gaps to recommend or plan future research (Peters et al., 2015 ; Tricco et al., 2018 ). Compared to systematic reviews (e.g., Liberati et al., 2009 ) scoping reviews mostly address a broader research question and summarize different types of evidence based on heterogeneous methods and designs. There is no need to examine the risk of bias of the included evidence to characterize the extent and content of research to a topic (Peters et al., 2015 ; Tricco et al., 2018 ). Thus, scoping reviews can be used to generate specific research questions and hypotheses for future systematic reviews (Tricco et al., 2016 ). Our review was guided by four research questions: (1) What percentage of articles in these journals focus on the measurement of effects of school psychological services? (2) What type of articles (i.e., empirical, theoretical/conceptual, meta-analysis) have been published on this topic and what is their content? (3) How did the authors define effects of school psychological services in past research? and (4) Which instruments are used to operationalize measurement of effects?

Methods and Procedure

We followed the methodological guidelines suggested by the PRISMA guidelines for scoping reviews (PRISMA-ScR, Tricco et al., 2018 ) and provide details in the following sections.

Eligibility Criteria

We followed a similar procedure as Villarreal et al. ( 2017 ) and limited our search to eight major peer-reviewed school psychology journals: Canadian Journal of School Psychology, International Journal of School and Educational Psychology, Journal of Applied School Psychology, Journal of School Psychology, Psychology in the School, School Psychology International, School Psychology Quarterly , and School Psychology Review . Furthermore, following common practices in systematic review methodology, we focused on articles published between 1998 (the year after the National Association of School Psychologists of the U.S. published the Blueprint for Training to guide training and practice, Yesseldyke et al., 1997 ) and September 2018. In this way, we aimed to capture past research published in the last 20 years. We only focused on reports written in English, as the review team is not proficient enough in other languages (e.g., French reports in the Canadian Journal of School Psychology ) to assess eligibility for this review.

Empirical research studies (i.e., articles reporting new data from quantitative or qualitative studies using either a sample of school psychologists or a sample of teachers, principals, students or other users of school psychological services), theoretical/conceptual contributions (i.e., articles on school psychological services based on the literature), or meta-analyses/systematic review (i.e., articles systematically analyzing results from multiple studies within the field of school psychology) were included. Editorials, commentaries, book and test reviews were excluded. Furthermore, to meet eligibility criteria for this review, articles had to explicitly refer to “school psychology” or “counseling” in the school context in their title and/or abstract and needed to address effects of school psychological services by naming at least one of the following keywords in the abstract: “evaluation/evaluate,” “effect/effects,” “effectivity,” “efficacy” or “effectiveness.” Abstracts on school psychology with text that was limited to the discussion (e.g., “consequences for school psychology will be discussed”) were excluded. To address our third and fourth research question, at full-text screening stage only articles that provided definitions and/or operationalizations of the effects of school psychological services were included.

To identify the range of evidence published on the effects of school psychological services we included studies delivered with school psychologists as well as articles related to all kinds of direct and indirect services. No selection limitations were imposed about method, design, participants, or outcomes of the studies.

Information Sources, Search, and Selection Process

The first author and a research assistant hand screened the journals' homepages from June to September 2018 1 . The selection process consisted of three steps. First, screening of titles and abstracts of all published articles was completed through independent double ratings by the first author and a research assistant to identify papers explicitly related to school psychology (search strategy: “school psychology” OR “counseling” in the school context). Articles were coded as included or excluded in an excel sheet. Moreover, the included articles were coded as empirical, theoretical/conceptual, or meta-analysis/systematic review. Inter-rater reliability values based on Cohen's kappa were estimated with the software package psych for R (Revelle, 2018 ). Second, the abstracts of the selected articles, even those with inclusion ratings from only one rater, were screened once again by the first author to identify articles on effects related to school psychological services (search strategy: evaluate * OR effect * OR effect * OR efficacy). Third, full texts of all included articles were screened by the first author.

If access to full texts was restricted through the journals' homepages, we used electronic databases (e.g., PsychArticles), search engines (e.g., Google), interlibrary loan services (e.g., Subito documents library), or contacted the authors via ResearchGate to gain full access.

Data Charting Process, Data Items, and Synthesis of Results

The first author charted data from the included articles in an excel sheet consisting of the following data items: type of article (i.e., empirical, theoretical/conceptual, or meta-analytical), authors, year of publication, content (i.e., summary of the study objective, participants and country of data collection or context referred to), definition of effects of school psychological services (i.e., explanation of outcomes used or recommended to measure effects of school psychological services), and operationalization of effects of school psychological services (i.e., instruments used or recommended to measure outcomes). Table 1 provides further details on each of these data items.

Definitions of data items.

EmpiricalThe article reports on empirical data collected specifically in this study or on a secondary analysis of an existing dataset.
Theoretical/conceptualThe article does not report on an empirical research study, but rather proposed a theoretical model or framework to guide future research or practice.
Meta-analyticalThe article systematically summarizes the available evidence across a set of empirical research studies.
AuthorsAll author names as mentioned in the article.
Year of publicationYear the study was published in the target journal as mentioned in the article.
Summary of the study objectiveAims or objectives of the study or article as expressed by the authors.
ParticipantsType of providers or users of school psychological services targeted by the article (e.g., students, parents, teachers, etc.).
Country of data collectionCountry the data was collected in or restriction of eligibility criteria to studies from certain countries for meta-analysis.
Definition of effects of school psychological servicesOutcomes used or recommended to measure effects of school psychological services
Operationalization of effects of school psychological servicesInstruments used or recommended to measure outcomes of school psychological services

Based on the data charting form, we summarized our findings in a narrative manner and identified research clusters and gaps addressing our research questions. Specifically, for the category “effects of school psychological services,” we proposed a first categorization into thematic clusters grouping different types of definitions, based on the information we extracted from full texts during data charting. This categorization was then double-checked by revising the full texts to make sure that each of the included articles was accurately allocated to one of the proposed thematic clusters.

Search and Selection Process

Figure 1 reveals the results of the search and selection process. Departing from 5,048 references that we initially identified, 4,425 were excluded during the first title and abstract screening round.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-12-606228-g0001.jpg

Flowchart of the scoping review procedure.

Table 2 presents further information on the articles screened from each of the eight journals targeted in this review, as well as interrater-reliability values based on percent agreement and Cohen's kappa (two independent ratings on 5,048 reports). There was almost perfect agreement between the two raters' judgements on two journals, κ = 0.89–0.90, substantial agreement for four journals, κ = 0.72 and 0.79 and moderate agreement on the remaining two journals, κ = 0.44 and 0.58 (Landis and Koch, 1977 ). Percent agreement ranged from 68 to 98%.

Summary of the procedure per journal (period: January 1998–September 2018).

Canadian Journal of School Psychology30781980.90(0.94; 0.98)152
International Journal of School and Educational Psychology14943980.89(0.95; 0.99)141
Journal of Applied School Psychology34448880.78(0.63; 0.93)62
Journal of School Psychology68532880.79(0.61; 0.96)82
Psychology in the School1,430220870.76(0.69; 0.83)207
School Psychology International732102840.72(0.60; 0.83)134
School Psychology Quarterly64538740.58(0.38; 0.78)93
School Psychology Review75659680.44(0.26; 0.62)51
Total5,0486239022

Eligibility criteria:

CI, confidence interval .

In the second screening stage, further 533 articles were excluded as they did not meet the inclusion criteria for this review (see Figure 1 ). The remaining 90 articles moved on to the full text screening stage. In this final step of the selection process, we identified 22 articles with explicit definitions or mention of effects of school psychological services. The remaining 68 articles were excluded (see Figure 1 ).

Percentage of Articles on the Effects of School Psychological Services

To answer research question 1, two rounds of title and abstract screening revealed that 1.78% of articles (i.e., 90 over a total of 5,048) published between January of 1998 and September of 2018 in the eight peer-reviewed journals targeted by this review explicitly referred to the measurement of effects of school psychological services in their title or abstract. However, only 0.43% (i.e., 22 over a total of 5,048) included definitions of effects of school psychological services or described outcomes used to evaluate school psychological services based on full text screening. See Table 2 for the number of included articles per journal.

Type and Content of Articles on the Effects of School Psychological Services

Research question 2 intended to explore what type of articles had been published on the measurement of effects of school psychological services. Of the 22 articles that met inclusion criteria for this review, we identified 12 empirical research reports, six theoretical/conceptual articles, and four meta-analyses published between 1998 and 2015. The content covered varied considerably from article to article and is described with a short summary of each report in Tables 3A – C .

Summary of included empirical articles.

Chafouleas et al. ( )SurveySchool psychologists ( = 189)U.S.Supervision and evaluation of school psychological workEvaluation as a determination of the significance of an individual's professional skill (=implicit definition)Non-standardized questionnaire on perception of and satisfaction with current evaluation practice (unspecified number of items)
Kikas ( )Survey9th- and 12th graders ( = 433)EstoniaSatisfaction with school psychological service, characteristics of students visiting school psychologistsEffect = perceived quality from students' perspective; no specific criterion mentionedNon-standardized questionnaire assessing the quality of school psychological work (one item)
Proctor and Steadman ( )SurveySchool psychologists ( = 63)U.S.Perceived effectiveness of counseling (=one aspect of the survey)Effectiveness = school psychologists' self-disclosure on belief in making a significant difference at school/with students/with teachersNon-standardized questionnaire on perceived effectiveness (six items)
Gilman and Gabriel ( )SurveyTeachers ( = 1,533) and administrators ( = 90)U.S.Satisfaction and helpfulness of school psychological services for students and educatorsEffect = perceived quality from teachers' and administrators' perspective; no specific criterion mentionedNon-standardized questionnaire assessing satisfaction with and helpfulness of school psychological work (two items)
Lepage et al. ( )pre-/post-test follow-up design without control-groupSchool psychology graduate students ( = 24)U.S.Evaluation of a 4-year consultation training programEffect = behavioral change in clients (several behavioral problems were included), and preschool teachers' as well as parents' satisfaction with counselingNon-standardized measures: repeated observations of client behavior rated by school psychologists, and questionnaire on clients' satisfaction and perceived quality (12 items)
Pérez-González et al. ( )Instrument development/SurveyTeachers ( = 157)SpainConstruction and validation of an instrument to evaluate school psychologists' skills relevant for teacher consultation from teachers' perceptionFrom teachers' perspective a school psychologist should offer expert knowledge concerning interventions, coordinate, initiate and follow-up interventions (= implicit definition)
Stobie et al. ( )SurveySchool psychologists ( = 31)U.K.Use and evaluation of solution-focused practicesEffect = attainment of short- and long-term goals, satisfaction of the client with the outcome, and client-therapist relationshipNon-standardized questionnaire concerning the criteria used to evaluate effectiveness of counseling (nine items)
Hawken ( )pre-/post-test design without control-group6th to 8th graders at-risk for poor peer relations and low academic achievement ( = 10)U.S.Evaluation of a school-wide prevention program after at least 6 weeks of intervention; program was implemented by teachers, administrators, and school psychologistsEffect = decrease of discipline referrals and increase in academic achievement; no data on achievement measures presentedNumber of discipline referrals per week; formative standardized measurements (e.g., teacher interviews, self-disclosure) to modify intervention (=only recommended by the authors but no actual data reported)
Yeo and Choi ( )pre-/post-test follow-up control-group design8- to 12-year-old students at-risk ( = 95)SingaporeEvaluation of a 12-session cognitive-behavioral group therapy delivered by one school psychologistEffects = increase of self-esteem, self-control, peer-relationship, social skills, and class-room behaviorStandardized problem-specific questionnaires of several data sources at three measurement occasions [rating scales on self-esteem, impulsive and self-controlled behavior, student report (in sum 70 items) and teacher report (in sum 33 items)]
Millar et al. ( )SurveySchool psychologists ( = 34)CanadaFeasibility examination of a professional development training for school psychologists to provide school-based intervention services for students with anxiety symptomsEffects = capacity of school psychologists to provide mental health services; number of students that receive evidence-based interventionNon-standardized survey on the perceived impact of the professional development practice on school psychologists own professional practice
Ogg et al. ( )SurveySchool psychologists ( = 217)U.S.Evaluation of assessment practice with students with ADHD (=one aspect of the survey)Effect = results of progress monitoring and development of outcome variables (both formative and summative), and assessment of intervention integrityNon-standardized questionnaire on assessment practices with ADHD students with three items concerning outcome evaluation (three items); standardized measurements of adaptive functions and functional behavior analysis (=only recommended by the authors but no actual data reported)
Vu et al. ( )Randomized controlled study with control groupTeachers ( = 1,440)U.S.Evaluation of instruction consultation teams to support teachers with struggling studentsEffect = teachers' perception of their own instructional practices, collaboration among school staff, own efficacy to affect student performance and job satisfactionFour adapted or self-developed scales to measure teachers' perceptions of teaching practices (18 items), collaboration (10 items), self-efficacy (16 items) and job satisfaction (4 items).

Definitions = descriptions used by the authors. If there was no explicit definition in the article, an implicit definition was concluded based of the procedure used .

Summary of included meta-analysis.

Prout and Prout ( )School-based psychological intervention studies with pre-/post-test-control-group designs ( = 17)1985–1994Direct counseling and psychotherapy group interventions; no limitations on the person carrying out the intervention; comparison of treatment type (cognitive-behavioral, relaxation, and skills)Effect = development of symptoms related to problems to function in schoolDisorder- and problem-specific outcomes (e.g., anxiety, depression, social skills) of several data sources (e.g., self-disclosure, grade or test scores, behavior observations no limitation on standardized measures)
Reddy et al. ( )Child and adolescent consultation studies ( = 35)1986–1997Comparison of behavioral, organizational developmental, and mental health consultation in school, home, and/or community for children from birth to 18 years of age; no limitations on design and the person carrying out the consultationEffect = changes in behavior, mental health, or organizational communication, depending of type of consultation after receiving indirect service for parents, teachers, and other professionalsProblem-specific outcomes of clients, consultation, and systems no limitation on standardized measures
Reese et al. ( )School-based pre-/post-test-control-group designs reported in dissertations ( = 65)1998–2008School-based psychotherapy and counseling; thesis published electronically at a PhD-Server; no limitations on the persons carrying out the intervention; comparison of treatment type (cognitive-behavioral, relaxation, and skills)Effect = development of symptoms after attending counseling understood as a psychotherapeutic intervention [analog to Prout and Prout ( )]Disorder- and problem-specific outcome measurements of several data sources (e.g., self-disclosure of students, reports of psychologists, teacher, parents no limitation on standardized measures)
Boudreau et al. ( )Studies measuring the effectiveness of peer/mediated pivotal response treatment to increase social-communication skills for children with autism spectrum disorders1995–2008Peer/mediated pivotal response treatment for school-aged children with an autism spectrum disorder diagnosis; single-subject design studies, although inclusion criteria not limited to this study-typeEffect = increases in the frequency of social-communication behavior of the target child with autism spectrum disorderDirect behavioral observations using behavioral coding schemes, teacher questionnaires and pre-post language samples

Summary of included theoretical or conceptual articles.

Bradley-Johnson and Dean ( )Recommendation for school psychological services to shift from single-case work to preventive system-level work with teachers and the use of a systematic evaluation of servicesEffect = evaluating teaching materials and procedures, intervention programs, and services (=implicit definition)
Durlak ( )Summary of the procedure used by a task force in the U.S. to evaluate the magnitude of the effects of school-related interventionsEffect of an intervention = changes in child adjustment (e.g., problems, symptoms, competences, grades) reaching important effect sizesRecommendation to use standardized measurements of self-, parent-, and teacher ratings of symptoms and competencies as well as grades, achievement test scores, and clinical diagnostics
Strein et al. ( )Application of a public health model to school psychological service and research; proposed shift from single-case to system-level workRecommendation to define effects as increase or decrease in the incidence and prevalence of outcomes at the school-level (e.g., grade retention, disciplinary referrals, performance on tests) instead of evaluation of effects with students with academic and behavioral problems
Hughes and Theodore ( )Conceptual framework how school psychologist can implement psychotherapy in school settingsDefinition of school psychology as psychotherapeutic intervention to support academic and social development; effects = development of symptomsRecommendation to use students' self-disclosure on symptom distress, behavioral observation of teachers, and standardized questionnaires
Nicholson et al. ( )Summary of possible negative effects of psychotherapy and counseling used by school psychologists based on reviews, intervention studies, and meta-analysesEffects = development of symptomsDisorder- and problem-specific outcomes (e.g., anxiety, depression, substance abuse no mention of standardized measures)
Morrison ( )Description of principles and methods in performance evaluation of school psychologists based on recommendations by the National Association for School Psychologists (NASP)Effect = positive impact of school psychological services on student outcomes; recommendations for evaluating student outcomes and school psychologists' performancePerformance appraisal rubrics and rating scales to evaluate school psychologists adapted from instruments for teachers; repeated measures of students' outcomes using standardized items in single-case designs

Within the 12 empirical articles , eight articles used a survey technique (in one article the survey is part of an instrument development method) and four articles described longitudinal intervention studies. The participants of the survey were mostly school psychologists (5 of 8 articles), one survey was done with students and two with teachers. The four (quasi-)experimental longitudinal studies investigated the effects of interventions for students (2 of 4), school psychology students (1 of 4) and teachers (1 of 4). Based on different aspects of the effects of school psychological services investigated by each study, the content covered across articles can be summarized as follows: (1) articles about school psychological evaluation practices in general from school psychologists' perspective (4 of 12: Chafouleas et al., 2002 ; Stobie et al., 2005 ; Millar et al., 2013 ; Ogg et al., 2013 ), (2) one article on the evaluation of school psychological skills from teachers' perspective (Pérez-González et al., 2004 ), (3) articles on the efficacy and/or effectiveness of special interventions or trainings (3 of 12: Lepage et al., 2004 ; Hawken, 2006 ; Yeo and Choi, 2011 ), and (4) articles about satisfaction or perceived effectiveness of school psychological services (4 of 12: Kikas, 2003 ; Proctor and Steadman, 2003 ; Gilman and Gabriel, 2004 ; Vu et al., 2013 ).

Of the six theoretical/conceptual articles , three reports contained conceptual frameworks of the effects of school psychological services (Bradley-Johnson and Dean, 2000 ; Strein et al., 2003 ; Hughes and Theodore, 2009 ), two articles recommended methodological proposals to assess effects of services (Durlak, 2002 ; Morrison, 2013 ), and one critically discussed possible positive and negative effects of school psychological services (Nicholson et al., 2009 ).

Within the category meta-analyses , three reports summarized effect sizes of 17 (Prout and Prout, 1998 ) and 65 empirical group studies (Reese et al., 2010 ) and five single-subject-design studies (Boudreau et al., 2015 ) on direct school-based psychological services respectively and one investigated the overall effect of 35 studies on indirect services (Reddy et al., 2000 ). No limitation was placed on the persons carrying out the intervention, type of services, or the setting. Hence, there was no explicit focus on school psychology in any of the meta-analyses.

Definitions of Effects of School Psychological Services

The definitions used to operationalize effects of school psychological services within the included 22 articles (see Table 3 ) indicate four thematic clusters (research question 3): (1) development of various psychological outcomes at the student level, (2) development of school/system-level outcomes, (3) consumer satisfaction, and (4) no explicit definition.

In relation to the first thematic cluster, most articles (i.e., 11 of 22) defined the effects of school psychological services as development of psychological outcomes at the student level. Three empirical articles (Lepage et al., 2004 ; Yeo and Choi, 2011 ; Ogg et al., 2013 ), four theoretical/conceptual articles (Durlak, 2002 ; Hughes and Theodore, 2009 ; Nicholson et al., 2009 ; Morrison, 2013 ), and all four meta-analyses (Prout and Prout, 1998 ; Reddy et al., 2000 ; Reese et al., 2010 ; Boudreau et al., 2015 ) fell into this category. However, given the fact that the four meta-analyses included counseling and consultation studies not necessarily delivered by school psychologists, the results should be interpreted with caution. Psychological outcomes addressed in the included articles predominantly focused on symptoms of mental health and/or academic or social competencies of students with disorders or at-risk students, depending on the context of the study. Mostly, development was equated to a positive change of these outcomes as evidenced through an increase or decrease of outcome scores based on data provided by pre-/post-test study designs. While some articles focused on development in a single subject or participant group receiving an intervention, others compared the development of an intervention group to a control group.

The second thematic cluster defined effects of school psychological services as the development of school/system-level outcomes and was represented by four of the 22 included articles (empirical: Hawken, 2006 ; Millar et al., 2013 ; Vu et al., 2013 theoretical/conceptual: Strein et al., 2003 ). These articles focused on the number of disciplinary referrals, grade retention frequency data, the capacity of school psychologists to provide services and the number of students receiving evidence-based interventions, and teachers' perceptions of the effects of consultation teams on their work and job satisfaction. A positive development indicating the effectiveness of school psychological services was therefore equated with a decrease or increase of these variables. The meta-analyses by Reddy et al. ( 2000 ), that was already categorized in the first cluster, can also be included within this category of school-level effects as it measures the overall effect of consultation on organizational development (e.g., improvement in communication or climate) in addition to other outcomes.

The third thematic cluster that defined effects as consumer satisfaction was solely represented by three empirical articles. While two studies focused on clients' perceived quality of the received service (Kikas, 2003 ; Gilman and Gabriel, 2004 ), the remaining study used school psychologists' self-disclosure of service efficacy as an outcome measure (Proctor and Steadman, 2003 ).

Finally, three of the 22 articles did not give any explicit definition of the effects of school psychological services (empirical: Chafouleas et al., 2002 ; Pérez-González et al., 2004 ; theoretical/conceptual: Bradley-Johnson and Dean, 2000 ). Implicitly, these articles referred to consumer satisfaction and evaluation practices as part of school psychological professional standards. Another article that we were unable to allocate into one of the categories was the study by Stobie et al. ( 2005 ). The authors defined effects of school psychological services in a broad manner as the attainment of goals without naming specific outcome variables.

Operationalization of Effects of School Psychological Services

The instruments used to investigate effects (research question 4) were mainly disorder- and problem-specific measurements or satisfaction surveys. Questionnaires with rating scales were the most commonly used instruments. Common questions, for example, asked users about the helpfulness of and satisfaction with school psychological services for children and/or educators (e.g., Gilman and Gabriel, 2004 ) or the overall quality of the work of school psychologists (e.g., Kikas, 2003 ). Similarly, school psychologists' self-perception was measured by investigating, for example, whether practitioners believed that they were making a difference at their school, that they were effective with students and teachers, and whether they thought that teachers, parents, and administrators were knowledgeable about their skills and abilities (e.g., Proctor and Steadman, 2003 ). Ten of the 12 empirical articles used non-standardized questionnaires specifically developed or adapted for each study. The questionnaires used a wide range of number of items to evaluate effects, that is, one item in Kikas ( 2003 ) up to 18 items in Vu et al. ( 2013 ). Only one empirical paper used standardized questionnaires (Yeo and Choi, 2011 ), and one study used a behavioral observation technique (Lepage et al., 2004 ). All four meta-analyses included standardized and non-standardized outcome measures from several data sources (questionnaires, behavioral observation, school or clinical reports). The two theoretical/conceptual articles describing procedures and methods for evaluating school psychological services and school-related interventions recommended using standardized measurements (Durlak, 2002 ; Morrison, 2013 ).

The present study aimed to contribute to a better understanding of the measurement of effects of school psychological services by conducting a scoping review of scientific articles published between 1998 and 2018 in eight leading school psychology journals. Only 0.43% of all articles published within this period (i.e., 22 over a total of 5,048) met the inclusion criteria for our review of providing definitions and/or operationalizations of the effects of school psychological services. This shows that measurement of effects of school psychological services has, to date, been addressed in a very limited way by the scholarly community in the field of school psychology. This is surprising giving the fact that professional practice guidelines ask school psychologists to evaluate the efficacy of their services to ensure quality standards. However, addressing this issue is a complex task given the multiple users, intervention methods, and psychological needs involved in school psychological practice (e.g., paradox of school psychology, Gutkin and Conoley, 1990 ). This might explain why such a limited percentage of past research has tried to tackle this issue. Moreover, given the difficulties researchers might probably neglect to collect primary data on the effects of school psychology, opting instead to focus on aspects linked to school psychological work such as the effects of interventions or the usability of diagnostic tools. Consequently, if these studies do not use our keywords in their abstracts, they are not included in our study.

Nevertheless, the 22 studies that we identified provide a starting point to guide school psychologists in evaluating the effects of some of the dimensions of their work. The six theoretical/conceptual articles lay the groundwork by proposing frameworks, recommending methodological steps, and highlighting critical aspects that need to be considered when evaluating the effects of school psychological services. Three empirical studies conducted by researchers in experimental settings as well as four meta-analyses report information on measuring effects of school psychological services with the limitation that interventions were not necessarily carried out by school psychologists. In coherence with Andrews ( 1999 ), this evidence describes the impact school psychological services may have under ideal conditions in experimental settings. In fact, only nine of the 22 studies included in our review were delivered with school psychologists. This result highlights a research gap in the field of “effectiveness-studies” (Andrews, 1999 ), this is, the application of methods supported by experimental research findings in practical non-experimental real-world settings. If school psychologists are required to engage in evidence-based practices and evaluate the efficacy of their work (White and Kratochwill, 2005 ; Skalski et al., 2015 ), it is imperative that they can rely on scientific evidence obtained from real-world studies conducted with practicing school psychologists, not only on findings from experimental studies implemented by researchers. Empirical (quasi-) experimental studies conducted by researchers and/or practitioners with school-aged samples were the most frequently published papers in the eight leading school psychology journals from 2000 onward (Villarreal et al., 2017 ; cf. Begeny et al., 2018 ). Further research should aim to confirm the effects of these (quasi-)experimental studies in non-experimental settings by including practicing school psychologists as potential professionals delivering the investigated interventions. The contributions by Forman et al. ( 2013 ) with respect to the development of implementation science in school psychology can provide a valuable starting point in this respect. Also, the ideas proposed by Nastasi ( 2000 , 2004 ) of creating researcher-practitioner partnerships are important to move the field forward and enable school psychologists to function as scientist-practitioners (Huber, 2007 ). Only if researchers and practitioners work together to increase the adoption of evidence-based practices, common obstacles, such as the reduced importance of research in school psychological practice, can be tackled to improve service delivery (Mendes et al., 2014 , 2017 ).

In terms of the definitions used to refer to effects of school psychological services, most articles focused on the development of psychological outcomes at the student level such as symptoms, behaviors, or competences (i.e., 11 of 22). Only a few studies understood effects as the development of school/system-level outcomes such as discipline referrals (i.e., 4 of 22), consumer satisfaction of service users (i.e., 3 of 22) or provided no explicit definitions (i.e., 4 of 22). No studies with an economic perspective on the definition of effects were identified. Thus, available definitions capture single aspects of the multiple dimensions involved in school psychological practice. The effects of direct student-level services are undoubtably a core component of the efficacy of school psychological services, but more research is needed to guide practitioners with respect to the evaluation of indirect services at the school level and indirect mediation effects of adults, who work with school psychologists, on outcomes at the student level (cf. Gutkin and Curtis, 2008 ). Following the suggestions by Strein et al. ( 2003 ) of adopting a public health perspective may aid researchers in adopting a broader focus of the evaluation of effects of school psychological services.

Regarding the operationalization of effects of school psychological services, we identified a mismatch between theoretical and conceptual proposals of what “should be done” and “what has actually been done” as reported by the available evidence. While theoretical/conceptual articles recommended the use of standardized measures, in practice ten of 12 empirical studies used non-standardized instruments, only in some cases offered information on the reliability of these instruments and, on occasions, only included one item. Also, the four meta-analyses did not limit their inclusion criteria to studies using standardized measures. This finding is concerning, as the methodological quality of instruments determines the extent to which effects of school psychological services may be captured. Thus, empirical research in this field should lead by example and use reliable and valid assessment instruments.

Our results need to be interpreted with caution taking at least two limitations into account. First, to be included in our scoping review, studies had to explicitly refer to the concepts “school psychology” or “counseling” in the school context in their title and/or abstract. We were interested in exploring how the effects of school psychological services are conceptualized and measured within this applied field of psychological research. However, school psychology often relies on contributions from other related research fields (e.g., counseling research, clinical psychology). Given our eligibility criteria this kind of research was not included in our review, although it also plays a role in guiding evaluations of the effects of school psychological services. Readers should, therefore, keep in mind that the evidence summarized in this review only refers to research explicitly associated with the field of school psychology. This limitation is especially important to keep in mind for international readers, as different professional titles are used worldwide to refer to psychologists that cater for needs of students, teachers and other school staff. In European countries, for example, the term “educational psychologist” is often used as a synonym of “school psychologist.” As our review focused on articles that explicitly mentioned the terms “school psychology” or “counseling” in the school context in their title and/or abstract, it is possible that we excluded relevant publications using derivations of the term “educational psychology.”

Second, our review is limited to past research published in eight leading school psychology journals that predominantly focus on contributions by researchers and samples located in the U.S (see Begeny et al., 2018 for a similar approach). This decision certainly limits the generalizability of our results. Therefore, we alert the reader to keep in mind that our findings might be showing an incomplete picture and encourage future research to replicate our study with additional information sources.

This scoping review contributes toward a better understanding of how the measurement of effects of school psychological services has been conceptualized by past research in major school psychology journals. The results show very limited research on this topic despite the relevance of evaluation in guidelines of school psychological practices. We systematically identified, summarized, and critically discussed the information from 22 articles that may serve as a guide for policymakers and school psychologists aiming at evaluating the effects of their services informed by scientific evidence. According to our results, the definitions of the effects of school psychological services only capture some aspects of the multiple dimensions involved in school psychological practice and the methodological quality of the instruments used to assess the efficacy needs to be improved in future studies. It seems that school psychologists can rely on a large amount of interventions with experimental research support, but studies investigating fidelity and effects of these interventions in practical non-experimental school psychological settings are less common. Overall, the results represent a starting point to conceptualize measurement of effects of school psychological services. More research is urgently needed to provide school psychologists with evidence-based tools and procedures to assess the effects of their work and ensure the quality of school psychological services.

Author Contributions

The initial identification process on the journals' homepages and the first screening of titles and abstracts was done by BM. The second screening of the abstracts, the full text screening, and data charting was done by BM. Syntheses of our results were done by BM, AH, NV, and GB. All authors contributed to the article and approved the submitted version.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

1 Reports published in the Canadian Journal of School Psychology and the International Journal of School and Educational Psychology were screened in January 2021 during the review process as recommended by one of the reviewers to extend our search.

Funding. The review was supported by the positions of the authors at the Competence Centre School Psychology Hesse at the Goethe-University Frankfurt, Germany.

  • American Psychological Association (2020). School Psychology . Available online at: https://www.apa.org/ed/graduate/specialize/school (accessed September 10, 2020).
  • Andrews G. (1999). Efficacy, effectiveness and efficiency in mental health service delivery . Aust. N. Z. J. Psychiatry 33 , 316–322. 10.1046/j.1440-1614.1999.00581.x [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Anthun R. (2000). Quality dimensions for school psychology services . Scand. J. Psychol . 41 , 181–187. 10.1111/1467-9450.00186 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Bahr M. W., Leduc J. D., Hild M. A., Davis S. E., Summers J. K., Mcneal B. (2017). Evidence for the expanding role of consultation in the practice of school psychologists . Psychol. Schools 54 , 581–595. 10.1002/pits.22020 [ CrossRef ] [ Google Scholar ]
  • Begeny J. C., Levy R. A., Hida R., Norwalk K. (2018). Experimental research in school psychology internationally: an assessment of journal publications and implications for internationalization . Psychol. Schools 55 , 120–136. 10.1002/pits.22070 [ CrossRef ] [ Google Scholar ]
  • Boudreau A. M., Corkum P., Meko K., Smith I. M. (2015). Peer-mediated pivotal response treatment for young children with autism spectrum disorders: a systematic review . Can. J. School Psychol. 30 , 218–235. 10.1177/0829573515581156 [ CrossRef ] [ Google Scholar ]
  • Braden J. S., DiMarino-Linnen E., Good T. L. (2001). Schools, society, and school psychologists history and future directions . J. School Psychol . 39 , 203–219. 10.1016/S0022-4405(01)00056-5 [ CrossRef ] [ Google Scholar ]
  • Bradley-Johnson S., Dean V. J. (2000). Role change for school psychology: the challenge continues in the new millennium . Psychol. Schools 37 , 1–5. 10.1002/(SICI)1520-6807(200001)37:1<1::AID-PITS1>3.0.CO;2-Q [ CrossRef ] [ Google Scholar ]
  • Bramlett R. K., Murphy J. J., Johnson J., Wallingsford L., Hall J. D. (2002). Contemporary practices in school psychology: a national survey of roles and referral problems . Psychol. Schools 39 , 327–335. 10.1002/pits.10022 [ CrossRef ] [ Google Scholar ]
  • Bundesverband Deutscher Psychologinnen und Psychologen (Professional Association of German Psychologists) (2015). Schulpsychologie in Deutschland – Berufsprofil [School psychology in Germany – job profile] . Available online at: https://www.bdp-schulpsychologie.de/aktuell/2018/180914_berufsprofil.pdf (accessed September 10, 2020).
  • Burns M. K. (2011). School psychology research: combining ecological theory and prevention science . School Psychol. Rev . 40 , 132–139. 10.1080/02796015.2011.12087732 [ CrossRef ] [ Google Scholar ]
  • Burns M. K., Kanive R., Zaslofsky A. F., Parker D. C. (2013). Mega-analysis of school psychology blueprint for training and practice domains . School Psychol. Forum 7 , 13–28. [ Google Scholar ]
  • Chafouleas S. M., Clonan S. M., Vanauken T. L. (2002). A national survey of current supervision and evaluation practices of school psychologists . Psychol. Schools 39 , 317–325. 10.1002/pits.10021 [ CrossRef ] [ Google Scholar ]
  • Durlak J. A. (2002). Evaluating evidence-based interventions in school psychology . School Psychol. Q . 17 , 475–482. 10.1521/scpq.17.4.475.20873 [ CrossRef ] [ Google Scholar ]
  • Farrell P., Jimerson S. R., Kalambouka A., Benoit J. (2005). Teachers' perception of school psychologists in different countries . School Psychol. Int . 26 , 525–544. 10.1177/0143034305060787 [ CrossRef ] [ Google Scholar ]
  • Forman S. G., Shapiro E. S., Codding R. S., Gonzales J. E., Reddy L. A., Rosenfield S. A., et al.. (2013). Implementation science and school psychology . School Psychol. Q . 28 , 77–100. 10.1037/spq0000019 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Gilman R., Gabriel S. (2004). Perceptions of schools psychological services by education professionals: results from a multi-state survey pilot study . School Psychol. Rev . 33 , 271–286. 10.1080/02796015.2004.12086248 [ CrossRef ] [ Google Scholar ]
  • Gutkin T. B., Conoley J. C. (1990). Reconceptualizing school psychology from a service delivery perspective: implications for practice, training, and research . J. School Psychol . 28 , 203–223. 10.1016/0022-4405(90)90012-V [ CrossRef ] [ Google Scholar ]
  • Gutkin T. B., Curtis M. J. (2008). School-based consultation: the science and practice of indirect service delivery , in The Handbook of School Psychology , 4th Edn. eds T. B., Gutkin and C. R. Reynolds (Hoboken, NJ: John Wiley & Sons; ), 591–635. [ Google Scholar ]
  • Hawken L. S. (2006). School psychologists as leaders in the implementation of a targeted intervention . School Psychol. Q . 21 , 91–111. 10.1521/scpq.2006.21.1.91 [ CrossRef ] [ Google Scholar ]
  • Hoagwood K., Johnson J. (2003). School psychology: a public health framework: I. From evidence-based practices to evidence-based policies . J. School Psychol . 41 , 3–21. 10.1016/S0022-4405(02)00141-3 [ CrossRef ] [ Google Scholar ]
  • Huber D. R. (2007). Is the scientist-practitioner model viable for school psychology practice? Am. Behav. Sci . 50 , 778–788. 10.1177/0002764206296456 [ CrossRef ] [ Google Scholar ]
  • Hughes T. L., Theodore L. A. (2009). Conceptual framework for selecting individual psychotherapy in the schools . Psychol. Schools 46 , 218–224. 10.1002/pits.20366 [ CrossRef ] [ Google Scholar ]
  • International School Psychology Association (2021). A Definition of School Psychology . Available online at: https://www.ispaweb.org/a-definition-of-school-psychology/ (accessed September 10, 2020).
  • Jimerson S. R., Graydon K., Farrell P., Kikas E., Hatzichristou C., Boce E., et al.. (2004). The international school psychology survey: development and data from Albania, Cyprus, Estonia, Greece and Northern England . School Psychol. Int . 25 , 259–286. 10.1177/0143034304046901 [ CrossRef ] [ Google Scholar ]
  • Jimerson S. R., Graydon K., Yuen M., Lam S.-F., Thurm J.-M., Klueva N., et al.. (2006). The international school psychology survey. Data from Australia, China, Germany, Italy and Russia . School Psychol. Int . 27 , 5–32. 10.1177/0143034306062813 [ CrossRef ] [ Google Scholar ]
  • Jimerson S. R., Oakland T. D., Farrell P. T. (2007). The Handbook of International School Psychology . Thousand Oaks, CA: Sage. [ Google Scholar ]
  • Kikas E. (2003). Pupils as consumers of school psychological services . School Psychol. Int . 24 , 20–32. 10.1177/0143034303024001579 [ CrossRef ] [ Google Scholar ]
  • Kratochwill T. R., Shernoff E. S. (2004). Evidence-based practice: promoting evidence-based interventions in school psychology . School Psychol. Rev . 33 , 34–48. 10.1080/02796015.2004.12086229 [ CrossRef ] [ Google Scholar ]
  • Lam S. F. (2007). Educational psychology in Hong Kong , in The Handbook of International School Psychology , eds S. Jimerson, T. Oakland, and P. Farrell (SAGE Publications: ), 147–157. [ Google Scholar ]
  • Landis J. R., Koch G. G. (1977). The measurement of observer agreement for categorical data . Biometrics 33 , 159–174. 10.2307/2529310 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Lepage K., Kratochwill T. R., Elliott S. N. (2004). Competency-based behaviour consultation training: an evaluation of consultant outcomes, treatment effects, and consumer satisfaction . School Psychol. Q . 19 , 1–28. 10.1521/scpq.19.1.1.29406 [ CrossRef ] [ Google Scholar ]
  • Liberati A., Altman D. G., Tetzlaff J., Mulrow C., Gøtzsche P. C., Ioannidis J. P. A., et al.. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration . PLoS Med. 6 :e1000100. 10.1371/journal.pmed.1000100 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Manz P. H., Mautone J. A., Martin S. D. (2009). School psychologists' collaborations with families: an exploratory study of the interrelationships of their perceptions of professional efficacy and school climate, and demographic and training variables . J. Appl. School Psychol . 25 , 47–70. 10.1080/15377900802484158 [ CrossRef ] [ Google Scholar ]
  • Mendes S. A., Abreu-Lima I., Almeida L. S., Simeonsson R. J. (2014). School psychology in Portugal: practitioners' characteristics and practices . Int. J. School Educ. Psychol . 2 , 115–125. 10.1080/21683603.2013.863171 [ CrossRef ] [ Google Scholar ]
  • Mendes S. A., Lasser J., Abreu-Lima I. M., Almeida L. S. (2017). All different or all the same? Exploring the diversity of professional practices in Portuguese school psychology . Eur. J. Psychol. Educ . 32 , 251–269. 10.1007/s10212-016-0297-6 [ CrossRef ] [ Google Scholar ]
  • Millar G. M., Lean D., Sweet S. D., Moraes S. C., Nelson V. (2013). The psychology school mental health initiative: an innovative approach to the delivery of school-based intervention services . Can. J. School Psychol. 28 , 103–118. 10.1177/0829573512468858 [ CrossRef ] [ Google Scholar ]
  • Morrison J. Q. (2013). Performance evaluation and accountability for school psychologists: challenges and opportunities . Psychol. Schools 50 , 314–324. 10.1002/pits.21670 [ CrossRef ] [ Google Scholar ]
  • Nastasi B. K. (2000). School psychologists as health-care providers in the 21st century: conceptual framework, professional identity, and professional practice . School Psychol. Rev . 29 , 540–554. 10.1080/02796015.2000.12086040 [ CrossRef ] [ Google Scholar ]
  • Nastasi B. K. (2004). Meeting the challenges of the future: integrating public health and public education for mental health promotion . J. Educ. Psychol. Consult . 15 , 295–312. 10.1207/s1532768xjepc153&amp;4_6 [ CrossRef ] [ Google Scholar ]
  • Nicholson H., Foote C., Grigerick S. (2009). Deleterious effects of psychotherapy and counseling in the schools . Psychol. Schools 46 , 232–237. 10.1002/pits.20367 [ CrossRef ] [ Google Scholar ]
  • Ogg J., Fefer S., Sundman-Wheat A., McMahan M., Stewart T., Chappel A., et al.. (2013). School-based assessment of ADHD: purpose, alignment with best practice guidelines, and training . J. Appl. School Psychol . 29 , 305–327. 10.1080/15377903.2013.836775 [ CrossRef ] [ Google Scholar ]
  • Pérez-González F., Garcia-Ros R., Gómez-Artiga A. (2004). A survey of teacher perceptions of the school psychologist's skills in the consultation process. An exploratory factor analysis . School Psychol. Int . 25 , 30–41. 10.1177/0143034304026778 [ CrossRef ] [ Google Scholar ]
  • Peters M. D. J., Godfrey C. M., Khalil H., McInerney P., Parker D., Soares C. B. (2015). Guidance for conducting systematic scoping reviews . Int. J. Evid. Based Healthc . 13 , 141–146. 10.1097/XEB.0000000000000050 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Phillips B. N. (1998). Scientific standards of practice in school psychology: an emerging perspective . School Psychol. Int . 19 , 267–278. 10.1177/0143034398193006 [ CrossRef ] [ Google Scholar ]
  • Proctor B. E., Steadman T. (2003). Job satisfaction, burnout, and perceived effectiveness of in-house versus traditional school psychologists . Psychol. Schools 40 , 237–243. 10.1002/pits.10082 [ CrossRef ] [ Google Scholar ]
  • Prout S. M., Prout H. T. (1998). A meta-analysis of school-based studies of counseling and psychotherapy: an update . J. School Psychol . 36 , 121–136. 10.1016/S0022-4405(98)00007-7 [ CrossRef ] [ Google Scholar ]
  • Reddy L. A., Barboza-Whitehead S., Files T., Rubel E. (2000). Clinical focus of consultation outcome research with children and adolescents . Special Serv. Schools 16 , 1–22, 10.1300/J008v16n01_01 [ CrossRef ] [ Google Scholar ]
  • Reese R. J., Prout H. T., Zirkelback E. A., Anderson C. R. (2010). Effectiveness of school-based psychotherapy: a meta-analysis of dissertation research . Psychol. Schools 47 , 1035–1045. 10.1002/pits.20522 [ CrossRef ] [ Google Scholar ]
  • Revelle W. (2018). psych: Procedures for Personality and Psychological Research. Evanston, Illinois, USA: Northwestern University . Available online at: https://CRAN.R-project.org/package=psych V ersion = 1.8.4 (accessed September 10, 2020).
  • Sandoval J., Lambert N. M. (1977). Instruments for evaluating school psychologists' functioning and service . Psychol. Schools 14 , 172–179. 10.1002/1520-6807(197704)14:2<172::AID-PITS2310140209>3.0.CO;2-1 [ CrossRef ] [ Google Scholar ]
  • Sheridan S. M., Gutkin T. B. (2000). The ecology of school psychology: examining and changing our paradigm for the 21st century . School Psychol. Rev . 29 , 485–502. 10.1080/02796015.2000.12086032 [ CrossRef ] [ Google Scholar ]
  • Skalski A. K., Minke K., Rossen E., Cowan K. C., Kelly J., Armistead R., et al.. (2015). NASP Practice Model Implementation Guide . Bethesda, MD: National Association of School Psychologists. [ Google Scholar ]
  • Squires G., Farrell P. (2007). Educational psychology in England and Wales , in The Handbook of International School Psychology , eds S. Jimerson, T. Oakland, and P. Farrell (SAGE Publications: ), 81–90. [ Google Scholar ]
  • Stobie I., Boyle J., Woolfson L. (2005). Solution-focused approaches in the practice of UK educational psychologists. A study of the nature of their application and evidence of their effectiveness . School Psychol. Int . 26 , 5–28. 10.1177/0143034305050890 [ CrossRef ] [ Google Scholar ]
  • Strein W., Hoagwood K., Cohn A. (2003). School psychology: a public health perspective. Prevention, populations, and systems change . J. School Psychol . 41 , 23–38. 10.1016/S0022-4405(02)00142-5 [ CrossRef ] [ Google Scholar ]
  • Tricco A. C., Lillie E., Zarin W., O'Brian K. K., Colquhoun H., Kastner M., et al.. (2016). A scoping review on the conduct and reporting of scoping reviews . BMC Med. Res. Methodol. 16 :15. 10.1186/s12874-016-0116-4 [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Tricco A. C., Lillie E., Zarin W., O'Brian K. K., Colquhoun H., Levac D., et al.. (2018). PRISMA extension for scoping Reviews (PRISMA-ScR): checklist and explanation . Ann. Int. Med . 169 , 467–473. 10.7326/M18-0850 [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Villarreal V., Castro M. J., Umana I., Sullivan J. R. (2017). Characteristics of intervention research in school psychology journals: 2010 – 2014 . Psychol. Schools 54 , 548–559. 10.1002/pits.22012 [ CrossRef ] [ Google Scholar ]
  • Vu P., Shanahan K. B., Rosenfield S., Gravois T., Koehler J., Kaiser L., et al.. (2013). Experimental evaluation of instructional consultation teams on teacher beliefs and practices . Int. J. School Educ. Psychol. 1 , 67–81. 10.1080/21683603.2013.790774 [ CrossRef ] [ Google Scholar ]
  • White J. L., Kratochwill T. R. (2005). Practice guidelines in school psychology: issues and directions for evidence-based intervention in practice and training . J. School Psychol . 43 , 99–115. 10.1016/j.jsp.2005.01.001 [ CrossRef ] [ Google Scholar ]
  • Yeo L. S., Choi P. M. (2011). Cognitive-behavioural therapy for children with behavioural difficulties in the Singapore mainstream school setting . School Psychol. Int . 32 , 616–631. 10.1177/0143034311406820 [ CrossRef ] [ Google Scholar ]
  • Yesseldyke J., Dawson P., Lehr C., Reschly D., Reynolds M., Telzrow C. (1997). School Psychology: A Blueprint for Training and Practice II . Bethesda, MD: National Association of School Psychologists. [ Google Scholar ]

American Psychological Association Logo

Journal of Applied Psychology

  • Read this journal
  • Read free articles
  • Journal snapshot
  • Advertising information

Journal scope statement

The Journal of Applied Psychology ® emphasizes the publication of original investigations that contribute new knowledge and understanding to fields of applied psychology (other than clinical and applied experimental or human factors, which are more appropriate for other APA journals).

The journal primarily considers empirical and theoretical investigations that enhance understanding of cognitive, motivational, affective, and behavioral psychological phenomena in work and organizational settings, broadly defined.

Those psychological phenomena can be

  • at one or multiple levels—individuals, groups, organizations, or cultures;
  • in work settings such as business, education, training, health, service, government, or military institutions; and
  • in the public or private sector, for-profit or nonprofit organizations.

The journal publishes several types of articles, including:

  • Rigorously conducted empirical investigations that extend conceptual understanding (original investigations or meta-analyses);
  • Theory development articles as well as integrative conceptual reviews that synthesize literature and create new theory of psychological phenomena that will stimulate novel research;
  • Rigorously conducted qualitative research on phenomena that are difficult to capture with quantitative methods, or on phenomena that warrant inductive theory building.

The journal accepts work that is conducted in the field or in the laboratory, where the data (quantitative or qualitative) are analyzed with elegant or simple statistics, so long as the data or theoretical synthesis advances understanding of psychological phenomena and human behavior that have direct or indirect practical implications.

A nonexhaustive sampling of topics appropriate for the Journal of Applied Psychology includes:

  • individual differences in abilities, personality, and other characteristics;
  • testing and personnel selection;
  • performance measurement and management;
  • training, learning, and skill acquisition;
  • work motivation;
  • job attitudes, affect, and emotions;
  • leadership;
  • team development, processes, and effectiveness;
  • career development;
  • work–family interface;
  • work stress, health, and well-being;
  • positive and negative work behaviors;
  • diversity and cross-cultural differences in work behavior and attitudes;
  • technology and work systems;
  • expertise and knowledge management;
  • creativity, innovation, and adaptation;
  • organizational culture and climate; and
  • organizational design, change, and interventions.

The journal also encourages studies of human behavior in novel situations, and integration of basic psychological principles and theories with applied work and organizational phenomena. Specific topics of interest, however, change as organizations evolve and societal views of work change.

Disclaimer: APA and the editors of the Journal of Applied Psychology ® assume no responsibility for statements and opinions advanced by the authors of its articles.

Equity, diversity, and inclusion

Journal of Applied Psychology supports equity, diversity, and inclusion (EDI) in its practices. More information on these initiatives is available under EDI Efforts .

Open science

The APA Journals Program is committed to publishing transparent, rigorous research; improving reproducibility in science; and aiding research discovery. Open science practices vary per editor discretion. View the initiatives implemented by this journal .

Editor's Choice

Each issue of the  Journal of Applied Psychology  will honor one accepted manuscript per issue by selecting it as an “ Editor’s Choice ” paper. Selection is based on the discretion of the editor and will be determined by considering societal relevance, potential for practically improving employee and organizational outcomes, and potential for advancing science in new directions.

Author and editor spotlights

Explore journal highlights : free article summaries, editor interviews and editorials, journal awards, mentorship opportunities, and more.

Prior to submission, please carefully read and follow the submission guidelines detailed below. Manuscripts that do not conform to the submission guidelines may be returned without review.

The completion of a submission checklist that signifies that authors have read this material and agree to adhere to the guidelines is required as part of the submission upload process (authors can find this checklist in the "Additional Information" section). If you need to include a data transparency appendix, please append this as the last page of your manuscript file. Revisions should also include the data transparency appendix and include any necessary updates based on the revisions.

To submit to the editorial office of Lillian T. Eby, please submit manuscripts electronically through the Manuscript Submission Portal in Microsoft Word (.docx) or LaTex (.tex) as a zip file with an accompanied Portable Document Format (.pdf) of the manuscript file.

Prepare manuscripts according to the Publication Manual of the American Psychological Association using the 7 th edition. Manuscripts may be copyedited for bias-free language (see Chapter 5 of the Publication Manual  and APA’s Brief Guide to Bias-Free and Inclusive Language ). APA Style and Grammar Guidelines for the 7 th edition are available.

Submit Manuscript

Lillian T. Eby, editor Department of Psychology University of Georgia Athens, GA 30602

General correspondence may be directed to the editor's office .

In addition to addresses and phone numbers, please supply email addresses and fax numbers, if available, for potential use by the editorial office and later by the production office.

Keep a copy of the manuscript to guard against loss.

Manuscripts submitted for publication consideration in the Journal of Applied Psychology are evaluated according to the following criteria:

  • degree to which the manuscript fits the mission of the journal as described on the journal website
  • significance of the theoretical, practical and/or methodological contributions
  • quality of the literature review
  • articulation and explication of the conceptual rationale, constructs, and psychological processes
  • rigor of the design and execution of the study
  • appropriateness of the analysis and interpretation of the results
  • discussion of implications for theory, research, and application
  • clarity of presentation

Manuscripts should be logically organized and clearly written in concise and unambiguous language. The goal of APA primary journals is to publish useful information that is accurate and clear.

Three primary types of articles will be published:

  • Feature Articles , which are full-length articles that focus on a conceptually or theoretically driven empirical contribution (all research strategies and methods, quantitative and qualitative, are considered) or on a theoretical contribution that can shape future research in applied psychology.
  • Research Reports , which are narrower in scope and are limited to no more than 19 manuscript pages (excluding title page, abstract, references, tables, and figures). In determining the length limits of Research Reports, authors should count 25 lines of 12-point text with 1-inch margins as the equivalent of one page.
  • Integrative Conceptual Reviews , which are full-length articles that are designed to synthesize relevant literature, extend theoretical development, and propose new directions for future research. Submissions start with a 10-page prospectus that describes the contribution, structure and coverage of the proposed paper and is evaluated by the editorial team. Additional information can be found at the Call for Integrative Conceptual Reviews .

The review process is the same for Feature Articles and Research Reports. The two types of manuscripts differ only in length, commensurate to different scope of intended contribution. Authors should indicate in the cover letter that they would like the submission considered as a Research Report. However, the action editor (with input from the review team) may suggest that a Feature Article submission be pared down to Research Report length through the review process.

Authors should refer to recent issues of the journal for approximate length of Feature Articles, Integrative Conceptual Reviews, and Research Reports. (Total manuscript pages divided by three provides an estimate of total printed pages.)

The journal also has a history of publishing Monographs on occasion. Monographs are substantial and significant contributions (as determined by the editorial team).

In addition, the journal occasionally publishes Commentaries (see Kozlowski, S.W.J. (2011). Comment policy. Journal of Applied Psychology , 96 (2), 231–232). (PDF, 36KB)

The Journal of Applied Psychology also publishes replications that add important value and insight to the literature. Submissions should include “A Replication of XX Study” in the subtitle of the manuscript as well as in the abstract.

Journal of Applied Psychology is now using a software system to screen submitted content for similarity with other published content. The system compares the initial version of each submitted manuscript against a database of 40+ million scholarly documents, as well as content appearing on the open web. This allows APA to check submissions for potential overlap with material previously published in scholarly journals (e.g., lifted or republished material).

Author contribution statements using CRediT

The APA Publication Manual (7th ed.) stipulates that “authorship encompasses…not only persons who do the writing but also those who have made substantial scientific contributions to a study.” In the spirit of transparency and openness, Behavioral Neuroscience has adopted the Contributor Roles Taxonomy (CRediT) to describe each author's individual contributions to the work. CRediT offers authors the opportunity to share an accurate and detailed description of their diverse contributions to a manuscript.

Submitting authors will be asked to identify the contributions of all authors at initial submission according to this taxonomy. If the manuscript is accepted for publication, the CRediT designations will be published as an Author Contributions Statement in the author note of the final article. All authors should have reviewed and agreed to their individual contribution(s) before submission.

CRediT includes 14 contributor roles, as described below:

  • Conceptualization: Ideas; formulation or evolution of overarching research goals and aims.
  • Data curation: Management activities to annotate (produce metadata), scrub data and maintain research data (including software code, where it is necessary for interpreting the data itself) for initial use and later reuse.
  • Formal analysis: Application of statistical, mathematical, computational, or other formal techniques to analyze or synthesize study data.
  • Funding acquisition: Acquisition of the financial support for the project leading to this publication.
  • Investigation: Conducting a research and investigation process, specifically performing the experiments, or data/evidence collection.
  • Methodology: Development or design of methodology; creation of models.
  • Project administration: Management and coordination responsibility for the research activity planning and execution.
  • Resources: Provision of study materials, reagents, materials, patients, laboratory samples, animals, instrumentation, computing resources, or other analysis tools.
  • Software: Programming, software development; designing computer programs; implementation of the computer code and supporting algorithms; testing of existing code components.
  • Supervision: Oversight and leadership responsibility for the research activity planning and execution, including mentorship external to the core team.
  • Validation: Verification, whether as a part of the activity or separate, of the overall replication/reproducibility of results/experiments and other research outputs.
  • Visualization: Preparation, creation and/or presentation of the published work, specifically visualization/data presentation.
  • Writing—original draft: Preparation, creation and/or presentation of the published work, specifically writing the initial draft (including substantive translation).
  • Writing—review and editing: Preparation, creation and/or presentation of the published work by those from the original research group, specifically critical review, commentary or revision—including pre- or post-publication stages.

Authors can claim credit for more than one contributor role, and the same role can be attributed to more than one author. Not all roles will be applicable to a particular scholarly work.

Masked review policy

The journal will accept submissions in masked review format only.

Author names and affiliations should appear in the cover letter but not anywhere on the manuscript. Authors should make every reasonable effort to see that the manuscript itself contains no clues to their identities, including grant numbers, names of institutions providing IRB approval, self-citations, and links to online repositories for data, materials, code, or preregistrations (e.g., create a view-only link for a project ). Manuscripts not in masked format will be returned to authors for revision prior to being reviewed.

Please ensure that the final version for production includes a byline and full author note for typesetting.

Original use of data

APA requires that all data in their published articles be an original use. Along with determining the appropriateness of any submission, the editorial team (editor and reviewers) also have a role in determining what constitutes "original use." Key to this determination is the extent to which reported data and results serve to promote cumulative knowledge and insight to the literature.

In order to preserve masked review, authors should include a data transparency appendix in the manuscript which details how and where the data collected were (or potentially will soon be) used. Any previous, concurrent, or near future use of data (and/or sample) reported in a submitted manuscript must be brought to the editorial team's attention (i.e., any paper(s) previously published, in press, or currently under review at any journals, as well as any paper(s) that foreseeably will be under review before an editorial decision is made on the current submitted manuscript). This includes variables that overlap as well as those that may not overlap with those in the submitted article. Authors may also put in any other clarifying information they wish, as long as it can be done anonymously. Any identifying information, such as authors' names or titles of journal articles that the authors wish to share can be included in the cover letter where only the editorial staff will see it.

When providing information in the paper itself and/or in the appendix, authors should ensure there is enough detail for reviewers to assess whether data presented constitute original use and unique knowledge and insights.

For more information on APA's data policies, please see "Duplicate and Piecemeal Publication of Data," APA Publication Manual (Section 1.16, 7th Edition, p. 17–20).

Access  data transparency appendix examples .

Transparency and openness

APA endorses the Transparency and Openness Promotion (TOP) Guidelines by a community working group in conjunction with the Center for Open Science ( Nosek et al. 2015 ). All articles submitted to the Journal of Applied Psychology must, at a minimum, adhere to the TOP levels as noted in the list below, which details the domains of research planning and reporting, the TOP level required, and a brief description of the journal’s policy. Submissions that do not include (1) qualitative, quantitative, or simulated data, (2) a systematic narrative or meta-analytic review of the literature, or (3) re-analysis of existing data must also include a statement that TOP guidelines related to data sharing, code sharing, hypotheses preregistration, analysis preregistration, and materials sharing are not applicable and why. All articles must comply with the TOP guideline of citing any materials not original to the submitted work.

  • Citation: Level 2, Requirement—All data, program code, and other methods not original to the submitted work (developed by others) must be cited in the text and listed in the References section.
  • Data Transparency: Level 1, Disclosure—Article states whether or not the raw and/or processed data on which study conclusions are based are posted to a trusted repository and, if so, how to access the data.
  • Analytic Methods (Code) Transparency: Level 1, Disclosure—Article states whether or not the computer code or syntax needed to reproduce analyses in an article is posted to a trusted repository and if so, how to access it.
  • Research Materials Transparency: Level 1, Disclosure—Article states whether materials described in the Method section are posted to a trusted repository and, if so, how to access them. It is strongly encouraged to include online supplements for study materials.
  • Design and Analysis Transparency (Reporting Standards): Level 2, Requirement—Article must follow the Journal of Applied Psychology methods checklist and all relevant aspects of the APA Style Journal Reporting Standards (JARS).
  • Study Preregistration: Level 2, Requirement—Article states whether the study design and (if applicable) hypotheses of any of the work reported was preregistered and, if so, how to access it. Access to a masked version of the preregistered materials must be available at submission via stable link or supplemental material.
  • Analysis Plan Preregistration: Level 2, Requirement—Article states whether any of the work reported was preregistered with an analysis plan and, if so, how to access it. Access to a masked version of the preregistered analysis plan must be available at submission via stable link or supplemental material.

Authors must include a brief (one short paragraph) subsection in the method description titled “Transparency and openness” that indicates how they complied with the TOP guidelines adopted at the Journal of Applied Psychology .

For example:

  • We describe our sampling plan, all data exclusions (if any), all manipulations, and all measures in the study, and we adhered to the Journal of Applied Psychology methodological checklist. All data, analysis code, and research materials are available at [stable masked link to repository]. Data were analyzed using R, version 4.0.0 (R Core Team, 2020) and the package ggplot , version 3.2.1 (Wickham, 2016). This study’s design and its analysis were not preregistered.
  • We describe our sampling plan, all data exclusions (if any), all manipulations, and all measures in the study, and we adhered to the Journal of Applied Psychology methodological checklist. Analysis code and research materials are available at [stable masked link to repository]. Data are not available due to their proprietary nature. Data were analyzed using R, version 4.0.0 (R Core Team, 2020) and the package ggplot , version 3.2.1 (Wickham, 2016). The study design was not preregistered because the data were collected for an applied selection project. The hypotheses and analysis were preregistered [masked OSF link].

Data and materials

Authors must state whether data and study materials are available and, if so, where to access them. Recommended repositories include APA’s repository on the Open Science Framework (OSF), or authors can access a full list of other recommended repositories . Trusted repositories adhere to policies that make data discoverable, accessible, usable, and preserved for the long term. Trusted repositories also assign unique and persistent identifiers. Original materials that are not in English do not require translation.

In a subsection at the end of the method subsection titled “Transparency and openness,” specify whether and where the data and material will be available or include a statement noting that they are not available.

  • All data have been made publicly available at the [trusted repository name] and can be accessed at [persistent URL or DOI].
  • Data for this study are not available because they are proprietary.
  • Data for this study are not available because they protected under HIPAA.
  • Data for this study are not available because they protected under FERPA.
  • Materials for this study can be found [in the Appendix; in the online supplement].
  • Materials for this study can be found at the [trusted repository name] and can be accessed at [masked persistent URL or DOI].
  • Materials for this study are not available because they are not owned by the authors.

Please refer to the APA Publication Manual (7th ed.) regarding APA guidelines on data retention and sharing (section 1.14) as well as the APA Ethics Code regarding conditions and exceptions to data sharing (e.g., 2.01 Boundaries of Competence, 8.14 Sharing Research Data for Verification).

Information regarding reproducing material from previously published works

If you have reproduced full scales or other material (e.g., tables, figures, experimental materials) from previously published articles, please note that it is your responsibility to ensure that doing so does not constitute a copyright infringement with the original publisher. You can find information on fair use of previously published material and on APA’s permissions policy in the Permissions Alert Form for APA Authors .

Analytic methods (code)

For submissions with quantitative or simulation analytic methods, state whether the study analysis code is posted to a trusted repository and if so, how to access them, including their location and any limitations on use. Trusted repositories adhere to policies that make data discoverable, accessible, usable, and preserved for the long term. Trusted repositories also assign unique and persistent identifiers. Recommended repositories include APA’s repository on the Open Science Framework (OSF), or authors can access a full list of other recommended repositories .

  • The code behind this analysis/simulation has been made publicly available at the [trusted repository name] and can be accessed at [persistent URL or DOI].
  • The code behind this analysis/simulation is not available because it is proprietary.
  • Analysis code is available at [stable masked link to trusted repository], except for the predictor scoring algorithm which is proprietary.

Preregistration of studies and analysis plans

Preregistration of studies and specific hypotheses can be a useful tool for making strong theoretical claims. Likewise, preregistration of analysis plans can be useful for distinguishing confirmatory and exploratory analyses. Investigators are encouraged to preregister their studies and analysis plans prior to conducting the research via a publicly accessible registry system (e.g., OSF , ClinicalTrials.gov, or other trial registries in the WHO Registry Network).

There are many available preregistration forms (e.g.,  OSF , ClinicalTrials.gov, or other trial registries in the WHO Registry Network). There are many preregistration templates available, including the  Preregistration for Quantitative Research in Psychology template (Bosnjak et al., 2022). Completed preregistration forms should be time-stamped.

Articles must state whether or not any work was preregistered and, if so, where to access the preregistration. Preregistrations must be available to reviewers; authors should submit a masked copy via stable link or supplemental material. Links in the method section should be replaced with an identifiable copy on acceptance.

  • This study’s design was preregistered; see [STABLE LINK OR DOI].
  • This study’s design and hypotheses were preregistered; see [STABLE LINK OR DOI].
  • This study’s analysis plan was preregistered; see [STABLE LINK OR DOI].
  • This study was not preregistered.

Manuscript preparation

Prepare manuscripts according to the Publication Manual of the American Psychological Association . Manuscripts may be copyedited for bias-free language (see Chapter 5 of the 7 th edition).

Review APA's Journal Manuscript Preparation Guidelines before submitting your article.

Double-space all copy. Other formatting instructions, as well as instructions on preparing tables, figures, references, metrics, and abstracts, appear in the Manual . Additional guidance on APA Style is available on the APA Style website .

Methodological reporting guidelines and checklists

In an effort to enhance transparency, reproducibility, and replicability, we have developed methodological reporting guidelines and checklists for the Journal of Applied Psychology (PDF, 166KB)  that authors should follow (as appropriate for their submitted work) when preparing manuscripts for submission to the Journal of Applied Psychology . Authors should also consult the APA Style Journal Article Reporting Standards (JARS) for quantitative , qualitative , and mixed methods research .

Authors should review the new Journal Article Reporting Standards for Race, Ethnicity, and Culture (JARS–REC). Meant for all authors, regardless of research topic, JARS–REC include standards for all stages of research and manuscript writing.

For more, see the Guidance for Authors sections of the table (PDF, 184KB) .

Below are additional instructions regarding the preparation of display equations, computer code, and tables.

Display equations

We strongly encourage you to use MathType (third-party software) or Equation Editor 3.0 (built into pre-2007 versions of Word) to construct your equations, rather than the equation support that is built into Word 2007 and Word 2010. Equations composed with the built-in Word 2007/Word 2010 equation support are converted to low-resolution graphics when they enter the production process and must be rekeyed by the typesetter, which may introduce errors.

To construct your equations with MathType or Equation Editor 3.0:

  • Go to the Text section of the Insert tab and select Object.
  • Select MathType or Equation Editor 3.0 in the drop-down menu.

If you have an equation that has already been produced using Microsoft Word 2007 or 2010 and you have access to the full version of MathType 6.5 or later, you can convert this equation to MathType by clicking on MathType Insert Equation. Copy the equation from Microsoft Word and paste it into the MathType box. Verify that your equation is correct, click File, and then click Update. Your equation has now been inserted into your Word file as a MathType Equation.

Use Equation Editor 3.0 or MathType only for equations or for formulas that cannot be produced as Word text using the Times or Symbol font.

Computer code

Because altering computer code in any way (e.g., indents, line spacing, line breaks, page breaks) during the typesetting process could alter its meaning, we treat computer code differently from the rest of your article in our production process. To that end, we request separate files for computer code.

In online supplemental material

We request that runnable source code be included as supplemental material to the article. For more information, visit Supplementing Your Article With Online Material .

In the text of the article

If you would like to include code in the text of your published manuscript, please submit a separate file with your code exactly as you want it to appear, using Courier New font with a type size of 8 points. We will make an image of each segment of code in your article that exceeds 40 characters in length. (Shorter snippets of code that appear in text will be typeset in Courier New and run in with the rest of the text.) If an appendix contains a mix of code and explanatory text, please submit a file that contains the entire appendix, with the code keyed in 8-point Courier New.

Use Word's insert table function when you create tables. Using spaces or tabs in your table will create problems when the table is typeset and may result in errors.

Academic writing and English language editing services

Authors who feel that their manuscript may benefit from additional academic writing or language editing support prior to submission are encouraged to seek out such services at their host institutions, engage with colleagues and subject matter experts, and/or consider several vendors that offer discounts to APA authors .

Please note that APA does not endorse or take responsibility for the service providers listed. It is strictly a referral service.

Use of such service is not mandatory for publication in an APA journal. Use of one or more of these services does not guarantee selection for peer review, manuscript acceptance, or preference for publication in any APA journal.

Submitting supplemental materials

APA can place supplemental materials online, available via the published article in the PsycArticles ® database. Please see Supplementing Your Article With Online Material for more details.

Abstract and keywords

All manuscripts must include an abstract containing a maximum of 250 words typed on a separate page. After the abstract, please supply up to five keywords or brief phrases.

List references in alphabetical order. Each listed reference should be cited in text, and each text citation should be listed in the references section.

Examples of basic reference formats:

Journal article

McCauley, S. M., & Christiansen, M. H. (2019). Language learning as language use: A cross-linguistic model of child language development. Psychological Review , 126 (1), 1–51. https://doi.org/10.1037/rev0000126

Authored book

Brown, L. S. (2018). Feminist therapy (2nd ed.). American Psychological Association. https://doi.org/10.1037/0000092-000

Chapter in an edited book

Balsam, K. F., Martell, C. R., Jones. K. P., & Safren, S. A. (2019). Affirmative cognitive behavior therapy with sexual and gender minority people. In G. Y. Iwamasa & P. A. Hays (Eds.), Culturally responsive cognitive behavior therapy: Practice and supervision (2nd ed., pp. 287–314). American Psychological Association. https://doi.org/10.1037/0000119-012

Data set citation

Alegria, M., Jackson, J. S., Kessler, R. C., & Takeuchi, D. (2016). Collaborative Psychiatric Epidemiology Surveys (CPES), 2001–2003 [Data set]. Inter-university Consortium for Political and Social Research. https://doi.org/10.3886/ICPSR20240.v8

All data, program code and other methods not original to the submitted work (developed by others) must be cited in the text and listed in the references section.

Preferred formats for graphics files are TIFF and JPG, and preferred format for vector-based files is EPS. Graphics downloaded or saved from web pages are not acceptable for publication. Multipanel figures (i.e., figures with parts labeled a, b, c, d, etc.) should be assembled into one file. When possible, please place symbol legends below the figure instead of to the side.

  • All color line art and halftones: 300 DPI
  • Black and white line tone and gray halftone images: 600 DPI

Line weights

  • Color (RGB, CMYK) images: 2 pixels
  • Grayscale images: 4 pixels
  • Stroke weight: 0.5 points

APA offers authors the option to publish their figures online in color without the costs associated with print publication of color figures.

The same caption will appear on both the online (color) and print (black and white) versions. To ensure that the figure can be understood in both formats, authors should add alternative wording (e.g., “the red (dark gray) bars represent”) as needed.

For authors who prefer their figures to be published in color both in print and online, original color figures can be printed in color at the editor's and publisher's discretion provided the author agrees to pay:

  • $900 for one figure
  • An additional $600 for the second figure
  • An additional $450 for each subsequent figure

Permissions

Authors of accepted papers must obtain and provide to the editor on final acceptance all necessary permissions to reproduce in print and electronic form any copyrighted work, including test materials (or portions thereof), photographs, and other graphic images (including those used as stimuli in experiments).

On advice of counsel, APA may decline to publish any image whose copyright status is unknown.

  • Download Permissions Alert Form (PDF, 13KB)

Publication policies

For full details on publication policies, including use of Artificial Intelligence tools, please see APA Publishing Policies .

APA policy prohibits an author from submitting the same manuscript for concurrent consideration by two or more publications.

See also APA Journals ® Internet Posting Guidelines .

APA requires authors to reveal any possible conflict of interest in the conduct and reporting of research (e.g., financial interests in a test or procedure, funding by pharmaceutical companies for drug research).

  • Download Full Disclosure of Interests Form (PDF, 41KB)

In light of changing patterns of scientific knowledge dissemination, APA requires authors to provide information on prior dissemination of the data and narrative interpretations of the data/research appearing in the manuscript (e.g., if some or all were presented at a conference or meeting, posted on a listserv, shared on a website, including academic social networks like ResearchGate, etc.). This information (2–4 sentences) must be provided as part of the Author Note.

Ethical Principles

It is a violation of APA Ethical Principles to publish "as original data, data that have been previously published" (Standard 8.13).

In addition, APA Ethical Principles specify that "after research results are published, psychologists do not withhold the data on which their conclusions are based from other competent professionals who seek to verify the substantive claims through reanalysis and who intend to use such data only for that purpose, provided that the confidentiality of the participants can be protected and unless legal rights concerning proprietary data preclude their release" (Standard 8.14).

APA expects authors to adhere to these standards. Specifically, APA expects authors to have their data available throughout the editorial review process and for at least 5 years after the date of publication.

Authors are required to state in writing that they have complied with APA ethical standards in the treatment of their sample, human or animal, or to describe the details of treatment.

  • Download Certification of Compliance With APA Ethical Principles Form (PDF, 26KB)

The APA Ethics Office provides the full Ethical Principles of Psychologists and Code of Conduct electronically on its website in HTML, PDF, and Word format. You may also request a copy by emailing or calling the APA Ethics Office (202-336-5930). You may also read "Ethical Principles," December 1992, American Psychologist , Vol. 47, pp. 1597–1611.

Other information

See APA’s Publishing Policies page for more information on publication policies, including information on author contributorship and responsibilities of authors, author name changes after publication, the use of generative artificial intelligence, funder information and conflict-of-interest disclosures, duplicate publication, data publication and reuse, and preprints.

Visit the Journals Publishing Resource Center for more resources for writing, reviewing, and editing articles for publishing in APA journals.

Lillian T. Eby, PhD University of Georgia, United States

Associate editors

Talya N. Bauer, PhD Portland State University, United States

Wendy J. Casper, PhD University of Texas at Arlington, United States

Bryan D. Edwards, PhD Oklahoma State University, United States

Allison S. Gabriel, PhD Purdue University, United States

Alicia A. Grandey, PhD Pennsylvania State University, United States

Astrid C. Homan, PhD University of Amsterdam, Amsterdam, the Netherlands

Jenny M. Hoobler, PhD Nova School of Business and Economics, Carcavelos, Portugal

Jia (Jasmine) Hu, PhD Ohio State University, United States, and Tsingua University, Beijing, China

Scott B. Morris, PhD Illinois Institute of Technology, United States

In-Sue Oh, PhD Temple University, United States

Fred Oswald, PhD Rice University, United States

Christopher O. L. H. Porter, PhD Virginia Polytechnic Institute and State University, United States

Kristen M. Shockley, PhD Auburn University, United States

Scott Tonidandel, PhD University of North Carolina at Charlotte, United States

Gillian B. Yeo, PhD University of Western Australia, Perth, Western Australia, Australia

Editorial fellows

Edwyna Hill, PhD University of South Carolina, United States

William Luse, PhD University of La Verne, United States

Tyree Mitchell, PhD Louisiana State University, United States

Emily Poulton, PhD Indiana University–Purdue University Indianapolis, United States

Editorial fellow mentors

Contributing editors.

Herman Aguinis, PhD George Washington University, United States

Ramon J. Aldag, PhD University of Wisconsin–Madison, United States

David G. Allen, PhD Texas Christian University, United States

Tammy D. Allen, PhD University of South Florida, United States

Frederik Anseel, PhD University of New South Wales, Sydney, New South Wales, Australia

Samuel Aryee, PhD University of Surrey, Surrey, United Kingdom

Bruce Avolio, PhD University of Washington, United States

Daniel G. Bachrach, PhD University of Alabama, United States

Katie L. Badura, PhD Georgia Institute of Technology, United States

Timothy Ballard, PhD University of Queensland, St. Lucia, Queensland, Australia

Gary A. Ballinger, PhD University of Virginia, United States

Peter A. Bamberger, PhD Tel Aviv University, Ramat Aviv, Israel

Laurie J. Barclay, PhD University of Guelph, Guelph, Ontario, Canada

Christopher M. Barnes, PhD University of Washington, United States

Kathryn M. Bartol, PhD University of Maryland, United States

James W. Beck, PhD University of Waterloo, Waterloo, Ontario, Canada

Bradford S. Bell, PhD Cornell University, United States

Suzanne T. Bell, PhD DePaul University, United States

Frederick Scott Bentley, PhD University of Delaware, United States

Christopher M. Berry, PhD Indiana University, United States

Jeremy M. Beus, PhD Washington State University, United States

Barnini Bhattacharyya, PhD University of Western Ontario, Canada

Uta K. Bindl, PhD King’s College, London, United Kingdom

John F. Binning, PhD Illinois State University and DeGarmo Inc, Bloomington, IL, United States

Ronald Bledow, PhD Singapore Management University, Singapore

Paul D. Bliese, PhD University of South Carolina, United States

Mark Bolino, PhD University of Oklahoma, United States

Wendy R. Boswell, PhD Texas A&M University, United States

James A. Breaugh, PhD University of Missouri, Saint Louis, United States

Claudia Buengeler, PhD Kiel University, Kiel, Germany

Anne Burmeister, PhD Rotterdam School of Management, Rotterdam, The Netherlands

Marcus M. Butts, PhD Southern Methodist University, United States

Charles Calderwood, PhD Virginia Polytechnic Institute and State University, United States

Elizabeth M. Campbell, PhD University of Minnesota, Twin Cities, United States

Emily D. Campion, PhD University of Iowa, United States

Nichelle C. Carpenter, PhD Rutgers University, United States

Dorothy R. Carter, PhD Michigan State University, United States

Min Z. Carter, PhD Southern Illinois University, United States

Georgia T. Chao, PhD University of South Florida, United States

Prithviraj Chattopadhyay, PhD University of Cambridge,  Cambridge , United Kingdom

Nitya Chawla, PhD Texas A&M University, United States

Gilad Chen, PhD University of Maryland—College Park, United States

Nai-Wen Chi, PhD National Sun Yat-Sen University, Kaohsiung, Taiwan

Jaepil Choi, PhD Sungkyunkwan University, Seoul, Korea

SinHui Chong, PhD Nanyang Technological University, Singapore

Michael S. Christian, PhD University of North Carolina at Chapel Hill, United States

Malissa A. Clark, PhD University of Georgia, United States

Stephen M. Colarelli, PhD Central Michigan University, United States

Michael S. Cole, PhD Texas Christian University, United States

Adrienne Colella, PhD Tulane University, United States

Patrick D. Converse, PhD Florida Institute of Technology, United States

Jose M. Cortina, PhD Virginia Commonwealth University, United States

Stephen H. Courtright, PhD University of Iowa, United States

Eean R. Crawford, PhD University of Iowa, United States

Kristen L. Cullen-Lester, PhD University of Mississippi, United States

Jeremy F. Dawson, PhD University of Sheffield, Sheffield, United Kingdom

Eric Anthony Day, PhD University of Oklahoma, United States

Serge P. da Motta Veiga, PhD NEOMA Business School, Reims, France

Katherine DeCelles, PhD University of Toronto, Toronto, Ontario, Canada

Bart A. de Jong, PhD Australian Catholic University, Sydney, New South Wales, Australia

Evangelia Demerouti, PhD Eindhoven University of Technology, Eindhoven, the Netherlands

Rellie R. Derfler-Rozin, PhD University of Maryland, United States

Lindsay Y. Dhanani, PhD Rutgers, The State University of New Jersey, United States

James M. Diefendorff, PhD University of Akron, United States

Erich C. Dierdorff, PhD DePaul University, United States

Nikolaos Dimotakis, PhD Oklahoma State University, United States

Brian R. Dineen, PhD Purdue University, United States

Michelle K. Duffy, PhD University of Minnesota, Twin Cities, United States

Mark G. Ehrhart, PhD University of Central Florida, United States

Jill Ellingson, PhD University of Kansas, United States

Aleksander P.J. Ellis, PhD University of Arizona, United States

Berrin Erodogan, PhD Portland State University, United States, and University of Exeter, United Kingdom

Jinyan Fan, PhD Auburn University, United States

Brady M. Firth, PhD Portland State University, United States

Michael T. Ford, PhD University of Alabama, United States

Trevor A. Foulk, PhD University of Maryland, United States

Kimberly A. French, PhD Georgia Institute of Technology, United States

Michael Frese, PhD Asia School of Management, Kuala Lumpur, Malaysia  and  Leuphana University, Lüneburg, Germany

Ravi Shanker Gajendran, PhD Florida International University, United States

Ian R. Gellatly, PhD University of Alberta, Edmonton, Alberta, Canada

Steffen R. Giessner, PhD Erasmus University, Rotterdam School of Management, Rotterdam, the Netherlands

Stephen W. Gilliland, PhD Claremont Graduate University, United States

Erik Gonzalez-Mulé, PhD Indiana University, United States

Vicente González-Romá, PhD University of Valencia, Valencia, Spain

Lindsey M. Greco, PhD Oklahoma State University, United States

Rebecca L. Greenbaum, PhD Rutgers University, United States

Lindred Leura Greer, PhD University of Michigan, United States

Travis J. Grosser, PhD University of Connecticut, United States

Gudela Grote, PhD ETH Zürich, Zürich, Switzerland

Markus Groth, PhD University of New South Wales Australia, Sydney, New South Wales, Australia

Andrew C. Hafenbrack, PhD University of Washington, United States

Leslie B. Hammer, PhD Portland State University and Oregon Health & Science University, United States

Joo Hun Han, PhD Korea Advanced Institute of Science and Technology, Daejong, South Korea

Samantha D. Hansen, PhD University of Toronto, Toronto, Ontario, Canada

Crystal M. Harold, PhD Temple University, United States

T. Brad Harris, PhD HEC Paris, Jouy-en-Josas, France

John P. Hausknecht, PhD Cornell University, United States

Madeline E. Heilman, PhD New York University, United States

Morela Hernandez, PhD University of Michigan, United States

Louis Hickman, PhD Virginia Tech, United States

Giles Hirst, PhD Australian National University, Canberra, Australian Capital Territory, Australia

Inga J. Hoever, PhD Erasmus University Rotterdam, Rotterdam, the Netherlands

Brian C. Holtz, PhD Temple University, United States

Peter W. Hom, PhD Arizona State University, United States

Michael Horvath, PhD Cleveland State University, United States

Jason L. Huang, PhD Michigan State University, United States

Stephen E. Humphrey, PhD Pennsylvania State University, United States

Ilke Inceoglu, PhD University of Exeter, Exeter, United Kingdom

Kaifeng Jiang, PhD The Ohio State University, United States

Michael A. Johnson, PhD Louisiana State University, United States

Michael D. Johnson, PhD University of Washington, United States

Kristen P. Jones, PhD University of Memphis, United States

Dana L. Joseph, PhD University of Central Florida, United States 

Sora Jun, PhD Rice University, United States

John D. Kammeyer-Mueller, PhD University of Minnesota, Twin Cities, United States

Seth Kaplan, PhD George Mason University, United States

Nina Keith, PhD Technische Universität Darmstadt, Darmstadt, Germany

Anita C. Keller, PhD University of Groningen, Groningen, the Netherlands

Gavin Kilduff, PhD New York University, United States

Eugene Kim, PhD Georgia Institute of Technology, United States

Bradley L. Kirkman, PhD North Carolina State University, United States

Anthony C. Klotz, PhD University College London, United Kingdom

Donald H. Kluemper, PhD University of Illinois at Chicago, United States

Joel Koopman, PhD Texas A&M University, United States

Jaclyn Koopmann, PhD Auburn University, United States

Maria L. Kraimer, PhD Rutgers University, New Brunswick, United States

Dina V. Krasikova, PhD University of Texas at San Antonio, United States

Jana Kühnel, PhD University of Vienna, Vienna, Austria

Timothy G. Kundro, PhD University of North Carolina at Chapel Hill, United States

Giuseppe (Joe) Labianca, PhD University of Kentucky, United States

Jamie J. Ladge, PhD Northeastern University, United States and University of Exeter Business School, Exeter, United Kingdom

Chak Fu Lam, PhD City University of Hong Kong, Kowloon, Hong Kong

Klodiana Lanaj, PhD University of Florida, United States

Blaine Landis, PhD University College London, London, United Kingdom

Keith Leavitt, PhD Oregon State University, United States

James M. LeBreton, PhD Pennsylvania State University, United States

Cynthia Lee, PhD Northeastern University, United States

Hun Whee Lee, PhD Ohio State University, United States

KiYoung Lee, PhD Yonsei University, Seoul, Korea

Anna C. Lennard, PhD Oklahoma State University, United States

Jeff LePine, PhD Arizona State University, United States

Hannes Leroy, PhD Erasmus University, Rotterdam School of Management, Rotterdam, the Netherlands

Lisa M. Leslie, PhD New York University, United States

Alex Ning Li, PhD Texas Christian University, United States

Andrew Li, PhD West Texas A&M University, United States

Ning Li, PhD Tsinghua University, Beijing, China

Wen-Dong Li, PhD Chinese University of Hong Kong,   Shatin, New Territories,   Hong Kong

Yixuan Li, PhD University of Florida, United States

Huiwen Lian, PhD Texas A&M University, United States

Zhenyu Liao, PhD Northeastern University, United States     

Robert C. Liden, PhD University of Illinois, Chicago, United States

Filip Lievens, PhD Singapore Management University, Singapore

Szu-Han (Joanna) Lin, PhD University of Georgia, United States

Weipeng Lin, PhD Shandong University, Jinan, Shandong, China

Laura M. Little, PhD University of Georgia, United States

Songqi Liu, PhD Georgia State University, United States

Yihao Liu, PhD University of Illinois at Urbana-Champaign, United States

Beth A. Livingston, PhD University of Iowa, United States

Margaret M. Luciano, PhD Pennsylvania State University, United States

Aleksandra Luksyte, PhD University of Western Australia, Perth, Western Australia, Australia

John Wiley Lynch, PhD University of Illinois at Chicago, United States

Brent J. Lyons, PhD York University, Toronto, Ontario, Canada

Jeremy D. Mackey, PhD Auburn University, United States

Mark A. Maltarich, PhD University of South Carolina, United States

Suzanne S. Masterson, PhD University of Cincinnati, United States

John E. Mathieu, PhD University of Connecticut, United States

Fadel K. Matta, PhD University of Georgia, United States

Russell A. Matthews, PhD University of Alabama, United States

Mary B. Mawritz, PhD Drexel University, United States

Daniel J. McAllister, PhD National University of Singapore, Singapore

Julie M. McCarthy, PhD University of Toronto, Toronto, Ontario, Canada

Brian William McCormick, PhD Northern Illinois University, United States

Lynn A. McFarland, PhD University of South Carolina, United States

Adam W. Meade, PhD North Carolina State University, United States

Laurenz L. Meier, PhD University of Neuchâtel, Neuchâtel, Switzerland

Jesse S. Michel, PhD Auburn University, United States

Christian Yarid Ayala Millán, PhD Tecnológico de Monterrey, Mexico

Marie S. Mitchell, PhD University of North Carolina at Chapel Hill, United States

Frederick P. Morgeson, PhD Michigan State University, United States

Elizabeth Wolfe Morrison, PhD New York University, United States

Cindy P. Muir (Zapata), PhD University of Notre Dame, United States

Kevin R. Murphy, PhD University of Limerick, Limerick, Ireland

Jennifer D. Nahrgang, PhD University of Iowa, United States

Andrew Neal, PhD University of Queensland, St. Lucia, Queensland, Australia

Barbara Nevicka, PhD University of Amsterdam, Amsterdam, the Netherlands

Daniel W. Newton, PhD University of Iowa, United States

Thomas W. H. Ng, PhD University of Hong Kong, Pok Fu Lam, Hong Kong

Cornelia Niessen, PhD Friedrich-Alexander University of Erlangen-Nürnberg, Erlangen, Germany

Bernard A. Nijstad, PhD University of Groningen, Groningen, the Netherlands

Raymond A. Noe, PhD Ohio State University, United States

Anthony J. Nyberg, PhD University of South Carolina, United States

Christopher D. Nye, PhD Michigan State University, United States

Ernest O'Boyle, PhD Indiana University, United States

Burak Oc, PhD Singapore Management University, Singapore

Babatunde Ogunfowara, PhD University of Calgary, Calgary, Canada

Kyoungjo (Jo) Oh, PhD University of Connecticut, United States

Sandra Ohly, PhD University of Kassel, Kassel, Germany

Jane O'Reilly, PhD University of Ottawa, Ottawa, Ontario, Canada

Tae-Youn Park, PhD Cornell University, United States

Michael R. Parke, PhD University of Pennsylvania, United States

Stephanie C. Payne, PhD Texas A&M University, United States

Matthew J. Pearsall, PhD University of North Carolina at Chapel Hill, United States

Ann Chunyan Peng, PhD University of Missouri, United States

Jill Perry-Smith, PhD Emory University, United States

Robert Ployhart, PhD University of South Carolina, United States

Caitlin Porter, PhD University of Memphis, United States

Belle Rose Ragins, PhD University of Wisconsin–Milwaukee, United States

Christian J. Resick, PhD Drexel University, United States

Simon Lloyd D. Restubog, PhD University of Illinois at Urbana-Champaign, United States

Hettie A. Richardson, PhD Texas Christian University, United States

Andreas Richter, PhD University of Cambridge, Cambridge, United Kingdom

Melissa Robertson, PhD Purdue University, United States

Jessica B. Rodell, PhD University of Georgia, United States

Steven Gary Rogelberg, PhD University of North Carolina at Charlotte, United States

Christopher C. Rosen, PhD University of Arkansas, United States

Alex L. Rubenstein, PhD University of Central Florida, United States

Enrica N. Ruggs, PhD University of Houston, United States

Deborah E. Rupp, PhD George Mason University, United States

Paul R. Sackett, PhD University of Minnesota, Twin Cities, United States

Juan Ignacio Sanchez, PhD Florida International University, United States

Katina B. Sawyer, PhD The George Washington University, United States

John M. Schaubroeck, PhD University of Missouri, United States

Donald J. Schepker, PhD University of South Carolina, United States

Charles A. Scherbaum, PhD Baruch College, City University of New York, United States

Beth Schinoff, PhD University of Delaware, United States

Aaron M. Schmidt, PhD University of Minnesota, Twin Cities, United States

Benjamin Schneider, PhD University of Maryland, United States

Brent A. Scott, PhD Michigan State University, United States

Priti Pradhan Shah, PhD University of Minnesota, Twin Cities, United States

Ruodan Shao, PhD York University, Toronto, Ontario, Canada

Jason D. Shaw, PhD Nanyang Technological University, Singapore

Winny Shen, PhD York University, Toronto, Ontario, Canada

Junqi Shi, PhD Zhejiang University, Hangzhou, China 

Shung Jae Shin, PhD Portland State University, United States

Lynn M. Shore, PhD Colorado State University, United States

Mindy Shoss, PhD University of Central Florida, United States

Lauren S. Simon, PhD University of Arkansas, United States

Ruchi Sinha, PhD University of South Australia, Adelaide, South Australia

Traci Sitzmann, PhD University of Colorado Denver, United States

Jerel E. Slaughter, PhD University of Arizona, United States

David M. Sluss, PhD ESSEC Business School, Cergy-Pontoise, France

Alexis N. Smith (Washington), PhD Oklahoma State University, United States

Troy A. Smith, PhD University of Nebraska, United States

Q. Chelsea Song, PhD Indiana University, United States

Yifan Song, PhD Texas A&M University, United States

Sabine Sonnentag, PhD University of Mannheim, Mannheim, Germany

Andrew B. Speer, PhD Indiana University, United States

Alex Stajkovic, PhD University of Wisconsin–Madison, United States

Kayla Stajkovic, PhD University of California, Davis, United States

Greg L. Stewart, PhD University of Iowa, United States

Karoline Strauss, PhD ESSEC Business School, Cergy-Pontoise, France

Michael C. Sturman, PhD Rutgers University, United States

Rong Su, PhD University of Iowa, United States

Lorne M. Sulsky, PhD Memorial University of Newfoundland, St. John's, Newfoundland, Canada

Tianjun Sun, PhD Kansas State University, United States

Brian W. Swider, PhD University of Florida, United States

Kenneth Tai, PhD Singapore Management University, Singapore

Riki Takeuchi, PhD University of Texas at Dallas, United States

Subrahmaniam Tangirala, PhD University of Maryland, United States

Shannon G. Taylor, PhD University of Central Florida, United States

Stefan Thau, PhD INSEAD, Singapore

Phillip S. Thompson, PhD Virginia Tech, United States

Christian N. Thoroughgood, PhD Georgia State University, United States

March L. To, PhD RMIT University, Melbourne, Australia

Donald M. Truxillo, PhD University of Limerick, Limerick, Ireland

Kerrie L. Unsworth, PhD University of Leeds, Leeds, United Kingdom

Edwin A. J. Van Hooft, PhD University of Amsterdam, Amsterdam, the Netherlands

Gerben A. van Kleef, PhD University of Amsterdam, Amsterdam, the Netherlands

Jeffrey B. Vancouver, PhD Ohio University, United States

Robert J. Vandenberg, PhD University of Georgia, United States

Hoda Vaziri, PhD University of North Texas, United States

Vijaya Venkataramani, PhD University of Maryland—College Park, United States

Chockalingam Viswesvaran, PhD Florida International University, United States

Ryan M. Vogel, PhD Temple University, United States of America

David A. Waldman, PhD Arizona State University, United States

H. Jack Walker, PhD Auburn University, United States

Danni Wang, PhD Rutgers, The State University of New Jersey, United States

Gang Wang, PhD Florida State University, United States

Mo Wang, PhD University of Florida, United States

Julie Holliday Wayne, PhD Wake Forest University, United States

Elijah X. M. Wee, PhD University of Washington, United States

Serena Wee, PhD University of Western Australia, Perth, Western Australia, Australia

Justin M. Weinhardt, PhD University of Calgary, Calgary, Alberta, Canada

Bart Wille, PhD Ghent University, Ghent, Belgium

Kelly Schwind Wilson, PhD Purdue University, United States

David J. Woehr, PhD University of North Carolina at Charlotte, United States

Kin Fai Ellick Wong, PhD Hong Kong University of Science & Technology, Kowloon, Hong Kong

Junfeng Wu, PhD University of Texas at Dallas, United States

Kai Chi Yam, PhD National University of Singapore, Singapore

Liu-Qin Yang, PhD Portland State University, United States

Kang Yang Trevor Yu, PhD Nanyang Technological University,   Singapore

Zhenyu Yuan, PhD University of Illinois at Chicago, United States

Yujie (Jessie) Zhan, PhD Wilfrid Laurier University, Waterloo, Ontario, Canada

Zhen Zhang, PhD Southern Methodist University, United States

Jing Zhou, PhD Rice University, United States

Le Zhou, PhD University of Minnesota, Twin Cities, United States

Michael J. Zickar, PhD Bowling Green State University, United States

Jonathan C. Ziegert, PhD Drexel University, United States

Kate P. Zipay, PhD Purdue University, United States

Editorial assistant

Jennifer Wood

Abstracting and indexing services providing coverage of Journal of Applied Psychology ®

  • ABI/INFORM Complete
  • ABI/INFORM Global
  • ABI/INFORM Professional Advanced
  • ABI/INFORM Professional Standard
  • ABI/INFORM Research
  • Academic OneFile
  • Advanced Placement Psychology Collection
  • ASSIA: Applied Social Sciences Index & Abstracts
  • Biological Abstracts
  • Business & Company Profile ASAP
  • Business Source Alumni Edition
  • Business Source Complete
  • Business Source Corporate Plus
  • Business Source Elite
  • Business Source Index
  • Business Source Premier
  • Cabell’s Directory of Publishing Opportunities in Psychology
  • Chartered Association of Business Schools (CABS) Academic Journal Guide
  • CINAHL Complete
  • CINAHL Plus
  • Corporate ResourceNet
  • Current Abstracts
  • Current Contents: Social & Behavioral Sciences
  • EBSCO MegaFILE
  • Educational Administration Abstracts
  • Ergonomics Abstracts
  • ERIH (European Reference Index for the Humanities and Social Sciences)
  • Expanded Academic ASAP
  • General OneFile
  • General Reference Center International
  • Health Reference Center Academic
  • Human Resources Abstracts
  • Humanities and Social Sciences Index Retrospective
  • Humanities Index Retrospective
  • Humanities Source
  • IBZ / IBR (Internationale Bibliographie der Rezensionen Geistes- und Sozialwissenschaftlicher Literatur)
  • InfoTrac Custom
  • International Bibliography of the Social Sciences
  • Journal Citations Report: Social Sciences Edition
  • Military and Intelligence
  • MLA International Bibliography
  • Mosby's Nursing Consult
  • NSA Collection
  • Nursing and Allied Health Collection
  • Nursing Resource Center
  • OmniFile Full Text Mega
  • Professional ProQuest Central
  • ProQuest Central
  • ProQuest Discovery
  • ProQuest Health Management
  • ProQuest Platinum Periodicals
  • ProQuest Psychology Journals
  • ProQuest Research Library
  • ProQuest Social Science Journals
  • Psychology Collection
  • Public Affairs Index
  • Race Relations Abstracts
  • RILM Abstracts of Music Literature
  • Social Sciences Abstracts
  • Social Sciences Citation Index
  • Social Sciences Full Text
  • Social Sciences Index Retrospective
  • TOC Premier
  • Women's Studies International

Special issue of APA's Journal of Applied Psychology, Vol. 104, No. 3, March 2019. Includes articles about team membership, team performance, leadership, goals, and work teams.

Special issue of APA's Journal of Applied Psychology, Vol. 104, No. 1, January 2019. Includes articles about self-reflection, email demands, leader-member exchange, training, leader ratings, organizational performance, ethics, abusive supervision, leader morality, and psychopathy.

Special issue of APA's Journal of Applied Psychology, Vol. 102, No. 3, March 2017. Includes articles about the science of work and organizational psychology with subthemes of building the workforce, managing the workforce, managing differences between and within organizations, and exiting work.

Transparency and Openness Promotion

APA endorses the Transparency and Openness Promotion (TOP) Guidelines by a community working group in conjunction with the Center for Open Science ( Nosek et al. 2015 ). Effective November 1, 2021, all articles submitted to the Journal of Applied Psychology must, at a minimum, adhere to the TOP levels as noted in the list below, which details the domains of research planning and reporting, the TOP level required, and a brief description of the journal’s policy. Submissions that do not include (1) qualitative, quantitative, or simulated data, (2) a systematic narrative or meta-analytic review of the literature, or (3) reanalysis of existing data must also include a statement that TOP guidelines related to data sharing, code sharing, hypotheses preregistration, analysis preregistration, and materials sharing are not applicable and why (this can appear in the method section). All articles must comply with the TOP guideline of citing any materials not original to the submitted work.

  • Level 1: Disclosure—The article must disclose whether or not the materials are posted to a trusted repository.
  • Level 2: Requirement—The article must share materials via a trusted repository when legally and ethically permitted (or disclose the legal and/or ethical restriction when not permitted).
  • Level 3: Verification—A third party must verify that the standard is met.

Empirical research, including meta-analyses, submitted to the Journal of Applied Psychology must, at a minimum, adhere to the TOP levels as noted in the list below, which details the domains of research planning and reporting, the TOP level required by the Journal of Applied Psychology , and a brief description of the journal’s policy.

  • Citation: Level 2, Requirement—All data, program code, and other methods not original to the submitted work (developed by others) must be cited in the text and listed in the references section.
  • Data Transparency: Level 1, Disclosure—Article states whether or not the raw and/or processed data on which study conclusions are based are posted to a trusted repository and, if so, how to access data.
  • Analytic Methods (Code) Transparency: Level 1, Disclosure—Article states whether computer code or syntax needed to reproduce analyses in an article is posted to a trusted repository and if so how to access it.
  • Research Materials Transparency: Level 1, Disclosure—Article states whether materials described in the method section are posted to a trusted repository and, if so, how to access them.
  • Design and Analysis Transparency (Reporting Standards): Level 2, Requirement—Article must follow the JAP methods checklist and all relevant aspects of the APA Style Journal Reporting Standards (JARS).

Note: If you have reproduced full scales or other material (e.g., tables, figures, experimental materials) from previously published articles, please note that it is your responsibility to ensure that doing so does not constitute a copyright infringement with the original publisher. You can find information on fair use of previously published material and on APA’s permissions policy in the Permissions Alert Form for APA Authors . In addition, we strongly encourage the inclusion of online supplements for study materials.

Authors should include a brief (one short paragraph) subsection in their method section titled “Transparency and openness” that indicates how they complied with the TOP guidelines adopted at the Journal of Applied Psychology .

  • We describe our sampling plan, all data exclusions (if any), all manipulations, and all measures in the study, and we adhered to the Journal of Applied Psychology methodological checklist. All data, analysis code, and research materials are available at [stable masked link to trusted repository]. Data were analyzed using R, version 4.0.0 (R Core Team, 2020) and the package ggplot , version 3.2.1 (Wickham, 2016). This study’s design and its analysis were not preregistered.
  • We describe our sampling plan, all data exclusions (if any), all manipulations, and all measures in the study, and we adhered to the Journal of Applied Psychology methodological checklist. Analysis code and research materials are available at [stable masked link to trusted repository]. Data are not available due to their proprietary nature. Data were analyzed using R, version 4.0.0 (R Core Team, 2020) and the package ggplot , version 3.2.1 (Wickham, 2016). The study design was not preregistered because the data were collected for an applied selection project. The hypotheses and analysis were preregistered [masked OSF link].

Please refer to the Center for Open Science TOP guidelines for details, and contact the editor (Lillian T. Eby, PhD) with any further questions. APA recommends sharing data, materials, and code via  trusted repositories  (e.g.,  APA’s repository  on the Open Science Framework (OSF)). Trusted repositories adhere to policies that make data discoverable, accessible, usable, and preserved for the long term. Trusted repositories also assign unique and persistent identifiers.

We encourage investigators to preregister their studies and to share protocols and analysis plans prior to conducting the research. There are many available preregistration forms (e.g., the APA  Preregistration for Quantitative Research in Psychology  template,  ClininalTrials.gov , aspredicted.org , or other  preregistration templates available via OSF ). Completed preregistration forms should be time-stamped and posted on a publicly accessible registry system (e.g., OSF , ClinicalTrials.gov, or other trial registries in the WHO Registry Network).

A  list of participating journals  is also available from APA.

Other open science initiatives

  • Open Science badges: Not offered
  • Public significance statements: Not offered
  • Author contribution statements using CRediT: Required
  • Registered Reports: Not published
  • Replications: Published

Explore open science at APA .

Inclusive study designs

  • Diverse samples

Definitions and further details on inclusive study designs are available on the Journals EDI homepage .

Inclusive reporting standards

  • Bias-free language and community-driven language guidelines (required)
  • Data sharing and data availability statements (required)

More information on this journal’s reporting standards is listed under the submission guidelines tab .

Pathways to authorship and editorship

Editorial fellowships.

Editorial fellowships help early-career psychologists gain firsthand experience in scholarly publishing and editorial leadership roles. This journal offers an editorial fellowship program for early-career psychologists from historically excluded communities.

Paper development workshops and other resources

This journal holds annual paper development workshops with priority to underrepresented scholars. Many EDI resources, translational summaries focusing on EDI, and more are available from The Journal Editor's Corner .

Other EDI offerings

Orcid reviewer recognition.

Open Research and Contributor ID (ORCID) Reviewer Recognition provides a visible and verifiable way for journals to publicly credit reviewers without compromising the confidentiality of the peer-review process. This journal has implemented the ORCID Reviewer Recognition feature in Editorial Manager, meaning that reviewers can be recognized for their contributions to the peer-review process.

Masked peer review

This journal offers masked peer review (where both the authors’ and reviewers’ identities are not known to the other). Research has shown that masked peer review can help reduce implicit bias against traditionally female names or early-career scientists with smaller publication records (Budden et al., 2008; Darling, 2015).

  • Reflections on the  Journal of Applied Psychology  in Times of Change by Lillian Eby January 2022
  • Editorial (PDF, 47KB) by Lillian Eby January 2021

Editor Spotlight

  • Read an interview with Editor Lillian Eby, PhD

From APA Journals Article Spotlight ®

  • You've got to have friends: The spillover of cross-race friendships to diversity training and education
  • The art of racing (deadlines) in the rain
  • Psychopaths in the C-suite?

Journal Alert

Sign up to receive email alerts on the latest content published.

Welcome! Thank you for subscribing.

Subscriptions and access

  • Pricing and individual access
  • APA PsycArticles database

Calls for Papers

Access options

  • APA publishing resources
  • Educators and students
  • Editor resource center

APA Publishing Insider

APA Publishing Insider is a free monthly newsletter with tips on APA Style, open science initiatives, active calls for papers, research summaries, and more.

Social media

Twitter icon

Contact Journals

APS

New Open Access Journal from APS and Sage Expands Publishing Opportunity for Psychological Scientists

Applications now open for role of inaugural editor.

  • Advances in Psychological Science Open
  • APS Journals
  • Open Practices

research paper about psychologists

August 13, 2024 — The Association for Psychological Science (APS) and Sage announce the launch of Advances in Psychological Science Open , a fully open access journal that will publish high-quality empirical, technical, theoretical, and review articles, across the full range of areas and topics in psychological science. The journal will accept submissions in a variety of formats, including long-form articles and short reports, and APS is encouraging scientists to submit integrative and interdisciplinary research articles. 

“APS is always working to identify new ways to catalyze advances in psychological science,” said APS CEO Robert Gropp. “We are excited to announce that we are launching Advances in Psychological Science Open to provide a publication option for scientists who want a fully open access journal in which to share their research findings.”  

APS has launched a search for the inaugural editor of the journal with the goal of having an editor appointed to begin work in January 2025, with first acceptance of manuscripts in mid- 2025. Nominations, including self-nominations, for Founding Editor are welcomed.  Nominations of members of underrepresented groups in psychology are especially encouraged. For more information on how to submit a nomination, please refer to the open call here .  

“Sage has been committed to opening pathways for social and behavioral scientists since our founding nearly 60 years ago,” said Bob Howard, executive vice president, research at Sage. “We’re thrilled to build on our long-standing partnership with APS to launch a journal publishing high-quality, impactful research that will help shape the future of psychological science.” 

The new title is the seventh journal that APS will publish in partnership with Sage. Advances in Psychological Science Open adds to the rich ecosystem of APS publications that collectively meet the needs of the psychological science community. APS members will receive a significant discount on the open access fee for this new journal, adding to the suite of benefits that members already receive. 

For more information, please contact Scott Sleek at [email protected] .

About Sage  

Sage  is a global academic publisher of books, journals, and library resources with a growing range of technologies to enable discovery, access, and engagement. Believing that research and education are critical in shaping society, 24-year-old Sara Miller McCune founded Sage in 1965. Today, we are controlled by a group of trustees charged with maintaining our independence and mission indefinitely.  

See the related announcement: Advances in Psychological Science Open  Coming Soon

APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines .

Please login with your APS account to comment.

research paper about psychologists

Incoming PSPI Editor Colleen Seifert Outlines Her Goals for the Journal

Colleen Seifert aims to expand the range of topics covered in the APS publication.

research paper about psychologists

Research Briefs

Recent highlights from APS journals articles.

research paper about psychologists

Recent highlights from APS journals articles on the link between self-esteem and eating disorders, how to be liked in first encounters, the effects of stress on rigid learning, and much more.

Privacy Overview

CookieDurationDescription
__cf_bm30 minutesThis cookie, set by Cloudflare, is used to support Cloudflare Bot Management.
CookieDurationDescription
AWSELBCORS5 minutesThis cookie is used by Elastic Load Balancing from Amazon Web Services to effectively balance load on the servers.
CookieDurationDescription
at-randneverAddThis sets this cookie to track page visits, sources of traffic and share counts.
CONSENT2 yearsYouTube sets this cookie via embedded youtube-videos and registers anonymous statistical data.
uvc1 year 27 daysSet by addthis.com to determine the usage of addthis.com service.
_ga2 yearsThe _ga cookie, installed by Google Analytics, calculates visitor, session and campaign data and also keeps track of site usage for the site's analytics report. The cookie stores information anonymously and assigns a randomly generated number to recognize unique visitors.
_gat_gtag_UA_3507334_11 minuteSet by Google to distinguish users.
_gid1 dayInstalled by Google Analytics, _gid cookie stores information on how visitors use a website, while also creating an analytics report of the website's performance. Some of the data that are collected include the number of visitors, their source, and the pages they visit anonymously.
CookieDurationDescription
loc1 year 27 daysAddThis sets this geolocation cookie to help understand the location of users who share the information.
VISITOR_INFO1_LIVE5 months 27 daysA cookie set by YouTube to measure bandwidth that determines whether the user gets the new or old player interface.
YSCsessionYSC cookie is set by Youtube and is used to track the views of embedded videos on Youtube pages.
yt-remote-connected-devicesneverYouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt-remote-device-idneverYouTube sets this cookie to store the video preferences of the user using embedded YouTube video.
yt.innertube::nextIdneverThis cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.
yt.innertube::requestsneverThis cookie, set by YouTube, registers a unique ID to store data on what videos from YouTube the user has seen.

Forming intervals of predicted total scores for cut-off scores evaluation: a generalizability theory application with Bootstrapping

  • Published: 14 August 2024

Cite this article

research paper about psychologists

  • Zhehan Jiang 1 ,
  • Jinying Ouyang 1 , 2 ,
  • Dexin Shi 3 ,
  • Junhao Pan 4 ,
  • Lingling Xu 1 , 5 &
  • Fen Cai   ORCID: orcid.org/0000-0002-6734-2860 1 , 5  

In routine Generalizability Theory (G-theory) research, the task of establishing or assessing cut-off scores for performance evaluation is consistently sought after yet simultaneously poses significant challenges. While many studies have presented diverse indices to evaluate potential cut-off scores, these indices are frequently limited by their design-specific nature, limiting adaptability across different assessment contexts. This paper reframes G-theory within the context of a Linear Mixed-Effects Model (LMM) and employs LMM-based bootstrapping techniques to generate Intervals of Predicted Total Scores (IPTS) for each individual to evaluate the suitability of potential cut-off score candidates in the assessment. We propose PredC , to quantify the proportion of individuals whose 95% IPTS encompass a given candidate value of interest, thereby gauging the appropriateness of different cut-off choices. A lower PredC is deemed preferable since it signifies that the corresponding cut-off score effectively discriminates between individuals, categorizing them into distinct groups (e.g., pass or fail). The study comprises two parts: Firstly, a Monte Carlo simulation study was carried out to compare the performance of three distinct techniques—parametric, semiparametric, and nonparametric bootstrapping—in constructing IPTS. The second part applied a genuine dataset to illustrate the practical implementation of our proposed methodology. The simulation findings revealed that the distribution of scores greatly influences the efficacy of IPTS, with semiparametric and nonparametric bootstraps being the preferred methods in situations involving ordinal data points, as was the case in the real-world dataset. When assessing cut-off scores, the PredC serves as a valuable complement to the existing suite of quantitative tools, thereby contributing to enhanced standard-setting practices.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

research paper about psychologists

Data availability

Data sharing is allowed via requests to the authors.

Alles, A. A., Wiedmann, M., & Martin, N. H. (2018). Rapid detection and characterization of postpasteurization contaminants in pasteurized fluid milk. Journal of Dairy Science , 101 (9), 7746–7756.

Article   PubMed   Google Scholar  

Angoff, W. H. (1971). Scales, norms, and equivalent scores. In R. L. Thorndike (Ed.), Educational measurement (2nd ed., pp. 508–600). American Council on Education.

Google Scholar  

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (2014). Standards for educational and psychological testing . American Educational Research Association.

Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software , 67 (1), 1–48.

Berk, R. A. (1986). A consumer’s guide to setting performance standards on criterion-referenced tests. Review of Educational Research , 56 , 137–172.

Brooks, M. E., Kristensen, K., van Benthem, K. J., Magnusson, A., Berg, C. W., Nielsen, A., ... Bolker, B. M. (2017). glmmTMB balances speed and flexibility among packages for zero-inflated generalized linear mixed modeling. The R Journal , 9 (2), 378–400.

Brennan, R. L. (1998). Raw-score conditional standard errors of measurement in generalizability theory. Applied Psychological Measurement , 22 (4), 307–331.

Article   Google Scholar  

Brennan, R. L. (2000). Performance assessments from the perspective of generalizability theory. Applied Psychological Measurement , 24 (4), 339–353.

Cardinet, J., Johnson, S., & Pini, G. (2010). Applying generalizability theory using EduG . Routledge.

Cronbach, L. J., Gleser, G. C., Nanda, H., & Rajaratnam, N. (1972). The dependability of behavioral measurements: Theory of generalizability for scores and profiles . John Wiley & Sons.

Eberlein, E., & Hammerstein, E. A. (2003). Generalized hyperbolic and inverse gaussian distributions: Limiting cases and approximation of processes . University of Freiburg. Nr. 80.

Erosheva, E. A., Martinková, P., & Lee, C. J. (2021). When zero may not be zero: A cautionary note on the use of inter-rater reliability in evaluating grant peer review. Journal of the Royal Statistical Society Series A: Statistics in Society , 184 (3), 904–919.

Fruehwald, J. (2016). The early influence of phonology on a phonetic change. Language , 92 (2), 376–410.

Hambleton, R. K., Jaeger, R. M., Plake, B. S., & Mills, C. (2000). Setting performance standards on complex educational assessments. Applied Psychological Measurement , 24 (4), 355–366.

Jaeger, R. M. (1989). Certification of student competence. In R. L. Linn (Ed.), Educational measurement (3rd ed.). New York: American Council on Education and Macmillan.

Jiang, Z. (2018). Using linear mixed-effect model framework to estimate generalizability variance component in R: A lme4 package application. Methodology, 14 (3), 133–142. https://doi.org/10.1027/1614-2241/a000149

Jiang, Z., & Skorupski, W. (2018). A bayesian approach to estimating variance components within a multivariate generalizability theory framework. Behavior Research Methods, 50 (6), 2193–2214. https://doi.org/10.3758/s13428-017-0986-3

Jiang, Z., Walker, K., Shi, D., & Cao, J. (2018). Improving generalizability coefficient estimate accuracy: A way to incorporate auxiliary information. Methodological Innovations . https://doi.org/10.1177/2059799118791397

Jiang, Z., Raymond, M., Shi, D., & DiStefano, C. (2020). Using linear mixed-effect model framework to estimate multivariate generalizability theory in R. Behavior Research Methods, 52 (6), 2383–2393. https://doi.org/10.3758/s13428-020-01399-z

Jiang, Z., Shi, D., & DiStefano, C. (2021). A short note on optimizing cost-generalizability via a machine-learning approach. Educational and Psychological Measurement, 81 (6), 1221–1233. https://doi.org/10.1177/0013164421992112

Jiang, Z., Ouyang, J., Li, L., Han, Y., Xu, L., Liu, R., & Sun, J. (2022a). Cost-effectiveness analysis in performance assessments: A case study of the objective structured clinical examination. Medical Education Online, 27 (1). https://doi.org/10.1080/10872981.2022.2136559

Jiang, Z., Raymond, M., Shi, D., DiStefano, C., Liu, R., & Sun, J. (2022b). A Monte Carlo study of confidence interval methods for generalizability coefficient. Educational and Psychological Measurement, 82 (4), 705–718. https://doi.org/10.1177/00131644211033899

Jung, K., Lee, J., Gupta, V., & Cho, G. (2019). Comparison of bootstrap confidence interval methods for GSCA using a Monte Carlo simulation. Frontiers in Psychology , 10 , 2215.

Kane, M., Crooks, T., & Cohen, A. (1999). Validating measures of performance. Educational Measurement: Issues and Practice , 18 (2), 5–17.

Klatt, W. K., Mayer, B., & Lobmaier, J. S. (2020). Content matters: Cyclic effects on women’s voices depend on social context. Hormones and Behavior , 122 , 104762.

Knowles, J. E., & Frederick, C. (2016). Prediction intervals from merMod objects [R package vignette]. CRAN. https://cran.r-project.org/web/packages/merTools/vignettes/Using_predictInterval.html

Lane, S., Liu, M., Ankenmann, R. D., & Stone, C. A. (1996). Generalizability and validity of mathematics performance assessment. Journal of Educational Measurement , 33 (1), 71–92.

Lewis, D. M., Mitzel, H. C., & Green, D. R. (1996, June). Standard setting: A bookmark approach . Paper presented at the Council of Chief State School Officers National Conference on Large-Scale Assessment, Phoenix, AZ.

Li, G. (2023). Which method is optimal for estimating variance components and their variability in generalizability theory? Evidence form a set of unified rules for bootstrap method. PLoS ONE, 18 (7), e0288069. https://doi.org/10.1371/journal.pone.0288069

Li, G., & Zhang, M. (2012). Analysis of cross-distribution for estimating variance components in generalizability theory. Psychological Development and Education , 28 (6), 665–672.

Livingston, S. A., & Zieky, M. J. (1982). Passing scores: A manual for setting standards of performance on educational and occupational tests . Princeton NJ: Educational Testing Service.

LoPilato, A. C., Carter, N. T., & Wang, M. (2015). Updating generalizability theory in management research: Bayesian estimation of variance components. Journal of Management , 41 (2), 692–717.

Malau-Aduli, B. S., Teague, P. A., D’Souza, K., Heal, C., Turner, R., Garne, D. L., & van der Vleuten, C. (2017). A collaborative comparison of objective structured clinical examination (OSCE) standard setting methods at Australian medical schools. Medical Teacher , 39 (12), 1261–1267.

Mena, R. H., & Walker, S. G. (2007). On the stationary version of the generalized hyperbolic ARCH model. Annals of the Institute of Statistical Mathematics , 59 (2), 325–348.

Mitzel, H. C., Lewis, D. M., Patz, R. J., & Green, D. R. (2001). The bookmark procedure: Psychological perspectives. In G. J. Cizek (Ed.), Setting performance standards (pp. 249–281). Lawrence Erlbaum.

Moore, C. (2016). gtheory: Apply generalizability theory with R [Computer software version]. Retrieved from https://cran.r-project.org/web/packages/gtheory/gtheory.pdf

Nellhaus, J. (2000). States with NAEP-like performance standards . Washington DC: National Assessment Governing Board.

Parshall, C. G., Davey, T., & Pashley, P. J. (2000). Innovative item types for computerized testing. In W. J. van der Linden & C. Glas (Eds.), Computerized adaptive testing: Theory and practice . Kluwer.

Peitzman, S. J., & Cuddy, M. M. (2015). Performance in physical examination on the USMLE step 2 clinical skills examination. Academic Medicine , 90 (2), 209–213.

Plake, B. S. (1998). Setting performance standards for professional licensure and certification. Applied Measurement in Education , 11 , 65–80.

Puth, M. T., Neuhäuser, M., & Ruxton, G. D. (2015). On the variety of methods for calculating confidence intervals by bootstrapping. Journal of Animal Ecology , 84 (4), 892–897.

Rousselet, G. A., Pernet, C. R., & Wilcox, R. R. (2021). The percentile bootstrap: A primer with step-by-step instructions in R. Advances in Methods and Practices in Psychological Science , 4 (1), 2515245920911881.

Shavelson, R. J., Baxter, G. P., & Gao, X. (1993). Sampling variability of performance assessments. Journal of Educational Measurement , 30 (3), 215–232.

Thombs, L. A., & Schucany, W. R. (1990). Bootstrap prediction intervals for autoregression. Journal of the American Statistical Association , 85 (410), 486–492.

Tong, Y., & Brennan, R. L. (2007). Bootstrap estimates of standard errors in generalizability theory. Educational and Psychological Measurement , 67 (5), 804–817.

Vispoel, W. P., Morris, C. A., & Kilinc, M. (2018a). Applications of generalizability theory and their relations to classical test theory and structural equation modeling. Psychological Methods, 23 (1), 1–26. https://doi.org/10.1037/met0000107

Vispoel, W. P., Morris, C. A., & Kilinc, M. (2018b). Practical applications of generalizability theory for designing, evaluating, and improving psychological assessments. Journal of Personality Assessment, 100 , 53–67. https://doi.org/10.1080/00223891.2017.1296455

Vispoel, W. P., Morris, C. A., & Kilinc, M. (2018c). Using generalizability theory to disattenuate correlation coefficients for multiple sources of measurement error. Multivariate Behavioral Research, 53 , 481–501. https://doi.org/10.1080/00273171.2018.1457938

Vispoel, W. P., Lee, H., Xu, G., & Hong, H. (2022a). Expanding bifactor models of psychological traits to account for multiple sources of measurement error. Psychological Assessment, 34 , 1093–1111. https://doi.org/10.1037/pas0001170

Vispoel, W. P., Xu, G., & Schneider, W. S. (2022b). Interrelationships between latent state trait theory and generalizability theory in a structural equation modeling framework. Psychological Methods, 27 , 773–803. https://doi.org/10.1037/met0000290

Vispoel, W. P., Xu, G., & Schneider, W. S. (2022c). Using parallel splits with self-report and other measures to enhance precision in generalizability theory analyses. Journal of Personality Assessment, 104 , 303–319. https://doi.org/10.1080/00223891.2021.1938589

Vispoel, W. P., Hong, H., & Lee, H. (2023a). Benefits of doing generalizability theory analyses within structural equation modeling frame works: Illustrations using the Rosenberg Self Esteem Scale. Structural Equation Modeling: An Interdisciplinary Journal Advance Online Publication. https://doi.org/10.1080/10705511.2023.2187734 .

Vispoel, W. P., Lee, H., Chen, T., & Hong, H. (2023b). Using structural equation modeling to reproduce and extend ANOVA based generalizability theory analyses for psychological assessments. Psych, 5 , 249–273. https://doi.org/10.3390/psych5020019

Yang, Y., Richards-Zawacki, C. L., Devar, A., & Dugas, M. B. (2016). Poison frog color morphs express assortative mate preferences in allopatry but not sympatry. Evolution , 70 (12), 2778–2788.

Yin, P., & Sconing, J. (2008). Estimating standard errors of cut scores for item rating and mapmark procedures: A generalizability theory approach. Educational and Psychological Measurement , 68 (1), 25–41.

Yousuf, N., Violato, C., & Zuberi, R. W. (2015). Standard setting methods for pass/fail decisions on high-stakes objective structured clinical examinations: A validity study. Teaching and Learning in Medicine , 27 (3), 280–291.

Download references

This work was supported by the National Natural Science Foundation of China for Young Scholars under Grant 72104006; Peking University Health Science Center under Grant BMU2021YJ010; the National Medical Examination Center of China for the project Examination Standards and Content Designs of National Medical Licensing Examination.

Author information

Authors and affiliations.

Institute of Medical Education, Health Science Center, Peking University, Beijing, China

Zhehan Jiang, Jinying Ouyang, Lingling Xu & Fen Cai

School of Public Health, Peking University, Beijing, China

Jinying Ouyang

Department of Psychology, University of South Carolina, Columbia, SC, USA

Department of Psychology, Sun Yat-sen University, Guangzhou, China

Graduate School of Education, Peking University, Beijing, China

Lingling Xu & Fen Cai

You can also search for this author in PubMed   Google Scholar

Corresponding authors

Correspondence to Junhao Pan or Fen Cai .

Ethics declarations

Conflict of interest.

No potential conflict of interest was reported by the author(s).

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Part 1. Simplified guide to key statistical techniques

Generalizability Theory (G-theory)

G-theory, developed by Lee Cronbach and others, is a statistical framework used to analyze the reliability and variability in test scores. It breaks down the total variance in scores into several components: facets (sources of variation like students, raters, or tasks), interactions between facets, and error. By doing so, G-theory helps identify which aspects of an assessment contribute most to score variability, aiding in the improvement of test design and interpretation.

Example: Imagine assessing surgical skill in medical students. G-theory would help discern if variations in scores are mainly due to differences between students, the specific surgical tasks, or the evaluators.

Linear Mixed-Effects Model (LMM)

An LMM is an advanced statistical model used when data are hierarchical or nested (e.g., students nested within classrooms). It accounts for both fixed effects (like time or treatment) and random effects (like differences between classrooms), providing a more nuanced understanding of the data structure.

Example: In a study of student performance, an LMM could model individual student performance (fixed effect) while also considering classroom-level differences in teaching quality (random effect).

Bootstrapping

Bootstrapping is a resampling technique used to estimate statistics on a population by repeatedly sampling from the original dataset. It provides a way to quantify the uncertainty around estimates and construct confidence intervals without making strong assumptions about the underlying data distribution.

Example: To estimate the variability of a cut-off score, bootstrapping involves randomly selecting subsets of data with replacement, calculating the cut-off for each subset, and then analyzing the distribution of these calculated cut-offs.

PredC Index

PredC is a metric introduced in this study to evaluate cut-off scores. It measures the proportion of individuals whose 95% Interval of Predicted Total Scores (IPTS) includes a specific cut-off value. A lower PredC indicates better discrimination, as it suggests the cut-off effectively separates individuals into distinct performance categories.

Example: If PredC for a cut-off score is 0.20, it means 20% of individuals’ IPTS overlap with this cut-off, supporting its adequacy in distinguishing high performers from low performers.

Part 2. The R code of the present study

figure a

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Jiang, Z., Ouyang, J., Shi, D. et al. Forming intervals of predicted total scores for cut-off scores evaluation: a generalizability theory application with Bootstrapping. Curr Psychol (2024). https://doi.org/10.1007/s12144-024-06306-9

Download citation

Accepted : 28 June 2024

Published : 14 August 2024

DOI : https://doi.org/10.1007/s12144-024-06306-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Performance assessment
  • Standard setting
  • Generalizability theory
  • Mixed-effect model
  • Find a journal
  • Publish with us
  • Track your research

The Psychology of the Psychic

research paper about psychologists

There is a hidden cause behind a fun little demonstration of an ostensibly paranormal experience that I often include in public talks on anomalistic psychology, especially when I have a reasonably large audience. I explain to my audience that an important part of proper skepticism is to always be open to the possibility that you may be wrong. In that spirit, I tell the audience that I would like to do a little experiment with them to see just how psychic they are. I tell them that I am going to try to telepathically send a simple message from my mind to theirs. “I’m thinking of a number between one and ten,” I say. “Not three, because that’s too obvious. I want you to make a mental note of the first number that comes into your mind now !”

research paper about psychologists

I then explain that, with such a large audience, we would expect around 10 percent of them to guess the number correctly just by chance, so we should only get ecstatically excited if considerably more than 10 percent of the audience get it right. I then, apparently somewhat nervously, ask, “Did anybody think of the number seven? If you did, raise your hand.” With a large audience, I can, in fact, be very confident that around a third of them will put up their hand.

Feigning surprise, I will try another, slightly more complicated example. “This time I’m thinking of a two-digit number between one and fifty. They are both odd digits and they are not the same. So, it could be fifteen — one and five, both odd digits, not the same — but it could not be eleven — both odd digits but they are the same. What is the first number that fits that description that comes into your mind now ?”

I then ask, as if expecting no one to have got it right this time, “Did anyone think of the number thirty-seven?” Once again, about a third of the audience will put up their hand. I will then add, “Did anyone think of thirty-five?” About a further quarter of the audience will raise their hand. “Sorry, that was my fault,” I explain. “I thought of thirty-five first and then I changed my mind.”

What is going on here? Any audience members who believe in ESP may well think that it has just been demonstrated. More skeptical members may be at a loss to explain what they have seen (and possibly directly experienced). Is it possible that I had simply conspired with all those members of the audience who got it right by telling them in advance to raise their hands in response to my questions? That would seem unlikely. Was it just a coincidence that so many more people guessed correctly than would be expected on the basis of chance alone? Again, possible but extremely unlikely.

The actual explanation is a phenomenon that psychologists refer to as population stereotypes . For most people faced with this task, when they are asked to make a mental note of the first number that comes into their head, they assume this is pretty much a random process. Therefore, they expect the frequencies of response to be more or less equal across the range of response options. In fact, this is not what happens. Responses tend to cluster in reliable and predictable ways, especially with large audiences.

In the first example, about a third of people will choose seven regardless of whatever number I may be thinking of (especially as I have ruled out three as a valid response, which otherwise would also be a popular choice). In the second example, about a third will pick 37 and about a further quarter will choose 35. Note that in neither example do the response rates approach 100 percent, but that is not a problem as people do not expect telepathy to be 100 percent reliable.

There are several other examples of population stereotypes that could be used to fool (at least some of) the unwary that you possess amazing telepathic powers. Tell them your telepathic target is two simple geometric forms, one inside the other. Around 60 percent will choose circle and triangle. Tell them you are thinking of a simple line drawing. Around 10 to 12 percent will draw a little house. It makes for a fun demonstration of the fact that not everything that looks like paranormal actually is. But would anyone ever seriously try to pass off such a demonstration as really involving telepathy?

research paper about psychologists

The answer is yes. For example, in the mid-1990s Uri Geller took part in a TV program called Beyond Belief , presented by David Frost, in which, it was claimed, various paranormal phenomena would be demonstrated live for the millions of viewers at home. I was one of those viewers. Uri demonstrated his alleged telepathic powers by supposedly transmitting a message to viewers. Uri had chosen one of four symbols that were presented at the bottom of the screen in the following order: square, star, circle, cross. As the camera slowly zoomed in on his face, Uri said: “I’m visualizing the symbol in my mind . . . and you people at home, millions of you, I’m actually communicating now with millions of people, maybe eleven, twelve, thirteen million people. Here goes. I’m transmitting it to you. I’m visualizing it. Open up your minds. Try to see it. Try to feel it. I’m really strongly passing it to you ! One more time . . . okay.” By this point, the upper half of Uri’s face pretty much filled the screen, with the four symbols still displayed across the bottom of the screen. Viewers were instructed to phone in using one of four different numbers to indicate their guess at Uri’s choice of symbol. Over 70,000 viewers did so.

My days of believing that Uri really did possess amazing psychic abilities were long gone by this stage, and I was therefore watching from the perspective of an informed skeptic. It was pretty easy to come up with nonparanormal explanations for all of the demonstrations featured in the program. With respect to this particular demonstration, I was rather pleased with myself for not only stating in advance what Uri’s telepathic target would be but for also correctly stating the order of popularity of the remaining symbols. It was very lucky for Uri that he chose to “telepathically” transmit the star symbol. It was by far the most popular choice of the viewers, with 47 percent of them indicating that this was the symbol that they had “received.” The second most popular was the circle, with 32 percent of the “votes,” followed by the cross (12 percent) and the square (10 percent). If the guesses were completely random, we would expect about 25 percent for each option, so 47 percent choosing the same symbol as Uri is a massively statistically significant outcome. The probability that almost half of the callers chose this symbol just by chance is astronomically low. So, was this really strong evidence of Uri’s psychic powers?

research paper about psychologists

Readers who are familiar with common techniques used to test for ESP will have recognized that the four symbols used in Uri’s demonstration are taken from the five symbols used on Zener cards (the missing symbol is three wavy lines). The cards are named after the person who designed them, perceptual psychologist Karl Zener. A full deck consists of twenty-five cards, five of each design. In a test of telepathy, a “sender” would take each card from a shuffled deck in turn and attempt to telepathically transmit the image on the card to a “receiver.” The receiver would record their guess of which card the sender was looking at. By chance alone, we would expect around five of the receiver’s guesses to be correct. If the receiver scores significantly more than five, this might be taken as evidence of ESP. However, it has been known for over eight decades that people are more likely to guess certain symbols compared to others. Back in 1939, Frederick Lund asked 596 people to each generate a random sequence of five symbols from the Zener set . By far the most popular symbol was — you’ve guessed it — the star, accounting for 32 percent of the responses compared to the 20 percent that would be expected by chance alone. So, as I said, it really was lucky for Uri that he chose the star as his telepathic target (assuming that it was just luck).

Chris French is Emeritus Professor and Head of the Anomalistic Psychology Research Unit in the Psychology Department at Goldsmiths, University of London. He is a Fellow of the British Psychological Society and of the Committee for Skeptical Inquiry and a Patron of Humanists UK. He is the coauthor of “ Anomalistic Psychology: Exploring Paranormal Belief and Experience ” and author of “ The Science of Weird Shit ,” from which this article is excerpted.

Many policymakers believe that the international reaction to the use of nuclear weapons would be severe. A new study challenges this assumption.

|

"Mein Kampf" is both a manifesto of ideological hatred and a strategic guide for manipulation. Its tactics remain disturbingly relevant.

|

In describing how they remember their lines, actors are telling us an important truth about memory.

|

Research shows that simple practices can help us zoom out, rediscover purpose, and find a fresh perspective when we're feeling burned out.

|

Yohannes Tsigab Publishes Research Paper in UCR's Undergraduate Research Journal

Yohannes Tsigab Publishes Research Paper in UCR’s Undergraduate Research Journal

  • Yohannes Tsigab Publishes Research Paper in UCR’s Undergraduate Research Journal

CAS Flip Book - UG and Graduate Programs

  • Message from the Dean
  • CAS Profile (PDF)
  • Mission & Vision
  • News and Events
  • Faculty & Staff

Yohannes Tsigab

Yohannes Tsigab, a UDC psychology student and alum of the University of California, Riverside-UDC  summer 2023 Pathways to Psychological Sciences program, recently published a paper in UCR’s Undergraduate Research Journal: “Antecedents of Procrastination: Examining the Role of Academic Identity and Self-Esteem.” Yohannes was mentored by Dr. Carolyn Murray, Professor of Psychology at UCR.

Contact CAS

4250 Connecticut Avenue NW, Suite 4332 Washington, DC 20008 Tel: 202.274.5194 Fax: 202.274.5589 Email: [email protected]

Latest News From CAS

Aliyah morris-chevalier wins first prize in durag festival’s hbcu design challenge.

  • UDC English Major Anthony Oakes Featured on Fox 5 DMV Zone
  • Undesigning Redlines: Exploring the Legacy of Segregation and Pathways to Justice
  • MSNBC President Rashida Jones Meets With UDC Students Ahead of Commencement Address

' src=

NIMH Logo

Transforming the understanding and treatment of mental illnesses.

Información en español

Celebrating 75 Years! Learn More >>

  • Science News
  • Meetings and Events
  • Social Media
  • Press Resources
  • Email Updates
  • Innovation Speaker Series

Revolutionizing the Study of Mental Disorders

March 27, 2024 • Feature Story • 75th Anniversary

At a Glance:

  • The Research Domain Criteria framework (RDoC) was created in 2010 by the National Institute of Mental Health.
  • The framework encourages researchers to examine functional processes that are implemented by the brain on a continuum from normal to abnormal.
  • This way of researching mental disorders can help overcome inherent limitations in using all-or-nothing diagnostic systems for research.
  • Researchers worldwide have taken up the principles of RDoC.
  • The framework continues to evolve and update as new information becomes available.

President George H. W. Bush proclaimed  the 1990s “ The Decade of the Brain  ,” urging the National Institutes of Health, the National Institute of Mental Health (NIMH), and others to raise awareness about the benefits of brain research.

“Over the years, our understanding of the brain—how it works, what goes wrong when it is injured or diseased—has increased dramatically. However, we still have much more to learn,” read the president’s proclamation. “The need for continued study of the brain is compelling: millions of Americans are affected each year by disorders of the brain…Today, these individuals and their families are justifiably hopeful, for a new era of discovery is dawning in brain research.”

An image showing an FMRI machine with computer screens showing brain images. Credit: iStock/patrickheagney.

Still, despite the explosion of new techniques and tools for studying the brain, such as functional magnetic resonance imaging (fMRI), many mental health researchers were growing frustrated that their field was not progressing as quickly as they had hoped.

For decades, researchers have studied mental disorders using diagnoses based on the Diagnostic and Statistical Manual of Mental Disorders (DSM)—a handbook that lists the symptoms of mental disorders and the criteria for diagnosing a person with a disorder. But, among many researchers, suspicion was growing that the system used to diagnose mental disorders may not be the best way to study them.

“There are many benefits to using the DSM in medical settings—it provides reliability and ease of diagnosis. It also provides a clear-cut diagnosis for patients, which can be necessary to request insurance-based coverage of healthcare or job- or school-based accommodations,” said Bruce Cuthbert, Ph.D., who headed the workgroup that developed NIMH’s Research Domain Criteria Initiative. “However, when used in research, this approach is not always ideal.”

Researchers would often test people with a specific diagnosed DSM disorder against those with a different disorder or with no disorder and see how the groups differed. However, different mental disorders can have similar symptoms, and people can be diagnosed with several different disorders simultaneously. In addition, a diagnosis using the DSM is all or none—patients either qualify for the disorder based on their number of symptoms, or they don’t. This black-and-white approach means there may be people who experience symptoms of a mental disorder but just miss the cutoff for diagnosis.

Dr. Cuthbert, who is now the senior member of the RDoC Unit which orchestrates RDoC work, stated that “Diagnostic systems are based on clinical signs and symptoms, but signs and symptoms can’t really tell us much about what is going on in the brain or the underlying causes of a disorder. With modern neuroscience, we were seeing that information on genetic, pathophysiological, and psychological causes of mental disorders did not line up well with the current diagnostic disorder categories, suggesting that there were central processes that relate to mental disorders that were not being reflected in DMS-based research.”

Road to evolution

Concerned about the limits of using the DSM for research, Dr. Cuthbert, a professor of clinical psychology at the University of Minnesota at the time, approached Dr. Thomas Insel (then NIMH director) during a conference in the autumn of 2008. Dr. Cuthbert recalled saying, “I think it’s really important that we start looking at dimensions of functions related to mental disorders such as fear, working memory, and reward systems because we know that these dimensions cut across various disorders. I think NIMH really needs to think about mental disorders in this new way.”

Dr. Cuthbert didn’t know it then, but he was suggesting something similar to ideas that NIMH was considering. Just months earlier, Dr. Insel had spearheaded the inclusion of a goal in NIMH’s 2008 Strategic Plan for Research to “develop, for research purposes, new ways of classifying mental disorders based on dimensions of observable behavior and neurobiological measures.”

Unaware of the new strategic goal, Dr. Cuthbert was surprised when Dr. Insel's senior advisor, Marlene Guzman, called a few weeks later to ask if he’d be interested in taking a sabbatical to help lead this new effort. Dr. Cuthbert soon transitioned into a full-time NIMH employee, joining the Institute at an exciting time to lead the development of what became known as the Research Domain Criteria (RDoC) Framework. The effort began in 2009 with the creation of an internal working group of interdisciplinary NIMH staff who identified core functional areas that could be used as examples of what research using this new conceptual framework looked like.

The workgroup members conceived a bold change in how investigators studied mental disorders.

“We wanted researchers to transition from looking at mental disorders as all or none diagnoses based on groups of symptoms. Instead, we wanted to encourage researchers to understand how basic core functions of the brain—like fear processing and reward processing—work at a biological and behavioral level and how these core functions contribute to mental disorders,” said Dr. Cuthbert.

This approach would incorporate biological and behavioral measures of mental disorders and examine processes that cut across and apply to all mental disorders. From Dr. Cuthbert’s standpoint, this could help remedy some of the frustrations mental health researchers were experiencing.

Around the same time the workgroup was sharing its plans and organizing the first steps, Sarah Morris, Ph.D., was a researcher focusing on schizophrenia at the University of Maryland School of Medicine in Baltimore. When she first read these papers, she wondered what this new approach would mean for her research, her grants, and her lab.

She also remembered feeling that this new approach reflected what she was seeing in her data.

“When I grouped my participants by those with and without schizophrenia, there was a lot of overlap, and there was a lot of variability across the board, and so it felt like RDoC provided the pathway forward to dissect that and sort it out,” said Dr. Morris.

Later that year, Dr. Morris joined NIMH and the RDoC workgroup, saying, “I was bumping up against a wall every day in my own work and in the data in front of me. And the idea that someone would give the field permission to try something new—that was super exciting.”

The five original RDoC domains of functioning were introduced to the broader scientific community in a series of articles published in 2010  .

To establish the new framework, the RDoC workgroup (including Drs. Cuthbert and Morris) began a series of workshops in 2011 to collect feedback from experts in various areas from the larger scientific community. Five workshops were held over the next two years, each with a different broad domain of functioning based upon prior basic behavioral neuroscience. The five domains were called:

  • Negative valence (which included processes related to things like fear, threat, and loss)
  • Positive valence (which included processes related to working for rewards and appreciating rewards)
  • Cognitive processes
  • Social processes
  • Arousal and regulation processes (including arousal systems for the body and sleep).

At each workshop, experts defined several specific functions, termed constructs, that fell within the domain of interest. For instance, constructs in the cognitive processes domain included attention, memory, cognitive control, and others.

The result of these feedback sessions was a framework that described mental disorders as the interaction between different functional processes—processes that could occur on a continuum from normal to abnormal. Researchers could measure these functional processes in a variety of complementary ways—for example, by looking at genes associated with these processes, the brain circuits that implement these processes, tests or observations of behaviors that represent these functional processes, and what patients report about their concerns. Also included in the framework was an understanding that functional processes associated with mental disorders are impacted and altered by the environment and a person’s developmental stage.

Preserving momentum

An image depicting the RDoC Framework that includes four overlapping circles (titled: Lifespan, Domains, Units of Analysis, and Environment).

Over time, the Framework continued evolving and adapting to the changing science. In 2018, a sixth functional area called sensorimotor processes was added to the Framework, and in 2019, a workshop was held to better incorporate developmental and environmental processes into the framework.;

Since its creation, the use of RDoC principles in mental health research has spread across the U.S. and the rest of the world. For example, the Psychiatric Ratings using Intermediate Stratified Markers project (PRISM)   , which receives funding from the European Union’s Innovative Medicines Initiative, is seeking to link biological markers of social withdrawal with clinical diagnoses using RDoC-style principles. Similarly, the Roadmap for Mental Health Research in Europe (ROAMER)  project by the European Commission sought to integrate mental health research across Europe using principles similar to those in the RDoC Framework.;

Dr. Morris, who has acceded to the Head of the RDoC Unit, commented: “The fact that investigators and science funders outside the United States are also pursuing similar approaches gives me confidence that we’ve been on the right pathway. I just think that this has got to be how nature works and that we are in better alignment with the basic fundamental processes that are of interest to understanding mental disorders.”

The RDoC framework will continue to adapt and change with emerging science to remain relevant as a resource for researchers now and in the future. For instance, NIMH continues to work toward the development and optimization of tools to assess RDoC constructs and supports data-driven efforts to measure function within and across domains.

“For the millions of people impacted by mental disorders, research means hope. The RDoC framework helps us study mental disorders in a different way and has already driven considerable change in the field over the past decade,” said Joshua A. Gordon, M.D., Ph.D., director of NIMH. “We hope this and other innovative approaches will continue to accelerate research progress, paving the way for prevention, recovery, and cure.”

Publications

Cuthbert, B. N., & Insel, T. R. (2013). Toward the future of psychiatric diagnosis: The seven pillars of RDoC. BMC Medicine , 11 , 126. https://doi.org/10.1186/1741-7015-11-126  

Cuthbert B. N. (2014). Translating intermediate phenotypes to psychopathology: The NIMH Research Domain Criteria. Psychophysiology , 51 (12), 1205–1206. https://doi.org/10.1111/psyp.12342  

Cuthbert, B., & Insel, T. (2010). The data of diagnosis: New approaches to psychiatric classification. Psychiatry , 73 (4), 311–314. https://doi.org/10.1521/psyc.2010.73.4.311  

Cuthbert, B. N., & Kozak, M. J. (2013). Constructing constructs for psychopathology: The NIMH research domain criteria. Journal of Abnormal Psychology , 122 (3), 928–937. https://doi.org/10.1037/a0034028  

Garvey, M. A., & Cuthbert, B. N. (2017). Developing a motor systems domain for the NIMH RDoC program.  Schizophrenia Bulletin , 43 (5), 935–936. https://doi.org/10.1093/schbul/sbx095  

Kozak, M. J., & Cuthbert, B. N. (2016). The NIMH Research Domain Criteria initiative: Background, issues, and pragmatics. Psychophysiology , 53 (3), 286–297. https://doi.org/10.1111/psyp.12518  

Morris, S. E., & Cuthbert, B. N. (2012). Research Domain Criteria: Cognitive systems, neural circuits, and dimensions of behavior. Dialogues in Clinical Neuroscience , 14 (1), 29–37. https://doi.org/10.31887/DCNS.2012.14.1/smorris  

Sanislow, C. A., Pine, D. S., Quinn, K. J., Kozak, M. J., Garvey, M. A., Heinssen, R. K., Wang, P. S., & Cuthbert, B. N. (2010). Developing constructs for psychopathology research: Research domain criteria. Journal of Abnormal Psychology , 119 (4), 631–639. https://doi.org/10.1037/a0020909  

  • Presidential Proclamation 6158 (The Decade of the Brain) 
  • Research Domain Criteria Initiative website
  • Psychiatric Ratings using Intermediate Stratified Markers (PRISM)  

American Psychological Association

Title Page Setup

A title page is required for all APA Style papers. There are both student and professional versions of the title page. Students should use the student version of the title page unless their instructor or institution has requested they use the professional version. APA provides a student title page guide (PDF, 199KB) to assist students in creating their title pages.

Student title page

The student title page includes the paper title, author names (the byline), author affiliation, course number and name for which the paper is being submitted, instructor name, assignment due date, and page number, as shown in this example.

diagram of a student page

Title page setup is covered in the seventh edition APA Style manuals in the Publication Manual Section 2.3 and the Concise Guide Section 1.6

research paper about psychologists

Related handouts

  • Student Title Page Guide (PDF, 263KB)
  • Student Paper Setup Guide (PDF, 3MB)

Student papers do not include a running head unless requested by the instructor or institution.

Follow the guidelines described next to format each element of the student title page.

Paper title

Place the title three to four lines down from the top of the title page. Center it and type it in bold font. Capitalize of the title. Place the main title and any subtitle on separate double-spaced lines if desired. There is no maximum length for titles; however, keep titles focused and include key terms.

Author names

Place one double-spaced blank line between the paper title and the author names. Center author names on their own line. If there are two authors, use the word “and” between authors; if there are three or more authors, place a comma between author names and use the word “and” before the final author name.

Cecily J. Sinclair and Adam Gonzaga

Author affiliation

For a student paper, the affiliation is the institution where the student attends school. Include both the name of any department and the name of the college, university, or other institution, separated by a comma. Center the affiliation on the next double-spaced line after the author name(s).

Department of Psychology, University of Georgia

Course number and name

Provide the course number as shown on instructional materials, followed by a colon and the course name. Center the course number and name on the next double-spaced line after the author affiliation.

PSY 201: Introduction to Psychology

Instructor name

Provide the name of the instructor for the course using the format shown on instructional materials. Center the instructor name on the next double-spaced line after the course number and name.

Dr. Rowan J. Estes

Assignment due date

Provide the due date for the assignment. Center the due date on the next double-spaced line after the instructor name. Use the date format commonly used in your country.

October 18, 2020
18 October 2020

Use the page number 1 on the title page. Use the automatic page-numbering function of your word processing program to insert page numbers in the top right corner of the page header.

1

Professional title page

The professional title page includes the paper title, author names (the byline), author affiliation(s), author note, running head, and page number, as shown in the following example.

diagram of a professional title page

Follow the guidelines described next to format each element of the professional title page.

Paper title

Place the title three to four lines down from the top of the title page. Center it and type it in bold font. Capitalize of the title. Place the main title and any subtitle on separate double-spaced lines if desired. There is no maximum length for titles; however, keep titles focused and include key terms.

Author names

 

Place one double-spaced blank line between the paper title and the author names. Center author names on their own line. If there are two authors, use the word “and” between authors; if there are three or more authors, place a comma between author names and use the word “and” before the final author name.

Francesca Humboldt

When different authors have different affiliations, use superscript numerals after author names to connect the names to the appropriate affiliation(s). If all authors have the same affiliation, superscript numerals are not used (see Section 2.3 of the for more on how to set up bylines and affiliations).

Tracy Reuter , Arielle Borovsky , and Casey Lew-Williams

Author affiliation

 

For a professional paper, the affiliation is the institution at which the research was conducted. Include both the name of any department and the name of the college, university, or other institution, separated by a comma. Center the affiliation on the next double-spaced line after the author names; when there are multiple affiliations, center each affiliation on its own line.

 

Department of Nursing, Morrigan University

When different authors have different affiliations, use superscript numerals before affiliations to connect the affiliations to the appropriate author(s). Do not use superscript numerals if all authors share the same affiliations (see Section 2.3 of the for more).

Department of Psychology, Princeton University
Department of Speech, Language, and Hearing Sciences, Purdue University

Author note

Place the author note in the bottom half of the title page. Center and bold the label “Author Note.” Align the paragraphs of the author note to the left. For further information on the contents of the author note, see Section 2.7 of the .

n/a

The running head appears in all-capital letters in the page header of all pages, including the title page. Align the running head to the left margin. Do not use the label “Running head:” before the running head.

Prediction errors support children’s word learning

Use the page number 1 on the title page. Use the automatic page-numbering function of your word processing program to insert page numbers in the top right corner of the page header.

1

IMAGES

  1. 💐 How to write a psychology research paper. 6 Tips For Crafting A

    research paper about psychologists

  2. Psychology Essay: Writing Guide and Tips

    research paper about psychologists

  3. (PDF) List of Topics for RESEARCH PAPERS in Psychology

    research paper about psychologists

  4. The Gestalt psychologists Research Paper Example

    research paper about psychologists

  5. psychology research paper sample in 2020

    research paper about psychologists

  6. Writing Annotated Bibliography Psychology with Ease

    research paper about psychologists

COMMENTS

  1. Free APA Journal Articles

    Recently published articles from subdisciplines of psychology covered by more than 90 APA Journals™ publications. For additional free resources (such as article summaries, podcasts, and more), please visit the Highlights in Psychological Research page.

  2. APA PsycArticles

    APA PsycArticles is a must-have for any core collection in the social and behavioral sciences providing access to 119 journals and journal coverage dating back to 1894. APA's definitive standards for writing and publishing psychological research enables authors to deliver rigorously vetted content that meets the highest standards of quality ...

  3. American Psychologist

    APA's flagship journal, American Psychologist publishes current and timely high-impact papers of broad interest. Learn about the journal and how to submit your paper.

  4. Psychological Science: Sage Journals

    Psychological Science, the flagship journal of the Association for Psychological Science, is the highest ranked empirical journal in psychology and is truly a leader in the field. The journal publishes cutting-edge research articles and … | View full journal description. This journal is a member of the Committee on Publication Ethics (COPE).

  5. Top 100 in Psychology

    This collection highlights our most downloaded* psychology papers published in 2022. Featuring authors from around the world, these papers showcase valuable research from an international community.

  6. How do clinical psychologists make ethical decisions? A systematic

    Abstract Given the nature of the discipline, it might be assumed that clinical psychology is an ethical profession, within which effective ethical decision-making is integral. How then, does this ethical decision-making occur? This paper describes a systematic review of empirical research addressing this question. The paucity of evidence related to this question meant that the scope was ...

  7. Psychology Top 100 of 2023

    Psychology Top 100 of 2023. This collection highlights the most downloaded* psychology research papers published by Scientific Reports in 2023. Featuring authors from around the world, these ...

  8. Frontiers in Psychology

    The most cited journal in its field, exploring psychological sciences - from clinical research to cognitive science, from imaging studies to human factors, and from animal cognition to social psych...

  9. Home

    Psychological Research is dedicated to the dissemination of knowledge contributing to a basic understanding of human cognition and behavior. Focuses on human perception, attention, memory, and action. Upholds impartiality, independent of any particular approach or school of thought. Includes theoretical and historical papers alongside applied ...

  10. The American Journal of Psychology

    The American Journal of Psychology (AJP) was founded in 1887 by G. Stanley Hall and was edited in its early years by Titchener, Boring, and Dallenbach. The Journal has published some of the most innovative and formative papers in psychology throughout its history. AJP explores the science of the mind and behavior, publishing reports of original research in experimental psychology, theoretical ...

  11. British Journal of Psychology

    The British Journal of Psychology is the flagship journal of the British Psychological Society, publishing cutting-edge, multidisciplinary psychological research with major theoretical or methodological contributions across different sections of psychology. With a commitment to open science, the journal enjoys a wide international readership. It features empirical studies and critical reviews ...

  12. Psychology

    Psychology articles from across Nature Portfolio Psychology is a scientific discipline that focuses on understanding mental functions and the behaviour of individuals and groups.

  13. Frontiers in Psychology

    The most cited journal in its field, exploring psychological sciences - from clinical research to cognitive science, from imaging studies to human factors, and from animal cognition to social psych...

  14. The Use of Research Methods in Psychological Research: A Systematised

    Research methods play an imperative role in research quality as well as educating young researchers, however, the application thereof is unclear which can be detrimental to the field of psychology. Therefore, this systematised review aimed to determine ...

  15. Racial Inequality in Psychological Research: Trends of the Past and

    In many cases, we document variation as a function of area and decade. We argue that systemic inequality exists within psychological research and that systemic changes are needed to ensure that psychological research benefits from diversity in editing, writing, and participation.

  16. 804869 PDFs

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on PSYCHOLOGY. Find methods information, sources, references or conduct a literature review on PSYCHOLOGY

  17. PDF APA Handbook of Research Methods in Psychology: Research Designs

    APA Handbooks in Psychology® Series APA Addiction Syndrome Handbook—two volumes Howard J. Shaffer, Editor-in-Chief APA Educational Psychology Handbook—three volumes Karen R. Harris, Steve Graham, and Tim Urdan, Editors-in-Chief APA Handbook of Adolescent and Young Adult Development—one volume Lisa J. Crockett, Gustavo Carlo, and John E. Schulenberg, Editors APA Handbook of Behavior ...

  18. Advances in quantitative research within the psychological sciences

    The psychological sciences have been a natural catalystfor the development of quantitativemethods, in part due to the wide arrayof research questions housed within the field.

  19. Articles

    Psychological Research is dedicated to the dissemination of knowledge contributing to a basic understanding of human cognition and behavior. Focuses on human ...

  20. Measurement of the Effects of School Psychological Services: A Scoping

    School psychologists are asked to systematically evaluate the effects of their work to ensure quality standards. Given the different types of methods applied to different users of school psychology measuring the effects of school psychological services is a complex task. Thus, the focus of our scoping review was to systematically investigate ...

  21. Journal of Applied Psychology

    Top ranked, peer reviewed journal on applied psychology. Learn how to access the latest research, submit your paper, and more.

  22. New Open Access Journal from APS and Sage Expands Publishing

    APS and Sage announce the launch of Advances in Psychological Science Open, a fully open access journal that will publish high-quality empirical, technical, theoretical, and review articles, across the full range of areas and topics in psychological science.

  23. Forming intervals of predicted total scores for cut-off scores

    In routine Generalizability Theory (G-theory) research, the task of establishing or assessing cut-off scores for performance evaluation is consistently sought after yet simultaneously poses significant challenges. While many studies have presented diverse indices to evaluate potential cut-off scores, these indices are frequently limited by their design-specific nature, limiting adaptability ...

  24. Study: Four in five psychologists overlook dissociation, a common

    New research has found that 4 in 5 Australian psychologists have gaps in their knowledge on dissociation—a common psychobiological response to trauma that can disrupt memory, emotion, and ...

  25. The Psychology of the Psychic

    Chris French is Emeritus Professor and Head of the Anomalistic Psychology Research Unit in the Psychology Department at Goldsmiths, University of London. He is a Fellow of the British Psychological Society and of the Committee for Skeptical Inquiry and a Patron of Humanists UK.

  26. Yohannes Tsigab Publishes Research Paper in UCR's Undergraduate

    Yohannes Tsigab, a UDC psychology student and alum of the University of California, Riverside-UDC summer 2023 Pathways to Psychological Sciences program, recently published a paper in UCR's Undergraduate Research Journal: "Antecedents of Procrastination: Examining the Role of Academic Identity and Self-Esteem." Yohannes was mentored by Dr. Carolyn Murray, Professor of Psychology at UCR.

  27. Revolutionizing the Study of Mental Disorders

    The Research Domain Criteria Initiative (RDoC) represented a new way to conceptualize the study of mental illnesses. In celebration of NIMH's 75th Anniversary, we reflect on the beginning and progress of this initiative.

  28. New research led sheds light on the behavioral and psychological

    A study offering insights into understanding and managing the behavioral and psychological symptoms of Alzheimer's disease and related dementias led by a team of UTHealth Houston researchers has ...

  29. White paper on forensic child interviewing: Research-based

    This white paper consists of evidence-based recommendations for conducting forensic interviews with children. The recommendations are jointly drafted by researchers in child interviewing active within the European Association of Psychology and Law and are focused on cases in which children are interviewed in forensic settings, in particular within investigations of child sexual and/or physical ...

  30. Title Page Setup

    The student title page includes the paper title, author names (the byline), author affiliation, course number and name for which the paper is being submitted, instructor name, assignment due date, and page number, as shown in this example.