How to Write a Methodology Section for Research Papers

HomeWritingHow to Write a Methodology Section for Research Papers

The methodology section explains how you conducted your research with enough detail for others to replicate your study. It includes: research design, participants/sample, materials/instruments, procedures, and data analysis methods. Write in past tense, be specific about numbers and procedures, justify your choices, and address ethical approvals. Follow APA 7th edition: “Method” heading (centered, bold) with bold, flush-left subheadings for Participants, Materials, Procedure, and Data Analysis.


What Is the Methodology Section? Purpose and Importance

The methodology section (often called “Methods” in APA format) is one of the most critical parts of a research paper. Its primary purpose is to provide a clear, detailed account of how you conducted your study so that:

  1. Other researchers can replicate your work exactly
  2. Readers can evaluate the validity and reliability of your findings
  3. Reviewers can assess whether your methods are appropriate for your research questions
  4. You demonstrate rigorous, ethical research practices

According to the American Psychological Association (APA) 7th edition, the Method section should contain “information that allows the reader to understand the procedures used to conduct the study and to evaluate the appropriateness of those procedures” (APA, 2020).

What belongs in methodology (and what doesn’t):

  • Included: Research design, sampling strategy, participant characteristics, materials/instruments, step-by-step procedures, data analysis techniques
  • Not included: Results, interpretations, background literature, theoretical frameworks (those belong in Introduction or Discussion)

Methodology vs. Methods: Understanding the Distinction

Many students confuse these terms, but they represent different levels of abstraction:

Methodology Methods
The overall strategy and theoretical framework guiding your research The specific tools and procedures used to collect and analyze data
Answers: Why did you choose this approach? Answers: How did you collect and analyze data?
Includes: research paradigm, design rationale, epistemological assumptions Includes: surveys, interviews, statistical tests, software tools
Example: “A mixed-methods approach was selected to provide both breadth and depth…” Example: “Participants completed a 20-item Likert-scale survey (α = .87)…”

Your methodology section should briefly mention the overarching methodology (e.g., “quantitative experimental design”) but focus primarily on describing the concrete methods.


The Three Main Research Approaches

Your methodology section will differ substantially depending on your research paradigm.

1. Quantitative Research

Quantitative studies collect numerical data to test hypotheses, identify patterns, or generalize findings to larger populations.

Typical components:

  • Research design (experimental, quasi-experimental, correlational, descriptive)
  • Sample size and power analysis (if applicable)
  • Sampling method (random, stratified, convenience)
  • Instruments (surveys, tests, sensors) with reliability/validity evidence
  • Procedure (group assignment, administration, control conditions)
  • Data analysis (statistical tests: t-tests, ANOVA, regression, etc.)
  • Software used (SPSS, R, Stata)

Example phrasing: “A randomized controlled trial (RCT) design was employed with 80 participants randomly assigned to treatment (n = 40) and control (n = 40) groups using a computer-generated randomization sequence.”

2. Qualitative Research

Qualitative studies collect non-numerical data (words, images, observations) to understand meanings, experiences, and social processes.

Typical components:

  • Research approach (ethnography, case study, phenomenology, grounded theory, narrative inquiry)
  • Site/context description
  • Participant selection ( purposive sampling, snowball sampling)
  • Data collection methods (interviews, focus groups, observations, document analysis)
  • Interview protocol or observation guide (often in appendix)
  • Data analysis approach (thematic analysis, content analysis, constant comparative method)
  • Software used (NVivo, ATLAS.ti, Dedoose)
  • Reflexivity (researcher positionality, potential biases)

Example phrasing: “Semi-structured interviews (n = 25) were conducted with participants lasting 45–60 minutes each. Interviews were audio-recorded, transcribed verbatim, and analyzed using Braun & Clarke’s (2006) six-step thematic analysis framework in NVivo 14.”

3. Mixed Methods

Mixed methods combines quantitative and qualitative approaches to leverage the strengths of both.

Typical components:

  • Mixed methods design (convergent parallel, explanatory sequential, exploratory sequential)
  • Rationale for mixing methods (complementarity, development, expansion)
  • Timing and priority of quantitative vs qualitative strands
  • Integration point (where and how data were combined)
  • Additional considerations (priority, theoretical framework)

Example phrasing: “An explanatory sequential mixed methods design was used. Quantitative data were collected first (n = 150 survey respondents), followed by qualitative interviews (n = 15 purposively selected from survey sample) to elaborate on unexpected quantitative findings. Integration occurred during interpretation.”


APA 7th Edition Structure for the Method Section

APA format prescribes a specific structure with clear headings. The main heading is “Method” (centered, bold). Subheadings are bold and flush-left.

Standard APA Method Section Organization

METHOD (centered, bold)

Participants (bold, flush-left)
[Description of sample, power analysis if applicable]

Materials (bold, flush-left)
[Instruments, stimuli, equipment, materials with citations]

Procedure (bold, flush-left)
[Step-by-step chronological description, including ethical approvals]

Data Analysis (bold, flush-left)
[Statistical or qualitative analysis methods, software]

Word count: Typically 300–800 words depending on study complexity.


Writing Each Subsection: Detailed Guidance

Participants (or Sample/Subjects)

This subsection describes who participated in your study and how they were selected.

Essential elements to include:

  1. Sample size: Total number of participants; separate groups if applicable
    • Example: “A total of 187 undergraduate students participated.”
    • Example for groups: “The experimental group (n = 42) and control group (n = 38) were compared.”
  2. Demographics: Age, gender, ethnicity, education level, relevant characteristics
    • Example: “Participants ranged in age from 18 to 35 years (M = 21.4, SD = 3.2). The sample was 62% female, 35% male, and 3% non-binary. All were native English speakers with no reported hearing impairments.”
  3. Sampling method: How participants were recruited and selected
    • Probability sampling: random, stratified, cluster
    • Non-probability: convenience, purposive, snowball
    • Example: “Participants were recruited via campus flyers and university research pool announcements. A convenience sampling approach was used.”
  4. Inclusion/exclusion criteria: What made someone eligible or ineligible
    • Example: “Inclusion criteria were: (a) age 18–65, (b) diagnosis of type 2 diabetes for at least 1 year, and (c) ability to read English at 8th-grade level. Excluded were individuals with severe complications requiring hospitalization.”
  5. Power analysis (for quantitative studies): How you determined sample size
    • Example: “A priori power analysis using G*Power 3.1 (Faul et al., 2009) indicated that 128 participants would provide 80% power to detect a medium effect size (f = 0.25, α = .05).”
  6. Compensation: Any incentives or payments
    • Example: “Participants received a $20 Amazon gift card or course credit for their participation.”

APA 7th tip: Use “participants” for human subjects; “subjects” for animal studies (APA, 2020).


Materials (or Instruments/Measures)

This subsection describes what tools or equipment you used to collect data.

Essential elements:

  1. Name and describe each instrument: Surveys, questionnaires, tests, equipment, stimuli
    • Example: “The Beck Depression Inventory-II (BDI-II; Beck et al., 1996) is a 21-item self-report measure of depressive symptoms…”
    • Example: “A calibrated sphygmomanometer (Omron Model HEM-907XL) was used to measure systolic and diastolic blood pressure…”
  2. Cite the source: Provide publication details and, if adapted, describe modifications
    • Example: “The Perceived Stress Scale (PSS-10; Cohen et al., 1983) was used with permission. Items were rated on a 5-point Likert scale from 1 (never) to 5 (very often).”
  3. Report reliability and validity (if available for your instrument):
    • Example: “The scale demonstrated good internal consistency in our sample (α = .89) and has shown strong construct validity in previous studies (Smith & Jones, 2018).”
  4. Include specific details: Number of items, response scales, subscales
    • Example: “The 45-item Big Five Inventory (BFI; John et al., 1991) measures five personality dimensions using 5-point Likert scales from 1 (disagree strongly) to 5 (agree strongly).”
  5. For custom instruments: Provide enough information for readers to understand or replicate
    • Example: “We developed a 12-item checklist assessing digital literacy skills. Items were created based on the European Digital Competence Framework (DigComp 2.2) and reviewed by three subject matter experts.”

When to use “Materials” vs “Instruments”:

  • “Materials” is broader and includes physical items, stimuli, software
  • “Instruments” typically refers to measurement tools (surveys, tests)
  • Many researchers use them interchangeably; choose one and be consistent

Procedure

This subsection explains step-by-step what you did, from participant recruitment to data collection. It should be detailed enough for someone to exactly replicate your study.

Essential elements:

  1. Study design overview: Briefly restate the design (between-subjects, within-subjects, longitudinal, etc.)
    • Example: “A between-subjects experimental design with two conditions (stress vs. control) was used.”
  2. Recruitment and consent: How participants were contacted and gave consent
    • Example: “Potential participants were emailed a recruitment flyer. Interested individuals attended a 30-minute orientation session where the study was explained, questions were answered, and written informed consent was obtained in accordance with the university’s IRB protocol (#2023-045).”
  3. Setting: Where the study took place (lab, online, field)
    • Example: “Testing occurred in a sound-attenuated laboratory room with controlled lighting. All procedures were conducted individually.”
    • Example: “The survey was administered online via Qualtrics. Participants received a unique link via email.”
  4. Chronological sequence: Describe each step in the order participants experienced them
    • Example: “Upon arrival, participants completed a demographic questionnaire (5 minutes). They then received instructions for the memory task and completed three practice trials. The experimental phase consisted of 48 trials presented in randomized order…”
  5. Randomization and counterbalancing: How conditions were assigned and order controlled
    • Example: “Participants were randomly assigned to condition using a random number generator. Task order was counterbalanced across participants using a Latin square design.”
  6. Manipulations: If you manipulated an independent variable, describe exactly how
    • Example: “Stress condition participants received the following instruction: ‘You will have 5 minutes to solve these puzzles. Performance will be evaluated and compared to other participants.’ Control participants received neutral instructions.”
  7. Debriefing: What participants were told after participation (especially for deceptive studies)
    • Example: “Following completion, all participants were fully debriefed about the study’s purpose, including the deception used, and given the researcher’s contact information for follow-up questions.”
  8. Ethical compliance: Mention IRB approval and ethical procedures
    • Example: “All procedures were approved by the Institutional Review Board (IRB Protocol #2023-078) and conducted in accordance with the Declaration of Helsinki.”

Write in past tense and active voice when possible:

  • ✅ “Participants completed the survey on computers in a private booth.”
  • ❌ “The survey was completed by participants on computers…” (passive)

Data Analysis

This subsection describes how you processed and analyzed the collected data.

Essential elements:

  1. Data preparation: Cleaning, screening, handling missing data
    • Example: “Data were screened for outliers using boxplots and Mahalanobis distance. Six participants with missing data >10% were excluded. Missing values were imputed using expectation-maximization (EM) algorithm.”
  2. Statistical tests (quantitative):
    • Tests used: Name each test (t-test, ANOVA, regression, chi-square, MANOVA, etc.)
    • Assumptions checked: Normality, homogeneity of variance, independence
    • Software and version: SPSS 28.0, R 4.2.2, Stata 17
    • Alpha level: Typically α = .05
    • Effect sizes: Cohen’s d, η², odds ratios (include these if reported)

    Example: “All analyses were conducted in R version 4.2.2 (R Core Team, 2022). Normality was assessed using Shapiro-Wilk tests and Q-Q plots. Homogeneity of variance was confirmed with Levene’s test. A 2 × 3 mixed ANOVA was performed with group (experimental, control) as the between-subjects factor and time (pre, post, follow-up) as the within-subjects factor. Partial eta squared (ηp²) is reported as the effect size measure. All tests were two-tailed with α = .05.”

  3. Qualitative analysis (qualitative):
    • Analytical approach: Thematic analysis, grounded theory, content analysis, phenomenology
    • Coding process: Inductive vs deductive, how codes were developed
    • Software: NVivo, ATLAS.ti, Dedoose, or manual
    • Reliability/validity measures: Inter-rater reliability, triangulation, member checking, audit trail

    Example: “Interviews were transcribed verbatim and analyzed using Braun & Clarke’s (2006) reflexive thematic analysis. Initial coding was performed by the first author using NVivo 14. Codes were discussed with the second author to establish an initial codebook. Themes were developed through iterative discussion, with disagreements resolved by consensus. Inter-rater reliability was calculated on 20% of transcripts (Cohen’s κ = .82).”

  4. Mixed methods integration: How quantitative and qualitative strands were combined
    • Example: “Quantitative and qualitative data were integrated during interpretation using a joint displays matrix to identify convergent and divergent findings.”

Common Mistakes to Avoid

Based on thousands of peer reviews and editorial comments, here are the most frequent methodology errors:

1. Vagueness and Lack of Specificity

Mistake: “Participants were recruited from the university” or “A survey was administered”

Why it’s wrong: Replication is impossible. How many participants? Which university? What kind of survey? How was it distributed?

Fix: “Eighty undergraduate students (60 female, 20 male; M age = 19.8 years) were recruited from the participant pool at Metropolitan University. Participants completed the 25-item Academic Motivation Scale via an online Qualtrics survey link.”

2. Insufficient Detail for Replication

Mistake: “Participants completed cognitive tasks” without explaining what tasks, how they were presented, duration, scoring.

Fix: “Participants completed the Stroop Color-Word Task (Golden & Freshwater, 2002) using E-Prime 3.0 software on a 24-inch monitor. The task consisted of 120 trials (40 congruent, 40 incongruent, 40 neutral). Each trial began with a 500 ms fixation cross, followed by the stimulus (color word printed in colored ink) for 2000 ms or until response. Reaction time and accuracy were recorded. The dependent variable was mean Stroop interference score (incongruent RT – congruent RT).”

3. Mixing Methods with Results

Mistake: “We found that the treatment group scored higher… (p < .01)” (this is a result, not a method)

Fix: Save all findings for the Results section. Method section only describes what you did, not what you found.

4. Ignoring Limitations of Your Methods

Mistake: Not acknowledging potential biases or weaknesses in your approach.

Fix: Briefly note limitations in the Method section (or discuss in Discussion). Example: “A convenience sampling strategy limits generalizability beyond the university setting.” Or “Self-report measures may be subject to social desirability bias.”

5. Misalignment with Research Questions

Mistake: Using a method that doesn’t actually answer your research questions.

Fix: Ensure your design, measures, and analysis directly address each research question/hypothesis. If you have three RQs, your methods must be capable of answering all three.

6. Omitting Ethical Considerations

Mistake: Not mentioning IRB approval, informed consent, confidentiality, or data storage.

Fix: Include a statement: “All procedures were approved by the Institutional Review Board (Protocol #XXXX). Informed consent was obtained from all participants. Data were de-identified and stored on a password-protected server for 5 years.”

7. Poor Organization and Flow

Mistake: Jumping between topics; mixing participant descriptions with procedures.

Fix: Follow APA structure strictly. Keep subsections focused: Participants first, then Materials, then Procedure, then Data Analysis.

8. Using Present Tense for Completed Work

Mistake: “We conduct the experiment with 30 participants” (present tense implies current action).

Fix: Use past tense for everything that already happened: “We conducted the experiment with 30 participants.”

9. Excessive Jargon Without Explanation

Mistake: “We employed a Bayesian hierarchical multilevel model with weakly informative priors” without explaining what this is or why you chose it.

Fix: Either explain in lay terms or cite a source that explains the method. Example: “We used a multilevel model (random intercepts for participants) to account for within-subject dependence, implemented in R using the lme4 package (Bates et al., 2015).”


Ethical Considerations: IRB and Beyond

Ethical research practices are non-negotiable. Your methodology section must demonstrate compliance with ethical standards.

Required Ethical Statements

  1. IRB/ethics approval: “All procedures were approved by the Institutional Review Board (IRB Protocol #XXXX) / Ethics Committee (Reference #YYYY).”
  2. Informed consent: “Written informed consent was obtained from all participants prior to participation. Participants were informed of the study’s purpose (with deception if applicable, then debriefed), risks and benefits, right to withdraw at any time without penalty, and contact information for the researcher and IRB.”
  3. Confidentiality: “All data were anonymized/coded. Identifying information was stored separately from responses on an encrypted drive.”
  4. Data storage and retention: “Data will be retained for 5 years in a secure, password-protected server and then destroyed.”
  5. Special populations: Additional protections for minors, prisoners, cognitively impaired individuals.

Deception and Debriefing

If your study uses deception:

  • State: “The study employed deception regarding [what was concealed].”
  • Describe: “Participants were fully debriefed immediately following participation, explaining the true purpose and rationale for deception.”
  • Justify: “Deception was necessary because knowledge of the true hypothesis would have altered participants’ responses.”

Sample Methodology Section: Complete Example

Below is a well-structured example for a quantitative psychology study.


Method

Participants

A total of 124 undergraduate students (94 female, 29 male, 1 non-binary; M age = 20.3 years, SD = 1.8) were recruited from a large public university’s psychology participant pool. Participants received course credit for their involvement. Inclusion criteria were: (a) age 18–25, (b) enrollment as a full-time undergraduate, and (c) native or fluent English speaker. A priori power analysis using G*Power 3.1 (Faul et al., 2009) with α = .05, power = .80, and medium effect size (f = 0.25) indicated a minimum sample of 128. Our final sample of 124 provided adequate power (actual power = .78) to detect the expected effects.

Materials

Perceived Stress Scale (PSS-10). The 10-item Perceived Stress Scale (Cohen et al., 1983) measures the degree to which situations in one’s life are appraised as stressful. Items (e.g., “In the last month, how often have you felt nervous and stressed?”) are rated on a 5-point Likert scale from 0 (never) to 4 (very often). Total scores range from 0 to 40, with higher scores indicating greater perceived stress. The PSS-10 has demonstrated strong reliability (α = .85–.90) and validity across diverse populations (Lee, 2012). In our pilot sample (n = 20), Cronbach’s alpha was .88.

Mindful Attention Awareness Scale (MAAS). The 15-item MAAS (Brown & Ryan, 2003) assesses dispositional mindfulness, defined as “the receptive attention to and awareness of present-moment experiences” (p. 824). Items (e.g., “I find myself doing things without paying attention”) are rated from 1 (almost always) to 6 (almost never). Scores are averaged, with higher values indicating greater mindfulness. The scale has excellent internal consistency (α = .90–.94) and test-retest reliability (Brown & Ryan, 2003).

Procedure

After providing informed consent, participants completed the demographic questionnaire followed by the PSS-10 and MAAS in counterbalanced order to control for order effects. All materials were administered online via Qualtrics. Participants were informed that the study examined “daily experiences and well-being” to minimize demand characteristics. The entire session took approximately 15 minutes. Upon completion, participants were debriefed and received course credit. All procedures were approved by the University Institutional Review Board (Protocol #2023-156) and conducted in accordance with APA Ethical Principles.

Data Analysis

Data were screened for missing values, outliers (using boxplots and ±3 SD criteria), and assumption violations (normality, homoscedasticity). Less than 2% of data were missing, and listwise deletion was used (final N = 124). No univariate outliers were identified. Descriptive statistics and Pearson correlations were computed using SPSS 28.0. To test the primary hypothesis that mindfulness would negatively correlate with perceived stress, a one-tailed Pearson correlation analysis was performed. Effect sizes are reported as r, with 95% confidence intervals. The alpha level was set at .05 (two-tailed for descriptives, one-tailed for directional hypothesis).


How to Handle Special Research Designs

Longitudinal Studies

Add a subsection or paragraph describing:

  • Number of time points (e.g., “Data were collected at three waves: baseline (T1), 6-month follow-up (T2), and 12-month follow-up (T3)”)
  • Retention strategies: “Participants were contacted via email and phone reminders. Retention rates were 92% at T2 and 87% at T3.”
  • Attrition analysis: “We compared completers vs. dropouts on baseline demographics using independent samples t-tests; no significant differences emerged (p > .05).”

Qualitative Interviews/Focus Groups

Include:

  • Interview guide development (how questions were developed, whether pilot tested)
  • Interview format (semi-structured, unstructured; in-person, phone, video)
  • Interviewer training (if applicable): “All interviewers completed a 4-hour training on neutral probing and active listening.”
  • Duration: average interview length (e.g., “Interviews lasted 45–70 minutes (M = 58 min)”)
  • Transcription method: “Audio recordings were transcribed verbatim by a professional transcription service and checked for accuracy by the researchers.”

Experimental Manipulations

Describe the manipulation check:

  • “Manipulation checks confirmed that participants in the stress condition reported significantly higher state anxiety (M = 4.2, SD = 0.8) than those in the control condition (M = 2.1, SD = 0.5), t(122) = 5.67, p < .001.”

Online/Remote Research

Include:

  • Platform used: “Data were collected via Prolific Academic. Participants were required to have approval rating >95% and located in the United States.”
  • Attention checks: “Three attention-check items (e.g., ‘Select ‘strongly disagree’ for this item’) were embedded; participants failing any were excluded.”
  • Data quality measures: “Response time was monitored; participants completing in <5 minutes were excluded.”

Frequently Asked Questions About Methodology Sections

Q: How much detail is enough?
A: Enough that a competent researcher could exactly replicate your study. If you’re unsure, ask: “Could someone at another university do exactly what I did using only my description?” Include exact numbers, brand names, software versions, timing, and specific procedures.

Q: Should I include the full survey/questions?
A: Usually not in the main text. Place full instruments in an appendix and reference: “The complete interview protocol is available in Appendix A.” Or provide a link to an online supplement if your journal allows.

Q: Can I combine Participants and Materials into one section?
A: APA 7th recommends separate subsections, but some journals allow combined “Participants and Materials” if it improves flow. Check your target journal’s author guidelines.

Q: How many participants are enough?
A: This depends on your design and analysis. Always justify your sample size with a power analysis for quantitative studies. For qualitative studies, justify based on saturation (e.g., “Data collection continued until thematic saturation was reached at n = 25”).

Q: Should I report statistical power or effect sizes in the Method?
A: Power analysis belongs in the Method (usually in Participants subsection). Observed effect sizes belong in the Results section.

Q: Do I need to describe my analysis software version?
A: Yes. For quantitative studies: “Analyses were conducted using SPSS 28.0 for Windows.” For qualitative: “Thematic analysis was performed in NVivo 14.”

Q: What if my methodology is unusual or novel?
A: Spend extra time explaining it. The more novel your approach, the more detail readers need. Consider including a diagram or flowchart if the journal permits.


Related Guides


Need Help with Your Research Paper?

Struggling to describe your methodology with the precision and detail required for academic publication? Writing a robust methods section requires deep understanding of research design, statistical analysis, and APA formatting—all while maintaining clarity and avoiding common pitfalls.

Our team of PhD-level research specialists can help you:

  • ✓ Design rigorous methodology appropriate for your research questions
  • ✓ Write a clear, detailed, and APA-compliant Methods section
  • ✓ Perform statistical analysis and interpret results correctly
  • ✓ Ensure ethical compliance and IRB documentation
  • ✓ Edit and polish your entire research paper for publication-ready quality

Why choose our research paper service:

  • Subject-specific experts with advanced degrees in your field
  • Original, methodologically sound research design
  • Adherence to any formatting style (APA, MLA, Chicago, Harvard, etc.)
  • On-time delivery, even under tight deadlines
  • Free revisions until your paper meets your expectations
  • 24/7 customer support

Get expert research assistance today and receive a 15% discount on your first order with code firstpaper15.

Order Your Custom Research Paper Now


Conclusion

The methodology section is the backbone of your research paper—it’s where you prove your study is rigorous, ethical, and replicable. By following APA 7th edition structure, providing sufficient detail for replication, justifying your methodological choices, and addressing ethical considerations, you’ll create a Methods section that withstands peer review and builds confidence in your findings.

Key takeaways:

  1. Structure matters: Follow APA format (Method heading → Participants, Materials, Procedure, Data Analysis)
  2. Be specific: Exact numbers, procedures, instruments, software versions
  3. Write in past tense: Everything that already happened
  4. Justify choices: Why this design? Why this sample? Why this analysis?
  5. Address ethics: IRB approval, informed consent, confidentiality
  6. Avoid common errors: Vagueness, mixing methods/results, insufficient detail
  7. Enable replication: Someone should be able to repeat your study using only your description

Master the methodology section, and you’ll have a solid foundation for a publishable, impactful research paper.


References

American Psychological Association. (2020). Publication manual of the American Psychological Association (7th ed.). https://doi.org/10.1037/0000165-000

Beck, A. T., Steer, R. A., & Brown, G. K. (1996). Manual for the Beck Depression Inventory-II. Psychological Corporation.

Brown, K. W., & Ryan, R. M. (2003). The benefits of being present: Mindfulness and its role in psychological well-being. Journal of Personality and Social Psychology, 84(4), 822–848. https://doi.org/10.1037/0022-3514.84.4.822

Cohen, S., Kamarck, T., & Mermelstein, R. (1983). A global measure of perceived stress. Journal of Health and Social Behavior, 24(4), 385–396. https://doi.org/10.2307/2136404

Faul, F., Erdfelder, E., Lang, A.-G., & Buchner, A. (2009). G*Power 3.1: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 41(4), 1149–1160. https://doi.org/10.3758/BRM.41.4.1149

Lee, E. H. (2012). Review of the psychometric evidence of the Perceived Stress Scale. Asian Nursing Research, 6(4), 121–127. https://doi.org/10.1016/j.anr.2012.08.004

all Post
Discount applied successfully