Conducting Mixed-Methods Research From Classical Social Sciences to the Age of Big Data and Analytics

1

Mixed-Methods Research as the Third Research Approach

CHAPTER 1. Mixed-Methods Research as the Third Research Approach

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research has emerged as a legitimate third research approach that combines qualitative and quantitative methods to leverage complementary strengths and overcome the limitations inherent in single-method approaches.

📌 Key points (3–5)

  • What mixed-methods research is: An approach that combines qualitative and quantitative techniques, methods, concepts, and languages in a single study or series of studies, requiring integration of findings (not just parallel collection).
  • Why it matters: It addresses weaknesses of single methods, provides more complete understanding, enables simultaneous exploratory and confirmatory questions, and generates richer insights through integration.
  • Core characteristics: Uses both qualitative and quantitative methods with explicit integration; embraces methodological eclecticism and paradigm pluralism; addresses both exploratory and confirmatory questions; emphasizes diversity at all research levels.
  • Common confusion: Multi-method vs. mixed-methods—multi-method uses multiple qualitative OR multiple quantitative methods, while mixed-methods requires combining both qualitative AND quantitative approaches with integration.
  • Key challenge: Researchers trained in only one paradigm often struggle with the compatibility thesis (whether qualitative and quantitative methods can be legitimately combined), leading to contribution shrinkage when integration is missing.

🎯 Defining mixed-methods research

🔄 Evolution from method to methodology

  • Early definitions (1970s-1980s) focused on method: combining data collection techniques to examine the same research problem.
  • Later definitions (1990s-2000s) expanded to methodology: a broader inquiry logic and research approach, not just technical procedures.
  • Current view integrates multiple perspectives: method, methodology, research design, and philosophical orientation combined.

Mixed-methods research: Research in which investigators collect and analyze both qualitative and quantitative data, integrate the findings, and draw inferences using both approaches in a single study or program of inquiry.

📊 Multi-method vs. mixed-methods distinction

ApproachDefinitionExample
Multi-methodUses two or more qualitative methods OR two or more quantitative methodsDiary study + laboratory observation (both qualitative)
Mixed-methodsCombines qualitative AND quantitative methods with integrationSurvey data + follow-up interviews with explicit integration of findings
  • Both fall under the umbrella of "multiple methods research."
  • The key differentiator: mixed-methods requires crossing the qualitative-quantitative boundary.
  • Example from excerpt: A study using four surveys plus twenty interviews to examine deadline pressures—the integration of numerical patterns with in-depth practitioner insights exemplifies mixed-methods.

🔑 Integration as the core element

  • Simply collecting both types of data is insufficient.
  • Researchers must explicitly specify:
    • Sequencing: which method comes first or whether they occur simultaneously.
    • Priority: which method receives more emphasis.
    • Integration: how findings from both methods are combined to produce meta-inferences.
  • Without integration, the study fails to achieve mixed-methods objectives and leads to "contribution shrinkage" (missing opportunities to develop or extend theory).

🧩 Five defining characteristics

🔬 Use of both qualitative and quantitative methods

  • Requires at least one qualitative and one quantitative method in the same study.
  • Goes beyond mere collection—must include clear integration of results.
  • Meta-inferences: The integrated conclusions drawn from combining both data types; this is what distinguishes mixed-methods from simply running two separate studies.
  • Don't confuse: Collecting both data types without integration is not true mixed-methods research.

🎨 Methodological eclecticism

Methodological eclecticism: Selecting and synergistically integrating the most appropriate techniques from qualitative, quantitative, and mixed methods to thoroughly investigate a phenomenon.

  • Rooted in rejecting the incompatibility thesis (the belief that qualitative and quantitative methods cannot be mixed due to paradigmatic differences).
  • Researchers are free to combine methods when they best answer research questions.
  • Important caveat: Sometimes the best approach for a study may be purely qualitative or purely quantitative—mixed-methods is not always superior.

🌈 Paradigm pluralism

  • Unlike single-method approaches tied to one paradigm (positivist for quantitative, constructivist/interpretivist for qualitative), mixed-methods allows multiple paradigms.
  • Alternative paradigms mentioned: pragmatism, critical realism, transformative emancipatory.
  • Challenge: Each paradigm has its own assumptions that may conflict when combined; researchers must be diligent in defending the compatibility thesis.
  • The excerpt acknowledges ongoing philosophical debates but argues pluralism leads to stronger inferences and richer insights.

❓ Focus on both exploratory and confirmatory questions

  • Exploratory questions: Broad questions seeking to discover what is happening, gain insights, or assess phenomena in new light (typically qualitative).
  • Confirmatory questions: Specific questions testing theory or hypotheses (typically quantitative).
  • Mixed-methods enables addressing both simultaneously—generating and evaluating theory in the same study.
  • Example: Using qualitative methods to identify main constructs and relationships, then quantitative methods to test the significance of those relationships.

🌍 Emphasis on diversity at all levels

  • Goes beyond just methodological and paradigmatic diversity.
  • Encompasses diverse range of questions, data sources, and analyses.
  • Provides opportunity for divergent findings from multiple sources.
  • Diversity is viewed as a strength, not a problem to be resolved.

💪 Value and advantages

⚖️ Compensating for weaknesses

Core principle: When different strategies are combined, the mixture results in complementary strengths and non-overlapping weaknesses.

MethodStrengthsWeaknesses
QuantitativeObjective, generalizable, large samples, precise numerical results, systematic analysisMay not reflect participants' understanding, standardized instruments limit data types, controlled settings may not represent natural context
QualitativeRich depth, contextual understanding, responsive to changes, captures lived experiencesNot generalizable to populations, time-consuming, researcher bias, social desirability when researcher present
  • Example from excerpt: Survey reaches many respondents but may have unclear responses; follow-up interviews allow elaboration and develop richer understanding.
  • The weaknesses of surveys (standardization, interpretation ambiguity) are offset by interview strengths (depth, clarification).

🧩 More complete picture

  • Neither method alone is sufficient to understand complex phenomena.
  • Combining numerical data with narrative data provides complementary views.
  • Multiple data types analyzed with multiple techniques (statistical + content analysis) yield fuller understanding.
  • Example: In health science, gathering exploratory data before, during, and after a clinical trial augments the trial design and leads to more complete findings.

✅ Greater confidence and accuracy

  • Integration of findings from different methods leads to insights that cannot emerge from either alone.
  • Example from excerpt: Study of global arms industry (1996-2007) combined interviews with industry experts to understand categorical structure, then quantitative data from 210 largest weapons providers for firm-level analysis.
  • The researcher noted that mixed-methods provided "in-depth knowledge of the industry rather than the researcher's prior belief" about which categories matter.
  • Complex, multifaceted phenomena require different data types and analyses to fully understand.

🎯 Answering questions single methods cannot

  • Major advantage: Simultaneously answer exploratory and confirmatory questions, thus verify and generate theory in the same study.
  • Single-method approaches can only address one goal per study.
  • Example: Study of team-based empowerment used longitudinal quasi-experiment to test hypothesis (confirmatory), then analyzed interview data to identify themes about why/how leaders facilitated or obstructed implementation (exploratory).
  • Such dual inquiry is impossible with either method alone.

🔀 Opportunity for divergent findings

  • Convergence of results strengthens validity and initial assumptions.
  • Divergence is also valuable: Provides opportunity to re-examine conceptual frameworks and theoretical assumptions.
  • Don't confuse: Divergent findings are not failures but opportunities for deeper understanding and theory refinement.

🚧 Challenges and obstacles

🎓 Training and skill gaps

  • Scholars are commonly trained in only one school of thought (either qualitative or quantitative).
  • Conducting mixed-methods research is difficult without collaborating with scholars from different traditions.
  • Requires understanding of both dominant paradigms plus strategies to accommodate differences and leverage complementarities.

🔗 Lack of integration in practice

  • Common problem: Little or no integration of findings within a single inquiry.
  • Leads to two negative outcomes:
    • Contribution shrinkage: Missing opportunities to discover, develop, or extend theory using findings from different approaches.
    • Communal disutility: The research community fails to learn intricacies of the phenomenon because an integrative view is not provided.

🤝 Paradigm compatibility debates

  • Two dominant research cultures (positivist/quantitative vs. constructivist/qualitative) have led to belief that methods should be executed independently if used together.
  • The compatibility thesis (that mixing is legitimate) continues to face resistance from researchers aligned with particular paradigms.
  • Researchers must be "diligent and persistent" in defending compatibility while acknowledging philosophical differences.

📈 Growing recognition and support

🌟 Increased popularity indicators

  • Growing number of mixed-methods articles in journals, conference proceedings, and books across fields.
  • Field-specific guidelines published (e.g., nursing field).
  • Multiple textbooks dedicated to mixed-methods research.

💰 Funding agency encouragement

  • National Science Foundation (NSF): Task force noted evaluators should "rely on a mixed-methods approach whenever possible."
  • National Institutes of Health (NIH): Provided guidelines in 2010 for developing and evaluating mixed-methods research proposals.
  • Both agencies actively encourage scholars to use mixed-methods in their work.

📚 Terminology evolution

  • Early inconsistency: Terms included multi-method, triangulation, integrated, hybrid, combined, mixed methodology.
  • Current consensus: "Mixed-methods research" is the accepted term for combining qualitative and quantitative approaches.
  • The field is described as being in "adolescence"—disagreement about exact definitions indicates the term is important and the field is still developing.
2

Philosophical Foundations of Mixed-Methods Research

CHAPTER 2. Philosophical Foundations of Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research embraces paradigm pluralism—the belief that multiple philosophical paradigms (pragmatism, critical realism, transformative emancipatory, or a dialectic stance) can legitimately underpin the integration of qualitative and quantitative methods in a single study.

📌 Key points (3–5)

  • Paradigm pluralism: Mixed-methods research accepts that a variety of paradigms may serve as the underlying philosophy, unlike traditional qualitative or quantitative research which typically aligns with a single paradigm.
  • Two main stances: An alternative paradigm stance (use one paradigm to guide the entire study) versus a dialectic stance (engage multiple paradigms in dialogue within the same inquiry).
  • Three alternative paradigms: Pragmatism (focus on research questions and actionable knowledge), critical realism (focus on causal mechanisms across real/actual/empirical domains), and transformative emancipatory (focus on power, justice, and community involvement).
  • Common confusion: Alternative paradigm stance seeks coherent actionable knowledge from one framework, while dialectic stance seeks insightful understanding by respectfully integrating diverse lenses that could not be obtained using one framework alone.
  • Why it matters: Paradigmatic assumptions shape the research process, influence outcomes, and determine how researchers integrate methods, interpret findings, and draw conclusions.

🔄 Two Stances on Mixing Paradigms

🔄 Alternative paradigm stance

  • Uses a single paradigm to provide philosophical underpinning for the entire mixed-methods study.
  • Presents a coherent system of thought with its own philosophical assumptions.
  • Context and practicality (not philosophical frameworks) guide practice.
  • Meta-inferences represent actionable knowledge—knowledge that can be directly acted upon to improve the practical problem being studied.
  • Example: A researcher might use pragmatism throughout the study, letting research questions drive all method choices and focusing on practical solutions.

🔄 Dialectic stance

  • Uses multiple paradigms in a single inquiry, engaging assumptions from different traditions in respectful dialogue.
  • More than one philosophical, theoretical, or mental model framework intentionally guides the study.
  • Meta-inferences represent a respectful integration of diverse lenses, producing insightful understanding that could not be obtained using one framework/method alone.
  • Involves using data from one approach to directly inform the other in an iterative way.
  • Example: A researcher might address the same research question using two different methods, each rooted in a distinct paradigm, then compare results from different worldviews.

🔍 How to distinguish the two stances

DimensionAlternative Paradigm StanceDialectic Stance
Number of paradigmsSingle coherent paradigmMultiple paradigms in dialogue
What guides practiceContext and practicalityMultiple philosophical frameworks
Meta-inferencesActionable knowledge for practical problemsIntegrated understanding from diverse lenses
GoalCoherent system of thoughtGenerative engagement with difference

🧩 Four Philosophical Elements of a Paradigm

🧩 Ontology

Ontology: how we view the existence of the world and society (reality).

  • Addresses the question "What is reality?"
  • Different paradigms hold different ontological assumptions about whether reality is single, multiple, or layered.
  • Example: Positivism assumes a single objective reality independent of our minds; constructivism assumes multiple realities constructed by individuals' minds.

🧩 Epistemology

Epistemology: what we believe about knowledge.

  • Addresses the question "How do we know / what counts as knowledge?"
  • Concerns the relationship between the knower and what can be known.
  • Example: Positivism views knowledge as objective and dispassionate; constructivism views knowledge as subjective, emerging from interaction between researchers and participants.

🧩 Methodology

Methodology: what systematic or theoretical approaches can we use to conduct a research study.

  • Addresses the question "What approach do we use to carry out research?"
  • Different paradigms favor different methods based on their ontological and epistemological assumptions.
  • Example: Positivism favors hypothetico-deductive methods (experiments, surveys); constructivism favors hermeneutic/dialectic methods (case studies, ethnography).

🧩 Axiology

Axiology: what role do values and ethics play in research.

  • Addresses the question "How do values come into play in an inquiry?"
  • Concerns whether research can or should be value-free.
  • Example: Positivism claims inquiry can be carried out without the influence of a value system; constructivism and transformative emancipatory paradigms view research as value-laden.

🎯 Pragmatism

🎯 Core principle: actions and consequences

  • Knowledge is acquired through the combination of action and reflection.
  • Knowledge is always about the relationship between action and consequences.
  • Different approaches generate different outcomes, so knowledge claims must be judged pragmatically in relation to the processes through which knowledge was generated.
  • Don't confuse: Pragmatism does not claim "anything goes"—it requires judging knowledge claims based on the particular methods and methodologies used.

🎯 Central importance of research questions

Dictatorship of the research questions: Research questions are more important than methods or methodologies.

  • The decision regarding research approach should depend on the research questions.
  • In some situations qualitative is more appropriate; in others quantitative is preferable; in many cases both can produce more complete understanding.
  • Pragmatism allows researchers to be free of mental and practical constraints imposed by the forced choice between postpositivism and constructivism.

🎯 Key characteristics of pragmatism

  • Rejects the incompatibility thesis: Supports using both qualitative and quantitative methods in the same study; is anti-dualist, questioning the dichotomy of positivism and constructivism.
  • Avoids metaphysical concepts: Believes in an "antirepresentational view of knowledge"—research should aim at utility for us, not to most accurately represent reality.
  • Practical and applied: Study what interests and is of value to you, study it in different ways you deem appropriate, and utilize results to bring about positive consequences within your value system.
  • Methodological appropriateness: Select the best methodologies as long as they help address research questions.

🎯 Ontology, epistemology, methodology, axiology in pragmatism

ElementPragmatism's position
OntologyBoth single and multiple realities
EpistemologyKnowledge is both constructed and based on the reality of the world we experience
MethodologyMethodological appropriateness—methods that can best address the research questions
AxiologyBoth biased and unbiased perspectives

🎯 Example: KMS use study (Zhang & Venkatesh, 2017)

  • Research questions: (1) What are the key KMS features that enhance job outcomes? (2) What are the drivers and consequences of use of the key KMS features?
  • Design: Sequential mixed-methods—qualitative study (interviews with 35 employees to identify key KMS features) followed by quantitative study (longitudinal survey with 1,441 responses to test nomological network).
  • Pragmatist principles applied:
    • Research questions drove method selection—no strong theoretical foundation existed, so qualitative feature validation followed by quantitative model validation was deemed appropriate.
    • Rejected forced choice between positivism and constructivism—relied on prior literature and used Miles and Huberman's approach to code interview data, then used PLS-SEM for quantitative analysis.
    • Established credibility by questioning theoretical assumptions and hypothesizing about consequences when hypotheses were unsupported.
    • Mixed perspectives from multiple data sources to enhance validity.

🔬 Critical Realism

🔬 Core principle: stratified reality

  • Reality exists independent of our knowledge (intransitive object of knowledge), but the generation of new knowledge depends on specific details and processes of its production (transitive object of knowledge).
  • Recognizes three forms of knowledge based on three domains:
    • Real domain: represents a reality that may or may not be observable (generative mechanisms, underlying structures).
    • Actual domain: a subset of the real (events that occur whether or not we observe them).
    • Empirical domain: a subclass of observable, experienced events and change.
  • Don't confuse: Critical realism is not the same as positivism—it accepts that our access to the world is always mediated by our perceptual and theoretical lenses.

🔬 Epistemological relativism

  • Accepts the possibility of alternative explanations to any phenomenon.
  • All theories about the world are grounded in a particular perspective and worldview.
  • All knowledge is partial, incomplete, and fallible.
  • However, maintains a strong realist and critical core—recognizes assumptions and limitations of both positivism and interpretivism.

🔬 Retroduction: the main methodology

Retroduction: the process of moving from descriptions of empirical events or regularities to potential causal mechanisms, the interaction of which could potentially have generated events.

  • Essentially the same as abduction.
  • Critical realists look for underlying generative mechanisms—the ways of acting of a thing, the causal powers and liabilities of objects or relations, capacities of behavior, tendencies of structures.
  • Causality is reformulated as "A generates B in context C" (not simply "A causes B").
  • Causality is a process of how causal powers are actualized in some particular context, where the generative mechanisms of that context shape the particular outcomes.
  • This approach allows researchers to use a wide variety of methods, integrating qualitative and quantitative approaches to hypothesize and identify generative mechanisms.

🔬 Ontology, epistemology, methodology, axiology in critical realism

ElementCritical realism's position
OntologyMuch of reality exists and operates independent of our knowledge; consists of three domains: real, actual, and empirical
EpistemologyEpistemological relativism—the process of scientific knowledge is viewed as historical emergent, political, and imperfect
MethodologyMethodological pluralism—retroductive, in-depth historically situated analysis of pre-existing structures
AxiologyValue-laden

🔬 Relevant aspects for mixed-methods research

  • Process approach to causality: Views causality as fundamentally referring to actual causal mechanisms and processes involved in particular events and situations; uses retroduction to move between knowledge of empirical phenomena and creation of explanations.
  • Seeing mind as part of reality: Recognizes mental events and processes as real phenomena that can be causes of behavior; supports the idea that individuals' social and physical contexts have causal influence on their beliefs and perspectives.
  • Validity and inference quality: Validity is not a matter of procedures but of the relationship between the claim and the phenomena in the specific context of a particular study; focuses on establishing whether the generative mechanism hypothesized is involved in the observed events.
  • Diversity as a real phenomenon: Believes diversity is itself a real phenomenon; mixed-methods research with critical realism helps overcome diversity limitation by examining different levels of abstraction using different methods.

🔬 Example: IT use study (Walsh, 2014)

  • Research question: How do users' IT culture and IT needs influence IT use?
  • Design: Exploratory (grounded theory) mixed-methods approach investigating use of Moodle e-learning platform.
  • Critical realist principles applied:
    • Used both secondary data from literature and primary data (quantitative surveys analyzed with PLS-SEM; qualitative interviews hand-coded).
    • Analyzed all data together, constantly comparing and analyzing as collected, letting data speak and guide emerging theory.
    • Assumed reality is multi-faceted and may be perceived differently by different individuals and in different contexts.
    • Focus on establishing causality—integrated qualitative and quantitative data to generate theoretical propositions through iterations, proposing new relationships as generative mechanisms.
    • Example proposition: "individual IT culture mostly has a positive influence on the individual's global IT needs" established as causal relationship in the substantive area and investigated context.

🔬 Example: Healthcare equity study (Smirthwaite & Swahnberg, 2016)

  • Research question: What causes gender differences in waiting times for cataract extraction?
  • Design: Two empirical studies—quantitative (database analysis of 102,532 patients) and qualitative (focus group interviews at two clinics).
  • Critical realist principles applied:
    • Differentiated three forms of knowledge:
      • Real domain: generative mechanisms affecting waiting times emerge from intersecting power structures related to gender, ethnicity, class, and age.
      • Actual domain: care seeking behavior shaped by intersecting gender norms, class norms, and norms related to ethnicity and age interact with notions, values and assumptions of health care staff.
      • Empirical domain: statistical analyses showed women spent longer time on waiting list; focus group analyses showed how assertive behavior tended to be legitimized for male patients.
    • Quantitative study alone was unsatisfactory from critical realist perspective because unable to conclude anything about what causes differences in waiting times—unable to inform about causal mechanisms.
    • Qualitative study sought to identify factors (causal mechanisms) that contributed to longer waiting times for women.
    • Assumed opinions/notions/prejudices/experiences expressed in focus groups reflected ordinary opinions (discovered rather than caused or constructed).

🌍 Transformative Emancipatory Paradigm

🌍 Core principle: explicit recognition of power and values

Transformative emancipatory paradigm: reflects explicit recognition of values and knowledge of self and community that form a basis for methodological decisions.

  • Driven by dissatisfaction with dominant research paradigms that fail to serve the needs of those traditionally excluded from positions of power in the research world.
  • Focuses on discrimination, oppression, and marginalization.
  • Goal: create a more just and democratic society; research should serve the ends of creating justice and democracy throughout the entire research process.

🌍 Ontological assumption

  • Rejects cultural relativism.
  • Recognizes the influence of privilege in determining what is accepted as "real" and the consequences of accepting one version of reality over another.
  • Reality is socially constructed and shaped by social, political, ethnic, and cultural lenses.
  • Multiple realities exist, but researchers must be explicit about the social, political, cultural, economic, ethnic, age, and disability values that define realities.

🌍 Epistemological assumption

  • Understanding the culture and building trust are paramount.
  • Knowledge is not neutral and is influenced by human interests.
  • Knowledge reflects the power and social relationships within society.
  • The purpose of knowledge construction is to aid people to improve society.
  • The relationship between researchers and participants is a critical determinant in achieving understanding of valid knowledge.

🌍 Ontology, epistemology, methodology, axiology in transformative emancipatory

ElementTransformative emancipatory's position
OntologyMultiple realities that are socially constructed; researchers must be explicit about social, political, cultural, economic, ethnic, age, and disability values that define realities
EpistemologyKnowledge is socially and historically located within a complex cultural context; respect for culture and awareness of power relations is critical
MethodologyResearchers can use qualitative, quantitative, or mixed-methods, but there should be an interactive link between them and the participants in defining the focus of research
AxiologyRespect, beneficence, and justice

🌍 Cyclical model for research

  • Community involvement is needed at the beginning, throughout, and at the end of each research study.
  • Cyclical use of data to inform decisions for next steps, whether related to additional research or program changes.
  • The researcher is someone who recognizes inequalities and injustices, strives to challenge the status quo, is a bit of a provocateur with overtones of humility, and possesses a shared sense of responsibility.

🌍 Example: Women's social capital study (Hodgkin, 2008)

  • Research questions: (1) Do men and women have different social capital profiles? (2) Why do women participate more in social and community activities than in civic activities?
  • Objectives: Draw attention to lack of gender focus in studies of social capital; make more visible women's contributions to society.
  • Design: Explanatory sequential mixed-methods—survey followed by in-depth interviews with 12 women.
  • Transformative emancipatory principles applied:
    • Purpose of mixed-methods was complementarity—elaboration, enhancement, and clarification from results of one method with those of the other.
    • Quantitative study showed difference between men and women in social and community participation, but did not reveal underlying motivation or experience.
    • Qualitative study with closed interaction between author and participants enabled understanding of why women became involved—findings revealed experience of motherhood influenced decision to get involved.
    • Members of the community (local residents) were involved throughout the research process, allowing for richer understanding of gender and social capital.
    • Cyclical use of data—results from quantitative stage pointed researcher to gather qualitative data to understand how women construct meaning around their involvement.

🔀 Dialectic Stance

🔀 Core principle: engagement with difference

  • Involves using multiple paradigmatic traditions and mental models, along with more than one methodology and type of method within the same study.
  • Aims to generate important understanding through the juxtaposition of different lenses, perspectives, and stances.
  • Provides "a meaningful engagement with difference, an engagement intended to be fundamentally generative of insight and understanding that are of conceptual and practical consequence."
  • Represents "multiple ways of seeing and hearing."

🔀 Two major characteristics

  • Recognizes legitimacy of multiple paradigmatic traditions: Embraces paradigmatic and methodological differences and seeks to integrate them in a dialogic manner; emphasizes importance of different people with different mental models contributing ideas.
    • Example: Address the same research question using two different methods, each rooted in a distinct paradigm, then make multiple inferences corresponding to different worldviews and compare results.
  • Iterative use of data: Use data from one approach to directly inform the other in an iterative way.
    • Example: Examine results of a particular method and consider how the paradigm underlying the study impacted the results, then analyze how results from one method corresponded to results from other methods, each with their own philosophical assumptions.

🔀 Dialectical pragmatism

  • A combination of dialectic approach and pragmatism, proposed as an alternative philosophical stance for mixed-methods research.
  • Pragmatism embraces the use of multiple methods to meaningfully generate information to address research questions.
  • Dialectic stance suggests researchers should carefully listen to and consider opposing viewpoints when developing a workable solution.
  • This perspective can provide ways to understand and properly integrate multiple paradigms, values, and methodologies to produce more complete knowledge about the phenomenon under study.

🔀 Opportunities for engagement with diversity

  • Mixed-methods way of thinking offers opportunities to engage meaningfully with differences encountered in study context—differences in ethnicity, gender, religion, country, culture.
  • Because dialectic stance invites multiple ways of seeing and knowing, a multiplicity of different perspectives is engaged, as are diversity and variation in the substance of what is being studied.
3

Nature of Theory in Mixed-Methods Research

CHAPTER 3. Nature of Theory in Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research enables researchers to both build and test theory within the same inquiry through an iterative, cyclical process that combines deductive and inductive logic, compensating for the weaknesses of using either quantitative or qualitative methods alone.

📌 Key points (3–5)

  • Theory's reciprocal relationship with research: theory development relies on research and research relies on theory; mixed-methods research allows integration of multiple theoretical perspectives in one study.
  • Both methods can build or test theory: contrary to common assumptions, qualitative research can be confirmatory and quantitative research can be exploratory—this compatibility is central to mixed-methods research.
  • Common confusion—method stereotypes: the misconception that qualitative = exploratory and quantitative = confirmatory limits understanding; both approaches can serve either purpose.
  • Cyclical, iterative process: mixed-methods research moves between specific observations and general inferences, allowing researchers to start at any point and refine theory continuously.
  • Five evaluation criteria: theory developed through mixed-methods can be assessed for simplicity, accuracy, generalizability, falsifiability, and utility—mixed methods help balance trade-offs among these criteria.

🔬 Theory in Quantitative Research

📐 What theory does in quantitative research

Theory in quantitative research: "a set of interrelated constructs (variables), definitions, and propositions that presents a systematic view of phenomenon by specifying relations among variables, with the purpose of explaining natural phenomenon" (Kerlinger, 1979, p. 64).

  • Theory explains what researchers expect to find.
  • It is used to explain, predict, generalize, and inform research questions and hypotheses.
  • Theory is defined in terms of relationships between independent and dependent variables.
  • Evaluated primarily by its ability to explain variance in a criterion variable of interest.

🔄 Building vs. testing in quantitative research

Theory building (exploratory):

  • Goal: examine patterns from collected data.
  • Most common method: hypothetico-deductive approach.
  • Can start with pre-existing theory or preliminary literature review.
  • Researchers use "careful observation, shrewd guesses, and scientific intuition" to arrive at postulates.
  • Over time, researchers add mediators, moderators, antecedents, and consequences to expand theory.

Theory testing (confirmatory):

  • Goal: use data to test hypotheses derived from theory.
  • Example: testing relationships between variables through experiments or surveys.
  • Subsequent tests can validate propositions and refine the theory.

Don't confuse: Building theory in quantitative research relies heavily on data but can still use pre-existing theory as a starting point—it's not purely inductive.

🎨 Theory in Qualitative Research

🔍 Theory as explanation and lens

  • Theory in qualitative research serves as both an explanation and a lens informing different research phases.
  • Process involves continuous comparison of data and theory beginning with data collection.

🌱 Starting points for qualitative theory building

With a priori theory:

  • Provides firm empirical grounding for emergent theory.
  • Helps identify what is lacking in existing literature.
  • Example: Nielsen et al. (2014) used institutionalization theory to identify gaps in understanding field-level dynamics and organizational processes.

Without a priori theory:

  • Allows researchers to avoid bias or limitations from existing frameworks.
  • Enables switching research focus as the study advances.
  • Permits generation of new concepts from open-ended data.
  • No construct is guaranteed a place in the resultant theory.

✅ Confirmatory qualitative research

  • Though not commonly associated with qualitative methods, theory testing is possible.
  • Example: using follow-up confirmatory interviews to validate quantitative findings.
  • "Confirmatory qualitative research is not an impossibility... neither is exploratory quantitative research."

Don't confuse: The absence of pre-existing theory in qualitative research doesn't mean lack of rigor—it can enable discovery of genuinely new concepts.

🔄 Theory in Mixed-Methods Research

🧩 Integration through theory

  • Theory plays an important role in integrating quantitative and qualitative approaches.
  • Philosophical perspectives should guide theoretical orientations (theory is nested within paradigms).
  • Mixed-methods offers opportunities to integrate various theoretical perspectives (e.g., structuration theory, social network theory).
  • Allows researchers to generate and test theory in the same research inquiry.

🔁 The cyclical approach

"An iterative, cyclical approach to research" (Teddlie & Tashakkori, 2012, p. 781) that includes both deductive and inductive logic in the same research inquiry.

How the cycle works:

  • Move from specific observations → general inferences → tentative hypotheses.
  • Researchers may start at any point in the cycle (from theories or from observations).
  • Involves creative insights that may lead to creation of new knowledge.
  • New theory emerges over time as research is extended using different methods.

The relationship between existing and new theory:

  • New theory may be influenced by existing theory (what are other possible answers in current literature).
  • New theory can serve as an alternative answer to the same phenomenon.
  • Confirmation or disconfirmation emerges only after observing similar phenomena using different methods.

🎯 Developmental purpose

  • Using qualitative methods before quantitative allows researchers to:
    • Develop or extend theory
    • Identify specific dependent and independent variables
    • Develop measurement instruments
    • Determine adequate level of analysis
    • Give more attention to process research

Example: A researcher conducts case studies to create theoretical constructs and propositions, then tests the theory using surveys and experiments—this sequential approach builds and validates in one research program.

📊 Evaluating Theory in Mixed-Methods Research

⚖️ Five primary criteria

CriterionDefinitionHow mixed-methods helps
SimplicityEase of understandingQuantitative methods can simplify complex qualitative frameworks
AccuracyConformity to truthQualitative methods provide rich, accurate insights into personal experiences
GeneralizabilityExtension to other domainsQuantitative large-sample studies enable broader generalization
FalsifiabilityRefutation is possibleTesting hypotheses quantitatively after qualitative theory building
UtilityUsefulness of theoretical systemsCombining methods creates theory that is both rich and testable

🔧 Compensating for weaknesses

  • Qualitative research: often accurate but potentially not generalizable, often lacking simplicity.
  • Quantitative research: simple and generalizable but fails to provide richness and insights into participants' personal experiences.
  • "Any single method of data collection results in trade-offs in the resulting theory's simplicity, generalizability, and accuracy."

Mixed-methods solution:

  • Start with qualitative methods to build an initial theoretical framework (accuracy, richness).
  • Use quantitative methods to simplify complexity and test the framework (generalizability, falsifiability).
  • Result: theory that is accurate, simple, generalizable, falsifiable, and useful.

🔬 Examples from Practice

📱 Example 1: Sarker et al. (2018) – Work-life conflict in distributed software development

Purpose: Developmental—qualitative study informed variable identification and hypothesis development.

Phase 1 (Qualitative):

  • Used exploratory case study with border theory as metatheoretical lens.
  • Identified key factors affecting work-life conflict (WLC) in globally distributed software development (GDSD).
  • Revealed categories: extent of physical border, temporal border, flexibility, permeability.
  • Developed theoretical model and hypotheses.

Phase 2 (Quantitative):

  • Conducted survey of IT professionals in US, UK, and India.
  • Used structural equation modeling to test hypotheses.
  • Most hypotheses were supported.

Contribution: Developed and empirically tested a theoretical model; validated border theory in a different context.

🏥 Example 2: Stewart et al. (2017) – Team-based empowerment barriers

Approach: Parallel mixed-methods—quantitative and qualitative data collected independently but simultaneously.

Quantitative component:

  • Tested hypothesis: teams with high-status leaders (physicians) less effective at implementing team-based empowerment.
  • Longitudinal quasi-experimental design with 224 providers.
  • Used discontinuous growth modeling.
  • Results supported the hypothesis.

Qualitative component:

  • Semi-structured interviews conducted prior to quantitative analysis.
  • Focused on identifying facilitators and barriers.
  • Identified patterns: high-status leaders had difficulty embracing empowering leader identity, leading to ineffective delegation.

Contribution: Used quantitative approach to test relationships; used qualitative approach to uncover "why" and "how" mechanisms; refined theory by combining findings from both approaches.

Don't confuse: Parallel mixed-methods doesn't mean the findings are independent—integration happens during interpretation to refine and explain the theory.

🔗 Theory and Paradigms

🌐 The Kuhn Cycle relationship

  • Theory resides at the epistemological level.
  • Theory helps inform choice of methodology and methods.
  • Paradigms prove constitutive of all normal science activities through the theories they embody.
  • When paradigms change, significant shifts occur in criteria for determining legitimacy of problems and solutions.
  • Using mixed-methods should take a theoretical position consistent with the chosen paradigm.

🤝 Pragmatic view

  • Both quantitative and qualitative studies can have exploratory or confirmatory objectives.
  • Both objectives are linked by theory.
  • Theory offers a conceptual framework and assumptions that guide researchers toward a common goal.
  • De-emphasizes differences and incompatibilities between quantitative and qualitative methods.
4

Appropriateness of Using a Mixed-Methods Research Approach

CHAPTER 4. Appropriateness of Using a Mixed-Methods Research Approach

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research is appropriate when research questions require both qualitative and quantitative approaches to generate different kinds of knowledge, and the choice depends on clearly defined research questions and purposes that justify combining methodologies.

📌 Key points (3–5)

  • Foundation of design decisions: Research questions and purposes, together with theoretical perspectives and paradigms, form the basis for determining whether mixed-methods research is appropriate.
  • Research questions drive methods: In pragmatism, research questions determine the type of design, sampling, data collection, and analysis techniques used.
  • Seven purposes justify mixed-methods: Compensation, corroboration, diversity, developmental, complementarity, completeness, and expansion provide rationales for combining methods.
  • Common confusion: Don't confuse corroboration (assessing the same phenomenon with different methods) with complementarity (assessing different facets of the same phenomenon).
  • Explicit justification required: Researchers must clearly state the purpose and demonstrate why mixed-methods is necessary to answer their research questions.

🔍 Research Questions in Mixed-Methods

🔍 Why research questions matter

Research questions in mixed-methods research determine the type of research design, sampling design, and data collection and analysis techniques.

  • Research questions are shaped by the study's purpose and in turn shape the methods and design.
  • The compatibility thesis focuses on research questions as the basis for conducting mixed-methods research.
  • Researchers are free to combine methodologies most appropriate to address their research questions.
  • Good research questions create a research boundary and focus on finding answers coherently.
  • Poor questions create problems affecting all subsequent stages.

📝 Types of research questions

Question TypeCharacteristicsExample Focus
QualitativeOpen-ended, evolving, non-directionalWhy and how of human interactions
QuantitativeFocuses on relationships among variablesDescriptive, comparative, or associative
Mixed-methodsEmbeds both qualitative and quantitative componentsRequires both data types collected concurrently, sequentially, or iteratively

🔄 Qualitative research questions

  • Generally open-ended, evolving, and non-directional.
  • Can change during the research process to reflect evolving understanding.
  • Iterative approach recommended: Start with a broad question, then narrow down into sub-questions.
  • First iteration should be tentative and exploratory.
  • Should capture basic goals in one or two overarching questions.

📊 Quantitative research questions

  • Focus on relationships among variables.
  • Three main types:
    • Descriptive: What is the perception of X among Y?
    • Comparative: What is the difference in X between groups A and B?
    • Associative: What is the relationship between X and Y?
  • Can be written as questions and/or hypotheses.
  • May emerge from theory or from identifying gaps in literature (exploratory quantitative research).
  • Can be based on a conceptual framework when a priori theory is insufficient.

🔗 Mixed-methods research questions

Mixed-methods research questions embed both a quantitative research question and a qualitative research question within the same question.

  • Necessitate that both qualitative and quantitative data be collected and analyzed.
  • A strong mixed-methods study should have a mixed-methods research question in addition to separate qualitative and quantitative questions.
  • Must demonstrate the need for interconnected qualitative and quantitative components.

📐 Four Dimensions of Research Questions

📐 Dimension 1: Rhetorical style—question format

Three formats available:

  1. Question format: Phrased as direct questions
  2. Aim/objective statement: Stated as research objectives
  3. Hypothesis: Formulated as testable predictions

Example: A study might use questions ("How do instructors interpret flipping?"), objectives ("explore the paradox of why leaders resist"), or hypotheses ("living abroad changes self-concept clarity").

🔗 Dimension 2: Rhetorical style—level of integration

Three presentation approaches:

  1. Independent questions: Write qualitative and quantitative questions separately
  2. Separate plus mixed: Write separate qualitative and quantitative questions, then add a mixed-methods question focusing on integration
  3. Overarching mixed question: Write one overarching mixed-methods question broken down into separate sub-questions

Don't confuse: Level of integration in question format vs. actual integration in the research design—these are related but distinct considerations.

🔀 Dimension 3: Relationship between research questions

Relationship TypeDescriptionExample
IndependentQuestions focus on different aspects; can be answered concurrentlyQ1: Impact of program on youth; Q2: How stakeholders experience program
DependentOne question depends on results of anotherQuantitative question depends on qualitative findings; mixed-methods question depends on both

⏱️ Dimension 4: Relationship to research process

  • Pre-determined: Research questions based on theory or prior literature; questions are fixed from the start
  • Emergent: Research questions evolve throughout the research phases; questions develop as understanding deepens

Example: Pre-determined questions follow from literature review; emergent questions develop after initial data collection reveals new directions.

🎯 Seven Purposes of Mixed-Methods Research

🎯 Overview of purposes

Purposes of mixed-methods research provide the rationale for conducting mixed-methods research.

The seven purposes are listed in increasing order of value from methodological compensation to rich intervention:

  1. Compensation
  2. Corroboration
  3. Diversity
  4. Developmental
  5. Complementarity
  6. Completeness
  7. Expansion
  • The quality of a mixed-methods study is directly associated with whether the purpose is accomplished.
  • Mixed-methods research can serve multiple purposes.
  • Explicit recognition of purposes helps readers understand goals and outcomes.
  • Explication of purposes is considered mandatory in mixed-methods research practice.

🛠️ Purpose 1: Compensation

Used to address the weaknesses of one type of method using the other type of method.

How it works:

  • Offset weaknesses of one method by using the other method.
  • If errors occur in one type of data, they can be reduced by using another method.
  • Does not require qualitative and quantitative methods be implemented equally.

Design approach:

  • Typically uses sequential mixed-methods approach.
  • Second strand conducted after findings from first strand are known.
  • Second study explicitly addresses weaknesses in the first study.

Example scenario: A researcher encounters a small sample size challenge in a quantitative survey. To compensate, they gather qualitative interview data to gain in-depth understanding and increase confidence in generating valid inferences.

✅ Purpose 2: Corroboration

Used to assess the credibility of inferences obtained from one approach using another approach.

How it works:

  • Aims to counteract or maximize heterogeneity of irrelevant variance sources.
  • Addresses method bias, inquirer bias, and theory bias.
  • Seeks convergence, triangulation, and correspondence of findings across methods.

Key distinction:

  • Corroboration: Different methods assess the same conceptual phenomenon
  • Complementarity: Different methods assess different facets of the same phenomenon

Design approach:

  • Concurrent mixed-methods design is well suited.
  • Both approaches implemented independently and concurrently to preserve counteracting biases.

Triangulation roots:

  • Traces back to Campbell and Fiske (1959).
  • Original triangulation used multiple methods within one paradigm (limited value).
  • Mixed-methods corroboration uses both quantitative and qualitative paradigms to address inherent weaknesses.

Example scenario: A researcher studies organizational identity using both implicit qualitative measures (troubling-event questions) and explicit quantitative measures (identity orientation scales) to triangulate findings and identify properties and predictors.

🔀 Purpose 3: Diversity

Used to obtain divergent views of a single phenomenon.

How it works:

  • Seeks to find new and better explanations for phenomena.
  • Divergent findings are compared and contrasted to generate stronger inferences.
  • Balance achieved when differences between qualitative and quantitative findings are properly reconciled.

Four strategies for handling divergence:

StrategyWhen to UseWhat Happens
ReconciliationDivergent results can be interpreted plausiblyReanalyze existing data; may lead to new perspective but not new research questions
InitiationConflicting results suggest new frameworksAsk new research questions; collect and analyze new data
BracketingFindings are irreconcilablePresent extreme results (e.g., best-case vs. worst-case scenarios)
ExclusionInferences contradict and lack validityOne type of data or results excluded due to incomplete or invalid results

Design approach:

  • Can use sequential or concurrent design.
  • Sequential if time order is critical variable.
  • Concurrent if investigating differences among populations in same time period.

Example scenario: A study compares IT and general management perceptions of ERP systems. Qualitative interviews show divergence in views, but quantitative survey shows similar perceptions with only small inconsistencies, providing divergent views of the same phenomenon.

🔨 Purpose 4: Developmental

Results from one method help develop or inform the design and development of the subsequent study using the other method.

How it works:

  • First method/study informs the design and development of second method/study.
  • Development broadly includes sampling designs and measurement decisions.
  • Increases validity of constructs and inquiries by compensating weaknesses of one method with strengths of another.

Design approach:

  • Always involves sequential use of methods.
  • First study (qualitative or quantitative) informs the second.

Example scenario: Researchers conduct in-depth interviews to identify critical customer-supplier obligations in IT outsourcing (qualitative phase), then use those findings to develop survey constructs and test their impact on success (quantitative phase).

Don't confuse: Developmental purpose involves more than just addressing methodological weaknesses—it's about using one study's results to build the next study's design and measures.

🧩 Purpose 5: Complementarity

Different methods assess different facets of a phenomenon, yielding an enriched, elaborated understanding.

How it works:

  • Seeks elaboration, enhancement, illustration, and clarification of findings from one method with results from the other.
  • Focuses on related but distinct aspects of the same phenomenon or relationship.
  • Measures overlapping but also different facets.

Key distinction from corroboration:

  • Corroboration: Same phenomenon measured with different methods
  • Complementarity: Different but related aspects of same phenomenon measured

Design approach:

  • Generally pursued using concurrent mixed-methods design (both studies conducted simultaneously).
  • Sequential design also possible.
  • Research questions should address related aspects of the same phenomenon.

Example scenario: A study examines environmental issue supporters' self-evaluations. Qualitative study (study 1) develops theory about how supporters evaluate themselves (self-assets and self-doubts). Quantitative study (study 2) derives profiles of supporters and shows how profiles relate to actual behaviors. Together, they investigate complementary aspects: how experiences influence self-evaluations and why these evaluations matter for predicting actions.

🧩 Purpose 6: Completeness

Provides a holistic view of the phenomenon that cannot be achieved by one approach only.

How it works:

  • Not used to confirm existing data but to ensure a complete picture is obtained.
  • The complete picture is more meaningful than findings from either method alone.
  • Uses different theoretical views to study all aspects of a phenomenon.

Metaphor: Think of putting a puzzle together—solving the puzzle is best achieved when different pieces are presented at the same time.

Design approach:

  • Generally conducted using concurrent mixed-methods approach.
  • Conducting both studies simultaneously enables researchers to discover different aspects and integrate them.

Completeness in qualitative and quantitative traditions:

  • Qualitative: Allows recognition of multiple realities.
  • Quantitative: Ensures all theoretical and substantive components come together.
  • Mixed-methods: Combines both perspectives for holistic understanding.

Example scenario: A study explores why leaders resist team-based empowerment in Veterans Health Administration. Quantitative longitudinal data test the hypothesis that high-status leaders are less effective in implementing empowerment. Qualitative interview data develop theory about the processes underlying why high-status leaders resist. Together, they provide a complete picture of both the effect and the underlying processes.

🌐 Purpose 7: Expansion

Extends the breadth and range of the phenomenon by using different methods for different components.

How it works:

  • Seeks to enrich understanding obtained in one strand with findings from the next strand.
  • Different methods address different components of the phenomenon.
  • Extends beyond initial findings to explore new dimensions.

Design approach:

  • Typically uses sequential mixed-methods design.
  • Qualitative study expands on quantitative findings or vice versa.

Common application:

  • In evaluation studies: Quantitative methods assess program outcomes; qualitative methods assess program processes.

Example scenario: A study examines social performance of hybrid organizations. First, quantitative survey data test the paradoxical relationship between social imprinting and social performance. Second, qualitative comparative case studies explore how organizations might resolve the paradox tested in the first study. The qualitative study expands to a different aspect (resolution strategies) of the same phenomenon.

🔗 Connecting Purposes to Design

🔗 Importance of explicit purpose

  • Researchers should clearly specify and acknowledge the rationale for using mixed-methods.
  • The seven purposes serve as a conceptual tool to systematically identify the most appropriate research design.
  • Identifying research purpose helps researchers refine their research questions.

Example of purpose-question alignment:

  • If purpose is developmental, the quantitative research question should be conditional on the qualitative research question or vice versa.

🎯 Purpose improves research quality

  • Examining research questions from the perspective of mixed-methods purpose clarifies the need for mixed-methods research.
  • Explicitly identifying purpose(s) improves likelihood of doing research with greater meaning.
  • More likely to lead to valuable implications.

Don't confuse: Purpose is not the question, not the design, not the methodology, not data collection or analysis—purpose is the focus on the reasons why the researcher is undertaking the study.

5

Basic Strategies for Mixed-Methods Research

CHAPTER 5. Basic Strategies for Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

The purpose of mixed-methods research should drive design decisions across five key properties—design investigation strategies, strands, time orientation, priority, and mixing strategies—to ensure the qualitative and quantitative components work together to answer research questions effectively.

📌 Key points (3–5)

  • Five design properties must be considered: design investigation strategies (exploratory/confirmatory/both), strands (monostrand/multistrand), time orientation (concurrent/sequential), priority (equal/dominant-less dominant), and mixing strategies (fully/partially mixed).
  • Purpose drives design: the research purpose (corroboration, complementarity, compensation, developmental, expansion, diversity, or completeness) determines which combination of design properties is most appropriate.
  • Common confusion—concurrent vs. sequential: concurrent designs collect qualitative and quantitative data simultaneously; sequential designs collect one type first and use results to inform the next strand.
  • Monostrand vs. multistrand distinction: monostrand designs involve data conversion (quantitizing or qualitizing) within a single phase; multistrand designs contain at least two separate research strands with distinct conceptualization-experiential-inferential stages.
  • Integration timing matters: fully mixed designs integrate qualitative and quantitative approaches at all stages; partially mixed designs integrate at specific stages (often experiential or inferential).

🔬 Design Investigation Strategies

🔍 Exploratory design

Exploratory design: researchers seek to develop or generate a new theory.

  • Research questions need not be specified in advance; they unfold as researchers learn about the phenomenon.
  • Example: an exploratory study used ethnographic observations and interviews to develop a survey examining graduate engineering student retention.
  • Don't confuse: exploratory questions can appear in quantitative research, just as confirmatory questions can appear in qualitative research.

✅ Confirmatory design

Confirmatory design: researchers test an existing theory using hypotheses established a priori.

  • Research questions are generally determined at the beginning.
  • Follows traditional scientific method: theory → hypotheses → data analysis.
  • Example: a study tested the hypothesis that changes in users' beliefs and attitudes toward a system over time could explain continued use intentions.

🔄 Combined exploratory and confirmatory

  • The most common type from the design investigation perspective.
  • One strength of mixed-methods research is addressing both exploratory and confirmatory questions in the same study or research program.
  • Example: a study tested the hypothesis that higher-status physician leaders are less successful at implementing team-based empowerment, then explored why and how leaders facilitated or obstructed implementation.

📊 Strands of the Study

🧵 What strands encompass

Strands or phases: three stages—(1) conceptualization (theoretical foundations, purposes, research methods); (2) experiential (data collection and analysis); (3) inferential (data interpretation and application).

  • Mixed-methods designs are classified by strands: monostrand or multistrand.
  • This property determines how many phases of the conceptualization-experiential-inferential process occur.

🔄 Monostrand design

Monostrand design: a single phase of conceptualization-experiential-inferential process, yet the study has both qualitative and quantitative components.

  • Also called quasi-mixed design because only one type of data is analyzed and only one type of inference is made.
  • Quantitizing: converting qualitative data to quantitative data (e.g., coding interview themes and counting frequencies).
  • Qualitizing: converting quantitative data to qualitative data (e.g., creating profiles from numeric data).
  • Mixing takes place in the experiential stage when data are transformed.

Limitations of monostrand conversion designs:

  • Numerical data may not capture the complexity of qualitative data.
  • Quantitizing may cause loss of depth and flexibility.
  • Qualitizing may overgeneralize numeric data or yield unrealistic profiles.
  • Quantitized data are vulnerable to multicollinearity problems.

Example: a study gathered abstracts (qualitative data), then quantitized them by assigning "1" to developed abstracts and "0" to underdeveloped ones, allowing correlation with other variables.

🔗 Multistrand design

Multistrand design: contains at least two research strands.

  • Can mix qualitative and quantitative components in or across all stages.
  • May also contain data conversion.
  • Most published mixed-methods articles adopt this design.
  • Requires decisions about time orientation, mixing strategies, and priority of methodological approaches.

⏱️ Time Orientation

⏩ Concurrent mixed-methods design

Concurrent design: conducting the qualitative and quantitative components at the same time.

  • Researchers gather both data types, analyze them separately, then compare results.
  • Used to validate one form of data with another, transform data for comparison, or address different types of questions.

Three types of concurrent designs:

🔺 Concurrent triangulation design

  • Both components conducted concurrently in the same phase.
  • Usually given equal weight (though not always).
  • Example: a study of team-based empowerment implementation collected qualitative and quantitative data concurrently with equal status; quantitative strand tested hypotheses about leader status effects while qualitative strand explored identity differences as underlying explanations.

🪆 Concurrent nested design

  • Both components conducted simultaneously, but one form receives more weight.
  • The less-priority strand is embedded to provide a supporting role.
  • Mixing usually occurs during the analysis/experiential phase.
  • Example: a psychological contract violation study conducted a two-stage survey (quantitative dominant) with open-ended responses (qualitative nested) to identify how employees experienced violations.

🌐 Concurrent transformative design

  • Combines features of both triangulation and nested designs.
  • Uses a specific theoretical perspective based on ideologies such as critical theory, advocacy, or participatory research.
  • The perspective reflects researchers' personal stance and is reflected in purpose or research questions.
  • Example: a study examining disability and transition to higher education used a transformative emancipatory paradigm to investigate life experiences of marginalized groups.

Challenges of concurrent designs:

  • Complexity of running multiple strands simultaneously.
  • Requires considerable expertise to examine the same phenomenon using two different approaches.
  • Involves integrating findings from simultaneous data analysis into coherent inferences.
  • Best achieved using a collaborative team approach.

⏭️ Sequential mixed-methods design

Sequential design: at least two strands that occur chronologically.

  • Researchers first collect and analyze one form of data, then use conclusions to formulate research questions for the next strand.
  • Final inferences based on results of both strands.

📊➡️📝 Sequential (quantitative-qualitative) design

  • Collect and analyze quantitative data first, then qualitative data to explain or elaborate.
  • Rationale: quantitative data provide general understanding; qualitative data refine and explain by drawing on participants' views in depth.
  • Advantages: easy to implement (clear separate stages); provides opportunities for deeper exploration.
  • Weaknesses: lengthy time for two phases; significant resources required.
  • Example: a study of work integration social enterprises first tested a paradox quantitatively (social imprinting enhanced social performance but indirectly weakened it through negative effects on economic productivity), then gathered qualitative data to explore how organizations could resolve this paradox.

📝➡️📊 Sequential (qualitative-quantitative) design

  • Begins with qualitative data collection and analysis.
  • Building on qualitative findings, researchers conduct a developmental phase (e.g., generating new variables, designing instruments, developing intervention plans).
  • Researchers interpret how quantitative results build on initial qualitative findings.
  • Example: a study gathered interview data to identify critical customer-supplier obligations in IT outsourcing, then used these obligations to develop a research model tested quantitatively using hierarchical regression.

🌐 Sequential transformative design

  • Similar to concurrent transformative design but sequential.
  • Can use either qualitative or quantitative as first strand.
  • Must use a theoretical perspective to guide the study.
  • The theoretical perspective is more important than methods alone.
  • Example: a study of health security for disaster resilience in Bangladesh used capability approach and dominant paradigm perspectives, employing various qualitative and quantitative techniques sequentially to incorporate different perspectives.

⚖️ Priority of Methodological Approach

🟰 Equal status design

Equal status design: researchers place equal weight on both qualitative and quantitative strands.

  • In concurrent design, must be implemented using concurrent triangulation.
  • In sequential design, equal weight can be given regardless of whether exploratory or explanatory.
  • Example: a study tested a paradox quantitatively, then explored resolution qualitatively, emphasizing both strands equally.

⚖️ Dominant-less dominant status design

Dominant-less dominant status design: researchers conduct a study in a single dominant strand with a small component from an alternative strand.

Two types:

📊 Quantitative-dominant design

  • Quantitative data provide main view; qualitative data fill gaps.
  • May be preferable for those still learning qualitative methods.
  • Example: a psychological contract violation study used quantitative data (dominant) to test relationships, with qualitative data (less dominant) from two open-ended questions to identify how employees experienced violations.

📝 Qualitative-dominant design

  • Researchers rely on qualitative data to answer questions while recognizing quantitative data benefits some parts.
  • Example: an ethnographic study used qualitative data as primary, but also used quantitative techniques to test predictions derived from qualitative analysis (e.g., regressing centralization scores on day of operation to test for linear and curvilinear trends).

🔀 Mixing Strategies

🌀 Fully mixed-methods design

Fully mixed design: mixing of qualitative and quantitative approaches occurs in an interactive manner in all stages of the study.

  • Commonly associated with equal status design where both strands conducted concurrently.
  • At each stage, qualitative and quantitative strands are intertwined.
  • Side-by-side comparison of research questions, data, findings, and inferences.

Example of fully mixed design:

  • A study examined affect-creativity relationship using daily diaries.
  • Conceptualization stage: quantitative and qualitative research questions were related (e.g., "Is there a positive or negative linear relationship?" alongside exploring temporal patterns).
  • Experiential stage: quantitative and qualitative data collected concurrently; qualitative data were quantitized and analyzed.
  • Inferential stage: results integrated and interpreted simultaneously.

🧩 Partially mixed-methods design

Partially mixed design: researchers do not have to integrate qualitative and quantitative approaches across all stages.

  • Mixing takes place at specific stages (most commonly experiential or inferential stages).
  • Popular because well-described in literature and less demanding than fully mixed.
  • Example: a study tested hypotheses quantitatively about social imprinting and economic productivity, then examined mechanisms qualitatively; integration occurred at the inferential stage when interpreting how findings from both strands related.

🎯 Linking Purpose to Design Decisions

🗺️ Decision model overview

The purpose of mixed-methods research plays a major role in guiding design decisions across three properties: time orientation, priority of methodological approach, and mixing strategies.

Why two properties are excluded from the decision model:

  1. Design investigation strategies: researchers should address both exploratory and confirmatory questions (a primary strength of mixed-methods).
  2. Strands/phases: researchers should adopt multistrand design to fully harness advantages; monostrand conversion should only be used if it brings additional value.

🔗 Purpose-to-design recommendations

PurposeRecommended Time OrientationRationale
CorroborationConcurrentBoth strands executed independently and concurrently to preserve counteracting biases
ComplementarityConcurrentAllows measuring overlapping but different aspects simultaneously; findings independent and integrated at inferential stage
CompensationSequentialNext strand overcomes limitations of previous strand; some issues may not be known until first strand concludes
DevelopmentalSequentialResults from one approach develop research questions for another approach
ExpansionSequentialOne study expands on initial findings from the other
DiversityEither concurrent or sequentialDiverse findings can emerge at any stage
CompletenessEither concurrent or sequentialConcurrent shortens data collection time; sequential allows using first-strand findings to develop second-strand questions

📋 Design option tables

For corroboration, complementarity, or completeness purposes:

  • Partially mixed concurrent equal status design
  • Partially mixed concurrent dominant status design
  • Fully mixed concurrent equal status design

For compensation, developmental, or expansion purposes:

  • Partially mixed sequential equal status design
  • Partially mixed sequential dominant status design
  • Fully mixed sequential equal status design

For diversity or completeness purposes:

  • Any of the above options

⚠️ Flexibility in design choices

The possible design options are recommendations based on most likely scenarios, not exhaustive rules. Researchers can use other designs not listed if they can justify why such design is useful and appropriate for the given context.

Example of flexibility: although complementarity is best achieved using concurrent design, a study examining support of environmental issues used a partially mixed sequential equal status design (qualitative followed by quantitative) to investigate two complementary aspects—how everyday experiences influenced self-evaluations (study 1) and why these self-evaluations mattered through predicting actions (study 2).

📚 Illustrative Examples by Purpose

💰 Compensation purpose example

A study determined differences in ethical decision-making of U.S. vs. non-U.S. leaders using a partially mixed sequential dominant status design:

  • Quantitative study (dominant) followed by qualitative interviews (less dominant).
  • Qualitative data compensated for small quantitative sample size.
  • Mixing at interpretation stage: no evidence of difference in profiles, but qualitative themes indicated incompatibility between luxury strategy values and responsible leadership principles.

🔍 Corroboration purpose example

A study of organizational identity and stakeholder relations used a fully mixed concurrent equal status design:

  • Qualitative data projected general perceptions; quantitative data measured identity orientation.
  • At conceptualization: research questions informed one another.
  • At experiential: data converted both ways (quantitized and qualitized) and analyzed simultaneously with equal weight.
  • At inferential: findings complemented each other; interpretations were interdependent.

🌈 Diversity purpose example

A study of enterprise information systems used a partially mixed sequential equal status design:

  • Qualitative strand investigated operational scope and success factors.
  • Quantitative strand identified differences between IT and general management perceptions.
  • Also had developmental element: case study results developed questionnaire.
  • Mixing at interpretation: despite interview-phase divergence, quantitative comparison showed similar perceptions with few significant differences.

🔨 Developmental purpose example

A study of outsourcing obligations used a partially mixed sequential (qualitative-quantitative) equal status design:

  • Qualitative study identified customer-supplier obligations.
  • Quantitative study assessed impact of fulfilling obligations on success.
  • Both approaches given equal weight.
  • Mixing at interpretation: results showed existence of psychological contract and that fulfilling obligations explained significant variance in outsourcing success.

🧩 Complementarity purpose example

A study of women's multiple roles used a partially mixed concurrent equal status design:

  • Qualitative study explored transfer of skills from personal roles to managerial role.
  • Quantitative study examined relationship from different perspective.
  • Both data types collected simultaneously using different samples.
  • Both strands received equal weight.
  • Mixing at interpretation: both studies supported role accumulation perspective that multiple roles can be enriching rather than depleting.

🎯 Completeness purpose example

A study of team-based empowerment barriers used a partially mixed concurrent equal status design:

  • Each strand investigated different aspects of the same phenomenon.
  • Quantitative and qualitative data collected independently but simultaneously.
  • Each strand received equal weight.
  • Mixing at interpretation: leader status supported as moderator quantitatively; qualitative analysis provided evidence of identity and delegation as mediators explaining the "why" and "how."

📈 Expansion purpose example

A study of social imprinting and social performance used a partially mixed sequential equal status design:

  • Quantitative approach tested existence of hypothesized paradox.
  • Qualitative approach explored how organizations could resolve the paradox.
  • Sequential (quantitative-qualitative) with equal emphasis.
  • Mixing at interpretation: social imprinting enhanced social performance but indirectly weakened it through negative relationship with economic productivity; qualitative findings suggested partial unfreezing of social imprint as critical condition to overcome paradox.
6

Mixed-Methods Data Collection Strategies

CHAPTER 6. Mixed-Methods Data Collection Strategies

🧭 Overview

🧠 One-sentence thesis

Mixed-methods data collection requires researchers to integrate qualitative and quantitative sampling designs and data sources in ways that determine what types of generalizations—statistical, analytical, or case-to-case—can be legitimately drawn from the study.

📌 Key points (3–5)

  • What mixed-methods sampling involves: selecting samples for both qualitative and quantitative strands, considering schema (how to select) and size (how many), with each strand potentially using different approaches.
  • How sampling determines generalization: probability sampling supports statistical generalization to populations; purposive sampling supports analytical generalization to theory or case-to-case transfer.
  • Common confusion—sampling alignment: the relationship between qualitative and quantitative samples can be identical (same members), parallel (different members, same population), nested (one subset of the other), or multilevel (different levels like organization vs. individual).
  • Why sampling decisions matter: sampling designs directly shape the types of inferences (meta-inferences) researchers can make and the scope of generalizability.
  • Data collection integration: qualitative methods (interviews, focus groups, observation) yield language/text data for depth; quantitative methods yield numerical data for breadth; mixed-methods research combines both to answer research questions more fully.

🎯 Sampling fundamentals in qualitative and quantitative research

🎯 What sampling determines

  • Sampling is a plan for selecting participants or units from a population.
  • It directly affects inference quality: how confidently researchers can draw conclusions.
  • Two core decisions:
    • Sample size: "How many participants do I need?"
    • Sampling schema: "How do I select these members?"
  • Don't confuse: schema (the method of selection) vs. size (the number selected).

🔀 Probability vs. non-probability sampling

DimensionProbability (random)Non-probability (non-random)
Selection principleEvery unit has equal chance of being chosenSubjective methods; not all units have equal chance
Primary goalStatistical generalizability to a defined populationInsight, depth, or theoretical understanding
Typical useQuantitative research (but can be used in qualitative)Qualitative research (but can be used in quantitative)
Example methodsSimple random, systematic, stratified, cluster, multistageConvenience, purposive (modal, expert, snowball, criterion, quota, etc.)
  • Example: Simple random sampling gives every individual an equal chance; systematic sampling picks every kth member from a list.
  • Example: Purposive sampling targets specific groups (e.g., IT professionals with 3+ years experience) or aims for diversity (heterogeneous sampling).

📏 Sample size guidelines

Sample size: the number of participants recruited for a study.

  • Quantitative research: based on probability computations (power analysis, effect size).
    • Example: t-test or ANOVA needs ~30 per cell for 80% power; regression needs ~50; factor analysis ~300.
    • Structural equation models range from 30 (simple one-factor CFA) to 460 (complex two-factor CFA with weaker loadings).
  • Qualitative research: based on expert opinion and saturation (sampling until no new information emerges).
    • Example: case study 3–5 participants; phenomenology 6–10 interviews; grounded theory 15–20 cases; ethnography 30–50 interviews.
  • Don't confuse: quantitative size is about statistical power; qualitative size is about depth and saturation.

🧩 Generalization types and their sampling requirements

🧩 Three types of generalization

Statistical generalization: making inferences from a representative sample to the population from which it was drawn.

Analytical generalization: making inferences to wider theory based on how selected cases fit with general constructs.

Case-to-case transfer: making generalizations from one case to another similar case.

  • Large, random samples → statistical generalization.
  • Small, purposive samples → analytical generalization or case-to-case transfer.
  • Example: A random sample of 300 employees allows statistical claims about the entire workforce; a purposive sample of 10 expert managers allows theoretical insights about leadership mechanisms.

🔍 When generalization is appropriate

The excerpt cites Lee and Baskerville's four judgment calls for generalizing a theory to a new setting without retesting:

  1. Uniformity of nature: similar situations will produce similar effects in the future.
  2. Sufficient similarity: the original and new settings are sufficiently alike in relevant conditions.
  3. Successful identification of variables: the theory already contains all significant variables.
  4. Theory is true: one accepts the theory as valid in the new context.
  • Implication: sampling strategies must be designed carefully because they determine where and how a theory is tested, which in turn affects generalizability.

⚖️ Qualitative vs. quantitative sampling priorities

AspectQualitativeQuantitative
Primary emphasisSaturation (comprehensive understanding; sample until no new information)Generalizability (representative of population)
FocusScope and nature of the systemHow many cases needed to reliably represent the system
Trade-offDepth over breadthBreadth and minimizing false positives/negatives
  • Don't confuse: qualitative saturation (information completeness) with quantitative power (statistical reliability).

🔗 Mixed-methods sampling strategies

🔗 Why mixed-methods sampling is complex

  • Researchers must make sampling decisions for both qualitative and quantitative strands.
  • Goal: simultaneously increase inference quality (depth) and generalizability (breadth).
  • Often requires two types of samples: probability (for statistical generalization) and purposive (for analytical generalization).

🗂️ Four mixed-methods sampling designs

  1. Basic sampling designs: use any of the standard qualitative or quantitative schemas (Tables 6-1, 6-2).
  2. Sequential sampling designs: findings from one phase drive the sample selection for the next phase.
    • Identical: same sample members in both strands.
    • Parallel: different samples from the same population.
    • Nested: one sample is a subset of the other.
    • Multilevel: samples from different levels (e.g., organization and individual).
  3. Concurrent sampling designs: separate qualitative and quantitative components run simultaneously; results are triangulated to confirm or cross-validate findings.
    • Can also be identical, parallel, nested, or multilevel.
  4. Multiple sampling designs: use more than one sampling technique (e.g., simple random for quantitative, expert sampling for qualitative).
  • Example: Stewart et al. used nested concurrent sampling—qualitative interviews with a subset of the quantitative survey participants.
  • Example: Koh et al. used parallel sequential sampling—different managers from the same professional associations for qualitative interviews and quantitative surveys.

📊 Generalization quadrants in mixed-methods research

The excerpt presents a two-dimensional framework (Figure 6-1) showing how sampling designs determine meta-inferences:

QuadrantQualitative strandQuantitative strandMeta-inference type
1Non-probabilityNon-probabilityPurely analytical generalization / case-to-case transfer
2ProbabilityNon-probabilityStatistical (qual) + analytical (quant)
3Non-probabilityProbabilityAnalytical (qual) + statistical (quant)
4ProbabilityProbabilityPurely statistical generalization
  • Implication: the choice of sampling designs in each strand constrains the types of inferences researchers can legitimately claim.

✅ Five criteria for mixed-methods sampling decisions

Collins (2010) proposed:

  1. Reflect the dependent relationship between samples and the time orientation (concurrent or sequential).
  2. Reflect the relationship between qualitative and quantitative samples (identical, parallel, nested, multilevel).
  3. Reflect the relationship between sampling schemes and the chosen type of generalization.
  4. Reflect the relationship between varying types of data collected and how they address research questions.
  5. Identify the relationship between emphasis of approach (dominant, less-dominant, equal) and the formulation of appropriate meta-inferences.
  • Example: If the goal is statistical generalization from the quantitative strand and deep understanding from the qualitative strand, use probability sampling for the former and purposive sampling for the latter, ensuring the relationship (e.g., nested) aligns with the research questions.

🛠️ Qualitative and quantitative sampling guidelines

🛠️ Six criteria for qualitative sampling

Curtis et al. (2000) and Miles et al. (2020) recommend:

  1. Relevance: sampling strategy should align with the conceptual framework and research questions.
  2. Rich information: sample should generate deep data on the phenomenon.
  3. Generalizability: enhance analytical generalization (fit with theory) rather than statistical generalization.
  4. Believability: produce credible descriptions and explanations.
  5. Ethics: ensure informed consent, assess benefits/risks, protect participants.
  6. Feasibility: consider resource costs (money, time), accessibility, and researcher competency.
  • Example: If studying expert decision-making, purposively sample experienced managers (relevance, rich information) and ensure they can consent and are accessible (ethics, feasibility).

🛠️ Six principles for quantitative sampling

  1. Theory-driven: sampling choices must align with research questions and objectives.
  2. Design first: study design (e.g., experimental, correlational) should be planned before sampling decisions.
  3. Detailed planning: minimum/appropriate sample size depends on careful planning of all research stages.
  4. Good technique: describe the population, list members, identify sampling type, determine size, test representativeness.
  5. Statistical generalization: aim to apply variable relationships to a general population.
  6. (Implicit) Power and precision: ensure adequate sample size for detecting effects and minimizing error.
  • Example: For a regression study, plan the design, identify the population, use simple random sampling, calculate required sample size (~50), and verify the sample represents the population.

📦 Data collection methods in mixed-methods research

📦 Qualitative data collection methods

Five qualitative approaches (Creswell & Poth, 2018):

  • Biography: narrative study of lives.
  • Phenomenology: study of structures of consciousness from the first-person perspective.
  • Grounded theory: inductive methodology for theory construction from data.
  • Ethnography: direct and sustained social contact with agents in their context.
  • Case study: empirical inquiry into a contemporary phenomenon in its real-life context.

Common qualitative data sources:

  • Qualitative surveys (open-ended questions).
  • Open-ended interviews.
  • Focus groups.
  • Observation.
  • Secondary data (documents, visual/audio data, archival data).

Purpose of qualitative data gathering: provide evidence tied to the phenomenon being investigated, typically in the form of words or texts.

  • Strength: deep understanding of participants' experiences.
  • Challenge: purposive sampling is most common because probability sampling is difficult when focusing on experience data.
  • Don't confuse: qualitative data (language, text) with quantitative data (numerical).

📦 Quantitative data collection methods

  • The excerpt does not detail quantitative methods but implies they focus on numerical data for breadth and statistical analysis.
  • Example: surveys with closed-ended questions, experiments with measured outcomes, secondary numerical datasets.

📦 Integrating qualitative and quantitative data

  • Mixed-methods research: combines language data (qualitative) and numerical data (quantitative) to answer research questions more fully.
  • Researchers must consider:
    • What types of data are needed (language vs. numerical).
    • What sources will best answer the research questions.
    • How to align data collection with the sampling design (concurrent, sequential, nested, etc.).
  • Example: Stewart et al. collected longitudinal quantitative data (224 providers) and in-depth qualitative interviews (30 team members) concurrently to explore barriers to team-based empowerment.
  • Example: Koh et al. collected qualitative interviews first (15 managers) to inform the design of a quantitative survey (370 managers) sequentially.

📦 Strengths and weaknesses of qualitative methods

The excerpt references Table 6-4 (not fully reproduced) summarizing strengths and weaknesses of qualitative methods:

  • Major strength: ability to provide much deeper understanding by asking participants about their experiences.
  • Challenge: sometimes difficult to use probability sampling; purposive sampling is most common.
  • Implication: qualitative methods excel at depth and context but may sacrifice breadth and statistical generalizability.

🧪 Mixed-methods sampling examples

🧪 Example 1: Stewart et al. (2017)—concurrent nested design

  • Design: partially mixed concurrent equal status.
  • Goal: explore barriers to team-based empowerment in Veterans Health Administration.
  • Sampling relationship: nested—qualitative sample was a subset of the quantitative sample.
  • Quantitative strand:
    • Longitudinal quasi-experimental design.
    • Convenience sampling: all providers in 21 geographical divisions who worked before the intervention.
    • Sample size: 224 providers.
  • Qualitative strand:
    • In-depth interviews during and one year after intervention.
    • Purposive sampling: team members in the same geographical division.
    • Sample size: 30 participants (8 providers, 10 care managers, 6 clinical associates, 6 administrative associates).
  • Implication: nested design allowed triangulation of quantitative trends with qualitative insights from a subset of the same population.

🧪 Example 2: Koh et al. (2004)—sequential parallel design

  • Design: partially mixed sequential equal status.
  • Goal: study IT outsourcing in Singapore.
  • Sampling relationship: parallel—different samples from the same population.
  • Qualitative strand (first):
    • Criterion sampling: managers with 3+ years outsourcing experience from IT Management Association (ITMA) and Singapore IT Federation (SITF).
    • Sample size: 15 managers (9 customer, 6 supplier).
  • Quantitative strand (second):
    • Simple random sampling: all ITMA and SITF members, excluding qualitative participants to avoid primacy effects.
    • Sample size: 370 managers (179 customer, 191 supplier) from 158 organizations.
  • Implication: qualitative findings informed the quantitative survey design; parallel sampling ensured both strands drew from the same professional population without overlap.

Note: The excerpt is a chapter on mixed-methods data collection strategies. It emphasizes that sampling decisions—schema, size, and the relationship between qualitative and quantitative samples—determine the types of generalizations (statistical, analytical, case-to-case) researchers can make. Mixed-methods research requires careful integration of both strands to maximize inference quality and generalizability. The chapter provides detailed guidelines for qualitative, quantitative, and mixed-methods sampling, and illustrates these principles with two empirical examples.

7

Qualitative and Quantitative Data Analysis Strategies

CHAPTER 7. Qualitative and Quantitative Data Analysis Strategies

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research requires careful integration of qualitative and quantitative data collection strategies, guided by research questions and purposes, to achieve complementary strengths and non-overlapping weaknesses across methods.

📌 Key points (3–5)

  • Five qualitative approaches: biography, phenomenology, grounded theory, ethnography, and case study—all aim to provide evidence tied to the phenomenon by uncovering experiential accounts.
  • Common qualitative sources: qualitative surveys, open-ended interviews, focus groups, observations, and secondary data; each has distinct strengths (depth, exploration) and weaknesses (time-consuming, generalizability challenges).
  • Common quantitative sources: quantitative surveys, experiments (laboratory, field, natural), quantitative observations, quantitative interviews, and secondary data; strengths lie in standardization and replicability, but outcomes are limited by structured formats.
  • Common confusion—intramethod vs. intermethod mixing: intramethod uses a single method with both qualitative and quantitative components (e.g., open- and closed-ended items on one survey); intermethod combines different approaches (e.g., surveys + interviews).
  • Fundamental principle: methods should be mixed to have complementary strengths and non-overlapping weaknesses, with sampling and data collection strategies aligned to research questions and purposes.

🔍 Qualitative data collection methods

🔍 Five qualitative approaches

The excerpt identifies five types of qualitative approaches:

ApproachDefinition
BiographyNarrative study of lives
PhenomenologyStudy of structures of consciousness as experienced from the first-person perspective
Grounded theoryInductive methodology providing systematic guidelines for gathering, synthesizing, analyzing, and conceptualizing qualitative data for theory construction
EthnographyMethodology involving direct and sustained social contact with agents
Case studyEmpirical inquiry investigating a contemporary phenomenon within its real-life context
  • Regardless of approach, the purpose is to provide evidence tied to the phenomenon being investigated.
  • Data are not on the surface; researchers must dig below to unearth experiential accounts.
  • Don't confuse: qualitative data and induction are often seen as interpretivist, but induction can also be positivist; inductive methods can use literature as data (top-down induction).

📋 Common qualitative data sources

The most common sources are:

  • Qualitative surveys: gain in-depth information about reasoning and motivations.
  • Open-ended interviews: allow participants to describe what is important in their own words; researchers can ask follow-up questions.
  • Focus groups: capitalize on group interaction to explore ideas and tap into different forms of communication.
  • Qualitative observations: directly see what people do in natural or structured settings without relying on self-reports.
  • Secondary data: personal and official documents, visual/audio data, archival data—useful for corroboration and studying past time periods.

⚖️ Strengths and weaknesses of qualitative methods

Each method has trade-offs (adapted from Table 6-4):

  • Qualitative surveys:

    • Strengths: reach broader audience, inexpensive (especially online), perceived anonymity.
    • Weaknesses: require more participant time/effort, no follow-up questions, vague answers possible.
  • Open-ended interviews:

    • Strengths: in-depth information, high credibility, useful for exploration and confirmation.
    • Weaknesses: time-consuming, reactive to personalities, difficult to generalize with small samples.
  • Focus groups:

    • Strengths: explore ideas, capture content quickly, allow participants to respond to each other.
    • Weaknesses: can be expensive, may be dominated by one or two participants, confidentiality concerns.
  • Qualitative observations:

    • Strengths: directly see behavior, good for description, access to contextual factors.
    • Weaknesses: reason for behavior unclear, possible observer bias, cannot observe large populations.
  • Secondary data:

    • Strengths: insight into past time periods, useful for corroboration and exploration.
    • Weaknesses: may be incomplete, access difficult, low interpretive validity.

🗣️ Interview designs

Three common forms:

  • Informal conversational interviews: spontaneous questions in natural interaction, typically during participant observation.
  • General interview guide approach: more structured than conversational but wording depends on the researcher.
  • Standardized open-ended interviews: identical questions for all participants but worded to elicit open-ended responses.

👥 Focus groups characteristics

"A form of group interview that capitalises on communication between research participants in order to generate data."

Five major characteristics:

  1. A small group of people
  2. Who possess certain characteristics
  3. Provide qualitative data
  4. In a focused discussion
  5. To help understand the topic of interest
  • The idea: group processes help people explore and clarify views in ways less accessible in one-on-one interviews.

👁️ Observational methods

  • Described as "naturalistic research."
  • Involve systematic, detailed observation of behavior and talk in natural environments.
  • Observers may adopt a "participant observer" role, becoming involved while observing.
  • Goal: minimize impact on the environment, remain objective, avoid contaminating data with preconceptions.
  • Note: the Hawthorne effect (participants changing behavior when observed) cannot be ruled out.

📂 Secondary data sources

"Data that were originally recorded or 'left behind' or collected at an earlier time by someone other than the current researcher, often for an entirely different purpose from the current research purpose."

  • May be internal (e.g., corporate annual reports) or external/personal (e.g., personal notes, photographs).
  • Archival research files: originally used for research, then stored for later use.
  • Social media data: offers massive, diverse content without intrusive data collection; volume continues to grow.

🔢 Quantitative data collection methods

🔢 Focus on numerical data

Quantitative methods gather numerical data using standardized techniques. Common types:

  • Quantitative surveys
  • Experiments (laboratory, field, natural)
  • Quantitative observations
  • Quantitative interviews
  • Secondary data

⚖️ Strengths and weaknesses of quantitative methods

From Table 6-5:

  • Quantitative surveys:
    • Strengths: good for measuring attitudes, inexpensive, quick turnaround, standardized questions (easy replication), high perceived anonymity, relatively easy data analysis.
    • Weaknesses: validation needed, must be brief, missing data issues, possible reactive
8

Mixed-Methods Data Analysis Strategies

CHAPTER 8. Mixed-Methods Data Analysis Strategies

🧭 Overview

🧠 One-sentence thesis

Mixed-methods data analysis involves separately analyzing qualitative and quantitative data using appropriate techniques for each, then integrating both types to yield meta-inferences that provide richer insights than either method alone could achieve.

📌 Key points (3–5)

  • Core requirement: Mixed-methods analysis requires competence in both qualitative and quantitative techniques, plus knowledge of how to integrate findings to generate high-quality meta-inferences.
  • Analysis must align with purpose: Data analysis strategies should be consistent with the study's purpose (e.g., completeness, complementarity) and address the research questions.
  • Six critical criteria: Number of data types analyzed, number of analysis types used, time sequence, priority of components, number of phases, and analysis orientation.
  • Common confusion: Integration vs. separate analysis—mixed-methods doesn't just mean doing both types separately; the key value comes from thoughtfully integrating findings.
  • Quality assessment is essential: Before considering mixed-methods quality, researchers must establish validity/reliability for both qualitative and quantitative strands using accepted criteria for each.

🎯 Core criteria for mixed-methods data analysis

📊 Number of data types analyzed

  • Traditional approach: Analyze qualitative data with qualitative methods, quantitative data with quantitative methods.
  • Conversion designs: Sequential analysis where one data type is converted to another (monostrand design).
    • Quantitizing: Transform qualitative data (e.g., narrative) into numeric form for statistical analysis.
    • Qualitizing: Transform quantitative data into narrative form for qualitative analysis.
  • Example: Content analysis of interviews followed by exploratory factor analysis of the coded themes (quantitizing).
  • Don't confuse: Conversion designs with multistrand designs—conversion involves only one original data type that gets transformed, while multistrand designs collect both types independently.

🔢 Number of analysis types used

  • Researchers must use at least one qualitative and one quantitative analysis technique.
  • The decision depends on: (1) types of data collected; (2) purposes of the study.
  • Example: If the purpose is to explain why a phenomenon occurs, descriptive statistics are insufficient—need interviews/grounded theory plus causal analysis techniques (regression, SEM).

⏱️ Time sequence of analysis

Three main strategies:

  1. Concurrent: Analyze qualitative and quantitative data simultaneously.
  2. Sequential qualitative→quantitative: Qualitative analysis informs quantitative analysis.
  3. Sequential quantitative→qualitative: Quantitative analysis informs qualitative analysis.

🎭 Priority of analytical components

  • Equal-status design: Both qualitative and quantitative components receive equal emphasis.
  • Dominant-less dominant design: One component has greater priority.
  • If qualitative has significantly higher priority → qualitative-dominant mixed analysis.
  • If quantitative has significantly higher priority → quantitative-dominant mixed analysis.
  • Priority affects complexity of analysis—less priority typically means less sophisticated techniques for that component.

📈 Number of analytical phases

Seven phases proposed (not all required in every study):

  1. Data reduction: Reduce dimensionality (e.g., coding for qualitative, factor analysis for quantitative).
  2. Data display: Visually describe data (tables, diagrams, matrices).
  3. Data transformation: Use quantitizing or qualitizing methods.
  4. Data correlation/comparison: Correlate and compare quantitative with qualitative data.
  5. Data consolidation: Combine both types to create new/consolidated variables.
  6. Data comparison: Compare data from different sources.
  7. Data integration: Integrate both into a coherent whole.

🧩 Analysis orientation

Three approaches:

  1. Case-oriented: Focus on selected cases to analyze meanings, experiences, perceptions—goal is particularizing and analytical generalization.
  2. Variable-oriented: Identify probabilistic relationships among constructs treated as variables—goal is external generalization.
  3. Process-oriented: Evaluate processes/experiences over time, linking processes to variables and experiences to cases—combines case and variable approaches.

Don't confuse: Qualitative methods generally use case-oriented approaches, quantitative methods use variable-oriented approaches, but mixed-methods can combine both in process-oriented analysis.

✅ Validation in mixed-methods research

🔍 Validation in qualitative research

Design validity:

  • Descriptive validity: Accuracy of what is reported (events, behaviors, settings).
  • Credibility: Confidence in truth of findings—use prolonged engagement, peer debriefing, triangulation, member checks.
  • Transferability: Degree results can generalize to other contexts—use thick descriptions, theoretical sampling.

Analytical validity:

  • Theoretical validity: Extent theoretical explanation fits the data—use data collection over extended time, theory triangulation, pattern matching.
  • Dependability: Describe changes in setting and how they affect the study—use audit trail, stepwise replication, peer examination.
  • Consistency: Verify steps through examination of raw data, data reduction products, process notes.

Inferential validity:

  • Interpretive validity: Accuracy of interpreting participants' views, thoughts, feelings—use participant feedback, confirmability audit.
  • Confirmability: Degree results could be confirmed by others—use reflective journal, theory in single-case studies.

🔬 Validation in quantitative research

Measurement validity:

  • Reliability: Repeatability/consistency—use split-half method, compute inter-rater, test-retest, internal consistency reliability.
  • Construct validity: Degree inferences can be made from operationalizations to theoretical constructs—use multitrait-multimethod matrix, demonstrate convergent/discriminant validity.

Design validity:

  • Internal validity: Whether observed covariation reflects causal relationship—demonstrate temporal precedence, use randomization, control groups.
  • External validity: Whether cause-effect relationship holds across variations—use random selection/probability sampling.

Inferential validity:

  • Statistical conclusion validity: Validity of inferences about correlation between variables—assess statistical significance, Type I/II errors.

🎯 Quality criteria in mixed-methods research

Two domains:

Design quality:

  1. Design suitability: Are methods appropriate for answering research questions? Does design match purpose?
  2. Design fidelity: Are methods implemented with necessary quality and rigor?
  3. Within-design consistency: Do components fit together seamlessly?
  4. Analytical adequacy: Are analysis procedures appropriate and adequate?

Interpretive rigor:

  1. Interpretive consistency: Do results from one strand match results from another?
  2. Theoretical consistency: Are meta-inferences consistent with theory?
  3. Interpretive agreement: Are other scholars likely to reach same conclusions?
  4. Interpretive distinctiveness: Is each inference more credible than other possible conclusions?
  5. Integrative efficacy: Do meta-inferences adequately incorporate inferences from each strand?
  6. Interpretive correspondence: Do inferences meet stated purposes/questions?

⚠️ Threats to quality

🚨 Threats to internal validity/credibility

Research design stage:

  • Insufficient/biased knowledge of earlier studies and theories.
  • Contradictions in logic among research questions, theory, hypotheses, tests.

Data collection stage:

  • Observer-caused effects (subjects behave differently when observed).
  • Observer bias (insufficient data, interpretation gaps filled with researcher's values).
  • Researcher bias (personal biases/a priori assumptions not ruled out).
  • Data access limitations.
  • Serious reactivity (changes in responses from being conscious of participating).

Data analysis/interpretation stage:

  • Lack of descriptive, interpretive, explanatory, or theoretical legitimation.
  • Lack of generalizability.
  • Confirmation bias (interpretations overly congruent with a priori hypotheses).
  • Illusory correlation (identifying relationships that don't exist).
  • Causal error (providing causal explanations without verification).

🌍 Threats to external validity/credibility

Qualitative research:

  • Catalytic legitimation issues (degree study empowers research community).
  • Communicative legitimation (disagreement on knowledge claims).
  • Interpretive legitimation (extent interpretation represents group's perspective).
  • Population generalizability (tendency to over-generalize findings).

Quantitative research:

  • Selection bias (inadequate/non-random sample).
  • Interaction of history and treatment effect (conditions change over time).

🔗 Threats to mixed-methods quality

Design quality threats:

  • Design inconsistent with research questions/purpose.
  • Observations/measures lack validity.
  • Data analysis techniques insufficient/inappropriate.
  • Results lack necessary strength for high-quality meta-inferences.
  • Inferences inconsistent with results or research questions.

Interpretive rigor threats:

  • Inconsistency between inferences and findings from each strand.
  • Inconsistency with empirical findings of other studies.
  • Inconsistency across scholars/participants' construction of reality.
  • Failure to distinguish inferences from other possible interpretations.
  • Failure to adequately integrate findings.

📋 Practical examples

🔬 Example: Sonenshein et al. (2014)

Purpose: Complementarity (with developmental element). Design: Sequential qualitative→quantitative.

Qualitative phase:

  • Interviewed 29 participants from Environment and Business Program.
  • Used grounded theory approach: initial coding → theoretical categorization → theory induction.
  • Developed causal network showing first-order categories → second-order themes → aggregate dimensions.
  • Qualitative inference example: "Appraisal of characteristics related to promoting job performance."

Quantitative phase:

  • Collected survey data from 402 environmental issue supporters.
  • Conducted confirmatory factor analysis, then cluster analysis.
  • Identified three profiles: self-affirmers, self-equivocators, self-critics.
  • Used one-way ANOVA to examine how profiles related to issue-supportive behaviors.

Integration: Quantitized qualitative observation data (binary coding), used multiple analysis methods to ensure high-quality meta-inferences.

🏥 Example: Stewart et al. (2017)

Purpose: Completeness. Design: Concurrent (field experiment + open-ended interviews).

Quantitative study:

  • Longitudinal quasi-experiment with 224 providers.
  • Used discontinuous growth modeling to analyze same-day appointment access.
  • Found teams led by lower-status providers improved access faster than higher-status provider teams.

Qualitative study:

  • Semi-structured interviews during and one year after implementation.
  • Followed Miles & Huberman procedures: identifying themes → creating categories → connecting patterns.
  • Used explanatory effect matrix to display findings.
  • Found identity work and delegation behaviors facilitated team effectiveness.

Integration: Although data collected concurrently, analyzed sequentially. Integrated findings using causal model showing how status, identity, and delegation related to team effectiveness.

Don't confuse: Concurrent data collection with concurrent analysis—this study collected both types simultaneously but analyzed them sequentially to build on quantitative findings with qualitative insights.

Mixed-Methods Data Analysis Strategies

🧭 Overview

🧠 One-sentence thesis

Mixed-methods data analysis requires researchers to not only master both qualitative and quantitative analysis techniques but also to strategically integrate findings from both strands to generate high-quality meta-inferences that address the study's purpose and research questions.

📌 Key points (3–5)

  • Core requirement: Competence in qualitative and quantitative analyses plus knowledge of integration techniques to generate meta-inferences
  • Six critical criteria guide decisions: (1) number of data types; (2) number of analysis types; (3) time sequence; (4) priority of components; (5) number of phases; (6) analysis orientation
  • Quality assessment is two-tiered: First establish validity/reliability for each strand using traditional criteria, then assess mixed-methods-specific quality (design quality and interpretive rigor)
  • Common confusion—integration vs. juxtaposition: Simply conducting both analyses separately is insufficient; the value emerges from thoughtful integration that yields insights neither method alone could provide
  • Validation must address both strands and their integration: Traditional qualitative and quantitative validity criteria apply to respective strands, but additional mixed-methods quality criteria assess the integration itself

🎯 Critical criteria for mixed-methods data analysis

📊 Number of data types analyzed

Traditional approach: Analyze qualitative data qualitatively, quantitative data quantitatively—both datasets remain distinct.

Conversion designs (monostrand): One data type is transformed into the other.

  • Quantitizing: Qualitative → quantitative (e.g., content analysis of interviews → frequency counts → factor analysis)
  • Qualitizing: Quantitative → qualitative (e.g., regression results → narrative profiling)

Example: Ferguson & Hull (2018) gathered quantitative personality data, conducted CFA, then used latent profile analysis to identify three qualitative profiles (reserved, well-adjusted, excitable), discussing results from both perspectives.

When to use conversion: Only if transformation brings additional insights beyond what the original data type provides. Gathering both types is preferable when feasible to avoid loss of depth (qualitative) or overgeneralization (quantitative).

Don't confuse: Monostrand conversion designs with multistrand designs—conversion starts with one data type and transforms it; multistrand collects both independently.

🔢 Number of analysis types used

Minimum requirement: At least one qualitative + one quantitative analysis technique.

Decision factors:

  • Types of data collected
  • Study purposes (e.g., if explaining why, descriptive statistics insufficient—need interviews/grounded theory + causal techniques like regression/SEM)

Table showing examples:

Qualitative methodsQuantitative methods
Case study, grounded theory, narrative analysis, content analysis, discourse analysisDescriptive stats, regression, ANOVA, factor analysis, SEM, cluster analysis

⏱️ Time sequence of analysis

Three strategies:

  1. Concurrent: Analyze both simultaneously—appropriate for corroboration/complementarity purposes
  2. Sequential qual→quant: Qualitative findings inform quantitative phase—three variations:
    • Form groups via qualitative data, compare on quantitative measures (typology development)
    • Form themes qualitatively, confirm quantitatively (e.g., content analysis → factor analysis)
    • Establish theoretical order qualitatively, confirm quantitatively
  3. Sequential quant→qual: Quantitative findings inform qualitative phase—three variations:
    • Form groups quantitatively, compare qualitatively
    • Form themes quantitatively, confirm qualitatively
    • Establish theoretical model quantitatively, confirm qualitatively

Example: Sarker et al. (2018) used qualitative data to identify key variables affecting work-life conflict, then tested effects quantitatively.

🎭 Priority of analytical components

Equal-status design: Both components emphasized equally—more likely to involve sophisticated techniques for both.

Dominant-less dominant design: One component has greater priority.

  • Qualitative-dominant: Qualitative component significantly higher priority
  • Quantitative-dominant: Quantitative component significantly higher priority
  • Less dominant component typically uses less complex techniques, may only address limited aspects

Priority affects analysis complexity and depth of insights from each strand.

📈 Number of analytical phases

Seven phases (Onwuegbuzie & Teddlie, 2003)—not all required in every study:

  1. Data reduction: Reduce dimensionality
    • Qualitative: coding to create themes/categories
    • Quantitative: factor analysis
  2. Data display: Visual presentation
    • Qualitative: various matrices (see Table 7-2 in excerpt)
    • Quantitative: tables, graphs
  3. Data transformation: Quantitizing or qualitizing
  4. Data correlation/comparison: Correlate/compare quantitative with qualitative data
  5. Data consolidation: Combine to create new/consolidated variables
  6. Data comparison: Compare data from different sources
  7. Data integration: Integrate into coherent whole

Data display formats (examples from Miles et al.):

  • Within-case: context charts, time-ordered matrices, role-ordered matrices, conceptually clustered matrices, causal networks
  • Cross-case: partially ordered meta-matrices, case-ordered descriptive matrices, scatterplots, causal models

🧩 Analysis orientation

Three approaches:

Case-oriented: Focus on selected cases to analyze meanings, experiences, perceptions—goal is particularizing and analytical generalization. Typically used in qualitative research.

Variable-oriented: Identify probabilistic relationships among constructs—goal is external generalization. Typically used in quantitative research.

  • Strength: Good for finding relationships in large populations
  • Weakness: Poor at handling causal complexity, findings often very general

Process-oriented: Evaluate processes/experiences over time, linking processes to variables and experiences to cases—combines case and variable approaches. Ideal for mixed-methods.

Example: Researchers studying companies implementing AI tools might use case-oriented methods for several companies plus surveys analyzed quantitatively.

Don't confuse: The orientation with the method—qualitative methods generally use case-oriented approaches and quantitative generally use variable-oriented, but mixed-methods can strategically combine both.

✅ Validation in mixed-methods research

🔍 Validation in qualitative research

Design validity:

  • Descriptive validity: Accuracy of reported events, behaviors, settings
    • Strategy: Provide actual descriptions
  • Credibility: Confidence in truth of findings
    • Strategies: Prolonged engagement, peer debriefing, triangulation, member checks, negative case analysis
  • Transferability: Degree results generalize to other contexts
    • Strategies: Thick descriptions, theoretical/purposeful sampling

Analytical validity:

  • Theoretical validity: Extent theoretical explanation fits data
    • Strategies: Extended data collection, theory triangulation, pattern matching
  • Dependability: Describe changes in setting and their effects
    • Strategies: Audit trail, stepwise replication, code-recode, peer examination
  • Consistency: Verify research steps
    • Strategies: Well-established protocols, qualitative database
  • Plausibility: Whether findings fit data
    • Strategies: Explanation-building, pattern-matching, address rival explanations, logic models

Inferential validity:

  • Interpretive validity: Accuracy of interpreting participants' views/thoughts/feelings
    • Strategies: Participant feedback, confirmability audit
  • Confirmability: Degree results could be confirmed by others
    • Strategies: Reflective journal, use theory (single-case) or replication logic (multiple-case)

🔬 Validation in quantitative research

Measurement validity:

  • Reliability: Repeatability/consistency
    • Types: Inter-rater, test-retest, parallel-forms, internal consistency
    • Strategies: Split-half method, compute reliability scores
  • Construct validity: Degree inferences can be made from operationalizations to theoretical constructs
    • Types: Face, content, criterion-related, predictive, concurrent, convergent, discriminant, factorial
    • Strategies: Multitrait-multimethod matrix, demonstrate theoretically-related measures are interrelated, unrelated measures are not

Design validity:

  • Internal validity: Whether observed covariation reflects causal relationship
    • Strategies: Demonstrate temporal precedence, covariation, use randomization, control groups, rapid experiments
  • External validity: Whether cause-effect holds across variations in persons, settings, variables
    • Strategies: Random selection/probability sampling, describe population to which results generalize

Inferential validity:

  • Statistical conclusion validity: Validity of inferences about correlation between variables
    • Strategies: Assess statistical significance, assess Type I/II error possibilities

🎯 Quality criteria specific to mixed-methods

Two quality domains:

Design quality (degree researcher selected appropriate procedures):

  1. Design suitability: Are methods appropriate for research questions? Does design match purpose?
  2. Design fidelity: Are methods implemented with necessary quality/rigor?
  3. Within-design consistency: Do components fit together seamlessly? Do strands follow logically?
  4. Analytical adequacy: Are analysis procedures appropriate/adequate to answer questions? Are mixed-methods strategies implemented effectively?

Interpretive rigor (degree of credible interpretations):

  1. Interpretive consistency: Do inferences closely follow findings? Are multiple inferences from same findings consistent?
  2. Theoretical consistency: Are inferences consistent with theory and field knowledge?
  3. Interpretive agreement: Would other scholars reach same conclusions? Do inferences match participants' constructions?
  4. Interpretive distinctiveness: Is each inference more credible/plausible than other possible conclusions?
  5. Integrative efficacy: Do meta-inferences adequately incorporate inferences from each strand? Are inconsistencies explored with explanations offered?
  6. Interpretive correspondence: Do inferences address study purposes/questions? Do meta-inferences meet stated need for mixed-methods?

Additional quality criteria (Onwuegbuzie & Johnson, 2006):

  • Sample integration: For statistical generalizations from sample to population
  • Inside-outside: Accurately present/utilize insider and observer views
  • Weakness minimization: Weaknesses from one approach compensated by strengths from other
  • Sequential legitimation: Minimize problems from revising sequence of phases
  • Conversion legitimation: Quantitizing/qualitizing lead to interpretable data and high inference quality
  • Paradigmatic mixing: Successfully combine paradigmatic assumptions into usable package
  • Commensurability: Ability to make Gestalt switches between qualitative and quantitative lenses
  • Multiple validities: Use all relevant strategies and meet multiple validity criteria
  • Political legitimation: Consumers value meta-inferences from both components

⚠️ Threats to quality in mixed-methods research

🚨 Threats to internal validity/credibility

Research design stage:

  • Insufficient/biased knowledge of earlier studies/theories
  • Contradictions in logic among research questions, theory, hypotheses, statistical tests

Data collection stage (qualitative):

  • Observer-caused effect (subjects behave differently when observed)
  • Observer bias (insufficient data, gaps filled with researcher's values/projections)
  • Researcher bias (personal biases/a priori assumptions not ruled out)
  • Data access limitations (limited time on site, restricted access)
  • Complexities of human mind (subjects may mislead, statements affected by fallibilities)
  • Serious reactivity (changes from being conscious of participating)

Data collection stage (quantitative):

  • Instrumentation issues
  • Selection bias
  • Researcher bias

Data analysis/interpretation stage (qualitative):

  • Lack of descriptive legitimation (settings/events)
  • Lack of interpretive legitimation (meanings/perspectives)
  • Lack of explanatory/theoretical legitimation (causal processes)
  • Lack of generalizability
  • Issues in various legitimation types (ironic, paralogical, rhizomatic, voluptuous)
  • Confidential information problems
  • Difficulty interpreting typicality
  • Not all data analyzed equally
  • Lack of structural corroboration (multiple data types to support/contradict)
  • Confirmation bias (interpretations overly congruent with a priori hypotheses)
  • Illusory correlation (identifying relationships that don't exist)
  • Causal error (providing causal explanations without verification)

Data analysis/interpretation stage (quantitative):

  • Statistical regression bias
  • Confirmation bias

🌍 Threats to external validity/credibility

Qualitative research:

  • Catalytic legitimation (degree study empowers/liberates research community)
  • Communicative legitimation (disagreement on knowledge claims in discourse)
  • Interpretive legitimation (extent interpretation represents group's perspective)
  • Population generalizability (tendency to generalize rather than obtain insights into specific processes/practices)

Quantitative research:

  • Selection bias (inadequate/non-random sample)
  • Interaction of history and treatment effect (conditions change over time)

🔗 Threats to mixed-methods quality

Design quality threats (within-design consistency):

  • Design inconsistent with research questions/purpose
  • Observations/measures lack validity
  • Data analysis techniques insufficient/inappropriate
  • Results lack necessary strength for high-quality meta-inferences
  • Inferences inconsistent with results
  • Inferences inconsistent with research questions/purposes

Interpretive rigor threats:

  • Inconsistency between inferences and findings from each strand
  • Inconsistency with empirical findings of other studies
  • Inconsistency across scholars and participants' construction of reality (other scholars disagree; interpretations don't make sense to participants)
  • Failure to distinguish inferences from other possible interpretations
  • Failure to adequately integrate findings

Addressing threats: All should be properly addressed in mixed-methods inquiry. Not all threats appear in every study, but consider and address when necessary to ensure high-quality meta-inferences.

📋 Practical examples

🔬 Example 1: Sonenshein et al. (2014)

Purpose: Complementarity (with developmental element) Design: Sequential qualitative→quantitative

Qualitative phase (Study 1):

  • Sample: 29 participants (14 students, 15 alumni) from Environment and Business Program
  • Justification: "Highly deviant set of individuals" who took distinctive steps to address climate change
  • Data: Interviews, field observations, secondary documents
  • Analysis: Grounded theory approach (Strauss & Corbin)
    • Step 1: Initial data coding (open codes staying close to informants' interpretations)
    • Step 2: Theoretical categorization (abstract codes grouping self-meaning into generalizable categories)
    • Step 3: Theory induction (identified three aggregate dimensions: issue support challenges, self-assets, self-doubts)
  • Display: Causal network showing first-order categories → second-order themes → aggregate dimensions; construct table with representative quotes
  • Established credibility through mindfulness of different settings, theoretical validity through abstracting to match theoretical concepts

Quantitative phase (Study 2):

  • Sample: 91 environmental issue supporters, active members of environmental groups
  • Pre-test: Demonstrated construct validity for self-assets and self-doubts measures
  • Data: Surveys + concealed observations (composting behavior, Earth Hour pledge signing)
  • Analysis:
    • Confirmatory factor analysis (bootstrapped due to small sample; CFI=.96, RMSEA=.08)
    • Cluster analysis → identified three profiles: self-affirmers (low doubts, high assets), self-equivocators (high doubts, high assets), self-critics (high doubts, low assets)
    • Coded observed behaviors (composting, interpersonal influence, collective advocacy)
    • One-way ANOVA with post-hoc comparisons
  • Findings: Self-critics engaged in lowest number of actions, followed by self-equivocators, then self-affirmers

Integration:

  • Quantitized qualitative observation data (binary coding: 0 or 1)
  • Used multiple analysis methods to ensure high-quality meta-inferences
  • Components fit together seamlessly—Study 1 findings developed theory tested in Study 2
  • Although didn't explicitly discuss quality criteria, methods appropriate for research questions, embedded multi-method element in Study 2

Don't confuse: This sequential design with pure quantitative follow-up—the qualitative phase was essential for theory development, not just exploratory; Study 2 embedded both quantitative (survey) and qualitative (observation) elements.

🏥 Example 2: Stewart et al. (2017)

Purpose: Completeness Design: Concurrent (field experiment + open-ended interviews)

Quantitative study:

  • Design: Longitudinal quasi-experiment
  • Sample: 224 providers (142 physician, 82 nonphysician)
  • Data: Pre- and post-intervention data; monthly time series (7 months before, 37 months after team-based empowerment adoption)
  • Measure: Same-day appointment access (% of same-day requests granted)
  • Coding: Status as dichotomous (1=higher-status physician, 0=lower-status nonphysician); absolute coding for time
  • Analysis: Discontinuous growth modeling (using R nlme package)
    • Added covariates
    • Tested different time forms (linear, quadratic)
    • Tested variations for random effects
    • Added status as predictor
    • Added interaction terms (status × time indicators)
    • Calculated pseudo-R² values
  • Findings: Teams led by lower-status providers improved access faster than higher-status provider teams

Qualitative study:

  • Sample: Purposive sampling from larger VHA teams study
  • Data: Semi-structured interviews (initial months + one-year follow-up)
  • Focus: Facilitators and barriers to team-based empowerment implementation
  • Analysis: Miles & Huberman (1994) procedures
    • Step 1: Identifying themes (leader identity work, leader delegation)
    • Step 2: Creating categories within themes
      • Identity work: embracing empowering identity, protecting hierarchical identity
      • Delegation: insufficient, overabundant, balanced
    • Step 3: Connecting patterns (nonphysician providers associated with embracing new identity)
  • Guard against bias: Research assistant blind to findings/theory involved in coding
  • Display: Explanatory effect matrix

Integration:

  • Data collected concurrently but analyzed sequentially
  • Quantitative findings confirmed hypothesis (high-status leaders less effective)
  • Qualitative findings examined leader behaviors in-depth (identity and delegation facilitated effectiveness)
  • Integrated using causal model (joint display showing status → identity/delegation → team effectiveness)
  • Should have used additional visual formats (e.g., time-ordered matrix for flow/sequence)

Quality assessment (not explicitly stated but evident):

  • Methods appropriate for research questions
  • Strands addressed different questions, together provided complete picture
  • Integration well-reported through causal model
  • Components fit together—quantitative identified what happened, qualitative explained how/why

Don't confuse: Concurrent data collection with concurrent analysis—this study collected both simultaneously but analyzed sequentially to build on quantitative findings with qualitative depth; also don't confuse completeness purpose with complementarity—here, neither strand alone provided complete picture (completeness), whereas complementarity examines different aspects that could stand alone.

9

Mixed-Methods Research and Big Data Analytics

CHAPTER 9. Mixed-Methods Research and Big Data Analytics

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research provides a powerful framework for big data analytics by combining qualitative and quantitative approaches to extract richer insights from diverse, large-scale datasets than either method could achieve alone.

📌 Key points (3–5)

  • Big data characteristics: Defined by volume (large-scale), variety (multiple formats/sources), velocity (high-speed generation), veracity (reliability), and value (economic insights); datasets with at least two of these properties qualify as big data.
  • Analytics techniques available: Include general analytics, text analytics (processing unstructured text), web analytics (clickstream data), network analytics (relationships as nodes/edges), and mobile analytics (sensor/location data).
  • Two main mixed-methods designs for analytics: Multistrand designs (separate qualitative and quantitative data strands analyzed independently then integrated) and monostrand conversion designs (one data type transformed into another, e.g., text mining converts qualitative text to quantitative data).
  • Common confusion—quantitizing vs. qualitizing: Quantitizing transforms qualitative data (e.g., text) into numerical data for statistical analysis; qualitizing transforms quantitative data into qualitative themes; both are valid conversion techniques but serve different purposes.
  • Visualization as integration tool: Visual analytics combine automated analysis with interactive visualization to handle massive data points, support pattern recognition, and facilitate integration of qualitative and quantitative findings.

📊 Big data fundamentals

📊 What qualifies as big data

Big data: "data whose scale and complexity go beyond typical database software tools, requiring new technical architectures and analytics to enable insights that unlock new sources of business value."

  • Not just "large" data—must have specific characteristics.
  • The excerpt defines five key properties (the "5 Vs"):
    • Volume: Large amounts consuming huge storage or containing many records
    • Variety: Multiple sources and formats, multidimensional fields
    • Velocity: High frequency of generation and/or delivery
    • Veracity: Inherent unpredictability requiring analysis for reliable prediction
    • Value: Generates economically worthy insights through extraction and transformation
  • A dataset needs at least two of these properties to qualify as big data.
  • Example: Online customer reviews have high volume, velocity, and variety → qualify as big data.

📊 Data format types

Three categories of data formats in analytics:

FormatDescriptionExample
StructuredSpecific format with relational structureDatabase tables, spreadsheets
UnstructuredNo specific formatVideos, text, time information, geographic location
Semi-structuredMix of structured and unstructured elementsJSON files, XML documents

📊 Common data sources

  • Internet clicks and web service data
  • Social media platforms (blogs, Facebook, Twitter, YouTube)
  • Mobile transactions and sensor data
  • Wearable technologies (smartwatches)
  • Internet of Things devices (power meters)
  • Business transactions (sales queries, purchase transactions)

🔧 Analytics technique categories

🔧 General analytics techniques

  • Process data from different sources in different formats.
  • Use mature commercial technologies: relational DBMS, data warehousing, ETL, OLAP.
  • Apply data mining algorithms: C4.5 (decision trees), k-means (clustering), support vector machines.
  • Cover classification, clustering, regression, association analysis, network analysis.
  • Common approaches: multivariate statistical analysis, optimization techniques, statistical machine learning, spatial data mining, high-speed data stream mining.

🔧 Text analytics

Text analytics: The process of automatically extracting quantitative data from text (email, documents, webpages, social media content, online reviews).

  • Converts unstructured text into analyzable data.
  • Key techniques include:
    • Information retrieval (vector space model, Boolean retrieval, probabilistic retrieval)
    • Natural language processing (NLP) for lexical acquisition, word sense disambiguation, part-of-speech tagging
    • Named Entity Recognition (NER)
    • Supervised and unsupervised learning methods
    • Topic modeling and clustering
    • Probabilistic latent semantic analysis
  • This is quantitizing: transforming qualitative text data into numerical data for statistical analysis.
  • Example: Researchers analyze social media posts to gain insights on customer sentiment, opinions, and preferences.

🔧 Web analytics

  • Measures and understands relationships between customers and websites.
  • Analyzes clickstream data logs to reveal browsing and purchasing patterns.
  • Built on data mining, statistical analysis, information retrieval, and NLP foundations.
  • Techniques: website crawling/spidering, webpage updating, website ranking, search log analysis.
  • Example: Analyzing bid history on auction sites to understand how auction rules influence bidding behaviors.

🔧 Network analytics

Network analytics: The process of analyzing networked data with entities modeled as nodes and their connections as edges, comprising large networks.

  • Investigates relations among relationships (e.g., cluster formation).
  • Examines influence of structural properties on social integration.
  • Built on citation-based bibliometric analytics extended to online communities.
  • Techniques include:
    • Social network analysis (centrality, betweenness, structural holes)
    • Exponential random graph models
    • Visualization tools for presenting large-scale network information
  • Example: Using h-index to measure scholar productivity; analyzing relationships among scientists.

🔧 Mobile analytics

  • Analyzes clickstream data logs and sensor data (location data) from mobile devices.
  • Provides insights on mobile activities and movement patterns.
  • Data types: number of downloads, direct quantitative measurement of use, habit, engagement.
  • Techniques: location-aware and activity-sensitive mobile sensing apps, mobile social innovation (m-health, m-learning), mobile social networks, crowdsourcing, mobile visualization, personalization and behavior modeling.
  • Example: Gathering mobile app use data through sensor-equipped phones to understand which features customers used most at particular times.

🔀 Mixed-methods designs for analytics

🔀 Multistrand designs

  • Contains at least two complete research strands: one quantitative cycle (data collection + analysis) and one qualitative cycle (data collection + analysis).
  • Each strand analyzed independently, then integrated.
  • Integration can occur at: conceptualization, data collection, data analysis, or inferential stage.
  • Researchers mine patterns from large-scale data in various formats.
  • Use both traditional methods (ANOVA, regression, qualitative coding) and advanced techniques (data mining, machine learning algorithms).

Key decision points:

  • Sampling methods: Are samples for quantitative and qualitative strands identical or overlapping?
  • Timing: Simultaneous analysis (for corroboration/complementarity purposes) or sequential analysis (for compensation/developmental/expansion purposes)?

Example from excerpt: An organization analyzed supply chain risks by:

  • Gathering social media data and performing content analysis (qualitative)
  • Obtaining financial statements and operational data (quantitative)
  • Converting qualitative data to quantitative using fuzzy Delphi method (quantitizing)
  • Aggregating findings using decision-making matrix and causal diagrams
  • Result: Capacity and operations had greater influence; triggering event risks were difficult to diagnose

🔀 Monostrand conversion designs

  • Single strand study where one data type is transformed into another.
  • Most common: quantitizing (qualitative → quantitative) via text mining/text analytics.
  • Less common: qualitizing (quantitative → qualitative).
  • Transformed data then analyzed using computational techniques.

Text preprocessing workflow for quantitizing:

  1. Text indexing: Convert text into a list of words (nouns, verbs, adjectives)
  2. Text encoding: Map texts into numerical vectors
    • Assign values to selected features
    • Use techniques like wrapper approach or principal component analysis for dimension reduction
  3. Text association: Extract association rules (if-then templates) from word sets
    • Use algorithms like A priori algorithm
    • Display directional relationships (A → B → C)

Text categorization process:

  1. Preliminary task: Pre-define categories (often theory-driven); select classification algorithm; allocate sample text to each category
  2. Learning: Index sample text into feature candidates; construct classification capacity (equations, symbolic rules, optimized parameters)
  3. Classification: Classify new text data; evaluate classification performance; adjust algorithm if needed

Example from excerpt: Researchers studied online customer reviews' impact on product performance:

  • Collected 2.4 million customer reviews (qualitative text data)
  • Performed text preprocessing on reviews and product release notes
  • Represented documents as "bag of words" (nouns, verbs, adjectives)
  • Extracted keywords using TF-IDF scheme (statistical measure of word relevance)
  • Constructed keyword-by-keyword matrix with similarity measures
  • Used singular value decomposition (SVD) to reduce dimensionality
  • Calculated "customer agility" as a quantitative measure
  • Regressed review volume and product performance on customer agility
  • Found curvilinear relationships

Don't confuse: Monostrand conversion designs still qualify as mixed-methods because they involve both qualitative and quantitative elements, even though the original data type is transformed rather than keeping separate strands.

📈 Visualization in analytics research

📈 Why visualization matters

Visual analytics: "A combination of automated analysis techniques with interactive visualization for an effective understanding, reasoning and decision making on the basis of very large and complex datasets."

  • Brings qualitative and quantitative data together visually.
  • Presents essential information from vast amounts of data.
  • Aids complex analyses and pattern recognition.
  • Shows statistics-by-themes, side-by-side comparisons.
  • Connects findings to theoretical frameworks and recommendations.
  • Transforms researcher from passive transmitter to active co-constructor of knowledge.

📈 Handling massive data points

Methods for reducing data points, dimensions, and clutter:

MethodDescriptionPurpose
Aggregation/SimplificationCombine or simplify dataReduce complexity
Data subsetSelect only relevant portionFocus on pertinent information
JitteringAdd random noise to data samplesPrevent points from plotting at identical locations
Data binningGroup N values into fewer than N discrete groupsReduce granularity while preserving patterns

📈 Selecting visualization types

Six dimensions to consider when choosing visualization:

  1. Analysis goal: What are you trying to show?

    • Composition (parts of whole) → stacked column chart
    • Order (emphasize sequence) → alphabetical list
    • Relationship (correlation) → point graph
    • Comparison (similarities/differences) → column chart
    • Cluster (data grouping) → dendrogram
    • Distribution (dispersion in space) → histogram
    • Trend (general tendency) → line graph
    • Geospatial (geographic patterns) → choropleth map
  2. Interaction level: How much detail to show?

    • Overview (entire dataset) → dendrogram
    • Zoom (focus on items of interest) → network map
    • Filter (ignore unwanted items) → area chart
    • Details-on-demand (select item for details) → choropleth map
  3. User literacy: Who is the audience?

    • Lay (computer-literates with limited analytics understanding) → line graph
    • Tech (skilled users with deeper analytics understanding) → tree map
  4. Dimensionality: How many variables?

    • One-dimensional (single value/string) → gauge
    • Two-dimensional (one dependent, one independent) → single line graph
    • n-dimensional (point in n-dimensional space) → bubble graph
    • Tree (items linked to one parent) → dendrogram
    • Graph (items linked to arbitrary number of others) → network map
  5. Cardinality: How many items?

    • Low (few to few dozen) → pie chart
    • High (dozens or more) → heat map
  6. Data type: What kind of data?

    • Nominal (qualitative, assigned to categories) → pie chart
    • Ordinal (qualitative, sortable categories) → column chart
    • Interval (quantitative, equality of intervals) → line graph
    • Ratio (quantitative, non-arbitrary zero point) → point graph

📈 Digital mixed-methods design with visualization

A proposed workflow integrating visualization:

Qualitative analysis → Transformation to quantitative data → Data mining (quantitative analysis) → Visualization → Qualitative exploration

  • Qualitative analyses of text, images, non-numeric data transformed using digital tools (e.g., multimodal annotation software)
  • Quantitized results analyzed using machine-learning techniques
  • Findings visualized to help qualitative interpretations of large datasets
  • Integration occurs at design, method, interpretation, and reporting stages

Example from excerpt: Researchers assessed corporate fraud risk using financial social media data:

  • Gathered social media content and financial statements (both qualitative and quantitative)
  • Used web crawling and Stanford CoreNLP toolkit
  • Extracted signals: sentiment, emotion, topic, lexical, social network features
  • Fed features into machine learning classifiers (SVM, logistic regression, neural networks, decision tree)
  • Used line graph to visualize model performance for each feature set
  • Graph enabled comparison showing topic features were most predictive of fraud
  • Visualization triggered cognitive responses difficult to obtain by other means

🏗️ Building theory with mixed-methods analytics

🏗️ Data-driven vs. theory-driven approaches

  • Theory-driven: Start with research questions from literature gaps or conflicts; use domain theory to guide analysis
  • Data-driven: Start from data; use exploratory approaches to extract scientifically interesting insights from big data
  • Combined approach: Strengthen outcomes by integrating both; even data-driven findings should be traced back to existing theory when possible

🏗️ Framework for theory building

Proposed workflow:

  1. Start with research questions (from literature or data-driven exploration)
  2. Consider domain theory (if applicable; skip if purely exploratory)
  3. Determine mixed-methods purpose (corroboration, complementarity, compensation, developmental, expansion)
  4. Select design: Monostrand or multistrand; concurrent or sequential; equal-status or dominant-less dominant
  5. Collect large volumes of data
  6. Data cleaning and transformation (e.g., quantitizing large volumes of text into themes)
  7. Validation and data analysis (state clearly when mixing occurs)
  8. Integration and meta-inference (theory development block)
    • Use visualization to draw theoretical models (especially for data-driven approach)
    • Extract correlations and patterns yielding insights into empirically interesting phenomena
    • Trace back to existing theory when possible
  9. Theory outcomes: Extend/refine existing theory or generate new theory

🏗️ Unique opportunities in analytics research

  • Observe psychological and behavioral elements not observable using conventional quantitative methods (e.g., surveys)
  • Include unconscious behaviors and emotional states hardly observable using qualitative methods (e.g., interviews)
  • Uncover factors or constructs from natural language processing for theoretical model development
  • Report not just observations but uncover hidden relationships in large-scale datasets
  • Analyze and interpret from pragmatic perspective
  • Offer knowledge and insights that single methods cannot provide

Don't confuse: Data-driven research is not atheoretical—correlations and patterns from big data analysis should still be connected to theory, even if results don't fit existing frameworks; this can lead to theory refinement or generation.

10

Generating Meta-Inferences in Mixed-Methods Research

CHAPTER 10. Generating Meta-Inferences in Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

Meta-inferences synthesize qualitative and quantitative findings through theoretical reasoning techniques (induction, deduction, abduction, or combinations) to produce integrated conclusions that answer mixed-methods research questions and address convergent, complementary, or divergent patterns in the data.

📌 Key points (3–5)

  • What meta-inferences are: integrative conclusions that synthesize findings from both qualitative and quantitative research strands, typically developed at the end of a study.
  • Three common analysis pathways: (1) merge qual + quant → meta-inferences; (2) quant → qual → meta-inferences; (3) qual → quant → meta-inferences.
  • Four theoretical reasoning approaches: induction (specific to general), deduction (general to specific), combination of both, and abduction (data to best explanation).
  • Three possible patterns: convergence (findings agree), complementarity (findings add different pieces), and divergence (findings contradict).
  • Common confusion: Don't confuse inferences (conclusions from evidence) with explanations (interpretations of conclusions)—induction separates these, while abduction integrates them.

🔍 What are meta-inferences?

🔍 Definition and purpose

Meta-inferences: "an integrative view of findings from qualitative and quantitative strands of mixed-methods research."

  • They are developed after analyzing both qualitative and quantitative data separately.
  • They represent a coherent conceptual framework that answers the research questions.
  • The process keeps research questions in the foreground—inferences are fundamentally answers to research questions.

🔍 Development process

  • Researchers examine each dataset's outcomes separately first.
  • They evaluate how effectively each answers its relevant research question(s).
  • Then they compare and contrast answers across questions.
  • They assess conceptual variations and similarities.
  • Finally, they synthesize into meta-inferences.

🛤️ Three analysis pathways

🛤️ Pathway selection

The pathway depends on the mixed-methods design strategy:

PathwayProcessWhen to use
MergingQual + Quant findings → Meta-inferencesConcurrent designs
Quant-firstQuant inferences → Qual inferences → Meta-inferencesSequential designs starting quantitative
Qual-firstQual inferences → Quant inferences → Meta-inferencesSequential designs starting qualitative

🛤️ Practical recommendation

  • Use the merging pathway for concurrent mixed-methods designs.
  • Use sequential pathways (second or third) for sequential mixed-methods designs.
  • The choice should align with the study's paradigmatic stance and purposes.

🧠 Theoretical reasoning techniques

🧠 Inductive reasoning

Inductive reasoning: reasoning from particular observations to general perspectives; generalizing from specific instances to broader contexts.

How it works:

  • Start with specific observations (grounds/data).
  • Identify patterns across instances.
  • Develop generalizations about the phenomenon.
  • Build theory from observations.

Toulmin's framework for induction:

  • Grounds: premises (data, observations).
  • Claims: theoretical interpretations or generalizations.
  • Warrants: standards that justify bridging grounds to claims (the essence of inference).

Strengths:

  • Can uncover previously unknown aspects of phenomena.
  • Applicable to both qualitative and quantitative research.
  • Enables discovery.

Limitations:

  • Lacks the solid normative foundation of deduction.
  • Knowledge claims are probable, not certain ("improbable that premises be true and conclusion false").
  • Can only observe particular events, not generalities directly.

Example: A researcher interviews 81 individuals about person-environment misfit and identifies three broad response patterns (resolution, relief, resignation)—generalizing from specific cases to broader categories.

🧠 Deductive reasoning

Deductive reasoning: a theory-testing approach that moves from general explanations to specific predictions.

How it works:

  • Start with an abstract, general theory.
  • Develop hypotheses based on the theory.
  • Test hypotheses empirically.
  • Derive claims/results from empirical evidence.
  • If predictions are correct, the theory is deemed empirically adequate.

Strengths:

  • Uses existing theory with presumed universal applicability.
  • Produces certain knowledge claims ("impossible for premises to be true and conclusion false").
  • Guides study purposefully.

Limitations:

  • Self-evident and lacks flexibility.
  • Cannot easily adapt to empirical circumstances.
  • Does not provide criteria for choosing among alternative explanations.

Example: Researchers draw on dynamic capabilities theory, predict that big data analytics usage affects organizational value, test via survey, and find support—confirming the theory's predictions.

Don't confuse: Deduction can be used in qualitative research too (e.g., pattern matching—comparing case data to theoretical predictions and counter-theory predictions).

🧠 Combined inductive and deductive reasoning

Why combine both:

  • Balances inference and explanation (induction) with empirical testing (deduction).
  • When deduction limits justification using alternative explanations, induction can generate new theoretical insights.
  • Allows researchers to defend both their findings and their theoretical interpretations.

Example: A study on technostress used qualitative case study (inductive) to identify factors, then quantitatively validated the model (deductive). Qualitative inferences were derived inductively; quantitative inferences deductively; meta-inferences integrated both approaches.

🧠 Abductive reasoning

Abductive reasoning (inference to the best explanation): reasoning that moves from observations to a hypothesis that best explains those observations.

The abductive rule:

  1. D is a collection of data.
  2. H explains D (if H were true, it would explain D).
  3. No other alternative explains D as well as H.
  4. Therefore, H is probably true.

How it differs:

  • Unlike induction: doesn't rely solely on empirical data to move from specific to general.
  • Unlike deduction: theoretical premises don't guarantee results.
  • Integrates inference and explanation as a single process (whereas induction separates them).

Strengths:

  • Describes how empirical scientists actually reason in practice.
  • Allows discovery through data exploration.
  • Overcomes problems with both induction (can't observe generalities) and deduction (no guidance on choosing among alternatives).
  • Effective for theorizing about surprising events or anomalies.

Limitations:

  • Produces plausible but fallible knowledge claims.
  • Claims are untested, held tentatively, subject to continuous revision.
  • Researchers have authority to select "best explanation" based on pragmatic virtues (interestingness, usefulness, simplicity), not truth value.

Example: A study found mobile shopping value had a negative (nonsignificant) effect on omni-channel shopping value. Using abduction, researchers approached qualitative data with an open mindset, then used prior knowledge to understand the anomaly—concluding that lack of smartphone usability and specialized mobile usage explained why mobile didn't contribute to global shopping value.

🔀 Three patterns in meta-inferences

🔀 Convergence

  • Qualitative and quantitative inferences are consistent.
  • They agree with each other.
  • Meta-inferences strengthen the initial theoretical assumptions.
  • This pattern confirms and reinforces findings.

🔀 Complementarity

  • Qualitative and quantitative inferences complement each other.
  • Each adds different pieces of understanding.
  • Meta-inferences provide a more complete picture of the empirical domain.
  • This pattern enriches understanding through multiple perspectives.

🔀 Divergence

  • Qualitative and quantitative inferences contradict each other.
  • Findings conflict.
  • Researchers must resolve the divergent findings.
  • Conflicting evidence and its resolution are key values of mixed-methods research.

Don't confuse: Divergence is not a failure—it's an opportunity for deeper understanding and theory refinement.

🔧 Strategies for handling divergence

🔧 Three main strategies

1. Appraise quality (check for mistakes):

  • Review methodological procedures in both strands.
  • Determine if divergence stems from methodological errors.
  • Re-examine what issues caused divergent findings.

2. Reanalyze and revisit theory:

  • If divergence can be interpreted in a plausible manner, reanalyze existing data.
  • Revisit theoretical assumptions.
  • Look for the best possible explanation.

Example: A study found technical support reduced positive psychological response (contradicting qualitative findings). Re-examining qualitative data revealed healthcare workers felt calling help desks took time from patient care and led to frustrating exchanges due to communication gaps between nurses and technical support.

3. Collect additional data:

  • Ask new research questions.
  • Collect and analyze new data to examine conflicting findings.
  • Verify initial findings with follow-up studies.

Example: A study on welfare rights advice found divergent qual/quant results. Collecting additional data revealed the positive relationship extended beyond the small qualitative sample, and that qualitative outcomes contained dimensions not measured quantitatively.

🔧 Two approaches for contradictory inferences

Bracketing approach:

Bracketing: "the process of incorporating a diverse and/or opposing view of the phenomenon of interest."

  • Appropriate when findings are irreconcilable and suggest extreme results (best-case/worst-case scenarios).
  • Consistent with exploration and exploitation of breakdowns.
  • Use with concurrent mixed-methods designs.
  • Allows side-by-side, simultaneous comparison.

Bridging approach:

Bridging: "the process of developing a consensus between qualitative and quantitative findings."

  • Helps understand transitions and boundary conditions.
  • Can be used with any mixed-methods design.
  • Works with deductive, inductive, or abductive reasoning.
  • Acknowledges contradictory or confirmatory elements.
  • Leads to new understanding.

Example: After finding divergent results about technical support, researchers bridged the gap with the meta-inference: "healthcare workers who do not perceive end-user support as related to proper and helpful support activities will experience a lowered positive psychological response."

📊 Comparison of reasoning techniques

Reasoning TypeDirectionKnowledge ClaimEvaluation Criteria
InductiveSpecific cases → General explanationProbable (relationship is consequential and reliable)Does general observation account for specific cases? Can relationships generalize?
DeductiveGeneral explanation → Specific predictionCertain (pattern is predictable)Was premise validated? Can prediction be replicated?
AbductiveSpecific observations → Particular explanationPlausible (resolves empirical anomaly)Does explanation cohere into testable hypothesis? Does it identify new variables? Is anomaly documented?

💡 Practical implications

💡 For research design

  • Choose analysis pathway based on design (concurrent vs. sequential).
  • Select reasoning technique(s) based on research goals and data patterns.
  • Be prepared to use multiple reasoning approaches.

💡 For developing meta-inferences

  • Keep research questions in the foreground throughout.
  • Analyze each strand separately first.
  • Compare and contrast findings systematically.
  • Use appropriate reasoning technique for the pattern observed.
  • Don't avoid divergence—embrace it as an opportunity for deeper insight.

💡 For reporting

  • Clearly distinguish inferences from explanations.
  • Document the reasoning process used.
  • Explain how meta-inferences answer research questions.
  • Address any divergent findings transparently.
  • Show how integration produces insights beyond single-method studies.
11

Mixed-Methods Research Paper Templates

CHAPTER 11. Mixed-Methods Research Paper Templates

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research papers require specific structural templates that align with the study's purpose (compensation, corroboration, diversity, developmental, complementarity, completeness, or expansion) and design characteristics (sequential vs. concurrent, equal vs. dominant status) to effectively communicate integrated qualitative and quantitative findings.

📌 Key points (3–5)

  • Core challenge: Despite growing interest in mixed-methods research, guidance on structuring and writing papers for publication has been insufficient.
  • Six main sections: Introduction, literature review/theory development, methods, results, discussion, and conclusions—but specific structure varies by purpose and design.
  • Integration is essential: Qualitative and quantitative findings must be integrated at one or multiple points, culminating in meta-inferences that answer research questions.
  • Common confusion: The structure is not one-size-fits-all; time orientation (concurrent vs. sequential) and status (equal vs. dominant-less dominant) significantly affect how sections are organized.
  • Purpose drives structure: Seven distinct purposes require different templates, each emphasizing different aspects of integration and presentation.

📝 Standard sections of mixed-methods papers

📝 Introduction section

  • Must establish methodological importance and appropriateness of mixed-methods research.
  • Four required components:
    1. Topic
    2. Research problems (practical and theoretical gaps)
    3. Research questions
    4. Purpose(s) of mixed-methods research

Research question approaches:

  • Write separate quantitative and qualitative questions
  • Write separate questions plus a mixed-methods question
  • Write only a mixed-methods question reflecting procedures or content

Purpose characteristics matter:

  • Complementarity: qualitative and quantitative examine overlapping or different aspects of the same phenomenon
  • Completeness: two distinct but related aspects
  • Expansion: two somewhat unrelated aspects
  • Other purposes (compensation, corroboration, diversity, developmental): same aspects examined differently

📚 Literature review and theory development

  • Researchers must be explicit about their worldview (paradigm) and how it informs the project.
  • Paradigms should not be the focus; research questions and purposes should drive the study.

Theory-driven vs. data-driven:

  • Theory-driven: describe theoretical basis in detail; explain how theory informs each phase
  • Data-driven: may lack strong theoretical basis; present relevant literature review; emphasize datasets, analyses, and implications

Hypotheses development (for theory-testing papers):

  • Each hypothesis must be stated with sufficient argumentation
  • Developed from theory (theory is precondition for deductive inferences)

🔬 Methods section

  • Structure varies by purpose, time orientation, and status.
  • Must include: sampling design, data collection, and data analysis strategies for both strands.
  • Explicitly discuss the link between qualitative and quantitative data analysis approaches.

Diagram of procedures (recommended): Five essential parts:

  1. Boxes showing data collection and analysis for both types
  2. Circle showing interpretation phase
  3. Procedures as bullet points alongside boxes
  4. Products resulting from each phase
  5. Arrows showing sequence

Example: The excerpt provides an illustrative study examining how online reviews influence purchase decisions, using both qualitative (data mining of customer reviews) and quantitative (2×2 experiment) methods concurrently.

📊 Results section

  • Should have parallel structure with methods section.
  • Include sample characteristics and findings from both qualitative and quantitative analyses.
  • Integration should occur at one or multiple points.

Integration strategies by purpose:

  • Corroboration/complementarity: side-by-side comparison in results section
  • Compensation/developmental/expansion: present first strand results, then use them to explain second strand
  • Completeness/diversity: select integration method based on design (concurrent → integrate in analysis stage)

Visual representations:

  • Use figures and diagrams to simplify complex interrelationships
  • Enhances readability for audiences unfamiliar with complex procedures
  • Helps communicate results to practitioners

💬 Discussion section

  • Additional sections beyond standard empirical papers: qualitative inferences, quantitative inferences, and especially meta-inferences.
  • Qualitative and quantitative inferences can be combined with results or reported separately.

Meta-inferences (most important differentiator):

  • Integrate qualitative and quantitative inferences
  • Establish quality through design quality and interpretive rigor assessment
  • Provide answers to research questions
  • If inferences are divergent, investigate inconsistency and reconcile differences

Other discussion components:

  • Practical and theoretical implications
  • Limitations and threats encountered (and how overcome)
  • Future research directions

🎯 Conclusions section

  • State most important findings
  • Mention achievement of main mixed-methods purpose
  • Present key implications ("golden nuggets") without apologies

🎨 Seven purpose-based templates

🔄 Compensation purpose

Compensation purpose: conducted when initial data collection findings have been compromised due to weaknesses associated with the method used.

  • Based on sequential design (typically decided after initial study)
  • Not necessarily equal status; less dominant strand often contains errors/weaknesses
  • Present results in order of data collection and analysis
  • Show integration during analysis stage
  • Use table/figure to illustrate comparison and demonstrate compensation

🎯 Corroboration purpose

Corroboration purpose: follows triangulation logic to achieve convergence by using different methods with different strengths/limitations to understand the same phenomenon.

  • Concurrent design with both methods implemented independently and simultaneously
  • Recommend equal status design to optimize strengths of both methods
  • Use joint displays or side-by-side comparisons in results
  • Goal is convergence; if divergent, must reconcile inconsistency

🌈 Diversity purpose

Diversity purpose: aims to uncover divergent findings.

  • Use multiple theoretical frameworks to maximize possibility of inconsistent findings
  • Either sequential or concurrent design possible
  • Recommend equal status design to ensure both data types emphasized equally
  • Stronger inferences and balance when inconsistency exists

🔗 Developmental purpose

Developmental purpose: results from one method help develop or inform the other method.

  • Sequential design required (first component implemented first)
  • Recommend combining methods and results sections for each component
  • Order depends on investigation strategy (qualitative→quantitative or quantitative→qualitative)

Example scenarios:

  • Qualitative findings develop survey items for quantitative component

  • Quantitative findings determine sampling strategy for qualitative component

  • Discuss results of each strand independently, then discuss merging point

  • Develop inferences for each strand before generating meta-inferences

🧩 Complementarity purpose

Complementarity purpose: investigates different aspects of a phenomenon.

  • Concurrent design is best approach
  • Structure influenced by priority (equal vs. dominant-less dominant)
  • Explain rationale for sampling strategy and how it's supported by theory
  • Meta-inferences should link back to research questions as answers

🔲 Completeness purpose

Completeness purpose: qualitative and quantitative components cannot stand alone; single method insufficient to provide complete picture.

  • Integrate qualitative and quantitative data in data analysis sub-section
  • Both sequential and concurrent designs possible
  • In sequential: complete first strand, generate inferences, use to inform next strand
  • In concurrent: collect data simultaneously, discuss integrated analysis results
  • Meta-inferences should provide complete picture of phenomenon

📐 Expansion purpose

Expansion purpose: second strand expands findings from first strand; aspects investigated must be distinct.

  • Sequential design based
  • Present methods and results of first strand, then use inferences to formulate research questions for second strand
  • Order determined by investigation nature (qualitative→quantitative or quantitative→qualitative)
  • Major challenge: mixing both methods to create meaningful meta-inferences
  • Recommendation: collect both data types at multiple points and analyze simultaneously for rich insights

🔑 Design elements affecting structure

⏱️ Time orientation

Design typeCharacteristicsImplications
ConcurrentBoth strands implemented simultaneouslyIntegration often in analysis or results
SequentialOne strand follows anotherFirst strand informs second strand

⚖️ Status considerations

Status typeCharacteristicsImplications
Equal statusBoth strands weighted equallyBoth emphasized equally in paper
Dominant-less dominantOne strand has greater emphasisGreater emphasis on dominant strand; less dominant provides limited contributions

Don't confuse: Status is about emphasis and contribution weight, not about which method is "better"—it reflects the research design strategy.

🔄 Integration points

Integration can happen at multiple stages:

  • Data collection: e.g., open- and closed-ended questions in same survey
  • Data analysis: e.g., quantifying qualitative data (quantitizing)
  • Results: e.g., side-by-side comparison of findings
  • Interpretation: developing meta-inferences
12

Guidelines for Editors and Reviewers of Mixed-Methods Research

CHAPTER 12. Guidelines for Editors and Reviewers of Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

Editors and reviewers can use a structured checklist to evaluate mixed-methods research papers by ensuring appropriate methodology selection, clear paradigmatic stance, rigorous data collection and analysis, and high-quality meta-inferences that advance understanding of the phenomenon under study.

📌 Key points (3–5)

  • Purpose of the checklist: Provides editors and reviewers—especially those unfamiliar with mixed-methods research—with criteria to assess the quality and appropriateness of mixed-methods empirical papers.
  • Core evaluation areas: Appropriateness of approach, paradigmatic stance, theory use, design strategies, data collection/analysis procedures, meta-inference development, and narrative structure.
  • Required vs. recommended elements: Some elements (e.g., research questions, justification, meta-inferences) are required for all mixed-methods papers; others (e.g., visual diagrams) are recommended but not mandatory.
  • Common confusion: Mixed-methods research is not always necessary—if the theoretical mechanism is clear and the goal is simple replication, a single method may suffice; the research questions and purpose should drive methodology selection.
  • Quality focus: Meta-inferences (the integrated findings from qualitative and quantitative strands) are the core outcome of mixed-methods research and must meet quality standards.

📋 Evaluating appropriateness and justification

🎯 When mixed-methods is appropriate

  • Editors and reviewers should first check whether mixed-methods research is actually suitable for the inquiry.
  • Not always necessary: If the theoretical or causal mechanism is already clear and the objective is simply to replicate an existing finding, mixed-methods may not be appropriate—or at least the "length to contribution ratio" should be assessed.
  • Example: A journal valuing robustness may accept a compensation-purpose study, while a novelty-focused journal may require purposes beyond compensation or corroboration.

❓ Required research questions

Authors should state quantitative questions, qualitative questions, and mixed-methods research questions where relevant.

  • Research questions must drive methodology selection, not the reverse.
  • Example: If the objective is to identify and test theoretical constructs in a new context, an exploratory qualitative phase followed by an explanatory quantitative phase (sequential design, developmental purpose) is appropriate.

🔍 Justification and purpose

Editors and reviewers should ensure:

  • Justification for mixed-methods: Authors must explain why mixed-methods is needed.
  • Purpose identification: One or more of the seven purposes (compensation, corroboration, diversity, developmental, complementarity, completeness, expansion) must be clearly identified and explained.
  • Benefits articulated: Authors should convey the specific benefits of using a mixed-methods approach, especially until norms and templates for such papers are widely understood.

🧩 Paradigmatic stance and theory

🌐 Paradigmatic stance evaluation

  • This is described as "perhaps the biggest challenge and associated risk" in moving mixed-methods papers to publication.
  • Open-mindedness required: Editors and reviewers should keep an open mind, as mixed-methods research is characterized by paradigm pluralism.
  • Common stances: pragmatism, critical realism, transformative emancipatory, and dialectic stance.
AspectWhat to check
IdentificationIs the paradigmatic approach and its underlying assumptions identified?
Multiple paradigmsIf authors use positivism in the quantitative strand and interpretivism in the qualitative strand, evaluate based on both paradigms
Evaluation basisMethodological selection should be evaluated based on the research questions, not rejected due to paradigm choice

Don't confuse: The paradigmatic stance should not become an obstacle to conducting or publishing mixed-methods research; it should inform and guide the methodology.

📚 Theory in mixed-methods papers

  • Theory-driven approach: If used, prior theory is required; hypotheses should be developed based on that theory.
  • Data-driven approach: Reviewing relevant literature is still recommended (though not always necessary); authors may use techniques like big data analytics to generate theory.
  • Editors should ensure appropriate literature review is presented if applicable, even if prior work is insufficient to fully explain the phenomenon.

🔧 Design strategies and data procedures

🏗️ Mixed-methods design elements

Editors and reviewers should ensure:

  • Design elements explained: If authors use multiple design elements (time orientation, strands/phases, mixing strategies, priority of methodological approach), they must explain how elements are selected and related.
  • Nomenclature: Authors must use mixed-methods nomenclature (not just qualitative or quantitative terminology).
  • Alignment with purpose: Design strategies must align with the stated purpose(s) of mixed-methods research.

Example: If the purpose is developmental, a sequential design is more suitable than a concurrent design.

📊 Data collection and analysis

Required reporting includes:

  • Separate descriptions: Detailed descriptions of quantitative and qualitative methods, including specific forms of data collection and analysis techniques (procedures, sample size, types of analysis).
  • Mixed-methods analysis strategies: How both types of data are rigorously analyzed so useful and credible inferences can be derived.
  • Quality metrics for each strand:
    • Quantitative: reliability, internal validity, discriminant validity
    • Qualitative: credibility, dependability
    • Big data: data extraction techniques, algorithms, validation techniques (e.g., sensitivity analysis)

🖼️ Visual representation

  • Recommended (not required): A visual depiction or diagram (e.g., joint display, diagram of procedures) helps readers understand the findings and the sequencing of data collection.
  • A diagram of procedures should portray quantitative and qualitative data collection sequencing, integration, and outcomes of various stages.

🎯 Meta-inferences and quality

🔑 Centrality of meta-inferences

Meta-inferences are the outcomes of mixed-methods research.

  • If authors fail to provide and explain meta-inferences, the key objective of conducting mixed-methods research is not achieved.
  • Meta-inferences are generated by integrating qualitative and quantitative inferences.

✅ Quality assessment

Editors and reviewers should ensure:

  • All inferences meet quality criteria: Both qualitative and quantitative inferences, as well as meta-inferences, must meet quality standards.
  • Explicit integration discussion: Authors must provide an explicit discussion and assessment of how they integrated findings from the two strands.
  • Theoretical reasoning specified: Authors must explain how inferences are generated (induction, deduction, abduction).

🔀 Three possible outcomes when integrating findings

OutcomeMeaningWhat meta-inferences provide
ConvergentQualitative and quantitative results agreeStrengthen initial theoretical assumptions
ComplementaryResults relate to different aspects of the same phenomenonA more complete picture that cannot be achieved by a single method
DivergentResults contradict each otherMeans to find new, better, complete, and/or contingent explanations

Critical requirement: When findings are divergent, authors are required to resolve or provide sufficient explanations (e.g., reanalyzing existing data, revisiting theoretical assumptions).

📝 Narrative structure and discussion

📖 Structure alignment

  • The narrative structure of the paper must relate to the design strategies.
  • Example: In a concurrent mixed-methods design, qualitative and quantitative strands should be discussed simultaneously.
  • The discussion section should follow the sequence of procedures used in the mixed-methods design.

💡 Discussion section requirements

Editors and reviewers should ensure the discussion:

  • Articulates advancement: Explains how the use of mixed-methods advances understanding of the phenomenon of interest.
  • Links back to research questions: Findings should be connected to the original research questions.
  • Includes implications: Discusses implications of the integrated findings.
  • Addresses contribution: For data-driven approaches, specifies how the contribution has refined existing theory or generated new theory.

🔄 Logical flow

  • The narrative should flow logically and make sense.
  • The discussion should reflect the implications of findings from the two different research strands.

📋 Summary checklist structure

The chapter provides a comprehensive checklist organized by guideline areas:

  1. Appropriateness of mixed-methods approach (required elements: research questions, justification, purpose identification, benefits explanation)
  2. Paradigmatic stance (required: identification of paradigmatic approach and assumptions)
  3. Theory (required for theory-driven; recommended literature review for data-driven)
  4. Design strategies (required: design elements, nomenclature, alignment with purpose)
  5. Data collection and analysis (required: procedures, quality metrics; recommended: visual model, joint display)
  6. Meta-inferences (required: proper development, quality assessment, explanation of generation process, reconciliation of divergent findings)
  7. Additional elements (required: narrative structure alignment, discussion articulating advancement)

Key distinction: The checklist is not a rigid standard but identifies recommended elements that represent the most important aspects of an empirical mixed-methods research paper, particularly helpful for those less familiar with this methodology.

13

Challenges and Strategies in Conducting, Writing, and Publishing Mixed-Methods Research

CHAPTER 13. Challenges and Strategies in Conducting, Writing, and Publishing Mixed-Methods Research

🧭 Overview

🧠 One-sentence thesis

Mixed-methods research faces multiple challenges—from paradigmatic debates to integration problems—but researchers can overcome them through pragmatic philosophical stances, careful planning, and systematic design choices.

📌 Key points (3–5)

  • Six major challenge categories: paradigmatic/philosophical issues, research question formulation, nomenclature/design complexity, integration of findings, team selection, resource requirements, and presentation of findings.
  • The incompatibility thesis debate: purists argue qualitative and quantitative methods cannot be mixed due to fundamental epistemological differences; pragmatism offers a solution by rejecting "either-or" thinking.
  • Integration is the hardest part: most studies integrate only at interpretation, not analysis; when findings don't cohere, diffraction methods allow contradictions to reveal different aspects of phenomena.
  • Common confusion—metaphysical vs. shared-belief paradigms: moving from rigid paradigm-method links to viewing paradigms as shared research community beliefs enables mixed-methods work.
  • Resource and scope expansion risks: mixed-methods requires more time, expertise, and resources than single-method research; careful planning and realistic scope management are critical.

🔥 Paradigmatic and Philosophical Challenges

🔥 The incompatibility thesis

Incompatibility thesis: the claim that it is inappropriate to mix qualitative and quantitative methods due to fundamental differences (e.g., epistemological) between the paradigms supposedly underlying those methods.

  • The core debate: purists from each tradition (constructivists, interpretivists, positivists) argue their methods must remain independent.
  • Why purists object: they believe different methods have distinct assumptions about phenomena; mixing leads to "vastly diverse, disparate, and totally antithetical ends."
  • Real-world impact: some journals have built reputations around one specific methodology (either-or, not both).

Example: A positivist might argue that objective measurement cannot be combined with subjective interpretation because they rest on incompatible views of reality.

🌉 Moving to pragmatism

Strategy: shift from "metaphysical paradigms" to "paradigms as shared beliefs in a research community."

  • Metaphysical paradigms: purists link assumptions of their chosen paradigm rigidly with methodological traditions.
  • Shared-belief paradigms: supersede qualitative/quantitative dichotomies of induction/deduction, subjectivity/objectivity, context/generality.
  • Pragmatism's role: rejects "either-or" decisions; accepts multiple methods in one project; emphasizes research questions over methodologies.

Contingency theory approach: accept that qualitative, quantitative, and mixed-methods are all appropriate under different circumstances; the researcher's task is to examine specific contingencies and decide which approach fits the situation.

⚖️ Handling contradictory findings

Challenge: when combining paradigms, what if findings from one method contradict findings from another?

Strategy: take a comprehensive, new epistemological position.

  • Knowledge from qualitative and quantitative methods should not be seen as irreconcilable pools.
  • Instead, view them as different positions on a continuum of knowledge.
  • Within complex reality, we need different and various types of knowledge.

🤝 Overcoming lack of commonality

Problem: researchers have different standpoints about mixed-methods research.

  • Some believe mixed-methods serves the quantitative paradigm while relegating qualitative to secondary status.
  • Others believe mixing should occur at all stages of the study.

Strategy: be more open-minded; review exemplary mixed-methods studies; select one or two as "gold standards" for guiding the research process.

🎯 Research Questions and Rationale

🎯 Assessing suitability

Challenge: some researchers adopt mixed-methods because they believe it's a "new approach," even when their research questions don't require it.

Key principle: research questions must drive method selection, not the other way around.

  • Provide solid reasons for choosing mixed-methods and the value it brings to answering research questions.
  • Don't use mixed-methods simply to improve publication probability—this creates a mismatch between stated rationale and actual objective.

📏 Managing scope expansion

Challenge: formulating appropriate research questions that leverage the synergy of integrating data without expanding scope beyond the optimal.

  • Combining qualitative and quantitative methods can easily broaden research scope.
  • Example from the excerpt: A researcher studying forest policy in Ghana had to conduct more interviews than planned because the target group in the field differed from the initially planned group.

Strategy: anticipate changes that may occur during the study (especially in sequential designs); be open to emerging research questions without expanding scope beyond optimal.

  • If expansion is necessary: gather more resources (if available) or reallocate existing resources.

🗺️ Nomenclature, Design, and Classification

🗺️ The taxonomy challenge

Problem: significant number of taxonomies, categorizations, and terminologies surrounding mixed-methods research.

  • Different typologies have overlapping and divergent components.
  • Different names and labels for the same thing (e.g., "convergent parallel design" = "concurrent design").
  • Researchers face the challenge of examining different types from various resources (books, journal articles) and selecting one that suits their purpose.

🧭 Navigation strategies

Strategy 1: Review different design elements; select those most salient to the study; explain those dimensions in the mixed-methods design section.

Strategy 2: Use a systematic decision choice model that covers most types of mixed-methods research.

  • Treat research questions and purposes as separate from design, data collection/analysis techniques, and inferential stages.
  • This separation flexibly guides researchers from one stage to the next without being trapped in complex definitions and design selections.

🔗 Integration of Findings

🔗 The integration challenge

Core problem: ensuring proper integration of findings across methods is one of the biggest challenges.

  • Most studies integrate only at the interpretation level.
  • Few studies report integration at the analysis stage (data integration).

Data integration: goes beyond just reporting findings of both approaches; considers the interaction of data during analysis to produce a more comprehensible object.

🔧 Common integration techniques

Two most common data integration techniques:

  1. Combination of data types within different analyses: using categorical or continuous variables both for statistical analysis and for comparing coded qualitative data.
  2. Conversion of data: quantitizing (qualitative → quantitative) or qualitizing (quantitative → qualitative).

Important: there must be a clear rationale for using such techniques.

💥 When integration produces "cuts"

Problem: data integration can be problematic when objects examined in qualitative and quantitative strands are different.

  • Different methods may produce "cuts"—findings that are completely different and do not fit together.

Example from the excerpt: A study of financial education with Hong Kong ethnic minority students found:

  • Quantitative: intervention group showed significant increase in long-term financial planning strategies.
  • Qualitative: students in intervention group showed no improvement in actual saving behavior.
  • Why the contradiction: quantitative strand targeted individual financial literacy; qualitative revealed family-level saving behaviors (cultural finance perspective).

Don't confuse: "cuts" are not research failures—they reveal different aspects of the same phenomenon (or multiple phenomena).

🌈 The diffraction approach

Diffraction: a method for integrating qualitative and quantitative data that are not coherent; involves reading data across methods while allowing data to cohere or not.

Key principles:

  • Mixed data, objects, and methods coproduce one another.
  • The ontology of data, objects, and methodological approach become as important as their epistemologies.
  • Precludes using mixed data merely to illustrate, enrich, or verify each other.
  • If data don't cohere, produce seemingly conflicting findings and explain using different narratives.

Why diffraction matters: it refuses to "hold still" one method and recognizes the research object as a messy, processual entity.

👥 Team Selection and Resources

👥 Expertise requirements

Challenge: mixed-methods research requires methodological and philosophical expertise in multiple areas.

  • Rather than just conducting interviews or just conducting surveys, researchers need to do both.

Strategy: form a strong research team consisting of individuals with backgrounds and expertise in different types of methods.

For Ph.D. students: gain sufficient knowledge in both qualitative and quantitative research (e.g., courses, workshops) before undertaking a mixed-methods project.

💰 Resource demands

Challenge: collecting two different types of data means greater resources required for data collection, management, and analysis.

Design typeTime consideration
SequentialSignificantly more time required to collect and analyze first data set before commencing subsequent data collection
ConcurrentSufficient time and resources required to facilitate collection of two sets of data at the same time

Strategy: careful planning and preparation.

  • Secure sufficient resources to collect and analyze both qualitative and quantitative data.
  • Create a timeline for data collection and analysis.

📝 Presentation of Findings

📝 Volume and reporting challenges

Challenge: large volume of data generated by mixed-methods research creates challenges in analysis and reporting.

Strategy 1: Use templates for publishing mixed-methods research (referenced in Chapter 11 of the source book).

Strategy 2—when facing word/page limits:

ApproachDescription
Appendixes/supplementary materialsReport study details in appendixes or supplementary materials for review
Separate publications (sub-optimal)Publish results of each study separately, incorporating meta-inferences into the second study
Journal flexibilityEncourage journals to be flexible regarding page limits for mixed-methods papers

Why flexibility matters: mixed-methods papers necessarily tend to be longer due to method details, analysis/results details, and meta-inferences; this ensures transparency and careful evaluation.

📋 Summary table of all challenges and strategies

CategoryChallengeStrategy
Paradigmatic/philosophicalIncompatibility thesis debateMove to pragmatism; use contingency approach; take new epistemological position
Paradigmatic/philosophicalLack of commonality in standpointsReview exemplary studies; use "gold standards"
Research questionsAssessing suitabilityEnsure questions drive method selection
Research questionsScope expansionAnticipate changes; manage scope; reallocate resources
Nomenclature/designToo many taxonomiesReview design elements; use decision choice model
IntegrationProper integration across methodsUse diffraction approach for non-coherent data
Team selectionMultiple expertise areas neededForm strong team; gain sufficient knowledge first
ResourcesGreater demands than single-methodCareful planning; create timeline
PresentationLarge data volumeUse templates; appendixes; encourage journal flexibility
    Conducting Mixed-Methods Research From Classical Social Sciences to the Age of Big Data and Analytics | Thetawave AI – Best AI Note Taker for College Students