Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, automatically generate references for free.

  • Knowledge Base
  • Methodology

Research Design | Step-by-Step Guide with Examples

Published on 5 May 2022 by Shona McCombes . Revised on 20 March 2023.

A research design is a strategy for answering your research question  using empirical data. Creating a research design means making decisions about:

  • Your overall aims and approach
  • The type of research design you’ll use
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods
  • The procedures you’ll follow to collect data
  • Your data analysis methods

A well-planned research design helps ensure that your methods match your research aims and that you use the right kind of analysis for your data.

Table of contents

Step 1: consider your aims and approach, step 2: choose a type of research design, step 3: identify your population and sampling method, step 4: choose your data collection methods, step 5: plan your data collection procedures, step 6: decide on your data analysis strategies, frequently asked questions.

  • Introduction

Before you can start designing your research, you should already have a clear idea of the research question you want to investigate.

There are many different ways you could go about answering this question. Your research design choices should be driven by your aims and priorities – start by thinking carefully about what you want to achieve.

The first choice you need to make is whether you’ll take a qualitative or quantitative approach.

Qualitative research designs tend to be more flexible and inductive , allowing you to adjust your approach based on what you find throughout the research process.

Quantitative research designs tend to be more fixed and deductive , with variables and hypotheses clearly defined in advance of data collection.

It’s also possible to use a mixed methods design that integrates aspects of both approaches. By combining qualitative and quantitative insights, you can gain a more complete picture of the problem you’re studying and strengthen the credibility of your conclusions.

Practical and ethical considerations when designing research

As well as scientific considerations, you need to think practically when designing your research. If your research involves people or animals, you also need to consider research ethics .

  • How much time do you have to collect data and write up the research?
  • Will you be able to gain access to the data you need (e.g., by travelling to a specific location or contacting specific people)?
  • Do you have the necessary research skills (e.g., statistical analysis or interview techniques)?
  • Will you need ethical approval ?

At each stage of the research design process, make sure that your choices are practically feasible.

Prevent plagiarism, run a free check.

Within both qualitative and quantitative approaches, there are several types of research design to choose from. Each type provides a framework for the overall shape of your research.

Types of quantitative research designs

Quantitative designs can be split into four main types. Experimental and   quasi-experimental designs allow you to test cause-and-effect relationships, while descriptive and correlational designs allow you to measure variables and describe relationships between them.

With descriptive and correlational designs, you can get a clear picture of characteristics, trends, and relationships as they exist in the real world. However, you can’t draw conclusions about cause and effect (because correlation doesn’t imply causation ).

Experiments are the strongest way to test cause-and-effect relationships without the risk of other variables influencing the results. However, their controlled conditions may not always reflect how things work in the real world. They’re often also more difficult and expensive to implement.

Types of qualitative research designs

Qualitative designs are less strictly defined. This approach is about gaining a rich, detailed understanding of a specific context or phenomenon, and you can often be more creative and flexible in designing your research.

The table below shows some common types of qualitative design. They often have similar approaches in terms of data collection, but focus on different aspects when analysing the data.

Your research design should clearly define who or what your research will focus on, and how you’ll go about choosing your participants or subjects.

In research, a population is the entire group that you want to draw conclusions about, while a sample is the smaller group of individuals you’ll actually collect data from.

Defining the population

A population can be made up of anything you want to study – plants, animals, organisations, texts, countries, etc. In the social sciences, it most often refers to a group of people.

For example, will you focus on people from a specific demographic, region, or background? Are you interested in people with a certain job or medical condition, or users of a particular product?

The more precisely you define your population, the easier it will be to gather a representative sample.

Sampling methods

Even with a narrowly defined population, it’s rarely possible to collect data from every individual. Instead, you’ll collect data from a sample.

To select a sample, there are two main approaches: probability sampling and non-probability sampling . The sampling method you use affects how confidently you can generalise your results to the population as a whole.

Probability sampling is the most statistically valid option, but it’s often difficult to achieve unless you’re dealing with a very small and accessible population.

For practical reasons, many studies use non-probability sampling, but it’s important to be aware of the limitations and carefully consider potential biases. You should always make an effort to gather a sample that’s as representative as possible of the population.

Case selection in qualitative research

In some types of qualitative designs, sampling may not be relevant.

For example, in an ethnography or a case study, your aim is to deeply understand a specific context, not to generalise to a population. Instead of sampling, you may simply aim to collect as much data as possible about the context you are studying.

In these types of design, you still have to carefully consider your choice of case or community. You should have a clear rationale for why this particular case is suitable for answering your research question.

For example, you might choose a case study that reveals an unusual or neglected aspect of your research problem, or you might choose several very similar or very different cases in order to compare them.

Data collection methods are ways of directly measuring variables and gathering information. They allow you to gain first-hand knowledge and original insights into your research problem.

You can choose just one data collection method, or use several methods in the same study.

Survey methods

Surveys allow you to collect data about opinions, behaviours, experiences, and characteristics by asking people directly. There are two main survey methods to choose from: questionnaires and interviews.

Observation methods

Observations allow you to collect data unobtrusively, observing characteristics, behaviours, or social interactions without relying on self-reporting.

Observations may be conducted in real time, taking notes as you observe, or you might make audiovisual recordings for later analysis. They can be qualitative or quantitative.

Other methods of data collection

There are many other ways you might collect data depending on your field and topic.

If you’re not sure which methods will work best for your research design, try reading some papers in your field to see what data collection methods they used.

Secondary data

If you don’t have the time or resources to collect data from the population you’re interested in, you can also choose to use secondary data that other researchers already collected – for example, datasets from government surveys or previous studies on your topic.

With this raw data, you can do your own analysis to answer new research questions that weren’t addressed by the original study.

Using secondary data can expand the scope of your research, as you may be able to access much larger and more varied samples than you could collect yourself.

However, it also means you don’t have any control over which variables to measure or how to measure them, so the conclusions you can draw may be limited.

As well as deciding on your methods, you need to plan exactly how you’ll use these methods to collect data that’s consistent, accurate, and unbiased.

Planning systematic procedures is especially important in quantitative research, where you need to precisely define your variables and ensure your measurements are reliable and valid.

Operationalisation

Some variables, like height or age, are easily measured. But often you’ll be dealing with more abstract concepts, like satisfaction, anxiety, or competence. Operationalisation means turning these fuzzy ideas into measurable indicators.

If you’re using observations , which events or actions will you count?

If you’re using surveys , which questions will you ask and what range of responses will be offered?

You may also choose to use or adapt existing materials designed to measure the concept you’re interested in – for example, questionnaires or inventories whose reliability and validity has already been established.

Reliability and validity

Reliability means your results can be consistently reproduced , while validity means that you’re actually measuring the concept you’re interested in.

For valid and reliable results, your measurement materials should be thoroughly researched and carefully designed. Plan your procedures to make sure you carry out the same steps in the same way for each participant.

If you’re developing a new questionnaire or other instrument to measure a specific concept, running a pilot study allows you to check its validity and reliability in advance.

Sampling procedures

As well as choosing an appropriate sampling method, you need a concrete plan for how you’ll actually contact and recruit your selected sample.

That means making decisions about things like:

  • How many participants do you need for an adequate sample size?
  • What inclusion and exclusion criteria will you use to identify eligible participants?
  • How will you contact your sample – by mail, online, by phone, or in person?

If you’re using a probability sampling method, it’s important that everyone who is randomly selected actually participates in the study. How will you ensure a high response rate?

If you’re using a non-probability method, how will you avoid bias and ensure a representative sample?

Data management

It’s also important to create a data management plan for organising and storing your data.

Will you need to transcribe interviews or perform data entry for observations? You should anonymise and safeguard any sensitive data, and make sure it’s backed up regularly.

Keeping your data well organised will save time when it comes to analysing them. It can also help other researchers validate and add to your findings.

On their own, raw data can’t answer your research question. The last step of designing your research is planning how you’ll analyse the data.

Quantitative data analysis

In quantitative research, you’ll most likely use some form of statistical analysis . With statistics, you can summarise your sample data, make estimates, and test hypotheses.

Using descriptive statistics , you can summarise your sample data in terms of:

  • The distribution of the data (e.g., the frequency of each score on a test)
  • The central tendency of the data (e.g., the mean to describe the average score)
  • The variability of the data (e.g., the standard deviation to describe how spread out the scores are)

The specific calculations you can do depend on the level of measurement of your variables.

Using inferential statistics , you can:

  • Make estimates about the population based on your sample data.
  • Test hypotheses about a relationship between variables.

Regression and correlation tests look for associations between two or more variables, while comparison tests (such as t tests and ANOVAs ) look for differences in the outcomes of different groups.

Your choice of statistical test depends on various aspects of your research design, including the types of variables you’re dealing with and the distribution of your data.

Qualitative data analysis

In qualitative research, your data will usually be very dense with information and ideas. Instead of summing it up in numbers, you’ll need to comb through the data in detail, interpret its meanings, identify patterns, and extract the parts that are most relevant to your research question.

Two of the most common approaches to doing this are thematic analysis and discourse analysis .

There are many other ways of analysing qualitative data depending on the aims of your research. To get a sense of potential approaches, try reading some qualitative research papers in your field.

A sample is a subset of individuals from a larger population. Sampling means selecting the group that you will actually collect data from in your research.

For example, if you are researching the opinions of students in your university, you could survey a sample of 100 students.

Statistical sampling allows you to test a hypothesis about the characteristics of a population. There are various sampling methods you can use to ensure that your sample is representative of the population as a whole.

Operationalisation means turning abstract conceptual ideas into measurable observations.

For example, the concept of social anxiety isn’t directly observable, but it can be operationally defined in terms of self-rating scores, behavioural avoidance of crowded places, or physical anxiety symptoms in social situations.

Before collecting data , it’s important to consider how you will operationalise the variables that you want to measure.

The research methods you use depend on the type of data you need to answer your research question .

  • If you want to measure something or test a hypothesis , use quantitative methods . If you want to explore ideas, thoughts, and meanings, use qualitative methods .
  • If you want to analyse a large amount of readily available data, use secondary data. If you want data specific to your purposes with control over how they are generated, collect primary data.
  • If you want to establish cause-and-effect relationships between variables , use experimental methods. If you want to understand the characteristics of a research subject, use descriptive methods.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the ‘Cite this Scribbr article’ button to automatically add the citation to our free Reference Generator.

McCombes, S. (2023, March 20). Research Design | Step-by-Step Guide with Examples. Scribbr. Retrieved 2 April 2024, from https://www.scribbr.co.uk/research-methods/research-design/

Is this article helpful?

Shona McCombes

Shona McCombes

  • Link to facebook
  • Link to linkedin
  • Link to twitter
  • Link to youtube
  • Writing Tips

The Four Types of Research Paradigms: A Comprehensive Guide

The Four Types of Research Paradigms: A Comprehensive Guide

  • 5-minute read
  • 22nd January 2023

In this guide, you’ll learn all about the four research paradigms and how to choose the right one for your research.

Introduction to Research Paradigms

A paradigm is a system of beliefs, ideas, values, or habits that form the basis for a way of thinking about the world. Therefore, a research paradigm is an approach, model, or framework from which to conduct research. The research paradigm helps you to form a research philosophy, which in turn informs your research methodology.

Your research methodology is essentially the “how” of your research – how you design your study to not only accomplish your research’s aims and objectives but also to ensure your results are reliable and valid. Choosing the correct research paradigm is crucial because it provides a logical structure for conducting your research and improves the quality of your work, assuming it’s followed correctly.

Three Pillars: Ontology, Epistemology, and Methodology

Before we jump into the four types of research paradigms, we need to consider the three pillars of a research paradigm.

Ontology addresses the question, “What is reality?” It’s the study of being. This pillar is about finding out what you seek to research. What do you aim to examine?

Epistemology is the study of knowledge. It asks, “How is knowledge gathered and from what sources?”

Methodology involves the system in which you choose to investigate, measure, and analyze your research’s aims and objectives. It answers the “how” questions.

Let’s now take a look at the different research paradigms.

1.   Positivist Research Paradigm

The positivist research paradigm assumes that there is one objective reality, and people can know this reality and accurately describe and explain it. Positivists rely on their observations through their senses to gain knowledge of their surroundings.

In this singular objective reality, researchers can compare their claims and ascertain the truth. This means researchers are limited to data collection and interpretations from an objective viewpoint. As a result, positivists usually use quantitative methodologies in their research (e.g., statistics, social surveys, and structured questionnaires).

This research paradigm is mostly used in natural sciences, physical sciences, or whenever large sample sizes are being used.

2.   Interpretivist Research Paradigm

Interpretivists believe that different people in society experience and understand reality in different ways – while there may be only “one” reality, everyone interprets it according to their own view. They also believe that all research is influenced and shaped by researchers’ worldviews and theories.

As a result, interpretivists use qualitative methods and techniques to conduct their research. This includes interviews, focus groups, observations of a phenomenon, or collecting documentation on a phenomenon (e.g., newspaper articles, reports, or information from websites).

3.   Critical Theory Research Paradigm

The critical theory paradigm asserts that social science can never be 100% objective or value-free. This paradigm is focused on enacting social change through scientific investigation. Critical theorists question knowledge and procedures and acknowledge how power is used (or abused) in the phenomena or systems they’re investigating.

Find this useful?

Subscribe to our newsletter and get writing tips from our editors straight to your inbox.

Researchers using this paradigm are more often than not aiming to create a more just, egalitarian society in which individual and collective freedoms are secure. Both quantitative and qualitative methods can be used with this paradigm.

4.   Constructivist Research Paradigm

Constructivism asserts that reality is a construct of our minds ; therefore, reality is subjective. Constructivists believe that all knowledge comes from our experiences and reflections on those experiences and oppose the idea that there is a single methodology to generate knowledge.

This paradigm is mostly associated with qualitative research approaches due to its focus on experiences and subjectivity. The researcher focuses on participants’ experiences as well as their own.

Choosing the Right Research Paradigm for Your Study

Once you have a comprehensive understanding of each paradigm, you’re faced with a big question: which paradigm should you choose? The answer to this will set the course of your research and determine its success, findings, and results.

To start, you need to identify your research problem, research objectives , and hypothesis . This will help you to establish what you want to accomplish or understand from your research and the path you need to take to achieve this.

You can begin this process by asking yourself some questions:

  • What is the nature of your research problem (i.e., quantitative or qualitative)?
  • How can you acquire the knowledge you need and communicate it to others? For example, is this knowledge already available in other forms (e.g., documents) and do you need to gain it by gathering or observing other people’s experiences or by experiencing it personally?
  • What is the nature of the reality that you want to study? Is it objective or subjective?

Depending on the problem and objective, other questions may arise during this process that lead you to a suitable paradigm. Ultimately, you must be able to state, explain, and justify the research paradigm you select for your research and be prepared to include this in your dissertation’s methodology and design section.

Using Two Paradigms

If the nature of your research problem and objectives involves both quantitative and qualitative aspects, then you might consider using two paradigms or a mixed methods approach . In this, one paradigm is used to frame the qualitative aspects of the study and another for the quantitative aspects. This is acceptable, although you will be tasked with explaining your rationale for using both of these paradigms in your research.

Choosing the right research paradigm for your research can seem like an insurmountable task. It requires you to:

●  Have a comprehensive understanding of the paradigms,

●  Identify your research problem, objectives, and hypothesis, and

●  Be able to state, explain, and justify the paradigm you select in your methodology and design section.

Although conducting your research and putting your dissertation together is no easy task, proofreading it can be! Our experts are here to make your writing shine. Your first 500 words are free !

Text reads: Make sure your hard work pays off. Discover academic proofreading and editing services. Button text: Learn more.

Share this article:

Post A New Comment

Got content that needs a quick turnaround? Let us polish your work. Explore our editorial business services.

3-minute read

What Is a Content Editor?

Are you interested in learning more about the role of a content editor and the...

4-minute read

The Benefits of Using an Online Proofreading Service

Proofreading is important to ensure your writing is clear and concise for your readers. Whether...

2-minute read

6 Online AI Presentation Maker Tools

Creating presentations can be time-consuming and frustrating. Trying to construct a visually appealing and informative...

What Is Market Research?

No matter your industry, conducting market research helps you keep up to date with shifting...

8 Press Release Distribution Services for Your Business

In a world where you need to stand out, press releases are key to being...

How to Get a Patent

In the United States, the US Patent and Trademarks Office issues patents. In the United...

Logo Harvard University

Make sure your writing is the best it can be with our expert English proofreading and editing.

Research Methods Information : Theoretical Models (Using Theory)

  • Data Management
  • Mixed Methods Research
  • Qualitative Research
  • Quantitative Research
  • Research Study Design
  • Scientific Method and Scientific Rigor
  • Theoretical Models (Using Theory)
  • Using Statistics
  • Articles/Journals
  • Data Analysis Software
  • Institutional Review Board and Research Ethics
  • Web Resources
  • Tutorials/Webinars
  • Developing a Topic
  • The Research Question
  • Help With Writing
  • Grammar/Punctuation
  • Using Style Guides
  • Writing a Literature Review

Browse by Subject

  • Alfaisal University Preparatory Program(AUPP)
  • Ambulatory Care
  • Anthropology
  • Arabic Language
  • Architectural Engineering
  • Behavior Science
  • Biomedical Engineering
  • Biostatics & Epidemiology
  • Business Communication
  • Cardiovascular System
  • Citation Analysis
  • Communications & Medical Skills
  • Computer Science
  • Copyright & Plagiarism
  • Dermatology
  • Electrical Engineering
  • End of Life Care & Geriatrics
  • Endocrine System
  • Evidence Based Medicine
  • Family and Community Medicine
  • Forensic Medicine & Toxicology
  • Gastrointestinal
  • General Engineering
  • Head & Neck And Special Senses
  • Health economics and Hospital management  
  • Hematopoietic & Lymphatic System
  • Human Resource
  • Industrial Engineering
  • Integrated Neuroscience
  • Islamic Studies
  • Law & International Relations
  • Library Resources for COM Clerkships and Clinical Electives
  • Life Science
  • Manage Citations with Zotero, Mendeley, and EndNote
  • Management Information System
  • Mathematics
  • Master of Biomedical Sciences (MBS)
  • Master of Business Administration (MBA)
  • Master's in Genetic Counselling (MGC)
  • Master's in Nanoscience & Nanotechnology
  • Master of Radiological & Imaging Sciences (MRIS)
  • Mechanical Engineering
  • Medical Informatics
  • Microbiology
  • Molecular Medicine
  • Musculoskeletal System
  • Neuroscience
  • Obstetrics & Gynecology
  • Open Educational Resources: Simulations and Virtual Labs
  • Pathogenesis of Diseases
  • Paediatrics
  • Pharmacology
  • Physiology  
  • Project Management
  • Renal System
  • Reproductive System
  • Respiratory System
  • Research Methods Information
  • Science, Technology, and Society
  • Software Engineering
  • Surgical Sub-Specialty

Selected Books on Theory

example of research model

Selected Journals on Theory

  • Historical Materialism Historical Materialism is an interdisciplinary journal dedicated to exploring and developing the critical and explanatory potential of Marxist theory.
  • International Journal of Social Research Methodology Focuses on current & emerging methodological debates across a wide range of social science disciplines & substantive interests.
  • Journal of Professional Counseling: Practice, Theory & Research Includes practical & unique applications of counseling techniques in schools & clinical settings, as well as significant quantitative & qualitative research.
  • Journal of Public Administration Research and Theory Peer reviewed coverage of the research and theory of public administration, including reports of empirical work, and both quantitative and qualitative areas of research.
  • Reconceptualizing Educational Research Methodology Reconceptualizing Educational Research Methodology (RERM) is an internationally refereed journal for researchers and practitioners investigating, tracing and theorizing practices, documentations and politics in education.

What is Meant by Theory?

A theory is a well-established principle that has been developed to explain some aspect of the natural world.

Theory of Everything comic by ThadGuy.com and used with permission.

Defining Theory

Theories are formulated to explain, predict, and understand phenomena and, in many cases, to challenge and extend existing knowledge, within the limits of the critical bounding assumptions.

  • The theoretical framework is the structure that can hold or support a theory of a research study.
  • The theoretical framework introduces and describes the theory which explains why the research problem under study exists.

The Importance of Theory

A theoretical framework consists of concepts, together with their definitions, and existing theory/theories that are used for your particular study. The theoretical framework must demonstrate an understanding of theories and concepts that are relevant to the topic of your  research paper and that will relate it to the broader fields of knowledge in the class you are taking.

The theoretical framework is not something that is found readily available in the literature . You must review course readings and pertinent research literature for theories and analytic models that are relevant to the research problem you are investigating. The selection of a theory should depend on its appropriateness, ease of application, and explanatory power.

The theoretical framework strengthens the study in the following ways .

  • An explicit statement of  theoretical assumptions permits the reader to evaluate them critically.
  • The theoretical framework connects the researcher to existing knowledge. Guided by a relevant theory, you are given a basis for your hypotheses and choice of research methods.
  • Articulating the theoretical assumptions of a research study forces you to address questions of why and how. It permits you to move from simply describing a phenomenon observed to generalizing about various aspects of that phenomenon.
  • Having a theory helps you to identify the limits to those generalizations. A theoretical framework specifies which key variables influence a phenomenon of interest. It alerts you to examine how those key variables might differ and under what circumstances.

By virtue of its application nature, good theory in the social sciences is of value precisely because it fulfills one primary purpose: to explain the meaning, nature, and challenges of a phenomenon, often experienced but unexplained in the world in which we live, so that we may use that knowledge and understanding to act in more informed and effective ways.

The Conceptual Framework . College of Education. Alabama State University; Drafting an Argument . Writing@CSU. Colorado State University; Trochim, William M.K. Philosophy of Research . Research Methods Knowledge Base. 2006.

Strategies for Developing the Theoretical Framework

I.  Developing the Framework

Here are some strategies to develop of an effective theoretical framework:

  • Examine your thesis title and research problem . The research problem anchors your entire study and forms the basis from which you construct your theoretical framework.
  • Brainstorm on what you consider to be the key variables in your research . Answer the question, what factors contribute to the presumed effect?
  • Review related literature to find answers to your research question.
  • List  the constructs and variables that might be relevant to your study. Group these variables into independent and dependent categories.
  • Review the key social science theories that are introduced to you in your course readings and choose the theory or theories that can best explain the relationships between the key variables in your study [note the Writing Tip on this page].
  • Discuss the assumptions or propositions of this theory and point out their relevance to your research.

A theoretical framework is used to limit the scope of the relevant data by focusing on specific variables and defining the specific viewpoint (framework) that the researcher will take in analyzing and interpreting the data to be gathered, understanding concepts and variables according to the given definitions, and building knowledge by validating or challenging theoretical assumptions.

II.  Purpose

Think of theories as the conceptual basis for understanding, analyzing, and designing ways to investigate relationships within social systems. To the end, the following roles served by a theory can help guide the development of your framework.*

  • Means by which new research data can be interpreted and coded for future use,
  • Response to new problems that have no previously identified solutions strategy,
  • Means for identifying and defining research problems,
  • Means for prescribing or evaluating solutions to research problems,
  • Way of telling us that certain facts among the accumulated knowledge are important and which facts are not,
  • Means of giving old data new interpretations and new meaning,
  • Means by which to identify important new issues and prescribe the most critical research questions that need to be answered to maximize understanding of the issue,
  • Means of providing members of a professional discipline with a common language and a frame of reference for defining boundaries of their profession, and
  • Means to guide and inform research so that it can, in turn, guide research efforts and improve professional practice.

*Adapted from: Torraco, R. J. “Theory-Building Research Methods.” In Swanson R. A. and E. F. Holton III , editors. Human Resource Development Handbook: Linking Research and Practice . (San Francisco, CA: Berrett-Koehler, 1997): pp. 114-137; Sutton, Robert I. and Barry M. Staw. “What Theory is Not.” Administrative Science Quarterly 40 (September 1995): 371-384.

Incorporating Theory in Your Structure and Writing Style

The theoretical framework may be rooted in a specific theory , in which case, you are expected to test the validity of an existing theory in relation to specific events, issues, or phenomena. Many social science research papers fit into this rubric. For example, Peripheral Realism theory, which categorizes perceived differences between nation-states as those that give orders, those that obey, and those that rebel, could be used as a means for understanding conflicted relationships among countries in Africa.

A test of this theory could be the following: Does Peripheral Realism theory help explain intra-state actions, such as, the growing split between southern and northern Sudan that may likely lead to the creation of two nations?

However, you may not always be asked by your professor to test a specific theory in your paper, but to develop your own framework from which your analysis of the research problem is derived . Given this, it is perhaps easiest to understand the nature and function of a theoretical framework if it is viewed as the answer to two basic questions:

  • What is the research problem/question? [e.g., "How should the individual and the state relate during periods of conflict?"]
  • Why is your approach a feasible solution? [I could choose to test Instrumentalist or Circumstantialists models developed among Ethnic Conflict Theorists that rely upon socio-economic-political factors to explain individual-state relations and to apply this theoretical model to periods of war between nations].

The answers to these questions come from a thorough review of the literature and your course readings [summarized and analyzed in the next section of your paper] and the gaps in the research that emerge from the review process. With this in mind, a complete theoretical framework will likely not emerge until after you have completed a thorough review of the literature .

In writing this part of your research paper, keep in mind the following:

  • Clearly describe the framework, concepts, models, or specific theories that underpin your study . This includes noting who the key theorists are in the field who have conducted research on the problem you are investigating and, when necessary, the historical context that supports the formulation of that theory. This latter element is particularly important if the theory is relatively unknown or it is borrowed from another discipline.
  • Position your theoretical framework within a broader context of related frameworks , concepts, models, or theories . There will likely be several concepts, theories, or models that can be used to help develop a framework for understanding the research problem. Therefore, note why the framework you've chosen is the appropriate one.
  • The present tense is used when writing about theory.
  • You should make your theoretical assumptions as explicit as possible . Later, your discussion of methodology should be linked back to this theoretical framework.
  • Don’t just take what the theory says as a given! Reality is never accurately represented in such a simplistic way; if you imply that it can be, you fundamentally distort a reader's ability to understand the findings that emerge. Given this, always note the limitiations of the theoretical framework you've chosen [i.e., what parts of the research problem require further investigation because the theory does not explain a certain phenomena].

The Conceptual Framework . College of Education. Alabama State University; Conceptual Framework: What Do You Think is Going On? College of Engineering. University of Michigan; Drafting an Argument . Writing@CSU. Colorado State University; Lynham, Susan A. “The General Method of Theory-Building Research in Applied Disciplines.” Advances in Developing Human Resources 4 (August 2002): 221-241; Tavallaei, Mehdi and Mansor Abu Talib. A General Perspective on the Role of Theory in Qualitative Research. Journal of International Social Research 3 (Spring 2010); Trochim, William M.K. Philosophy of Research . Research Methods Knowledge Base. 2006.

Video on Creating a Theoretical Framework

  • Theoretical Framework A short introduction to theoretical frameworks and how to approach constructing one. Presented by Francois J. Desjardins, Associate Professor at University of Ontario Institute of Technology. NOTE: Dr. Desjardins speaks a bit quickly at times but the content of his presentation is very good.

Writing Tip | Borrowing Theoretical Constructs

Borrowing Theoretical Constructs from Elsewhere

A growing and increasingly important trend in the social sciences is to think about and attempt to understand specific research problems from an interdisciplinary perspective. One way to do this is to not rely exclusively on the theories you've read about in a particular class, but to think about how an issue might be informed by theories developed in other disciplines.

For example, if you are a political science student studying the rhetorical strategies used by female incumbants in state legislature campaigns, theories about the use of language could be derived, not only from political science, but linguistics, communication studies, philosophy, psychology, and, in this particular case, feminist studies.

Building theoretical frameworks based on the postulates and hypotheses developed in other disciplinary contexts can be both enlightening and an effective way to be fully engaged in the research topic.

Writing Tip | Don't Undertheorize

Never leave the theory hanging out there in the Introduction never to be mentioned again. Undertheorizing weakens your paper. The theoretical framework you introduce should guide your study throughout the paper.

Be sure to always connect theory to the analysis and to explain in the discussion part of your paper how the theoretical framework you chose fit the research problem, or if appropriate, was inadequate in explaining the phenomenon you were investigating. In that case, don't be afraid to propose your own theory based on your findings.

Writing Tip | Theory vs Hypothesis

What's a Theory? What's a Hypothesis?

The terms theory and hypothesis are often used interchangeably in everyday use. However, the difference between them in scholarly research is important, particularly when using an experimental design. A theory is a well-established principle that has been developed to explain some aspect of the natural world.

Theories arise from repeated observation and testing and incorporates facts, laws, predictions, and tested hypotheses that are widely accepted [e.g., rational choice theory; grounded theory].

A hypothesis is a specific, testable prediction about what you expect to happen in your study. For example, an experiment designed to look at the relationship between study habits and test anxiety might have a hypothesis that states, "We predict that students with better study habits will suffer less test anxiety." Unless your study is exploratory in nature, your hypothesis should always explain what you expect to happen during the course of your research.

The key distinctions are:

  • A theory predicts events in a broad, general context;  a hypothesis makes a specific prediction about a specified set of circumstances.
  • A theory has been extensively tested and is generally accepted among scholars; a hypothesis is a speculative guess that has yet to be tested.

Cherry, Kendra. Introduction to Research Methods: Theory and Hypothesis . About.com Psychology; Gezae, Michael et al. Welcome Presentation on Hypothesis . Slideshare presentation.

Need More Help?

  • Librarians [email protected]
  • Alfaisal IT Department   [email protected]
  • << Previous: Scientific Method and Scientific Rigor
  • Next: Using Statistics >>
  • Last Updated: May 25, 2023 10:13 AM
  • URL: https://libguides.alfaisal.edu/research

Scientific Method

4. research methods: modeling.

LEGO ® bricks have been a staple of the toy world since they were first manufactured in Denmark in 1953. The interlocking plastic bricks can be assembled into an endless variety of objects (see Figure 1). Some kids (and even many adults) are interested in building the perfect model – finding the bricks of the right color, shape, and size, and assembling them into a replica of a familiar object in the real world, like a castle, the space shuttle , or London Bridge. Others focus on using the object they build – moving LEGO knights in and out of the castle shown in Figure 1, for example, or enacting a space shuttle mission to Mars. Still others may have no particular end product in mind when they start snapping bricks together and just want to see what they can do with the pieces they have.

Legos

On the most basic level, scientists use models in much the same way that people play with LEGO bricks. Scientific models may or may not be physical entities, but scientists build them for the same variety of reasons: to replicate systems in the real world through simplification, to perform an experiment that cannot be done in the real world, or to assemble several known ideas into a coherent whole to build and test hypotheses .

Types of models: Physical, conceptual, mathematical

At the St. Anthony Falls Laboratory at the University of Minnesota, a group of engineers and geologists have built a room-sized physical replica of a river delta to model a real one like the Mississippi River delta in the Gulf of Mexico (Paola et al., 2001). These researchers have successfully incorporated into their model the key processes that control river deltas (like the variability of water flow, the deposition of sediments transported by the river, and the compaction and subsidence of the coastline under the pressure of constant sediment additions) in order to better understand how those processes interact. With their physical model, they can mimic the general setting of the Mississippi River delta and then do things they can’t do in the real world, like take a slice through the resulting sedimentary deposits to analyze the layers within the sediments. Or they can experiment with changing parameters like sea level and sedimentary input to see how those changes affect deposition of sediments within the delta, the same way you might “experiment” with the placement of the knights in your LEGO castle.

St. Anthony experiment

Not all models used in scientific research are physical models. Some are conceptual, and involve assembling all of the known components of a system into a coherent whole. This is a little like building an abstract sculpture out of LEGO bricks rather than building a castle. For example, over the past several hundred years, scientists have developed a series of models for the structure of an atom . The earliest known model of the atom compared it to a billiard ball, reflecting what scientists knew at the time – they were the smallest piece of an element that maintained the properties of that element. Despite the fact that this was a purely conceptual model, it could be used to predict some of the behavior that atoms exhibit. However, it did not explain all of the properties of atoms accurately. With the discovery of subatomic particles like the proton and electron , the physicist Ernest Rutherford proposed a “solar system” model of the atom, in which electrons orbited around a nucleus that included protons (see our Atomic Theory I: The Early Days module for more information). While the Rutherford model is useful for understanding basic properties of atoms, it eventually proved insufficient to explain all of the behavior of atoms. The current quantum model of the atom depicts electrons not as pure particles, but as having the properties of both particles and waves , and these electrons are located in specific probability density clouds around the atom’s nucleus.

Both physical and conceptual models continue to be important components of scientific research . In addition, many scientists now build models mathematically through computer programming. These computer-based models serve many of the same purposes as physical models, but are determined entirely by mathematical relationships between variables that are defined numerically. The mathematical relationships are kind of like individual LEGO bricks: They are basic building blocks that can be assembled in many different ways. In this case, the building blocks are fundamental concepts and theories like the mathematical description of turbulent flow in a liquid , the law of conservation of energy, or the laws of thermodynamics, which can be assembled into a wide variety of models for, say, the flow of contaminants released into a groundwater reservoir or for global climate change.

Comprehension Checkpoint

All models are exact replicas of physical things.

Modeling as a scientific research method

Whether developing a conceptual model like the atomic model, a physical model like a miniature river delta , or a computer model like a global climate model, the first step is to define the system that is to be modeled and the goals for the model. “System” is a generic term that can apply to something very small (like a single atom), something very large (like the Earth’s atmosphere), or something in between, like the distribution of nutrients in a local stream. So defining the system generally involves drawing the boundaries (literally or figuratively) around what you want to model, and then determining the key variables and the relationships between those variables.

Though this initial step may seem straightforward, it can be quite complicated. Inevitably, there are many more variables within a system than can be realistically included in a model , so scientists need to simplify. To do this, they make assumptions about which variables are most important. In building a physical model of a river delta , for example, the scientists made the assumption that biological processes like burrowing clams were not important to the large-scale structure of the delta, even though they are clearly a component of the real system.

Determining where simplification is appropriate takes a detailed understanding of the real system – and in fact, sometimes models are used to help determine exactly which aspects of the system can be simplified. For example, the scientists who built the model of the river delta did not incorporate burrowing clams into their model because they knew from experience that they would not affect the overall layering of sediments within the delta. On the other hand, they were aware that vegetation strongly affects the shape of the river channel (and thus the distribution of sediments), and therefore conducted an experiment to determine the nature of the relationship between vegetation density and river channel shape (Gran & Paola, 2001).

water molecule - with hooks

Once a model is built (either in concept, physical space, or in a computer), it can be tested using a given set of conditions. The results of these tests can then be compared against reality in order to validate the model. In other words, how well does the model do at matching what we see in the real world? In the physical model of delta sediments , the scientists who built the model looked for features like the layering of sand that they have seen in the real world. If the model shows something really different than what the scientists expect, the relationships between variables may need to be redefined or the scientists may have oversimplified the system . Then the model is revised, improved, tested again, and compared to observations again in an ongoing, iterative process . For example, the conceptual “billiard ball” model of the atom used in the early 1800s worked for some aspects of the behavior of gases, but when that hypothesis was tested for chemical reactions , it didn’t do a good job of explaining how they occur – billiard balls do not normally interact with one another. John Dalton envisioned a revision of the model in which he added “hooks” to the billiard ball model to account for the fact that atoms could join together in reactions , as conceptualized in Figure 3.

Once a model is built, it is never changed or modified.

While conceptual and physical models have long been a component of all scientific disciplines, computer-based modeling is a more recent development, and one that is frequently misunderstood. Computer models are based on exactly the same principles as conceptual and physical models, however, and they take advantage of relatively recent advances in computing power to mimic real systems .

The beginning of computer modeling: Numerical weather prediction

In the late 19 th century, Vilhelm Bjerknes , a Norwegian mathematician and physicist, became interested in deriving equations that govern the large-scale motion of air in the atmosphere . Importantly, he recognized that circulation was the result not just of thermodynamic properties (like the tendency of hot air to rise), but of hydrodynamic properties as well, which describe the behavior of fluid flow. Through his work, he developed an equation that described the physical processes involved in atmospheric circulation, which he published in 1897. The complexity of the equation reflected the complexity of the atmosphere, and Bjerknes was able to use it to describe why weather fronts develop and move.

Using calculations predictively

Bjerknes had another vision for his mathematical work, however: He wanted to predict the weather. The goal of weather prediction, he realized, is not to know the paths of individual air molecules over time, but to provide the public with “average values over large areas and long periods of time.” Because his equation was based on physical principles , he saw that by entering the present values of atmospheric variables like air pressure and temperature, he could solve it to predict the air pressure and temperature at some time in the future. In 1904, Bjerknes published a short paper describing what he called “the principle of predictive meteorology” (Bjerknes, 1904) (see the Research links for the entire paper). In it, he says:

Based upon the observations that have been made, the initial state of the atmosphere is represented by a number of charts which give the distribution of seven variables from level to level in the atmosphere. With these charts as the starting point, new charts of a similar kind are to be drawn, which represent the new state from hour to hour.

In other words, Bjerknes envisioned drawing a series of weather charts for the future based on using known quantities and physical principles . He proposed that solving the complex equation could be made more manageable by breaking it down into a series of smaller, sequential calculations, where the results of one calculation are used as input for the next. As a simple example, imagine predicting traffic patterns in your neighborhood. You start by drawing a map of your neighborhood showing the location, speed, and direction of every car within a square mile. Using these parameters , you then calculate where all of those cars are one minute later. Then again after a second minute. Your calculations will likely look pretty good after the first minute. After the second, third, and fourth minutes, however, they begin to become less accurate. Other factors you had not included in your calculations begin to exert an influence, like where the person driving the car wants to go, the right- or left-hand turns that they make, delays at traffic lights and stop signs, and how many new drivers have entered the roads.

Trying to include all of this information simultaneously would be mathematically difficult, so, as proposed by Bjerknes, the problem can be solved with sequential calculations. To do this, you would take the first step as described above: Use location, speed, and direction to calculate where all the cars are after one minute. Next, you would use the information on right- and left-hand turn frequency to calculate changes in direction, and then you would use information on traffic light delays and new traffic to calculate changes in speed. After these three steps are done, you would solve your first equation again for the second minute time sequence, using location, speed, and direction to calculate where the cars are after the second minute. Though it would certainly be rather tiresome to do by hand, this series of sequential calculations would provide a manageable way to estimate traffic patterns over time.

Although this method made calculations tedious, Bjerknes imagined “no intractable mathematical difficulties” with predicting the weather. The method he proposed (but never used himself) became known as numerical weather prediction, and it represents one of the first approaches towards numerical modeling of a complex, dynamic system .

Advancing weather calculations

Bjerknes’ challenge for numerical weather prediction was taken up sixteen years later in 1922 by the English scientist Lewis Fry Richardson . Richardson related seven differential equations that built on Bjerknes’ atmospheric circulation equation to include additional atmospheric processes. One of Richardson’s great contributions to mathematical modeling was to solve the equations for boxes within a grid; he divided the atmosphere over Germany into 25 squares that corresponded with available weather station data (see Figure 4) and then divided the atmosphere into five layers, creating a three-dimensional grid of 125 boxes. This was the first use of a technique that is now standard in many types of modeling. For each box, he calculated each of nine variables in seven equations for a single time step of three hours. This was not a simple sequential calculation, however, since the values in each box depended on the values in the adjacent boxes, in part because the air in each box does not simply stay there – it moves from box to box.

forecast - Richardson's

Richardson’s attempt to make a six-hour forecast took him nearly six weeks of work with pencil and paper and was considered an utter failure, as it resulted in calculated barometric pressures that exceeded any historically measured value (Dalmedico, 2001). Probably influenced by Bjerknes, Richardson attributed the failure to inaccurate input data , whose errors were magnified through successive calculations (see more about error propagation in our Uncertainty, Error, and Confidence module).

stamp - Vilhelm Bjerknes

In addition to his concerns about inaccurate input parameters , Richardson realized that weather prediction was limited in large part by the speed at which individuals could calculate by hand. He thus envisioned a “forecast factory,” in which thousands of people would each complete one small part of the necessary calculations for rapid weather forecasting.

First computer for weather prediction

Richardson’s vision became reality in a sense with the birth of the computer, which was able to do calculations far faster and with fewer errors than humans. The computer used for the first one-day weather prediction in 1950, nicknamed ENIAC (Electronic Numerical Integrator and Computer), was 8 feet tall, 3 feet wide, and 100 feet long – a behemoth by modern standards, but it was so much faster than Richardson’s hand calculations that by 1955, meteorologists were using it to make forecasts twice a day (Weart, 2003). Over time, the accuracy of the forecasts increased as better data became available over the entire globe through radar technology and, eventually, satellites.

The process of numerical weather prediction developed by Bjerknes and Richardson laid the foundation not only for modern meteorology , but for computer-based mathematical modeling as we know it today. In fact, after Bjerknes died in 1951, the Norwegian government recognized the importance of his contributions to the science of meteorology by issuing a stamp bearing his portrait in 1962 (Figure 5).

Weather prediction is based on _____________ modeling.

  • mathematical

Modeling in practice: The development of global climate models

The desire to model Earth’s climate on a long-term, global scale grew naturally out of numerical weather prediction. The goal was to use equations to describe atmospheric circulation in order to understand not just tomorrow’s weather, but large-scale patterns in global climate, including dynamic features like the jet stream and major climatic shifts over time like ice ages. Initially, scientists were hindered in the development of valid models by three things: a lack of data from the more inaccessible components of the system like the upper atmosphere , the sheer complexity of a system that involved so many interacting components, and limited computing powers. Unexpectedly, World War II helped solve one problem as the newly-developed technology of high altitude aircraft offered a window into the upper atmosphere (see our Technology module for more information on the development of aircraft). The jet stream, now a familiar feature of the weather broadcast on the news, was in fact first documented by American bombers flying westward to Japan.

As a result, global atmospheric models began to feel more within reach. In the early 1950s, Norman Phillips, a meteorologist at Princeton University, built a mathematical model of the atmosphere based on fundamental thermodynamic equations (Phillips, 1956). He defined 26 variables related through 47 equations, which described things like evaporation from Earth’s surface , the rotation of the Earth, and the change in air pressure with temperature. In the model, each of the 26 variables was calculated in each square of a 16 x 17 grid that represented a piece of the northern hemisphere. The grid represented an extremely simple landscape – it had no continents or oceans, no mountain ranges or topography at all. This was not because Phillips thought it was an accurate representation of reality, but because it simplified the calculations. He started his model with the atmosphere “at rest,” with no predetermined air movement, and with yearly averages of input parameters like air temperature.

Phillips ran the model through 26 simulated day-night cycles by using the same kind of sequential calculations Bjerknes proposed. Within only one “day,” a pattern in atmospheric pressure developed that strongly resembled the typical weather systems of the portion of the northern hemisphere he was modeling (see Figure 6). In other words, despite the simplicity of the model, Phillips was able to reproduce key features of atmospheric circulation , showing that the topography of the Earth was not of primary importance in atmospheric circulation. His work laid the foundation for an entire subdiscipline within climate science: development and refinement of General Circulation Models (GCMs).

graph - Phillips' 1956 paper

By the 1980s, computing power had increased to the point where modelers could incorporate the distribution of oceans and continents into their models . In 1991, the eruption of Mt. Pinatubo in the Philippines provided a natural experiment: How would the addition of a significant volume of sulfuric acid , carbon dioxide, and volcanic ash affect global climate? In the aftermath of the eruption, descriptive methods (see our Description in Scientific Research module) were used to document its effect on global climate: Worldwide measurements of sulfuric acid and other components were taken, along with the usual air temperature measurements. Scientists could see that the large eruption had affected climate , and they quantified the extent to which it had done so. This provided a perfect test for the GCMs . Given the inputs from the eruption, could they accurately reproduce the effects that descriptive research had shown? Within a few years, scientists had demonstrated that GCMs could indeed reproduce the climatic effects induced by the eruption, and confidence in the abilities of GCMs to provide reasonable scenarios for future climate change grew. The validity of these models has been further substantiated by their ability to simulate past events, like ice ages, and the agreement of many different models on the range of possibilities for warming in the future, one of which is shown in Figure 7.

Climate model by NOAA - large

Limitations and misconceptions of models

The widespread use of modeling has also led to widespread misconceptions about models , particularly with respect to their ability to predict. Some models are widely used for prediction, such as weather and streamflow forecasts, yet we know that weather forecasts are often wrong. Modeling still cannot predict exactly what will happen to the Earth’s climate , but it can help us see the range of possibilities with a given set of changes. For example, many scientists have modeled what might happen to average global temperatures if the concentration of carbon dioxide (CO 2 ) in the atmosphere is doubled from pre-industrial levels (pre-1950); though individual models differ in exact output, they all fall in the range of an increase of 2-6° C (IPCC, 2007).

All models are also limited by the availability of data from the real system . As the amount of data from a system increases, so will the accuracy of the model. For climate modeling, that is why scientists continue to gather data about climate in the geologic past and monitor things like ocean temperatures with satellites – all those data help define parameters within the model. The same is true of physical and conceptual models, too, well-illustrated by the evolution of our model of the atom as our knowledge about subatomic particles increased.

__________ can result in a flawed model.

  • Lack of data about a system
  • Too much data about a system

Modeling in modern practice

The various types of modeling play important roles in virtually every scientific discipline, from ecology to analytical chemistry and from population dynamics to geology. Physical models such as the river delta take advantage of cutting edge technology to integrate multiple large-scale processes. As computer processing speed and power have increased, so has the ability to run models on them. From the room-sized ENIAC in the 1950s to the closet-sized Cray supercomputer in the 1980s to today’s laptop, processing speed has increased over a million-fold, allowing scientists to run models on their own computers rather than booking time on one of only a few supercomputers in the world. Our conceptual models continue to evolve, and one of the more recent theories in theoretical physics digs even deeper into the structure of the atom to propose that what we once thought were the most fundamental particles – quarks – are in fact composed of vibrating filaments, or strings. String theory is a complex conceptual model that may help explain gravitational force in a way that has not been done before. Modeling has also moved out of the realm of science into recreation, and many computer games like SimCity® involve both conceptual modeling (answering the question, “What would it be like to run a city?”) and computer modeling, using the same kinds of equations that are used model traffic flow patterns in real cities. The accessibility of modeling as a research method allows it to be easily combined with other scientific research methods, and scientists often incorporate modeling into experimental, descriptive, and comparative studies.

Scientific modeling is a research method scientists use to replicate real-world systems – whether it’s a conceptual model of an atom, a physical model of a river delta, or a computer model of global climate. This module describes the principles that scientists use when building models and shows how modeling contributes to the process of science.

Key Concepts

  • Modeling involves developing physical, conceptual, or computer-based representations of systems.
  • Scientists build models to replicate systems in the real world through simplification, to perform an experiment that cannot be done in the real world, or to assemble several known ideas into a coherent whole to build and test hypotheses.
  • Computer modeling is a relatively new scientific research method, but it is based on the same principles as physical and conceptual modeling.

Footer Logo Lumen Candela

Privacy Policy

Grad Coach

Saunders’ Research Onion: Explained Simply

Peeling the onion, layer by layer (with examples).

By: David Phair (PhD) and Kerryn Warren (PhD) | January 2021

If you’re learning about research skills and methodologies, you may have heard the term “ research onion ”. Specifically, the research onion developed by Saunders et al in 2007 . But what exactly is this elusive onion? In this post, we’ll break Saunders’ research onion down into bite-sized chunks to make it a little more digestible.

The Research Onion (Saunders, 2007)

Saunders’ (2007) Research Onion – What is it?

At the simplest level, Saunders’ research onion describes the different decisions you’ll need to make when developing a  research methodology   – whether that’s for your dissertation, thesis or any other formal research project. As you work from the outside of the onion inwards , you’ll face a range of choices that progress from high-level and philosophical to tactical and practical in nature. This also mimics the general structure for the methodology chapter .

While Saunders’ research onion is certainly not perfect, it’s a useful tool for thinking holistically about methodology. At a minimum, it helps you understand what decisions you need to make in terms of your research design and methodology.

The layers of Saunders’ research onion

The onion is made up of 6 layers, which you’ll need to peel back one at a time as you develop your research methodology:

  • Research philosophy
  • Research approach
  • Research strategy
  • Time horizon
  • Techniques & procedures

Onion Layer 1: Research Philosophy

The very first layer of the onion is the research philosophy . But what does that mean? Well, the research philosophy is the foundation of any study as it describes the set of beliefs the research is built upon . Research philosophy can be described from either an  ontological  or  epistemological  point of view. “A what?!”, you ask?

In simple terms,  ontology  is the “what” and “how” of what we know – in other words, what is the nature of reality and what are we really able to know and understand. For example, does reality exist as a single objective thing, or is it different for each person? Think about the simulated reality in the film The Matrix.

Epistemology , on the other hand, is about “how” we can obtain knowledge and come to understand things – in other words, how can we figure out what reality is, and what the limits of this knowledge are. This is a gross oversimplification, but it’s a useful starting point (we’ll cover ontology and epistemology another post).

With that fluffy stuff out the way, let’s look at three of the main research philosophies that operate on different ontological and epistemological assumptions:

  • Interpretivism

These certainly aren’t the only research philosophies, but they are very common and provide a good starting point for understanding the spectrum of philosophies.

The research philosophy is the foundation of any study as it describes the set of beliefs upon which the research is built.

Research Philosophy 1:  Positivism

Positivist research takes the view that knowledge exists outside of what’s being studied . In other words, what is being studied can only be done so objectively , and it cannot include opinions or personal viewpoints – the researcher doesn’t interpret, they only observe. Positivism states that there is only one reality  and that all meaning is consistent between subjects.

In the positivist’s view, knowledge can only be acquired through empirical research , which is based on measurement and observation. In other words, all knowledge is viewed as a posteriori knowledge – knowledge that is not reliant on human reasoning but instead is gained from research.

For the positivist, knowledge can only be true, false, or meaningless . Basically, if something is not found to be true or false, it no longer holds any ground and is thus dismissed.

Let’s look at an example, based on the question of whether God exists or not. Since positivism takes the stance that knowledge has to be empirically vigorous, the knowledge of whether God exists or not is irrelevant. This topic cannot be proven to be true or false, and thus this knowledge is seen as meaningless.

Kinda harsh, right? Well, that’s the one end of the spectrum – let’s look at the other end.

For the positivist, knowledge can only be true, false, or meaningless.

Research Philosophy 2: Interpretivism

On the other side of the spectrum, interpretivism emphasises the influence that social and cultural factors can have on an individual. This view focuses on  people’s thoughts and ideas , in light of the socio-cultural backdrop. With the interpretivist philosophy, the researcher plays an active role in the study, as it’s necessary to draw a holistic view of the participant and their actions, thoughts and meanings.

Let’s look at an example. If you were studying psychology, you may make use of a case study in your research which investigates an individual with a proposed diagnosis of schizophrenia. The interpretivist view would come into play here as social and cultural factors may influence the outcome of this diagnosis.

Through your research, you may find that the individual originates from India, where schizophrenic symptoms like hallucinations are viewed positively, as they are thought to indicate that the person is a spirit medium. This example illustrates an interpretivist approach since you, as a researcher, would make use of the patient’s point of view, as well as your own interpretation when assessing the case study.

The interpretivist view focuses on people’s thoughts and ideas, in light of the  socio-cultural backdrop.

Research Philosophy 3: Pragmatism

Pragmatism highlights the importance of using the best tools possible to investigate phenomena. The main aim of pragmatism is to approach research from a practical point of view , where knowledge is not fixed, but instead is constantly questioned and interpreted. For this reason, pragmatism consists of an element of researcher involvement and subjectivity, specifically when drawing conclusions based on participants’ responses and decisions. In other words, pragmatism is not committed to (or limited by) one specific philosophy.

Let’s look at an example in the form of the trolley problem, which is a set of ethical and psychological thought experiments. In these, participants have to decide on either killing one person to save multiple people or allowing multiple people to die to avoid killing one person. 

This experiment can be altered, including details such as the one person or the group of people being family members or loved ones. The fact that the experiment can be altered to suit the researcher’s needs is an example of pragmatism – in other words, the outcome of the person doing the thought experiment is more important than the philosophical ideas behind the experiment.

Pragmatism is about using the best tools possible to investigate phenomena.   It approaches research from a practical point of view, where knowledge is constantly questioned and interpreted.

To recap, research philosophy is the foundation of any research project and reflects the ontological and epistemological assumptions of the researcher. So, when you’re designing your research methodology , the first thing you need to think about is which philosophy you’ll adopt, given the nature of your research.

Onion Layer 2: Research Approach

Let’s peel off another layer and take a look at the research approach . Your research approach is the broader method you’ll use for your research –  inductive  or  deductive . It’s important to clearly identify your research approach as it will inform the decisions you take in terms of data collection and analysis in your study (we’ll get to that layer soon).

Inductive approaches entail generating theories from research , rather than starting a project with a theory as a foundation.  Deductive approaches, on the other hand, begin with a theory and aim to build on it (or test it) through research.

Sounds a bit fluffy? Let’s look at two examples:

An  inductive approach  could be used in the study of an otherwise unknown isolated community. There is very little knowledge about this community, and therefore, research would have to be conducted to gain information on the community, thus leading to the formation of theories.

On the other hand, a  deductive approach  would be taken when investigating changes in the physical properties of animals over time, as this would likely be rooted in the theory of evolution. In other words, the starting point is a well-established pre-existing body of research.

Inductive approaches entail generating theories from the research data. Deductive approaches, on the other hand, begin with a theory and aim to build on it (or test it) using research data.

Closely linked to research approaches are  qualitative and  quantitative  research. Simply put, qualitative research focuses on textual , visual or audio-based data, while quantitative research focuses on numerical data. To learn more about qualitative and quantitative research, check out our dedicated post here .

What’s the relevance of qualitative and quantitative data to research approaches? Well, inductive approaches are usually used within qualitative research, while quantitative research tends to reflect a deductive approach, usually informed by positivist philosophy. The reason for using a deductive approach here is that quantitative research typically begins with theory as a foundation, where progress is made through hypothesis testing. In other words, a wider theory is applied to a particular context, event, or observation to see whether these fit in with the theory, as with our example of evolution above.

So, to recap, the two research approaches are  inductive  and  deductive . To decide on the right approach for your study, you need to assess the type of research you aim to conduct. Ask yourself whether your research will build on something that exists, or whether you’ll be investigating something that cannot necessarily be rooted in previous research. The former suggests a deductive approach while the latter suggests an inductive approach.

Need a helping hand?

example of research model

Onion Layer 3: Research Strategy

So far, we’ve looked at pretty conceptual and intangible aspects of the onion. Now, it’s time to peel another layer off that onion and get a little more practical – introducing research strategy . This layer of the research onion details how, based on the aims of the study, research can be conducted. Note that outside of the onion, these strategies are referred to as research designs.

There are several strategies  you can take, so let’s have a look at some of them.

  • Experimental research
  • Action research
  • Case study research
  • Grounded theory
  • Ethnography
  • Archival research

Strategy 1: Experimental research

Experimental research involves manipulating one variable (the independent variable ) to observe a change in another variable (the dependent variable ) – in other words, to assess the relationship between variables. The purpose of experimental research is to support, refute or validate a  research hypothesis . This research strategy follows the principles of the  scientific method  and is conducted within a controlled environment or setting (for example, a laboratory).

Experimental research aims to test existing theories rather than create new ones, and as such, is deductive in nature. Experimental research aligns with the positivist research philosophy, as it assumes that knowledge can only be studied objectively and in isolation from external factors such as context or culture.

Let’s look at an example of experimental research. If you had a hypothesis that a certain brand of dog food can raise a dogs’ protein levels, you could make use of experimental research to compare the effects of the specific brand to a “regular” diet. In other words, you could test your hypothesis.

In this example, you would have two groups, where one group consists of dogs with no changes to their diet (this is called  the control group) and the other group consists of dogs being fed the specific brand that you aim to investigate (this is called the experimental/treatment group). You would then test your hypothesis by comparing the protein levels in both groups.

Experimental research involves manipulating the independent variable to observe a change in the dependent variable.

Strategy 2: Action research

Next, we have action research . The simplest way of describing action research is by saying that it involves learning through… wait for it… action. Action research is conducted in practical settings such as a classroom, a hospital, a workspace, etc – as opposed to controlled environments like a lab. Action research helps to inform researchers of problems or weaknesses related to interactions within the real-world . With action research, there’s a strong focus on the participants (the people involved in the issue being studied, which is why it’s sometimes referred to as “participant action research” or PAR.

An example of PAR is a community intervention (for therapy, farming, education, whatever). The researcher comes with an idea and it is implemented with the help of the community (i.e. the participants). The findings are then discussed with the community to see how to better the intervention. The process is repeated until the intervention works just right for the community. In this way, a practical solution is given to a problem and it is generated by the combination of researcher and community (participant) feedback.

This kind of research is generally applied in the social sciences , specifically in professions where individuals aim to improve on themselves and the work that they are doing. Action research is most commonly adopted in qualitative studies and is rarely seen in quantitative studies. This is because, as you can see in the above examples, action research makes use of language and interactions rather than statistics and numbers.

Action research is conducted in practical settings such as a classroom, a hospital, a workspace, etc.   This helps researchers understand problems related to interactions within the real-world.

Strategy 3: Case study research

A case study is a detailed, in-depth study of a single subject – for example, a person, a group or an institution, or an event, phenomenon or issue. In this type of research, the subject is analysed to gain an in-depth understanding of issues in a real-life setting. The objective here is to gain an in-depth understanding within the context of the study – not (necessarily) to generalise the findings.

It is vital that, when conducting case study research, you take the social context and culture into account, which means that this type of research is (more often than not) qualitative in nature and tends to be inductive. Also, since the researcher’s assumptions and understanding play a role in case study research, it is typically informed by an interpretivist philosophy.

For example, a study on political views of a specific group of people needs to take into account the current political situation within a country and factors that could contribute towards participants taking a certain view.

A case study is an detailed study of a single subject to gain an in-depth understanding within the context of the study .

Strategy 4: Grounded theory

Next up, grounded theory. Grounded theory is all about “letting the data speak for itself”. In other words, in grounded theory, you let the data inform the development of a new theory, model or framework. True to the name, the theory you develop is “ grounded ” in the data. Ground theory is therefore very useful for research into issues that are completely new or under-researched.

Grounded theory research is typically qualitative (although it can also use quantitative data) and takes an inductive approach. Typically, this form of research involves identifying commonalities between sets of data, and results are then drawn from completed research without the aim of fitting the findings in with a pre-existing theory or framework.

For example, if you were to study the mythology of an unknown culture through artefacts, you’d enter your research without any hypotheses or theories, and rather work from the knowledge you gain from your study to develop these.

Grounded theory is all about "letting the data speak for itself" - i.e. you let the data inform the development of a new theory or model.

Strategy 5: Ethnography

Ethnography involves observing people in their natural environments and drawing meaning from their cultural interactions. The objective with ethnography is to capture the subjective experiences of participants, to see the world through their eyes. Creswell (2013) says it best: “Ethnographers study the meaning of the behaviour, the language, and the interaction among members of the culture-sharing group.”

For example, if you were interested in studying interactions on a mental health discussion board, you could use ethnography to analyse interactions and draw an understanding of the participants’ subjective experiences.

For example, if you wanted to explore the behaviour, language, and beliefs of an isolated Amazonian tribe, ethnography could allow you to develop a complex, complete description of the social behaviours of the group by immersing yourself into the community, rather than just observing from the outside.  

Given the nature of ethnography, it generally reflects an interpretivist research philosophy and involves an inductive , qualitative research approach. However, there are exceptions to this – for example, quantitative ethnography as proposed by David Shafer.

Ethnography involves observing people in their natural environments and drawing meaning from their cultural interactions.

Strategy 6: Archival research

Last but not least is archival research. An archival research strategy draws from materials that already exist, and meaning is then established through a review of this existing data. This method is particularly well-suited to historical research and can make use of materials such as manuscripts and records.

For example, if you were interested in people’s beliefs about so-called supernatural phenomena in the medieval period, you could consult manuscripts and records from the time, and use those as your core data set.

Onion Layer 4: Choices

The next layer of the research onion is simply called “choices” – they could have been a little more specific, right? In any case, this layer is simply about deciding how many data types (qualitative or quantitative) you’ll use in your research. There are three options – mono , mixed , and multi-method .

Let’s take a look at them.

Choosing to use a  mono method  means that you’ll only make use of one data type – either qualitative or quantitative. For example, if you were to conduct a study investigating a community’s opinions on a specific pizza restaurant, you could make use of a qualitative approach only, so that you can analyse participants’ views and opinions of the restaurant.

If you were to make use of both quantitative and qualitative data, you’d be taking a  mixed-methods approach. Keeping with the previous example, you may also want to assess how many people in a community eat specific types of pizza. For this, you could make use of a survey to collect quantitative data and then analyse the results statistically, producing quantitative results in addition to your qualitative ones.

Lastly, there’s  multi-method . With a multi-method approach, you’d make use of a wider range of approaches, with more than just a one quantitative and one qualitative approach. For example, if you conduct a study looking at archives from a specific culture, you could make use of two qualitative methods (such as thematic analysis and content analysis ), and then additionally make use of quantitative methods to analyse numerical data.

There are three options in terms of your method choice - mono-method,  mixed-method, and multi-method.

As with all the layers of the research onion, the right choice here depends on the nature of your research, as well as your research aims and objectives . There’s also the practical consideration of viability – in other words, what kind of data will you be able to access, given your constraints.

Onion Layer 5: Time horizon

What’s that far in the distance? It’s the time horizon. But what exactly is it? Thankfully, this one’s pretty straightforward. The time horizon simply describes how many points in time you plan to collect your data at . Two options exist – the  cross-sectional  and  longitudinal  time horizon.

Imagine that you’re wasting time on social media and think, “Ooh! I want to study the language of memes and how this language evolves over time”. For this study, you’d need to collect data over multiple points in time – perhaps over a few weeks, months, or even years. Therefore, you’d make use of a  longitudinal time horizon. This option is highly beneficial when studying changes and progressions over time.

If instead, you wanted to study the language used in memes at a certain point in time (for example, in 2020), you’d make use of a  cross-sectional  time horizon. This is where data is collected at one point in time, so you wouldn’t be gathering data to see how language changes, but rather what language exists at a snapshot point in time. The type of data collected could be qualitative, quantitative or a mix of both, as the focus is on the time of collection, not the data type.

Time horizon

As with all the other choices, the nature of your research and your research aims and objectives are the key determining factors when deciding on the time horizon. You’ll also need to consider practical constraints , such as the amount of time you have available to complete your research (especially in the case of a dissertation or thesis).

Onion Layer 6: Techniques and Procedures

Finally, we reach the centre of the onion – this is where you get down to the real practicalities of your research to make choices regarding specific techniques and procedures .

Specifically, this is where you’ll:

  • Decide on what data you’ll collect and what data collection methods you’ll use (for example, will you use a survey? Or perhaps one-on-one interviews?)
  • Decide how you’ll go about sampling the population (for example, snowball sampling, random sampling, convenience sampling, etc).
  • Determine the type of data analysis you’ll use to answer your research questions (such as content analysis or a statistical analysis like correlation).
  • Set up the materials you’ll be using for your study (such as writing up questions for a survey or interview)

What’s important to note here is that these techniques and procedures need to align with all the other layers of the research onion – i.e., research philosophy, research approaches, research strategy, choices, and time horizon.

For example, you if you’re adopting a deductive, quantitative research approach, it’s unlikely that you’ll use interviews to collect your data, as you’ll want high-volume, numerical data (which surveys are far better suited to). So, you need to ensure that the decisions at each layer of your onion align with the rest, and most importantly, that they align with your research aims and objectives.

In practical terms, you'll need to decide what data to collect, how you'll sample it, how'll collect it and how you'll analyse it.

Let’s Recap: Research Onion 101

The research onion details the many interrelated choices you’ll need to make when you’re crafting your research methodology. These include:

  • Research philosophy – the set of beliefs your research is based on (positivism, interpretivism, pragmatism)
  • Research approaches – the broader method you’ll use (inductive, deductive, qualitative and quantitative)
  • Research strategies – how you’ll conduct the research (e.g., experimental, action, case study, etc.)
  • Choices – how many methods you’ll use (mono method, mixed-method or multi-method)
  • Time horizons – the number of points in time at which you’ll collect your data (cross-sectional or longitudinal)
  • Techniques and procedures (data collection methods, data analysis techniques, sampling strategies, etc.)

Saunders research onion

Psst… there’s more (for free)

This post is part of our dissertation mini-course, which covers everything you need to get started with your dissertation, thesis or research project. 

You Might Also Like:

Research aims, research objectives and research questions

59 Comments

Kapsleisure@yahoo.com

This is good

Patience Nalavwe

Wow this was sooo helpful. I don’t feel so blank about my research anymore. With this information I can conquer my research. Going ‘write’ into it. Get it write not right hahahaha

Botho

I am doing research with Bolton University so i would like to empower myself.

Arega Berlie

Really thoughtful presentation and preparation. I learnt too much to teach my students in a very simple and understandable way

Eduard Popescu

Very useful, thank you.

Derek Jansen

You’re most welcome. Good luck with your research!

davie nyondo

thanks alot for your brief and brilliant notes

Osward Lunda

I am a Student at Malawi Institute of Management, pursuing a Masters’ degree in Business Administration. I find this to be very helpful

Roxana

Extremely useful, well explained. Thank you so much

Khadija Mohammed

I would like to download this file… I can’t find the attachment file. Thanks

abirami manoj

Thank you so much for explaining it in the most simple and precise manner!

Tsega

Very thoughtful and well expained, thanks.

Samantha liyanage

This is good for upgrade my research knowledge

Abubakar Musa

I have enjoying your videos on YouTube, they are very educative and useful. I have learned a lot. Thanks

Ramsey

Thank you this has really helped me with writing my dissertation methodology !

Kenneth Igiri

Thanks so much for this piece. Just to be clear, which layer do interviews fit in?

janet

well explained i found it to be very engaging. now i’m going to pass my research methods course. thank you.

aleina tomlinson

Thank you so much this has really helped as I can’t get this insight from uni due to covid

Abdullah Khan

well explained with more clarity!

seun banjoko

this is an excellent piece i find it super helpful

Lini

Beautiful, thank you!

Lini

Beautiful and helpful. Thank you!

Lydia Namatende-Sakwa

This is well done!

Sazir

A complex but useful approach to research simplified! I would like to learn more from the team.

Aromona Deborah

A very simplified version of a complex topic. I found it really helpful. I would like to know if this publication can be cited for academic research. Thank you

You’re welcome to cite this page, but it would be better to cite the original work of Saunders.

Giovanni

Thirteen odd years since my MSc in HRM & HRD at UoL. I’d like to say thank you for the effort to produce such an insightful discussion of a rather complex topic.

Moses E.D Magadza

I am a PhD in Media Studies student. I found this enormously helpful when stringing together the methodology chapter, especially the research philosophy section.

Mark Saunders

Hello there. Thank you for summarising the work on the onion. A more recent version of the onion (Saunders et al., 2019) refers to ‘methodological choices’ rather than choices. This can be downloaded, along with the chapter dealing with research philosophies at: https://www.researchgate.net/publication/330760964_Research_Methods_for_Business_Students_Chapter_4_Understanding_research_philosophy_and_approaches_to_theory_development or https://www.academia.edu/42304065/Research_Methods_for_Business_Students_Chapter_4_Understanding_research_philosophy_and_approaches_to_theory_development_8th_edition

Lillian Sintufya

Thank you Mark Saunders. Your work is very insightful

Yvonne

Thank you for the update and additional reading Mark, very helpful indeed.

PRASAD VITHANAGE

THROUGHLY AND SIMPLY BRIEFED TO MAKE SENSE AND A CLEAR INSIGHT. THANK YOU, VERY MUCH.

KAPANSA

Thank you for the sharing the recent version of the Onion!

John Bajracharya

I want to keep it in my reference of my assignment. May I??

David Bell

Great summary, thank you taking the time to put this together. I’m sure it’s been a big help to lots of people. It definitely was to me.

Justus Ranganga

I love the analysis… some people do not recognize qualitative or quantitative as an approach but rather have inductive, abductive, and deductive.

Modise Othusitse

This has been helpful in the understanding of research . Thank you for this valuable information.

Joy Chikomo

Great summary. Well explained. Thank you, guys.

Nancy Namwai Mpekansambo

This makes my fears on methodology go away. I confidently look forward to working on my methodology now. Thank you so much I ma doing a PhD with UNIMA, School of Education

rashmk

simple and clear

Maku Babatunde

Simple guide to crafting a research methodology. Quite impactful. Thank you

Thank you for this, this makes things very clear. Now I’m off to conquer my research proposal. Thanks again.

purusha kuni

Thank you for this very informative and valuable information. What would the best approach be to take if you are using secondary data to form a qualitative study and relying on industry reports and peer journals to distinguish what factors influence the use of say cryptocurrency ?

W. W. Tiyana. R

Thanks for providing the whole idea/knowledge in the simplest way with essential factors which made my entire research process more efficient as well as valuable.

Netra Prasad Subedi

what is about research design such as descriptive, causal-comparative, correlation, developmental where these fall in the research onion?

Ilemobayo Meroko

This is very helpful. Thank you for this wonderful piece. However, it would be nicer to have References to the knowledge provided here. My suggestion

AKLILU ASSEFA ADATO

This material is very important for researchers, particularly for PhD scholars to conduct further study.

Adetayo Ayanleke

This was insightful. Thank you for the knowledge.

WENDYMULITE

Thank you for the wonderful knowledge !Easy to understand and grasp.

PETER BWALYA

thanks very much very simple. will need a coach

Tanuja Tambwekar

Hi this is a great article giving much help to my research. I just wanted to mention here that the example where you mentioned that ” schizophrenic symptoms like hallucinations are viewed positively, as they are thought to indicate the person is a spirit medium” is completely false as those are different cases and a bit out of context here. We are medically and psychologically well versed and obviously understand the difference between the two. As much as I am grateful to this article I would like to suggest you to give proper examples.

Osman Sadiq

Thank you very much, sincerely I appreciate your efforts, it is insightful information. Once again I’m grateful .

Ahtasham Faroq

In short, a complete insight of and for writing research methodology.

kuchhi

This information was very helpful, I was having difficulties in writing my methodology now I can say I have the full knowledge to write a more informative research methodology.

Amali

Thank you so much for this amazing explanation. As a person who hasn’t ever done a research project, this video helped me to clear my doubts and approach my research in a clear and concise manner. Great work

Asif Azam

very well explained , after going through this there is no need any material to study . a very concise and to the point.

Santulan Chaubey

I have one small query. If I choose mixed -methods (quantitative and qualitative techniques), Then, my research Philosophy will also change to both Positivists and Interpretivist. Isn’t?

GILBERT CHIPANGULA

well explained and thank you

Charlene Kaereho

Thanks for this presentation. Quite simple and easy to understand, and to teach others.

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly
  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • The C.A.R.S. Model
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Introduction

The Creating a Research Space [C.A.R.S.] Model was developed by John Swales based upon his analysis of journal articles representing a variety of discipline-based writing practices. His model attempts to explain and describe the organizational pattern of writing the introduction to scholarly research studies. Following the C.A.R.S. Model can be useful approach because it can help you to: 1) begin the writing process [getting started is often the most difficult task]; 2) understand the way in which an introduction sets the stage for the rest of your paper; and, 3) assess how the introduction fits within the larger scope of your study. The model assumes that writers follow a general organizational pattern in response to two types of challenges [“competitions”] relating to establishing a presence within a particular domain of research: 1) the competition to create a rhetorical space and, 2) the competition to attract readers into that space. The model proposes three actions [Swales calls them “moves”], accompanied by specific steps, that reflect the development of an effective introduction for a research paper. These “moves” and steps can be used as a template for writing the introduction to your own social sciences research papers.

"Introductions." The Writing Lab and The OWL. Purdue University; Coffin, Caroline and Rupert Wegerif. “How to Write a Standard Research Article.” Inspiring Academic Practice at the University of Exeter; Kayfetz, Janet. "Academic Writing Workshop." University of California, Santa Barbara, Fall 2009; Pennington, Ken. "The Introduction Section: Creating a Research Space CARS Model." Language Centre, Helsinki University of Technology, 2005; Swales, John and Christine B. Feak. Academic Writing for Graduate Students: Essential Skills and Tasks. 2nd edition. Ann Arbor, MI: University of Michigan Press, 2004.

Creating a Research Space Move 1: Establishing a Territory [the situation] This is generally accomplished in two ways: by demonstrating that a general area of research is important, critical, interesting, problematic, relevant, or otherwise worthy of investigation and by introducing and reviewing key sources of prior research in that area to show where gaps exist or where prior research has been inadequate in addressing the research problem. The steps taken to achieve this would be:

  • Step 1 -- Claiming importance of, and/or  [writing action = describing the research problem and providing evidence to support why the topic is important to study]
  • Step 2 -- Making topic generalizations, and/or  [writing action = providing statements about the current state of knowledge, consensus, practice or description of phenomena]
  • Step 3 -- Reviewing items of previous research  [writing action = synthesize prior research that further supports the need to study the research problem; this is not a literature review but more a reflection of key studies that have touched upon but perhaps not fully addressed the topic]

Move 2: Establishing a Niche [the problem] This action refers to making a clear and cogent argument that your particular piece of research is important and possesses value. This can be done by indicating a specific gap in previous research, by challenging a broadly accepted assumption, by raising a question, a hypothesis, or need, or by extending previous knowledge in some way. The steps taken to achieve this would be:

  • Step 1a -- Counter-claiming, or  [writing action = introduce an opposing viewpoint or perspective or identify a gap in prior research that you believe has weakened or undermined the prevailing argument]
  • Step 1b -- Indicating a gap, or  [writing action = develop the research problem around a gap or understudied area of the literature]
  • Step 1c -- Question-raising, or  [writing action = similar to gap identification, this involves presenting key questions about the consequences of gaps in prior research that will be addressed by your study. For example, one could state, “Despite prior observations of voter behavior in local elections in urban Detroit, it remains unclear why do some single mothers choose to avoid....”]
  • Step 1d -- Continuing a tradition  [writing action = extend prior research to expand upon or clarify a research problem. This is often signaled with logical connecting terminology, such as, “hence,” “therefore,” “consequently,” “thus” or language that indicates a need. For example, one could state, “Consequently, these factors need to examined in more detail....” or “Evidence suggests an interesting correlation, therefore, it is desirable to survey different respondents....”]

Move 3: Occupying the Niche [the solution] The final "move" is to announce the means by which your study will contribute new knowledge or new understanding in contrast to prior research on the topic. This is also where you describe the remaining organizational structure of the paper. The steps taken to achieve this would be:

  • Step 1a -- Outlining purposes, or  [writing action = answering the “So What?” question. Explain in clear language the objectives of your study]
  • Step 1b -- Announcing present research [writing action = describe the purpose of your study in terms of what the research is going to do or accomplish. In the social sciences, the “So What?” question still needs to addressed]
  • Step 2 -- Announcing principle findings  [writing action = present a brief, general summary of key findings written, such as, “The findings indicate a need for...,” or “The research suggests four approaches to....”]
  • Step 3 -- Indicating article structure  [writing action = state how the remainder of your paper is organized]

"Introductions." The Writing Lab and The OWL. Purdue University; Atai, Mahmood Reza. “Exploring Subdisciplinary Variations and Generic Structure of Applied Linguistics Research Article Introductions Using CARS Model.” The Journal of Applied Linguistics 2 (Fall 2009): 26-51; Chanel, Dana. "Research Article Introductions in Cultural Studies: A Genre Analysis Explorationn of Rhetorical Structure." The Journal of Teaching English for Specific and Academic Purposes 2 (2014): 1-20; Coffin, Caroline and Rupert Wegerif. “How to Write a Standard Research Article.” Inspiring Academic Practice at the University of Exeter; Kayfetz, Janet. "Academic Writing Workshop." University of California, Santa Barbara, Fall 2009; Pennington, Ken. "The Introduction Section: Creating a Research Space CARS Model." Language Centre, Helsinki University of Technology, 2005; Swales, John and Christine B. Feak. Academic Writing for Graduate Students: Essential Skills and Tasks . 2nd edition. Ann Arbor, MI: University of Michigan Press, 2004; Swales, John M. Genre Analysis: English in Academic and Research Settings . New York: Cambridge University Press, 1990; Chapter 5: Beginning Work. In Writing for Peer Reviewed Journals: Strategies for Getting Published . Pat Thomson and Barbara Kamler. (New York: Routledge, 2013), pp. 93-96.

Writing Tip

Swales showed that establishing a research niche [move 2] is often signaled by specific terminology that expresses a contrasting viewpoint, a critical evaluation of gaps in the literature, or a perceived weakness in prior research. The purpose of using these words is to draw a clear distinction between perceived deficiencies in previous studies and the research you are presenting that is intended to help resolve these deficiencies. Below is a table of common words used by authors.

NOTE : You may prefer not to adopt a negative stance in your writing when placing it within the context of prior research. In such cases, an alternative approach is to utilize a neutral, contrastive statement that expresses a new perspective without giving the appearance of trying to diminish the validity of other people's research. Examples of how to take a more neutral contrasting stance can be achieved in the following ways, with A representing the findings of prior research, B representing your research problem, and X representing one or more variables that have been investigated.

  • Prior research has focused primarily on A , rather than on B ...
  • Prior research into A can be beneficial but to rectify X , it is important to examine B ...
  • These studies have placed an emphasis in the areas of A as opposed to describing B ...
  • While prior studies have examined A , it may be preferable to contemplate the impact of B ...
  • After consideration of A , it is important to also distinguish B ...
  • The study of A has been thorough, but changing circumstances related to X support a need for examining [or revisiting] B ...
  • Although research has been devoted to A , less attention has been paid to B ...
  • Earlier research offers insights into the need for A , though consideration of B would be particularly helpful to...

In each of these example statements, what follows the ellipsis is the justification for designing a study that approaches the problem in the way that contrasts with prior research but which does not devalue its ongoing contributions to current knowledge and understanding.

Dretske, Fred I. “Contrastive Statements.” The Philosophical Review 81 (October 1972): 411-437; Kayfetz, Janet. "Academic Writing Workshop." University of California, Santa Barbara, Fall 2009; Pennington, Ken. "The Introduction Section: Creating a Research Space CARS Model." Language Centre, Helsinki University of Technology, 2005; Swales, John M. Genre Analysis: English in Academic and Research Settings . New York: Cambridge University Press, 1990

  • << Previous: 4. The Introduction
  • Next: Background Information >>
  • Last Updated: Apr 3, 2024 10:04 AM
  • URL: https://libguides.usc.edu/writingguide

Stormboard

Blueprints for Academic Research Projects

Today's post is written by Dr. Ben Ellway, the founder of www.academic-toolkit.com . Ben completed his Ph.D. at The University of Cambridge and created the  Research Design Canvas , a multipurpose tool for learning about academic research and designing a research project. 

Based on requests from students for examples of completed research canvases, Ben created the  Research Model Builder Canvas . 

This canvas modifies the original questions in the nine building blocks to enable students to search for key information in a journal article and then reassemble it on the canvas to form a research model — a single-page visual summary of the journal article which captures how the research was designed and conducted. 

Ben’s second book,  Building Research Models,  explains how to use the Research Model Builder Canvas to become a more confident and competent reader of academic journal articles, while simultaneously building research models to use as blueprints to guide the design of your own project .  

Ben has created a template for Stormboard based on this tool and this is his brief guide on how to begin using it.

Starting with a blank page can be daunting

The Research Design Canvas brings together the key building blocks of academic research on a single page and provides targeted questions to help you design your own project. However, starting with a blank page can be a daunting prospect! 

Academic research is complex as it involves multiple components, so designing and conducting your own project can be overwhelming, especially if you lack confidence in making decisions or are confused about how the components of a project fit together. It is much easier to start a complex task and long process such as designing a research project when you have an existing research model or ‘blueprint’ to work from. 

Starting with a ‘blueprint’ — tailored to your topic area — is much easier

Using the Research Model Builder Canvas, you can transform a journal article in your topic into a research model or blueprint — a single-page visualization of how a project was designed and conducted. 

The research model — and equally importantly the process of building it — will improve your understanding of academic research, and will also provide you with a personalized learning resource for your Thesis. You can use the research model as a blueprint to refer to specific decisions and their justification, and how components of research fit together, to help you begin to build your own project. 

Obviously, each project is unique so you’ll be using the blueprint as a guide rather than as a ‘cookie cutter’ solution. Seeing the components of a completed research project together on a single page (which  you  produced from a ten or twenty-page journal article) — is a very powerful learning resource to have on your academic research journey.

Build research models on Stormboard 

If you prefer to work digitally rather than with paper and pen, you can use the Research Model Builder Canvas Template in Stormboard. 

By using the Stormboard template, you’ll be able to identify key content and points from the journal article and then quickly summarize these on digital sticky notes. You can easily edit the sticky notes to rearrange, delete, or expand upon the ideas and points. You can then refer back to the permanent visual research model you created, share it with fellow students, or discuss it with your supervisors.

What are the building blocks of the research model?

The template has nine building blocks. 

The original questions in the building blocks of the research design canvas are modified in the research model builder canvas. They are designed to help you locate the most important points, decisions, and details in a journal article.  

example of research model

A brief introduction to the purpose of each building block is provided below to help you familiarize yourself with the research model you will build.

Phenomenon / Problem

What does the research focus on? What were the main ‘things’ investigated and discussed in the journal article? Did the research involve a real-world problem?

What area (or areas) of past literature are identified and introduced? Which sources are especially important?

Observations & Arguments 

What are the most crucial points made by the authors in their analysis of past research? What evidence, issues, and themes are the focus of the literature review? Is a gap in past research identified? 

Research Questions / Hypotheses 

What are the research questions and/or hypotheses? How are they justified? If none are stated, what line of investigation is pursued?  

Theory & Concepts 

Does the research involve a theoretical or conceptual component? If so, what are the key concepts / theory? What role do they play in the research?  

Methodology / Design / Methods  

What methods and data were used? How are the decisions justified? 

Sample / Context 

What sampling method is used? Is the research context important?

Contributions

What contribution(s) do the authors claim that their research makes? Is the value-add more academically or practically-oriented? Are real-world stakeholders and the implications for them mentioned? 

Philosophical Assumptions / Research Paradigm 

These are not usually mentioned or discussed in journal articles. Indeed, this building block can be confusing if you are not familiar with research philosophy or are confused by its seemingly abstract focus. If you understand these ideas, can you identify any implicit assumptions or a research paradigm in the article?

Compare two research models to appreciate the diversity of research

The easiest way to increase your appreciation of the different types and ways of conducting academic research is to build  multiple  research models. 

Start by building two models. Compare and contrast them. Which decisions and aspects are similar and which are different? What can you learn from each research model and how can this help you when designing your own research and Thesis? 

Building research models will help you to appreciate the diversity in the different types of research conducted in your topic area.

Transforming a ten or twenty-page journal article into a single-page visual summary is a powerful way to learn about how academic research is designed and conducted — and also what a completed research project looks like. 

The Stormboard template makes the process of building research models easy, and the ability to save, edit, and share them ensures that you’ll be able to refer back to these blueprints at various stages throughout your research journey and Thesis writing process. 

When you get confused, become stuck, or feel overwhelmed by the complexity of academic research, you can fall back on the research models you created to guide you and get you back on track. Good luck!

Are you interested in trying the Research Model Builder Canvas? Sign up for a free trial now!

Keep reading.

The Best Collaboration Tools for Web Developers and Designers

Discover how technology enables cohesive collaboration for remote web development teams, with 4 out of 5 developers working remotely. Explore the top collaboration tools empowering seamless workflows and website building, regardless of location.

Agile Work Environment: All You Need to Know in 2024

In a fast-paced business world marked by evolving customer needs and technological advancements, staying ahead demands adaptability. Discover how fostering an agile workplace culture can empower your team to innovate, collaborate, and swiftly respond to change, driving efficiency and securing a competitive edge.

Turn Your Ideas into Action with New, Improved Whiteboard Sticky Notes

Learn how DevOps is revolutionizing development and operations workflows, enhancing product delivery speed and cost-efficiency. Explore the critical role of continuous security auditing in DevOps to mitigate vulnerabilities and safeguard against threats.

10 Common Project Management Challenges &amp; How to Overcome Them

Project management challenges are inevitable. Here are some common problems and how you can stay ahead of them in 2024.

Top Stormboard Updates and New Features of 2023

Explore the transformative landscape of 2023 as Stormboard, riding the generative AI wave, celebrates its most significant year yet. Uncover how Stormboard's core product enhancements and advanced workflow integrations respond to the evolving needs of modern businesses, empowering enterprises to reimagine their technology strategy for a dynamic future.

Effective Brainstorming in Diverse Teams: Overcoming Barriers and Embracing Differences&nbsp;

Unlock innovation and embrace diversity in team brainstorming! Learn strategies to navigate cultural differences and create an inclusive environment for effective ideation while overcoming boundaries and enhancing diversity in collaboration sessions.

Q3 Product Evolution: The Power of Data-Driven Innovation&nbsp;

Explore Stormboard's Q3 highlights: Discover how user feedback shapes our evolution, and explore Q3 highlights with new features and enhanced UX for global organizations.

The Ultimate Guide to Developing Buyer Personas for B2B&nbsp;

Discover how crafting buyer personas can elevate your marketing strategy, making your website more effective and enhancing customer interactions in the competitive business landscape.

How to Overcome the Biggest Challenges in Business Process Management

Navigate the complexities of business process management (BPM) with a proactive approach to overcome common challenges in implementing automation. Discover how AI technology, no-code, and low-code solutions make BPM accessible, enabling you to streamline processes, cut costs, and save time effectively.

5 Steps Leaders Can Take for Better Collaborative Brainstorming

Optimize Collaborative Brainstorming: Learn the art of efficient idea generation with these five crucial steps. Enhance team collaboration, foster innovation, and bid farewell to unproductive meetings. Learn how to elevate your brainstorming sessions for impactful outcomes.

Avoiding Burnout: Manage Your Time With These Techniques&nbsp;

Even with half their waking hours spent at work, professionals experiencing burnout feel time is never enough. Chaotic workdays, heavy workloads, unproductive meetings, and constant distractions are common culprits.

Discover how to reclaim your work-life balance and conquer burnout with these essential time management strategies.

How to Track Your Team’s Workflow Remotely

Archive - 2021 business trends.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • v.13(5); 2023
  • PMC10230988

Logo of bmjo

Original research

Evidence-based practice models and frameworks in the healthcare setting: a scoping review, jarrod dusin.

1 Department of Evidence Based Practice, Children’s Mercy Hospitals and Clinics, Kansas City, Missouri, USA

2 Therapeutic Science, The University of Kansas Medical Center, Kansas City, Kansas, USA

Andrea Melanson

Lisa mische-lawson, associated data.

bmjopen-2022-071188supp001.pdf

bmjopen-2022-071188supp002.pdf

No data are available.

The aim of this scoping review was to identify and review current evidence-based practice (EBP) models and frameworks. Specifically, how EBP models and frameworks used in healthcare settings align with the original model of (1) asking the question, (2) acquiring the best evidence, (3) appraising the evidence, (4) applying the findings to clinical practice and (5) evaluating the outcomes of change, along with patient values and preferences and clinical skills.

A Scoping review.

Included sources and articles

Published articles were identified through searches within electronic databases (MEDLINE, EMBASE, Scopus) from January 1990 to April 2022. The English language EBP models and frameworks included in the review all included the five main steps of EBP. Excluded were models and frameworks focused on one domain or strategy (eg, frameworks focused on applying findings).

Of the 20 097 articles found by our search, 19 models and frameworks met our inclusion criteria. The results showed a diverse collection of models and frameworks. Many models and frameworks were well developed and widely used, with supporting validation and updates. Some models and frameworks provided many tools and contextual instruction, while others provided only general process instruction. The models and frameworks reviewed demonstrated that the user must possess EBP expertise and knowledge for the step of assessing evidence. The models and frameworks varied greatly in the level of instruction to assess the evidence. Only seven models and frameworks integrated patient values and preferences into their processes.

Many EBP models and frameworks currently exist that provide diverse instructions on the best way to use EBP. However, the inclusion of patient values and preferences needs to be better integrated into EBP models and frameworks. Also, the issues of EBP expertise and knowledge to assess evidence must be considered when choosing a model or framework.

STRENGTHS AND LIMITATIONS OF THIS STUDY

  • Currently, no comprehensive review exists of evidence-based practice (EBP) models and frameworks.
  • Well-developed models and frameworks may have been excluded for not including all five steps of original model for EBP.
  • This review did not measure the quality of the models and frameworks based on validated studies.

Introduction

Evidence-based practice (EBP) grew from evidence-based medicine (EBM) to provide a process to review, translate and implement research with practice to improve patient care, treatment and outcomes. Guyatt 1 coined the term EBM in the early 1990s. Over the last 25 years, the field of EBM has continued to evolve and is now a cornerstone of healthcare and a core competency for all medical professionals. 2 3 At first, the term EBM was used only in medicine. However, the term EBP now applies to the principles of other health professions. This expansion of the concept of EBM increases its complexity. 4 The term EBP is used for this paper because it is universal across professions.

Early in the development of EBP, Sackett 5 created an innovative five-step model. This foundational medical model provided a concise overview of the process of EBP. The five steps are (1) asking the question, (2) acquiring the best evidence, (3) appraising the evidence, (4) applying the findings to clinical practice and (5) evaluating the outcomes of change. Other critical components of Sackett’s model are considering patient value and preferences and clinical skills with the best available evidence. 5 The influence of this model has led to its integration and adaption into every field of healthcare. Historically, the foundation of EBP has focused on asking the question, acquiring the literature and appraising the evidence but has had difficulty integrating evidence into practice. 6 Although the five steps appear simple, each area includes a vast number of ways to review the literature (eg, Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA), Newcastle-Ottawa Scale) and entire fields of study, such as implementation science, a field dedicated to implementing EBP. 7 8 Implementation science can be traced to the 1960s with Everett Rogers’ Diffusion of Innovation Theory and has grown alongside EBP over the last 25 years. 7 9

One way to manage the complexity of EBP in healthcare is by developing EBP models and frameworks that establish strategies to determine resource needs, identify barriers and facilitators, and guide processes. 10 EBP models and frameworks provide insight into the complexity of transforming evidence into clinical practice. 11 They also allow organisations to determine readiness, willingness and potential outcomes for a hospital system. 12 EBP can differ from implementation science, as EBP models include all five of Sackett’s steps of EBP, while the non-process models of implementation science typically focus on the final two steps. 5 10 There are published scoping reviews of implementation science, 13 however, no comprehensive review of EBP models and frameworks currently exists. Although there is overlap of EBP, implementation science and knowledge translation models and frameworks 10 14 the purpose of the scoping review was to explore how EBP models and frameworks used in healthcare settings align with the original EBP five-step model.

A scoping review synthesises findings across various study types and provides a broad overview of the selected topic. 15 The Arksey and O’Malley method and Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA-ScR) procedures guided this review (see online supplemental PRISMA-ScR checklist ). 15 16 The primary author established the research question and inclusion and exclusion criteria before conducting the review. An a priori protocol was not pre-registered. One research question guided the review: Which EBP models and frameworks align with Sackett’s original model?

Supplementary data

Eligibility criteria.

To be included in the review, English language published EBP models and frameworks needed to include the five main steps of EBP (asking the question, acquiring the best evidence, appraising the evidence, applying the findings to clinical practice and assessing the outcomes of change) based on Sackett’s model. 5 If the models or frameworks involved identifying problems or measured readiness for change, the criteria of ‘asking the question’ was met. Exclusions included models or frameworks focused on one domain or strategy (eg, frameworks focused on applying findings). Also, non-peer-reviewed abstracts, letters, editorials, opinion articles, and dissertations were excluded.

Search and selection

To identify potential studies, a medical librarian searched the databases from January 1990 to April 2022 in MEDLINE, EMBASE and Scopus in collaboration with the primary author. The search was limited to 1990 because the term EBP was coined in the early 90s. The search strategy employed the following keywords: ‘Evidence-Based Practice’ OR ‘evidence based medicine’ OR ‘evidence-based medicine’ OR ‘evidence based nursing’ OR ‘evidence-based nursing’ OR ‘evidence based practice’ OR ‘evidence-based practice’ OR ‘evidence based medicine’ OR ‘evidence-based medicine’ OR ‘evidence based nursing’ OR ‘evidence-based nursing’ OR ‘evidence based practice’ OR ‘evidence-based practice’ AND ‘Hospitals’ OR ‘Hospital Medicine’ OR ‘Nursing’ OR ‘Advanced Practice Nursing’ OR ‘Academic Medical Centers’ OR ‘healthcare’ OR ‘hospital’ OR ‘healthcare’ OR ‘hospital’ AND ‘Models, Organizational’ OR ‘Models, Nursing’ OR ‘framework’ OR ‘theory’ OR ‘theories’ OR ‘model’ OR ‘framework’ OR ‘theory’ OR ‘theories’ OR ‘model’. Additionally, reference lists in publications included for full-text review were screened to identify eligible models and frameworks (see online supplemental appendix A for searches).

Selection of sources of evidence

Two authors (JD and AM) independently screened titles and abstracts and selected studies for potential inclusion in the study, applying the predefined inclusion and exclusion criteria. Both authors then read the full texts of these articles to assess eligibility for final inclusion. Disagreement between the authors regarding eligibility was resolved by consensus between the three authors (JD, AM and LM-L). During the selection process, many models and frameworks were found more than once. Once a model or framework article was identified, the seminal article was reviewed for inclusion. If models or frameworks had been changed or updated since the publication of their seminal article, the most current iteration published was reviewed for inclusion. Once a model or framework was identified and verified for inclusion, all other articles listing the model or framework were excluded. This scoping review intended to identify model or framework aligned with Sackett’s model; therefore, analysing every article that used the included model or framework was unnecessary (see online supplemental appendix B for tracking form).

Data extraction and analysis

Data were collected on the following study characteristics: (1) authors, (2) publication year, (3) model or framework and (4) area(s) of focus in reference to Sackett’s five-step model. After initial selection, models and frameworks were analysed for key features and alignment to the five-step EBP process. A data analysis form was developed to map detailed information (see online supplemental appendix C for full data capture form). Data analysis focused on identifying (1) the general themes of the model or frameworks, and (2) any knowledge gaps. Data extraction and analysis were done by the primary author (JD) and verified by one other author (AM). 15

Patient and public involvement

Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

The search identified 6523 potentially relevant references (see figure 1 ). Following a review of the titles and abstracts, the primary author completed a more detailed screening of 37 full papers. From these, 19 models and frameworks were included. Table 1 summarises the 19 models and frameworks. Of the 19 models and frameworks assessed and mapped, 15 had broad target audiences, including healthcare or public health organisations or health systems. Only five models and frameworks included a target audience of individual clinicians (eg, physicians and nurses). 17–22

An external file that holds a picture, illustration, etc.
Object name is bmjopen-2022-071188f01.jpg

Retrieval and selection process.

Models and frameworks organised by integration of patient preferences and values

EBP, evidence-based practice.

Asking the question

All 19 models and frameworks included a process for asking questions. Most focused on identifying problems that needed to be addressed on an organisational or hospital level. Five used the PICO (population, intervention, comparator, outcome) format to ask specific questions related to patient care. 19–25

Acquiring the evidence

The models and frameworks gave basic instructions on acquiring literature, such as ‘conduct systematic search’ or ‘acquire resource’. 20 Four recommended sources from previously generated evidence, such as guidelines and systematic reviews. 6 21 22 26 Although most models and frameworks did not provide specifics, others suggested this work be done through EBP mentors/experts. 20 21 25 27 Seven models included qualitative evidence in the use of evidence, 6 19 21 24 27–29 while only four models considered the use of patient preference and values as evidence. 21 22 24 27 Six models recommended internal data be used in acquiring information. 17 20–22 24 27

Assessing the evidence

The models and frameworks varied greatly in the level of instruction provided in assessing the best evidence. All provided a general overview in assessing and grading the evidence. Four recommended this work be done by EBP mentors and experts. 20 25 27 30 Seven models developed specific tools to be used to assess the levels of evidence. 6 17 21 22 24 25 27

Applying the evidence

The application of evidence also varied greatly for the different models and frameworks. Seven models recommended pilot programmes to implement change. 6 21–25 31 Five recommended the use of EBP mentors and experts to assist in the implementation of evidence and quality improvement as a strategy of the models and frameworks. 20 24 25 27 Thirteen models and frameworks discussed patient values and preferences, 6 17–19 21–27 31 32 but only seven incorporated this topic into the model or framework, 21–27 and only five included tools and instructions. 21–25 Twelve of the 20 models discussed using clinical skill, but specifics of how this was incorporated was lacking in models and frameworks. 6 17–19 21–27 31

Evaluating the outcomes of change

Evaluation varied among the models and frameworks, but most involved using implementation outcome measures to determine the project’s success. Five models and frameworks provide tools and in-depth instruction for evaluation. 21 22 24–26 Monash Partners Learning Health Systems provided detailed instruction on using internal institutional data to determine success of application. 26 This framework uses internal and external data along with evidence in decision making as a benchmark for successful implementation.

EBP models and frameworks provide a process for transforming evidence into clinical practice and allow organisations to determine readiness and willingness for change in a complex hospital system. 12 The large number of models and frameworks complicates the process by confusing what the best tool is for healthcare organisations. This review examined many models and frameworks and assessed the characteristics and gaps that can better assist healthcare organisations to determine the right tool for themselves. This review identified 19 EBP models and frameworks that included the five main steps of EBP as described by Sackett. 5 The results showed that the themes of the models and frameworks are as diverse as the models and frameworks themselves. Some are well developed and widely used, with supporting validation and updates. 21 22 24 27 One such model, the Iowa EBP model, has received over 3900 requests for permission to use it and has been updated from its initial development and publication. 24 Other models provided tools and contextual instruction such as the Johns Hopkin’s model which includes a large number of supporting tools for developing PICOs, instructions for grading literature and project implementation. 17 21 22 24 27 By contrast, the ACE Star model and the An Evidence Implementation Model for Public Health Systems only provide high level overview and general instructions compared with other models and frameworks. 19 29 33

Gaps in the evidence

A consistent finding in research of clinician experience with EBP is the lack of expertise that is needed to assess the literature. 24 34 35 The models and frameworks reviewed demonstrated that the user must possess the knowledge and related skills for this step in the process. The models and frameworks varied greatly in the level of instruction to assess the evidence. Most provided a general overview in assessing and grading the evidence, though a few recommended that this work be done by EBP mentors and experts. 20 25 27 ARCC, JBI and Johns Hopkins provided robust tools and resources that would require administrative time and financial support. 21 22 27 Some models and frameworks offered vital resources or pointed to other resources for assessing evidence, 24 but most did not. While a few used mentors and experts to assist with assessing the literature, a majority did not address this persistent issue.

Sackett’s five-step model included another important consideration when implementing EBP: patient values and preferences. One criticism of EBP is that it ignores patient values and preferences. 36 Over half of the models and frameworks reported the need to include patient values and preferences, but the tools, instruction or resources for including them were limited. The ARCC model integrates patient preferences and values into the model, but it is up to the EBP mentor to accomplish this task. 37 There are many tools for assessing evidence, but few models and frameworks provide this level of guidance for incorporating patient preference and values. The inclusion of patient and family values and preferences can be misunderstood, insincere, and even tokenistic but without it there is reduced chance of success of implementation of EBP. 38 39

Strengths and limitations

Similar to other well-designed scoping reviews, the strengths of this review include a rigorous search conducted by a skilled librarian, literature evaluation by more than one person, and the utilisation of an established methodological framework (PRISMA-ScR). 14 15 Additionally, utilising the EBP five-step models as a point of alignment allows for a more comprehensive breakdown and established reference points for the reviewed models and frameworks. While scoping reviews have been completed on implementation science and knowledge translation models and framework, to our knowledge, this is the first scoping review of EBP models and frameworks. 13 14 Limitations of the study include that well-developed models and frameworks may have been excluded for not including all five steps. 40 For example, the Promoting Action on Research Implementation in Health Services (PARIHS) framework is a well-developed and validated implementation framework but did not include all five steps of an EBP model. 40 Also, some models and frameworks have been studied and validated over many years. It was beyond the scope of the review to measure the quality of the models and frameworks based on these other validated studies.

Implications and future research

Healthcare organisations can support EBP by choosing a model or framework that best suits their environment and providing clear guidance for implementing the best evidence. Some organisations may find the best fit with the ARCC and the Clinical Scholars Model because of the emphasis on mentors or the Johns Hopkins model for its tools for grading the level of evidence. 21 25 27 In contrast, other organisations may find the Iowa model useful with its feedback loops throughout its process. 24

Another implication of this study is the opportunity to better define and develop robust tools for patient and family values and preferences within EBP models and frameworks. Patient experiences are complex and require thorough exploration, so it is not overlooked, which is often the case. 39 41 The utilisation of EBP models and frameworks provide an opportunity to explore this area and provide the resources and understanding that are often lacking. 38 Though varying, models such as the Iowa Model, JBI and Johns Hopkins developed tools to incorporate patient and family values and preferences, but a majority of the models and frameworks did not. 21 22 24 An opportunity exists to create broad tools that can incorporate patient and family values and preferences into EBP to a similar extent as many of the models and frameworks used for developing tools for literature assessment and implementation. 21–25

Future research should consider appraising the quality and use of the different EBP models and frameworks to determine success. Additionally, greater clarification on what is considered patient and family values and preferences and how they can be integrated into the different models and frameworks is needed.

This scoping review of 19 models and frameworks shows considerable variation regarding how the EBP models and frameworks integrate the five steps of EBP. Most of the included models and frameworks provided a narrow description of the steps needed to assess and implement EBP, while a few provided robust instruction and tools. The reviewed models and frameworks provided diverse instructions on the best way to use EBP. However, the inclusion of patient values and preferences needs to be better integrated into EBP models. Also, the issues of EBP expertise to assess evidence must be considered when selecting a model or framework.

Supplementary Material

Acknowledgments.

We thank Keri Swaggart for completing the database searches and the Medical Writing Center at Children's Mercy Kansas City for editing this manuscript.

Contributors: All authors have read and approved the final manuscript. JD conceptualised the study design, screened the articles for eligibility, extracted data from included studies and contributed to the writing and revision of the manuscript. LM-L conceptualised the study design, provided critical feedback on the manuscript and revised the manuscript. AM screened the articles for eligibility, extracted data from the studies, provided critical feedback on the manuscript and revised the manuscript. JD is the guarantor of this work.

Funding: The article processing charges related to the publication of this article were supported by The University of Kansas (KU) One University Open Access Author Fund sponsored jointly by the KU Provost, KU Vice Chancellor for Research, and KUMC Vice Chancellor for Research and managed jointly by the Libraries at the Medical Center and KU - Lawrence

Disclaimer: No funding agencies had input into the content of this manuscript.

Competing interests: None declared.

Patient and public involvement: Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

Provenance and peer review: Not commissioned; externally peer reviewed.

Supplemental material: This content has been supplied by the author(s). It has not been vetted by BMJ Publishing Group Limited (BMJ) and may not have been peer-reviewed. Any opinions or recommendations discussed are solely those of the author(s) and are not endorsed by BMJ. BMJ disclaims all liability and responsibility arising from any reliance placed on the content. Where the content includes any translated material, BMJ does not warrant the accuracy and reliability of the translations (including but not limited to local regulations, clinical guidelines, terminology, drug names and drug dosages), and is not responsible for any error and/or omissions arising from translation and adaptation or otherwise.

Data availability statement

Ethics statements, patient consent for publication.

Not applicable.

  • Privacy Policy

Buy Me a Coffee

Research Method

Home » Research Report – Example, Writing Guide and Types

Research Report – Example, Writing Guide and Types

Table of Contents

Research Report

Research Report

Definition:

Research Report is a written document that presents the results of a research project or study, including the research question, methodology, results, and conclusions, in a clear and objective manner.

The purpose of a research report is to communicate the findings of the research to the intended audience, which could be other researchers, stakeholders, or the general public.

Components of Research Report

Components of Research Report are as follows:

Introduction

The introduction sets the stage for the research report and provides a brief overview of the research question or problem being investigated. It should include a clear statement of the purpose of the study and its significance or relevance to the field of research. It may also provide background information or a literature review to help contextualize the research.

Literature Review

The literature review provides a critical analysis and synthesis of the existing research and scholarship relevant to the research question or problem. It should identify the gaps, inconsistencies, and contradictions in the literature and show how the current study addresses these issues. The literature review also establishes the theoretical framework or conceptual model that guides the research.

Methodology

The methodology section describes the research design, methods, and procedures used to collect and analyze data. It should include information on the sample or participants, data collection instruments, data collection procedures, and data analysis techniques. The methodology should be clear and detailed enough to allow other researchers to replicate the study.

The results section presents the findings of the study in a clear and objective manner. It should provide a detailed description of the data and statistics used to answer the research question or test the hypothesis. Tables, graphs, and figures may be included to help visualize the data and illustrate the key findings.

The discussion section interprets the results of the study and explains their significance or relevance to the research question or problem. It should also compare the current findings with those of previous studies and identify the implications for future research or practice. The discussion should be based on the results presented in the previous section and should avoid speculation or unfounded conclusions.

The conclusion summarizes the key findings of the study and restates the main argument or thesis presented in the introduction. It should also provide a brief overview of the contributions of the study to the field of research and the implications for practice or policy.

The references section lists all the sources cited in the research report, following a specific citation style, such as APA or MLA.

The appendices section includes any additional material, such as data tables, figures, or instruments used in the study, that could not be included in the main text due to space limitations.

Types of Research Report

Types of Research Report are as follows:

Thesis is a type of research report. A thesis is a long-form research document that presents the findings and conclusions of an original research study conducted by a student as part of a graduate or postgraduate program. It is typically written by a student pursuing a higher degree, such as a Master’s or Doctoral degree, although it can also be written by researchers or scholars in other fields.

Research Paper

Research paper is a type of research report. A research paper is a document that presents the results of a research study or investigation. Research papers can be written in a variety of fields, including science, social science, humanities, and business. They typically follow a standard format that includes an introduction, literature review, methodology, results, discussion, and conclusion sections.

Technical Report

A technical report is a detailed report that provides information about a specific technical or scientific problem or project. Technical reports are often used in engineering, science, and other technical fields to document research and development work.

Progress Report

A progress report provides an update on the progress of a research project or program over a specific period of time. Progress reports are typically used to communicate the status of a project to stakeholders, funders, or project managers.

Feasibility Report

A feasibility report assesses the feasibility of a proposed project or plan, providing an analysis of the potential risks, benefits, and costs associated with the project. Feasibility reports are often used in business, engineering, and other fields to determine the viability of a project before it is undertaken.

Field Report

A field report documents observations and findings from fieldwork, which is research conducted in the natural environment or setting. Field reports are often used in anthropology, ecology, and other social and natural sciences.

Experimental Report

An experimental report documents the results of a scientific experiment, including the hypothesis, methods, results, and conclusions. Experimental reports are often used in biology, chemistry, and other sciences to communicate the results of laboratory experiments.

Case Study Report

A case study report provides an in-depth analysis of a specific case or situation, often used in psychology, social work, and other fields to document and understand complex cases or phenomena.

Literature Review Report

A literature review report synthesizes and summarizes existing research on a specific topic, providing an overview of the current state of knowledge on the subject. Literature review reports are often used in social sciences, education, and other fields to identify gaps in the literature and guide future research.

Research Report Example

Following is a Research Report Example sample for Students:

Title: The Impact of Social Media on Academic Performance among High School Students

This study aims to investigate the relationship between social media use and academic performance among high school students. The study utilized a quantitative research design, which involved a survey questionnaire administered to a sample of 200 high school students. The findings indicate that there is a negative correlation between social media use and academic performance, suggesting that excessive social media use can lead to poor academic performance among high school students. The results of this study have important implications for educators, parents, and policymakers, as they highlight the need for strategies that can help students balance their social media use and academic responsibilities.

Introduction:

Social media has become an integral part of the lives of high school students. With the widespread use of social media platforms such as Facebook, Twitter, Instagram, and Snapchat, students can connect with friends, share photos and videos, and engage in discussions on a range of topics. While social media offers many benefits, concerns have been raised about its impact on academic performance. Many studies have found a negative correlation between social media use and academic performance among high school students (Kirschner & Karpinski, 2010; Paul, Baker, & Cochran, 2012).

Given the growing importance of social media in the lives of high school students, it is important to investigate its impact on academic performance. This study aims to address this gap by examining the relationship between social media use and academic performance among high school students.

Methodology:

The study utilized a quantitative research design, which involved a survey questionnaire administered to a sample of 200 high school students. The questionnaire was developed based on previous studies and was designed to measure the frequency and duration of social media use, as well as academic performance.

The participants were selected using a convenience sampling technique, and the survey questionnaire was distributed in the classroom during regular school hours. The data collected were analyzed using descriptive statistics and correlation analysis.

The findings indicate that the majority of high school students use social media platforms on a daily basis, with Facebook being the most popular platform. The results also show a negative correlation between social media use and academic performance, suggesting that excessive social media use can lead to poor academic performance among high school students.

Discussion:

The results of this study have important implications for educators, parents, and policymakers. The negative correlation between social media use and academic performance suggests that strategies should be put in place to help students balance their social media use and academic responsibilities. For example, educators could incorporate social media into their teaching strategies to engage students and enhance learning. Parents could limit their children’s social media use and encourage them to prioritize their academic responsibilities. Policymakers could develop guidelines and policies to regulate social media use among high school students.

Conclusion:

In conclusion, this study provides evidence of the negative impact of social media on academic performance among high school students. The findings highlight the need for strategies that can help students balance their social media use and academic responsibilities. Further research is needed to explore the specific mechanisms by which social media use affects academic performance and to develop effective strategies for addressing this issue.

Limitations:

One limitation of this study is the use of convenience sampling, which limits the generalizability of the findings to other populations. Future studies should use random sampling techniques to increase the representativeness of the sample. Another limitation is the use of self-reported measures, which may be subject to social desirability bias. Future studies could use objective measures of social media use and academic performance, such as tracking software and school records.

Implications:

The findings of this study have important implications for educators, parents, and policymakers. Educators could incorporate social media into their teaching strategies to engage students and enhance learning. For example, teachers could use social media platforms to share relevant educational resources and facilitate online discussions. Parents could limit their children’s social media use and encourage them to prioritize their academic responsibilities. They could also engage in open communication with their children to understand their social media use and its impact on their academic performance. Policymakers could develop guidelines and policies to regulate social media use among high school students. For example, schools could implement social media policies that restrict access during class time and encourage responsible use.

References:

  • Kirschner, P. A., & Karpinski, A. C. (2010). Facebook® and academic performance. Computers in Human Behavior, 26(6), 1237-1245.
  • Paul, J. A., Baker, H. M., & Cochran, J. D. (2012). Effect of online social networking on student academic performance. Journal of the Research Center for Educational Technology, 8(1), 1-19.
  • Pantic, I. (2014). Online social networking and mental health. Cyberpsychology, Behavior, and Social Networking, 17(10), 652-657.
  • Rosen, L. D., Carrier, L. M., & Cheever, N. A. (2013). Facebook and texting made me do it: Media-induced task-switching while studying. Computers in Human Behavior, 29(3), 948-958.

Note*: Above mention, Example is just a sample for the students’ guide. Do not directly copy and paste as your College or University assignment. Kindly do some research and Write your own.

Applications of Research Report

Research reports have many applications, including:

  • Communicating research findings: The primary application of a research report is to communicate the results of a study to other researchers, stakeholders, or the general public. The report serves as a way to share new knowledge, insights, and discoveries with others in the field.
  • Informing policy and practice : Research reports can inform policy and practice by providing evidence-based recommendations for decision-makers. For example, a research report on the effectiveness of a new drug could inform regulatory agencies in their decision-making process.
  • Supporting further research: Research reports can provide a foundation for further research in a particular area. Other researchers may use the findings and methodology of a report to develop new research questions or to build on existing research.
  • Evaluating programs and interventions : Research reports can be used to evaluate the effectiveness of programs and interventions in achieving their intended outcomes. For example, a research report on a new educational program could provide evidence of its impact on student performance.
  • Demonstrating impact : Research reports can be used to demonstrate the impact of research funding or to evaluate the success of research projects. By presenting the findings and outcomes of a study, research reports can show the value of research to funders and stakeholders.
  • Enhancing professional development : Research reports can be used to enhance professional development by providing a source of information and learning for researchers and practitioners in a particular field. For example, a research report on a new teaching methodology could provide insights and ideas for educators to incorporate into their own practice.

How to write Research Report

Here are some steps you can follow to write a research report:

  • Identify the research question: The first step in writing a research report is to identify your research question. This will help you focus your research and organize your findings.
  • Conduct research : Once you have identified your research question, you will need to conduct research to gather relevant data and information. This can involve conducting experiments, reviewing literature, or analyzing data.
  • Organize your findings: Once you have gathered all of your data, you will need to organize your findings in a way that is clear and understandable. This can involve creating tables, graphs, or charts to illustrate your results.
  • Write the report: Once you have organized your findings, you can begin writing the report. Start with an introduction that provides background information and explains the purpose of your research. Next, provide a detailed description of your research methods and findings. Finally, summarize your results and draw conclusions based on your findings.
  • Proofread and edit: After you have written your report, be sure to proofread and edit it carefully. Check for grammar and spelling errors, and make sure that your report is well-organized and easy to read.
  • Include a reference list: Be sure to include a list of references that you used in your research. This will give credit to your sources and allow readers to further explore the topic if they choose.
  • Format your report: Finally, format your report according to the guidelines provided by your instructor or organization. This may include formatting requirements for headings, margins, fonts, and spacing.

Purpose of Research Report

The purpose of a research report is to communicate the results of a research study to a specific audience, such as peers in the same field, stakeholders, or the general public. The report provides a detailed description of the research methods, findings, and conclusions.

Some common purposes of a research report include:

  • Sharing knowledge: A research report allows researchers to share their findings and knowledge with others in their field. This helps to advance the field and improve the understanding of a particular topic.
  • Identifying trends: A research report can identify trends and patterns in data, which can help guide future research and inform decision-making.
  • Addressing problems: A research report can provide insights into problems or issues and suggest solutions or recommendations for addressing them.
  • Evaluating programs or interventions : A research report can evaluate the effectiveness of programs or interventions, which can inform decision-making about whether to continue, modify, or discontinue them.
  • Meeting regulatory requirements: In some fields, research reports are required to meet regulatory requirements, such as in the case of drug trials or environmental impact studies.

When to Write Research Report

A research report should be written after completing the research study. This includes collecting data, analyzing the results, and drawing conclusions based on the findings. Once the research is complete, the report should be written in a timely manner while the information is still fresh in the researcher’s mind.

In academic settings, research reports are often required as part of coursework or as part of a thesis or dissertation. In this case, the report should be written according to the guidelines provided by the instructor or institution.

In other settings, such as in industry or government, research reports may be required to inform decision-making or to comply with regulatory requirements. In these cases, the report should be written as soon as possible after the research is completed in order to inform decision-making in a timely manner.

Overall, the timing of when to write a research report depends on the purpose of the research, the expectations of the audience, and any regulatory requirements that need to be met. However, it is important to complete the report in a timely manner while the information is still fresh in the researcher’s mind.

Characteristics of Research Report

There are several characteristics of a research report that distinguish it from other types of writing. These characteristics include:

  • Objective: A research report should be written in an objective and unbiased manner. It should present the facts and findings of the research study without any personal opinions or biases.
  • Systematic: A research report should be written in a systematic manner. It should follow a clear and logical structure, and the information should be presented in a way that is easy to understand and follow.
  • Detailed: A research report should be detailed and comprehensive. It should provide a thorough description of the research methods, results, and conclusions.
  • Accurate : A research report should be accurate and based on sound research methods. The findings and conclusions should be supported by data and evidence.
  • Organized: A research report should be well-organized. It should include headings and subheadings to help the reader navigate the report and understand the main points.
  • Clear and concise: A research report should be written in clear and concise language. The information should be presented in a way that is easy to understand, and unnecessary jargon should be avoided.
  • Citations and references: A research report should include citations and references to support the findings and conclusions. This helps to give credit to other researchers and to provide readers with the opportunity to further explore the topic.

Advantages of Research Report

Research reports have several advantages, including:

  • Communicating research findings: Research reports allow researchers to communicate their findings to a wider audience, including other researchers, stakeholders, and the general public. This helps to disseminate knowledge and advance the understanding of a particular topic.
  • Providing evidence for decision-making : Research reports can provide evidence to inform decision-making, such as in the case of policy-making, program planning, or product development. The findings and conclusions can help guide decisions and improve outcomes.
  • Supporting further research: Research reports can provide a foundation for further research on a particular topic. Other researchers can build on the findings and conclusions of the report, which can lead to further discoveries and advancements in the field.
  • Demonstrating expertise: Research reports can demonstrate the expertise of the researchers and their ability to conduct rigorous and high-quality research. This can be important for securing funding, promotions, and other professional opportunities.
  • Meeting regulatory requirements: In some fields, research reports are required to meet regulatory requirements, such as in the case of drug trials or environmental impact studies. Producing a high-quality research report can help ensure compliance with these requirements.

Limitations of Research Report

Despite their advantages, research reports also have some limitations, including:

  • Time-consuming: Conducting research and writing a report can be a time-consuming process, particularly for large-scale studies. This can limit the frequency and speed of producing research reports.
  • Expensive: Conducting research and producing a report can be expensive, particularly for studies that require specialized equipment, personnel, or data. This can limit the scope and feasibility of some research studies.
  • Limited generalizability: Research studies often focus on a specific population or context, which can limit the generalizability of the findings to other populations or contexts.
  • Potential bias : Researchers may have biases or conflicts of interest that can influence the findings and conclusions of the research study. Additionally, participants may also have biases or may not be representative of the larger population, which can limit the validity and reliability of the findings.
  • Accessibility: Research reports may be written in technical or academic language, which can limit their accessibility to a wider audience. Additionally, some research may be behind paywalls or require specialized access, which can limit the ability of others to read and use the findings.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Data collection

Data Collection – Methods Types and Examples

Delimitations

Delimitations in Research – Types, Examples and...

Research Process

Research Process – Steps, Examples and Tips

Research Design

Research Design – Types, Methods and Examples

Institutional Review Board (IRB)

Institutional Review Board – Application Sample...

Evaluating Research

Evaluating Research – Process, Examples and...

Gravity Models for Global Migration Flows: A Predictive Evaluation

  • Original Research
  • Open access
  • Published: 02 April 2024
  • Volume 43 , article number  29 , ( 2024 )

Cite this article

You have full access to this open access article

  • Juan Caballero Reina 1 , 2 ,
  • Jesus Crespo Cuaresma   ORCID: orcid.org/0000-0003-3244-6560 1 , 3 , 4 , 5 , 6 ,
  • Katharina Fenz 1 , 3 ,
  • Jakob Zellmann 1 , 3 ,
  • Teodor Yankov 1 , 7 &
  • Amr Taha 8  

This study introduces a comprehensive econometric framework based on gravity equations and designed to forecast migrant flows between countries. The model’s theoretical underpinnings are validated through empirical data, and we show that the model has better out-of-sample predictive ability than alternative global models. We explore the quantitative effects of various socioeconomic, demographic, and geographic factors on migration and illustrate its use to obtain scenario-driven projections of bilateral migration, assessing the potential contributions of migration to population and GDP dynamics in Germany and Portugal for the period 2021–2025. Our projection results highlight the critical role of immigration in sustaining population levels and economic growth, particularly in the context of ageing populations and decreasing fertility rates across Europe.

Avoid common mistakes on your manuscript.

Introduction

The world is undergoing unprecedented changes in its age structure, fertility and mortality (Mason, 2022 ). On the one hand, certain countries around the world, from Japan and South Korea to Italy and a lot of Eastern Europe, are experiencing population decline (Bricker & Ibbitson, 2019 ). According to Vollset et al. ( 2020 , p. 1285), 23 countries including Spain, Japan and Thailand are forecast to undergo population declines larger than 50% between 2017 and 2100. Among richer countries, the issue is not simply one of declining populations, but also of ageing. Currently, the working-age population accounts for more than 65% of the world population, outnumbering the older age group (65 +) by almost seven times (UNDESA, 2019 ). However, the ratio of the working-age population to the older population is expected to fall to 5.5 by the year 2030, altering fundamental aspects of society such as labour force participation (Baker et al., 2005 ). Moreover, a declining and ageing population increases the burden on the capacity of public services (Lubitz, 2003 ), finances (Bloom, 2015 ), as well as social and family support networks (Prince, 2015 ).

In countries facing ageing and declining populations, migration can reduce old-age dependency ratios, and other factors, such as a higher labour force participation of women and better educated individuals, may help curve these demographic impacts (Lee, 2014 ). To investigate the magnitude of the effect of international mobility and design evidence-based migration policy, policy makers need accurate estimates of current migration and reliable forecasts of their future change, as well as credible predictions of their effects on economic growth.

In this contribution, we provide an approach to provide bilateral international migration flow predictions based on gravity models for short-term projection (5 years ahead). We augment the standard gravity model of migration by including additional social and economic variables known to impact migration and assess the forecasting accuracy of different specifications making use of out-of-sample predictive validation. We exemplify the usefulness of estimated gravity models by creating migration forecasts and comparing them to ‘zero immigration’ scenarios for population and gross domestic product (GDP) in Germany and Portugal. The choice of Germany and Portugal is justified as they exhibit low fertility rates and increasing inflows of migrants in recent years. In addition, reliable input data is available for both nations and their governments hold a rather stable position towards migration, which makes it easier to create credible projections of expected future developments.

The central underlying assumption of gravity models for international mobility is that migration flows between two countries are proportional to their size, i.e., to their total population and inversely proportional to the geodesic distance between them, which acts as a proxy for transportation costs (see (Ramos, 2016 ), for example). In addition to their intuitive appeal, gravity models can be augmented in a flexible manner with additional potential socioeconomic determinants of migration activity, such as GDP per capita, the relative size of the middle class, the ratio of the working-age population to children and elderly, fertility rates and the size of the existing stock of migrants of a given origin in a destination country. Moreover, gravity models also have a solid theoretical foundation in the random utility maximization (RUM) model. The RUM model is an economic framework that explores why individuals choose to migrate by considering their preferences and perceived benefits and costs of different locations. Such a framework posits that individuals select the migration destination that maximizes their overall satisfaction, leading to the calculation of choice probabilities based on the total utility of their action.

The use of gravity models in the context of migration goes back to Ravenstein ( 1885 ), who identifies gravity-like properties of international migration in the context of the United Kingdom, as well as Zipf ( 1946 ) who applies a gravity approach to analyse U.S. intercity migration. Another example of a migration analysis based on the gravity approach is given by Karemera ( 2000 ), that puts forward a gravity model of international migration for North America and identifies the population size of origin countries and the income of destination economies as two significant determinants of mobility to the region. Cohen ( 2008 ) also ground their approach on a gravity model and propose a generalized linear model based only on geographic and demographic independent variables. Kim and Cohen ( 2010 ) analyse the determinants of international migration flows to and from industrialized countries based on panel data and a gravity model specification that uses demographic, geographic and socioeconomic explanatory variables. Another example of panel data used in a gravity model of migration is Mayda ( 2010 ), who focuses on the determinants of migration inflows into 14 OECD countries. In particular, this study analyses the effect of income in countries of origin and destination on migration flows.

Traditionally, forecasts of migration flows have been based on relatively simple extrapolation exercises for past data, expert opinion, or the existing correlations between migration and economic or demographic data (Disney, 2015 ). While accurate knowledge of actual and projected migration flows is central to planning and implementing policy instruments, migration can be affected by many social, economic, and political drivers, making forecasting exercises difficult. Assessing the predictive performance of different methodological approaches to create forecasts of bilateral migration flows appears thus particularly important when selecting statistical models for migration flows (Aslany et al., 2021 ; Disney, 2015 ). Azose and Raftery ( 2015 ), who compare the performance of Bayesian probabilistic projections, persistence models, and gravity models of migration based on out-of-sample validation, find that their Bayesian hierarchical model outperforms an approach based on a gravity model as described in Cohen ( 2012 ). As opposed to our analysis, Azose and Raftery ( 2015 ) focus on country-level net migration instead of bilateral flows, and aim at creating projections over a long time horizon.

Building on Azose and Raftery ( 2015 ) and other studies, Sardoschau ( 2020 ) collects and visualizes migration predictions developed by several leading experts in the field of migration modelling. Focusing on relatively long-term forecasts of net migration flows, they compare the performance of gravity models to that of structural and Bayesian specifications. Their analysis shows that gravity models perform slightly worse regarding the theoretical foundation, transparency of assumptions, and predictive power than structural models, but have lower data requirements. In contrast, when comparing gravity models to Bayesian models, they find that gravity models have slightly higher data requirements but a more robust theoretical foundation and a similar level of transparency in the underlying assumptions. An important recent critique of gravity models comes from Beyer et al. ( 2022 ), who question the explanatory power of gravity models for variation in migration flows over time for pairs of countries. In particular, the analysis in Beyer et al. ( 2022 ) concludes that while gravity models describe spatial patterns of international migration very well, they do not capture temporal dynamics better than averages of historical flows.

The aim of this paper is to rigorously evaluate the predictive ability of gravity models of migration for bilateral flows using a forecasting exercise and comparing the predictive ability of gravity specifications with those of averages of the historical flows (in the spirit of Beyer et al. 2022 ). We assess the demographic, geographical, and socioeconomic factors that appear empirically relevant to explaining and forecasting migration patterns. Our results indicate that the best predictive performance is delivered by econometric models for migration which in addition to the standard gravity variables incorporate information about diaspora, demographic factors and labour market outcomes. We exemplify the use of these models to create projections of the future contribution of migration to population and GDP dynamics.

The remainder of this paper is organized as follows, In Sect.  2 , we describe our input data, our specification of the gravity model and the other statistical methods that we use for a comparison. Section  3 presents the results and Sect.  4 concludes.

Gravity Models of Migration: Specifications and Data

In the framework of gravity models, migration flows between countries are linked to their respective size and the distance between them (as a proxy of mobility costs). Such a relationship implies an underlying data generating process that links migration flows from origin country i to destination countries j in period t ( \(m_{i,j,t}\) ) to the size, measured by total population, of origin and destination countries ( \(S_{i,t}\) and \(S_{j,t}\) respectively) and the geodesic distance that separates them ( \(d_{i,j}\) ),

where \(\epsilon _{i,j,t}\) is a stochastic error term, and c represents a scaling constant. The theoretical basis of such a specification is motivated by migration decisions based on their potential gains to expected utility (see for instance Ortega & Peri, 2013 ). Additional push and pull factors of origin and destination countries that are assumed to influence migrants’ decisions can be incorporated to Eq. ( 1 ). Besides the standard gravity model, we also estimate specifications that include (a) socioeconomic factors such as GDP per capita and the relative size of the middle class to capture economic incentives that act as pull and push factors for migration, as well as the ratio of the working-age population (15–64) to the total population and unemployment rates as proxies for labour market needs; (b) demographic characteristics such as fertility rates and the share of people with at least secondary education; (c) diaspora variables measuring the existing number of migrants of a given origin in the destination country and the flow of migrants in the last period (i.e., with a five-year lag); and (d) dummy variables indicating whether origin and destination countries share a common border or a common (official) language.

Summarizing these additional origin-specific variables in the vector \({\textbf{Z}}_{i,t}\) , the destination-specific variables in \({\textbf{Z}}_{j,t}\) and the bilateral factors in \({\textbf{X}}_{i,j,t}\) , and using a linear model in (natural) logs, the specifications we use in our forecasting exercise are nested in the model given by

where the vectors \(\varvec{\theta }\) , \(\varvec{\phi }\) and \(\varvec{\eta }\) summarize the effect of the variables in \({\textbf{Z}}_{i,t}\) , \({\textbf{Z}}_{j,t}\) and \({\textbf{X}}_{i,j,t}\) , respectively, and \(\mu _{i,j,t} = \log \epsilon _{i,j,t}\) is assumed to fulfil the assumptions of the standard linear regression models.

To measure the predictive power of the proposed gravity model, we assess the quality of the forecasts produced by the specification in a pseudo-out-of-sample predictive analysis exercise and benchmark the forecasts from several specifications of the form given by equation ( 2 ) with other simpler heuristic models: (i) the naive approach to use migration in the last observation period as prediction for the following periods ( random walk model); (ii) use the historical average for each given origin and destination country as a prediction ( historical mean model); and (iii) a simple autoregressive model, where the forecasts are obtained from a model that projects the (cross-sectional) flows on their lagged values and bilateral origin–destination fixed effects as explanatory covariates.

The bilateral migration flow data required to estimate the gravity specifications is sourced from Abel and Cohen ( 2019 ), who provide information for 200 countries of the world in 5-year intervals ranging from 1990 to 2020. Specifically, we use their results based on a closed demographic accounting system and a minimization approach to estimate the missing bilateral migration flows. For the set of independent covariates in the gravity models, we employ data from several sources. Information on national GDP per capita is sourced from the World Economic Outlook from the International Monetary Fund ( 2022 ) and data on population from the UNDESA ( 2019 ) World Population Prospects (WPP) dataset. Data on migrant stocks are obtained from the United Nations Department of Economic and Social Affairs UNDESA ( 2020 ). For each five-year interval of migration flow data, we use migrant stock data referring to the beginning of the corresponding period. We also employ a (5 years) lag of the bilateral migration flows as an additional covariate. Information on the share of people of working age (15–64) in the total population is also sourced from the United Nation’s WPP dataset. The share of persons with post-secondary education and fertility rates are obtained from Wittgenstein Centre ( 2018 ). Data on the share of the middle class (defined as households spending $11–110 per day per person in 2011 purchasing power parity, or PPP) in the general population are obtained from the World Data Lab ( 2022 ), World Data Pro. Finally, information on unemployment is sourced from International Labour Organization ( 2022 ). Our final data set contains (bilateral) information on 177 origin and destination countries. Information on some variables is missing for particular country pairs and periods. The final unbalanced panel data set contains 116,460 observations, and we utilize the balanced sub-panel containing 113,700 observations composed of 28,425 country pairs and four five-year periods starting in 2000. Footnote 1

We consider four different specifications of the form given by Eq. ( 2 ), that differ in the number and nature of the regressors included as controls in the model. The variables included in each of the specifications estimated can be found in Table ( 1 ). The estimation results for the parameters of the models are presented in Table  2 . The first column shows the estimates corresponding to a basic gravity model which only includes population, GDP per capita, distance, as well as period fixed effects as covariates (GM-SMALL). In the second column, we present the results of the estimated effects for the extended specification which includes the full set of explanatory variables introduced in the section above (GM-LARGE). The estimates presented in the third column correspond to a model that, in addition to the additional variables, also includes origin and destination country fixed effects (GM-LARGE-FE). In column 4, we consider a model including the interaction of these fixed effects, that is, bilateral origin–destination fixed effects (GM-LARGE-BFE). Note that all time invariant variables are perfectly colinear to those bilateral origin–destination fixed effects and are therefore excluded for the estimation of the GM-LARGE-BFE model.

The intuitive theoretical relationships implied by the simple gravity model are qualitatively validated in the data. The variables that capture country size have effect estimates which are statistically significant and have a positive sign, whereas distance appears negatively related to migration flows. The sign of the effect of these variables is not affected by the inclusion of additional covariates, but the magnitude of the parameter estimates decreases. When controlling for origin and destination specific fixed effects the parameter estimates of population and GDP per capital remain significant indicating that those variables provides information about country-specific outflows and inflows exceeding the mean values. The intuitive direction of the effects predicted by the standard gravity model of migration are validated by the cross-sectional variation of migration flows in our dataset. Once the variation across country pairs is controlled for, the estimated parameters obtained by exploiting variation over time present are clearly different from those in the models without fixed effects. The parameter estimate of population in the destination country is negative in the model including country-specific fixed effects, suggesting that while countries with large populations tend to experience larger migration flows, population changes tend to correlate negatively with changes in migration flows on average, after controlling for the other factors included in the model. The same interpretation explains the estimated negative effect of GDP per capita in origin countries on migration flows, and similar results are obtained for the model that includes bilateral fixed effects. The significant effect of lagged migration flows and the effects of the migration stock variables suggest an important role of persistence effects of migration and diaspora networks as determinants of future migration flows.

We find additional significant effects from several socioeconomic and demographic covariates included in the specification. The share of persons with secondary or higher education in the origin country has a positive and sizeable effect on migration flows, especially when controlling for fixed effects. Furthermore, the ratio of working age population to total population in the origin (destination) country positively (negatively) influences the magnitude of migration flows between pairs of economies. However, controlling for country-specific fixed effects reverses the direction of these results, thus indicating that the effect of this variable in the specification given by GM-LARGE is mostly driven by cross-country variation, as opposed to variation over time. Similarly, we find that high fertility origin countries tend to have higher emigration, whereas high fertility rate in destination countries tend to reduce immigration flows. These results still hold when controlling for country-specific fixed effects, but change direction when controlling for bilateral origin–destination fixed effect, indicating that the effects implied by cross-sectional variation and those implied by time variation can be very different.

Out-of-sample Prediction Validation

To assess the predictive power of the gravity models for migration, we estimate our gravity specifications using data for the period 1995 to 2015, use the estimated models to obtain forecasts for 2015–2020 and compare the predictive ability of our models with that of heuristic methods based on random walk specifications, historical averages of migration flows and simple autoregressive models. We obtain different measures of prediction accuracy: (i) the root mean square forecast error (RMSE), as a measure of discrepancy between realized and predicted values, (ii) the mean directional accuracy (MDA), which measures the share of correctly predicted changes in migration flows (based on predicted increase/decrease) and (iii) the estimated coefficients of a linear regression model where the realized migration flow values are regressed on an intercept and the predicted values (where rational predictions would correspond to an intercept of zero and a slope of unity in this regression model).

Scatterplots of predicted and realized values in the out-of-sample period for all models are presented in Fig.  1 , and the results of the prediction exercise can be found in Table  3 . Figure  1 depicts the scatterplot of realized and predicted values together with the 45 degree line (which would imply compatibility with rational forecasts) and the corresponding regression line. Deviations between these lines are informative of biases in the forecasts. We find that the large gravity model without country-fixed effects (GM-LARGE) performs best in means of RMSE. It is closely followed by the model including country and destination fixed effects (GM-LARGE-FE) and the historical averages. The relatively poor prediction results from both GM-LARGE-FE and GM-LARGE-BFE suggest that models including bilateral and country fixed effects may tend to overfit the existing migration flow data.

The results presented are robust in respect to the estimation method for the migration flow data. In particular, we find that the results obtained for the migration flow data used remain similar if the migration flow estimates obtained by the pseudo-Bayesian approach in Abel and Cohen ( 2019 ) are used. In addition, we also entertained models based on Poisson and negative binomial regression, to account for the count nature of migration flows and the excess of zero observations. Footnote 2 The predictive ability of these specifications was significantly worse than that of our log-linearized models. Footnote 3

figure 1

Predicted vs realized values of log level migration flows. The solid blue line represents the regression line of (linearly) regression the realized on the predicted values. The 45°-line is depicted in red and dashed

Measuring the Effects of Migration on GDP: An Illustrative Example

The predictive ability of gravity models can be exploited to support evidence-based policies, not only through the use of best-practice migration forecasts, but also by providing the basis for the creation of (counterfactual) projections of population and GDP based on scenarios concerning migration flows. In this section, we provide a simple illustrative example of such a scenario-driven projection exercise, in which we provide a first (lower bound) approximation of the potential contribution of migration to population and GDP dynamics in Germany and Portugal over the period 2021–2025.

For this purpose, we combine migration forecasts from the GM-LARGE model with population projection from the WPP dataset and GDP data from the IMF. We subtract the projected number of persons migrating to Germany and Portugal from 2021 to 2025 implied by our model forecasts from the population projections from a scenario with migration (that is, by the population projections in the WPP data). This projection exercise is thus aimed at measuring the population that would live in these two countries if no immigration took place over the the next years. As compared to a scenario without emigration and immigration, we choose this design in order to minimize the uncertainty around future population changes created by migration flows. Eliminating emigration from the scenario would imply employing additional estimates of bilateral migration flows between Portugal and Germany (as origin nations) and all other countries of the world. These estimates would add to the uncertainty of the immigration figures and would make our projections less credible.

Given that our model contains explanatory variables measured at the beginning of the period when the migration flows take place, we can use the latest available data point in order to create the projections of bilateral migration flows for all country pairs in the period 2015–2025. Figure  2 presents the two scenarios (benchmark from the WPP projections and scenario without immigration) for the total population in Germany and Portugal. The benchmark projection shows that the total population of Germany and Portugal is expected to decrease in the coming years, a trend which is further amplified in the scenarios without immigration. Should immigration to Germany and Portugal have come to a halt in 2020, we would see a sharp drop in population numbers in both countries of destination, with the no-immigration scenario in Germany implying around half a million less inhabitants by 2025 and the one for Portugal around 50,000 less. In addition to this alternative scenario for population trends, we also analyse how migration projections can be used to provide first approximations of the effect of human mobility effect on economic growth in receiving countries. To quantify the productivity of migrants, we use GDP growth projections from the IMF and create three scenarios where we assign migrants (a) the average labour productivity corresponding to their country of origin or (b) that of the recipient economy. Summing up the monetary value added to the economy by each individual migrant (as measured by their assumed labour productivity), we are able to assess how GDP in Germany and Portugal would be affected by migration flows in the different scenarios designs. Figure  2 presents this gap in the two scenarios entertained, together with benchmark GDP forecasts from the IMF. Assigning the average productivity of the country of destination to each migrant, in 2025 Germany would already lose over 42 billion dollars of GDP (in 2011 PPP), corresponding to over 1% of its total annual GDP. Assuming that each migrant’s productivity corresponds to that of workers in their country of origin would lead to slightly lower losses for Germany, while for Portugal the two scenarios regarding migrant productivity would lead to very similar drops in GDP growth. In relative terms, the fall in GDP implied by the projection scenarios without immigration is larger than that in population, so qualitatively the results for GDP translate to GDP per capita figures in terms of the relative ordering of the scenarios presented.

Such simple exercises based on conditional projections from the gravity model for migration illustrate how our gravity specifications can contribute to an evidence-based debate on migration policy and on the economic effects of migration. It should be noted that, by abstracting from spillovers related to market innovation activities or entrepreneurship, for instance, our exercise can be thought of as providing a lower bound estimate of the effects of migration on economic growth.

figure 2

Projected total population and GDP with and without immigration: Germany and Portugal

This paper estimates global gravity models aimed at forecasting migrant flows between countries and exemplifies their use as a tool to inform economic policy by combining migration projections with current IMF economic growth forecasts to calculate potential GDP losses to the German and Portuguese economies in a scenario without immigration. To assess the validity of this gravity model approach, we perform an out-of-sample prediction exercise and compare different measures of forecasting accuracy, comparing the results of four different specifications of a gravity model to those of three heuristic models. This validation exercise shows that a gravity model including some socioeconomic pull and push factors without country fixed effects performs best in every measure of prediction quality. In addition to projecting expected migration flows, we also provide a simple estimate of how the population and GDP of Germany and Portugal might develop in a no immigration scenario. For this purpose, we combine our migration forecasts with population projections and GDP forecasts.

The modelling framework put forward in our analysis can be particularly useful for the design of evidence-based migration policy instruments, in particular in the context of current discussions in the European political arena. The development of statistical tools to inform policy about issues related to the regulation of asylum requests and the allocation of immigrants across EU economies would require different methods that account for the particular nature of such forced migration flows. On the other hand, our modelling tools could be helpful to create a scientific basis to frame the current discussion on the competition for skilled workers in the global market in ageing societies. To deepen the insights gained by our modelling tool, it would be desirable for further research to perform a more in-depth analysis of the productivity of migrants and the corresponding effects migration may have on economic growth via entrepreneurship, investment in human or physical capital. In this respect, data limitations are currently a binding constraint to the advancement of the research agenda.

Our simple illustration of the use of such models to quantify the economic effects of migration serves as a proof=of=concept example that would need to be further improved to account for additional factors. For longer-term forecasts, it would be useful to extend this relatively simple approach and consider a wider range of determinants that might influence the effect that immigrants have on the population of their country of destination. Age structure, fertility and mortality differentials between migrants and the rest of the population, for instance, would need to be investigated in order to create credible population projections under different scenarios for long time horizons. Focusing only on the next four to five years, we abstract from these effects in the projections provided in this contribution. Information on the age, sex and education structure of migrants, as well as their allocation in specific economic sectors, would need to be incorporated to refine the assumptions concerning the productivity of migrants and thus GDP projections.

All the data and codes required to replicate the analysis can be found at https://github.com/jakobZellmann/Gravity-Models-for-Global-Migration-Flows-A-Predictive-Evaluation .

It can be seen in Fig.  1 that zero bilateral migration flows significantly alter the predictive ability of the models employed.

Results based on different migration flow estimates and Poisson and negative binomial models can be obtained at https://github.com/jakobZellmann/Gravity-Models-for-Global-Migration-Flows-A-Predictive-Evaluation .

Abel, G. J., & Cohen, J. E. (2019). Bilateral international migration flow estimates for 200 countries. Scientific Data, 6 (1), 1–13. https://doi.org/10.1038/s41597-019-0089-3

Article   Google Scholar  

Aslany, M. et al. (2021). Systematic review of determinants of migration aspirations. In: QuantMig Project Deliverable D2.2.

Azose, J. J., & Raftery, A. E. (2015). Bayesian probabilistic projection of international migration. Demography, 52 (5), 1627–1650. https://doi.org/10.1007/s13524-015-0415-0

Baker, D. et al. (2005). Asset returns and economic growth. In: Brookings papers on economic activity.

Beyer, R. M., et al. (2022). Gravity models do not explain, and cannot predict, international migration dynamics. Humanities and Social Sciences Communications . https://doi.org/10.1057/s41599-022-01067-x

Bloom, D. E., et al. (2015). Macroeconomic implications of population ageing and selected policy responses. The Lancet, 385 (9968), 649–657.

Bricker, D., & Ibbitson, J. (2019). Empty planet: The shock of global population decline . Hachette.

Google Scholar  

Cohen, J. E. (2012). Projection of net migration using a gravity model. In: Proceedings of the XXVII IUSSP international population.

Cohen, J. E., et al. (2008). International migration beyond gravity: A statistical model for use in population projections. Proceedings of the National Academy of Sciences, 105 (40), 15269–15274. https://doi.org/10.1073/pnas.0808185105

Disney, G. et al. (2015). Evaluation of existing migration forecasting methods and models. In: ESRC Centre for Population Change, University of Southampton.

International Labour Organization. (2022). Unemployment, Total (Percent of total labor force). World Economic Data. https://ilostat.ilo.org/data/

International Monetary Fund. (2022). World Economic Outlook, April 2022: War sets back the global recovery. World Economic and Financial Surveys. https://www.imf.org/en/Publications/WEO/Issues/2022/04/19/world-economic-outlook-april-2022

Karemera, D., et al. (2000). A gravity model analysis of international migration to North America. Applied Economics, 32 (13), 1745–1755. https://doi.org/10.1080/000368400421093

Kim, K., & Cohen, J. E. (2010). Determinants of international migration flows to and from industrialized countries: A panel data approach beyond gravity. International Migration Review, 44 (4), 899–932. https://doi.org/10.1111/j.1747-7379.2010.00830.x

Lee, R., et al. (2014). Is low fertility really a problem? Population aging, dependency, and consumption. Science, 346 (6206), 229–234.

Lubitz, J., et al. (2003). Health, life expectancy, and health care spending among the elderly. New England Journal of Medicine, 349 (11), 1048–1055.

Mason, A., et al. (2022). Six ways population change will affect the global economy. Population and Development Review, 48 (1), 51–73.

Mayda, A. M. (2010). International migration: A panel data analysis of the determinants of bilateral flows. Journal of Population Economics, 23 (4), 1249–1274. https://doi.org/10.1007/s00148-009-0251-x

Ortega, F., & Peri, G. (2013). The effect of income and immigration policies on international migration. Migration Studies, 1 (1), 47–74. https://doi.org/10.1093/migration/mns004

Prince, M. J., et al. (2015). The burden of disease in older people and implications for health policy and practice. The Lancet, 385 (9967), 549–562.

Ramos, R. (2016). Gravity models: A tool for migration analysis. IZA World of Labor . https://doi.org/10.15185/izawol.239

Ravenstein, E. G. (1885). The laws of migration . Royal Statistical Society.

Book   Google Scholar  

Sardoschau, S. (2020). The Future of Migration to Germany. In: Assessing Methods in Migration Forecasting: DeZIM Project report–DPr# 1.

UNDESA. (2019). World Population Prospects (2019 Revision). World Demographic Surveys. https://population.un.org/wpp/

UNDESA. (2020). International Migrant Stock 2020. World Demographic Surveys. https://www.un.org/development/desa/pd/content/international-migrant-stock

Vollset, S. E., et al. (2020). Fertility, mortality, migration, and population scenarios for 195 countries and territories from 2017 to 2100: A forecasting analysis for the Global Burden of Disease Study. The Lancet, 396 (10258), 1285–1306.

Wittgenstein Centre. (2018). Educational attainment distribution, post-secondary education. World Demographic Data. http://dataexplorer.wittgensteincentre.org/wcde-v2/

World Data Lab. (2022). World Data Pro, Spending and Demographic Data. World Economic Data. https://worlddata.pro/

Zipf, G. K. (1946). The P1P2/D hypothesis: On the intercity movement of persons. American Sociological Review, 11 (6), 677–686.

Download references

Acknowledgements

The authors would like to thank two anonymous referees, as well as participants in workshops organized by the International Organization for Migration for their helpful comments on earlier drafts of this paper. This paper stems from a joint project with the International Organization of Migration (IOM) and the government offices of Moldova, Portugal, and Germany. We would like to thank the different IOM counterparts, who contributed with expert opinion to the specification of the gravity model. Jesus Crespo Cuaresma and Jakob Zellmann thankfully acknowledge support from the eXplore Initiative (Michael Tojner and B &C Privatstiftung) under the grant “Migration and Economic Growth” in the context of the eXplore! initiative.

Open access funding provided by Vienna University of Economics and Business (WU).

Author information

Authors and affiliations.

World Data Lab, Vienna, Austria

Juan Caballero Reina, Jesus Crespo Cuaresma, Katharina Fenz, Jakob Zellmann & Teodor Yankov

University of Cambridge, Cambridge, UK

Juan Caballero Reina

Department of Economics, Vienna University of Economics and Business, Welthandelsplatz 1, 1020, Vienna, Austria

Jesus Crespo Cuaresma, Katharina Fenz & Jakob Zellmann

International Institute for Applied Systems Analysis, Laxenburg, Austria

Jesus Crespo Cuaresma

Wittgenstein Centre for Demography and Global Human Capital, Vienna, Austria

Austrian Institute of Economic Research, Vienna, Austria

University College London (UCL), London, UK

Teodor Yankov

The International Organization for Migration (IOM), Vienna, Austria

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Jesus Crespo Cuaresma .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Caballero Reina, J., Crespo Cuaresma, J., Fenz, K. et al. Gravity Models for Global Migration Flows: A Predictive Evaluation. Popul Res Policy Rev 43 , 29 (2024). https://doi.org/10.1007/s11113-024-09867-6

Download citation

Received : 03 March 2023

Accepted : 01 February 2024

Published : 02 April 2024

DOI : https://doi.org/10.1007/s11113-024-09867-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Gravity model
  • Economic growth
  • Forecasting

Advertisement

  • Find a journal
  • Publish with us
  • Track your research

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • What Is Qualitative Research? | Methods & Examples

What Is Qualitative Research? | Methods & Examples

Published on June 19, 2020 by Pritha Bhandari . Revised on June 22, 2023.

Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research.

Qualitative research is the opposite of quantitative research , which involves collecting and analyzing numerical data for statistical analysis.

Qualitative research is commonly used in the humanities and social sciences, in subjects such as anthropology, sociology, education, health sciences, history, etc.

  • How does social media shape body image in teenagers?
  • How do children and adults interpret healthy eating in the UK?
  • What factors influence employee retention in a large organization?
  • How is anxiety experienced around the world?
  • How can teachers integrate social issues into science curriculums?

Table of contents

Approaches to qualitative research, qualitative research methods, qualitative data analysis, advantages of qualitative research, disadvantages of qualitative research, other interesting articles, frequently asked questions about qualitative research.

Qualitative research is used to understand how people experience the world. While there are many approaches to qualitative research, they tend to be flexible and focus on retaining rich meaning when interpreting data.

Common approaches include grounded theory, ethnography , action research , phenomenological research, and narrative research. They share some similarities, but emphasize different aims and perspectives.

Note that qualitative research is at risk for certain research biases including the Hawthorne effect , observer bias , recall bias , and social desirability bias . While not always totally avoidable, awareness of potential biases as you collect and analyze your data can prevent them from impacting your work too much.

Prevent plagiarism. Run a free check.

Each of the research approaches involve using one or more data collection methods . These are some of the most common qualitative methods:

  • Observations: recording what you have seen, heard, or encountered in detailed field notes.
  • Interviews:  personally asking people questions in one-on-one conversations.
  • Focus groups: asking questions and generating discussion among a group of people.
  • Surveys : distributing questionnaires with open-ended questions.
  • Secondary research: collecting existing data in the form of texts, images, audio or video recordings, etc.
  • You take field notes with observations and reflect on your own experiences of the company culture.
  • You distribute open-ended surveys to employees across all the company’s offices by email to find out if the culture varies across locations.
  • You conduct in-depth interviews with employees in your office to learn about their experiences and perspectives in greater detail.

Qualitative researchers often consider themselves “instruments” in research because all observations, interpretations and analyses are filtered through their own personal lens.

For this reason, when writing up your methodology for qualitative research, it’s important to reflect on your approach and to thoroughly explain the choices you made in collecting and analyzing the data.

Qualitative data can take the form of texts, photos, videos and audio. For example, you might be working with interview transcripts, survey responses, fieldnotes, or recordings from natural settings.

Most types of qualitative data analysis share the same five steps:

  • Prepare and organize your data. This may mean transcribing interviews or typing up fieldnotes.
  • Review and explore your data. Examine the data for patterns or repeated ideas that emerge.
  • Develop a data coding system. Based on your initial ideas, establish a set of codes that you can apply to categorize your data.
  • Assign codes to the data. For example, in qualitative survey analysis, this may mean going through each participant’s responses and tagging them with codes in a spreadsheet. As you go through your data, you can create new codes to add to your system if necessary.
  • Identify recurring themes. Link codes together into cohesive, overarching themes.

There are several specific approaches to analyzing qualitative data. Although these methods share similar processes, they emphasize different concepts.

Qualitative research often tries to preserve the voice and perspective of participants and can be adjusted as new research questions arise. Qualitative research is good for:

  • Flexibility

The data collection and analysis process can be adapted as new ideas or patterns emerge. They are not rigidly decided beforehand.

  • Natural settings

Data collection occurs in real-world contexts or in naturalistic ways.

  • Meaningful insights

Detailed descriptions of people’s experiences, feelings and perceptions can be used in designing, testing or improving systems or products.

  • Generation of new ideas

Open-ended responses mean that researchers can uncover novel problems or opportunities that they wouldn’t have thought of otherwise.

Researchers must consider practical and theoretical limitations in analyzing and interpreting their data. Qualitative research suffers from:

  • Unreliability

The real-world setting often makes qualitative research unreliable because of uncontrolled factors that affect the data.

  • Subjectivity

Due to the researcher’s primary role in analyzing and interpreting data, qualitative research cannot be replicated . The researcher decides what is important and what is irrelevant in data analysis, so interpretations of the same data can vary greatly.

  • Limited generalizability

Small samples are often used to gather detailed data about specific contexts. Despite rigorous analysis procedures, it is difficult to draw generalizable conclusions because the data may be biased and unrepresentative of the wider population .

  • Labor-intensive

Although software can be used to manage and record large amounts of text, data analysis often has to be checked or performed manually.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Chi square goodness of fit test
  • Degrees of freedom
  • Null hypothesis
  • Discourse analysis
  • Control groups
  • Mixed methods research
  • Non-probability sampling
  • Quantitative research
  • Inclusion and exclusion criteria

Research bias

  • Rosenthal effect
  • Implicit bias
  • Cognitive bias
  • Selection bias
  • Negativity bias
  • Status quo bias

Quantitative research deals with numbers and statistics, while qualitative research deals with words and meanings.

Quantitative methods allow you to systematically measure variables and test hypotheses . Qualitative methods allow you to explore concepts and experiences in more detail.

There are five common approaches to qualitative research :

  • Grounded theory involves collecting data in order to develop new theories.
  • Ethnography involves immersing yourself in a group or organization to understand its culture.
  • Narrative research involves interpreting stories to understand how people make sense of their experiences and perceptions.
  • Phenomenological research involves investigating phenomena through people’s lived experiences.
  • Action research links theory and practice in several cycles to drive innovative changes.

Data collection is the systematic process by which observations or measurements are gathered in research. It is used in many different contexts by academics, governments, businesses, and other organizations.

There are various approaches to qualitative data analysis , but they all share five steps in common:

  • Prepare and organize your data.
  • Review and explore your data.
  • Develop a data coding system.
  • Assign codes to the data.
  • Identify recurring themes.

The specifics of each step depend on the focus of the analysis. Some common approaches include textual analysis , thematic analysis , and discourse analysis .

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Bhandari, P. (2023, June 22). What Is Qualitative Research? | Methods & Examples. Scribbr. Retrieved April 3, 2024, from https://www.scribbr.com/methodology/qualitative-research/

Is this article helpful?

Pritha Bhandari

Pritha Bhandari

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, how to do thematic analysis | step-by-step guide & examples, unlimited academic ai-proofreading.

✔ Document error-free in 5minutes ✔ Unlimited document corrections ✔ Specialized in correcting academic texts

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Creating a Corporate Social Responsibility Program with Real Impact

  • Emilio Marti,
  • David Risi,
  • Eva Schlindwein,
  • Andromachi Athanasopoulou

example of research model

Lessons from multinational companies that adapted their CSR practices based on local feedback and knowledge.

Exploring the critical role of experimentation in Corporate Social Responsibility (CSR), research on four multinational companies reveals a stark difference in CSR effectiveness. Successful companies integrate an experimental approach, constantly adapting their CSR practices based on local feedback and knowledge. This strategy fosters genuine community engagement and responsive initiatives, as seen in a mining company’s impactful HIV/AIDS program. Conversely, companies that rely on standardized, inflexible CSR methods often fail to achieve their goals, demonstrated by a failed partnership due to local corruption in another mining company. The study recommends encouraging broad employee participation in CSR and fostering a culture that values CSR’s long-term business benefits. It also suggests that sustainable investors and ESG rating agencies should focus on assessing companies’ experimental approaches to CSR, going beyond current practices to examine the involvement of diverse employees in both developing and adapting CSR initiatives. Overall, embracing a dynamic, data-driven approach to CSR is essential for meaningful social and environmental impact.

By now, almost all large companies are engaged in corporate social responsibility (CSR): they have CSR policies, employ CSR staff, engage in activities that aim to have a positive impact on the environment and society, and write CSR reports. However, the evolution of CSR has brought forth new challenges. A stark contrast to two decades ago, when the primary concern was the sheer neglect of CSR, the current issue lies in the ineffective execution of these practices. Why do some companies implement CSR in ways that create a positive impact on the environment and society, while others fail to do so? Our research reveals that experimentation is critical for impactful CSR, which has implications for both companies that implement CSR and companies that externally monitor these CSR activities, such as sustainable investors and ESG rating agencies.

  • EM Emilio Marti is an associate professor at the Rotterdam School of Management, Erasmus University. His research focuses on corporate sustainability with a specific focus on sustainable investing.
  • DR David Risi is a professor at the Bern University of Applied Sciences and a habilitated lecturer at the University of St. Gallen. His research focuses on how companies organize CSR and sustainability.
  • ES Eva Schlindwein is a professor at the Bern University of Applied Sciences and a postdoctoral fellow at the University of Oxford. Her research focuses on how organizations navigate tensions between business and society.
  • AA Andromachi Athanasopoulou is an associate professor at Queen Mary University of London and an associate fellow at the University of Oxford. Her research focuses on how individuals manage their leadership careers and make ethically charged decisions.

Partner Center

  • Open access
  • Published: 02 April 2024

Sialyl-Tn serves as a potential therapeutic target for ovarian cancer

  • Linah Al-Alem 1 , 2 ,
  • Jillian M. Prendergast 3 ,
  • Justin Clark 3 ,
  • Bianca Zarrella 1 ,
  • Dominique T. Zarrella 1 ,
  • Sarah J. Hill 5 , 6 , 7 ,
  • Whitfield B. Growdon 1 , 2 , 4 ,
  • Venkatesh Pooladanda 1 , 2 ,
  • David R. Spriggs 8 , 9 ,
  • Daniel Cramer 10 ,
  • Kevin M. Elias 11 ,
  • Rawan I. Nazer 3 ,
  • Steven J. Skates 12 ,
  • Jeff Behrens 3 ,
  • Daniel T. Dransfield 3 &
  • Bo R. Rueda 1 , 2 , 4  

Journal of Ovarian Research volume  17 , Article number:  71 ( 2024 ) Cite this article

2 Altmetric

Metrics details

Ovarian cancer remains the deadliest of the gynecologic cancers in the United States. There have been limited advances in treatment strategies that have seen marked increases in overall survival. Thus, it is essential to continue developing and validating new treatment strategies and markers to identify patients who would benefit from the new strategy. In this report, we sought to further validate applications for a novel humanized anti-Sialyl Tn antibody-drug conjugate (anti-STn-ADC) in ovarian cancer.

We aimed to further test a humanized anti-STn-ADC in sialyl-Tn (STn) positive and negative ovarian cancer cell line, patient-derived organoid (PDO), and patient-derived xenograft (PDX) models. Furthermore, we sought to determine whether serum STn levels would reflect STn positivity in the tumor samples enabling us to identify patients that an anti-STn-ADC strategy would best serve. We developed a custom ELISA with high specificity and sensitivity, that was used to assess whether circulating STn levels would correlate with stage, progression-free survival, overall survival, and its value in augmenting CA-125 as a diagnostic. Lastly, we assessed whether the serum levels reflected what was observed via immunohistochemical analysis in a subset of tumor samples.

Our in vitro experiments further define the specificity of the anti-STn-ADC. The ovarian cancer PDO, and PDX models provide additional support for an anti-STn-ADC-based strategy for targeting ovarian cancer. The custom serum ELISA was informative in potential triaging of patients with elevated levels of STn. However, it was not sensitive enough to add value to existing CA-125 levels for a diagnostic. While the ELISA identified non-serous ovarian tumors with low CA-125 levels, the sample numbers were too small to provide any confidence the STn ELISA would meaningfully add to CA-125 for diagnosis.

Conclusions

Our preclinical data support the concept that an anti-STn-ADC may be a viable option for treating patients with elevated STn levels. Moreover, our STn-based ELISA could complement IHC in identifying patients with whom an anti-STn-based strategy might be more effective.

Introduction

Ovarian carcinoma (OvCa) is the leading cause of death among gynecological malignancies in the United States [ 1 ]. It is estimated that in 2022 approximately 19,880 women will be newly diagnosed, and 12,810 women will succumb to the disease [ 2 ]. Despite high initial response rates to current treatment strategies, the 5-year survival rate for women with advanced stage ovarian cancer remains less than 50%. These poor outcomes are due, in part, to the lack of biomarkers that allow for early diagnosis coupled with relatively few targetable mutations or genomic alterations that are present in any quantity across the different ovarian adenocarcinoma histologies. Tumor-associated carbohydrate antigens (TACAs) are aberrant glycosylations of proteins or lipids that are hallmark biomarkers of cancer [ 3 ]. Our recent work, as well as work by others, suggested that one such TACA, Sialyl Thomsen-nouveau (Sialyl-Tn or STn; also known as CD175s (Neu5Acα2, 6GalNAc O-Ser/Thr)), may be a target, prognostic biomarker or diagnostic marker of interest in OvCa [ 4 , 5 , 6 ]. STn is of low abundance in healthy human tissues but is highly expressed in many cancers, including breast, OvCa, bladder, cervical, colon, pancreatic, lung, and others [ 6 , 7 , 8 , 9 , 10 , 11 ]. The presence of STn is generally associated with poor clinical outcomes, therapeutic resistance, and immune suppression [ 7 , 9 , 12 , 13 ]. Recurrent OvCa, like many other solid tumors, is thought to result, in part, from residual cancer stem cells (CSCs) that survive cytotoxic chemotherapy and serve as seed cells for the tumor resurgence [ 14 , 15 ]. Cancer stem cells have membrane proteins with altered glycosylation, including ovarian CSCs, which often express STn [ 5 , 14 ]. The enzyme responsible for creating STn, ST6 N-Acetylgalactosaminide alpha-2,6-Sialyltransferase 1 (ST6GALNAC1), has been reported to facilitate stemness in OvCa cells via the Akt pathway [ 16 ]. Since STn can be added as an O-linked post-translational modification to a variety of protein backbones, using antibodies that specifically target the STn glycan independent of its carrier protein could afford the potential to recognize a more comprehensive array of cancer-specific sialylated proteins [ 4 , 6 , 17 , 18 , 19 ] on CSC and non-CSC populations. Therefore, it is possible that using a glycan targeted therapy alone or in combination with standard chemotherapy may be a rationale therapeutic strategy.

To take advantage of this concept, we initially generated a panel of murine monoclonal anti-STn therapeutic antibodies, including clones S3F and 2G12-2B2 as described in Prendergast et al. 2017 [ 6 ]. These murine anti-STn antibodies were conjugated to monomethyl auristatin E (MMAE) to generate antibody-drug conjugates (ADCs). These ADCs demonstrated in vitro efficacy in STn-expressing cell lines, including CSCs and tumor growth inhibition in an OvCa cell xenograft tumor model. Exposure to either S3F-CL-MMAE or 2G12-2B2-CL-MMAE reduced OVCAR3-derived xenograft volume in vivo, depleting STn + tumor cells [ 5 ]. Thus, targeting STn in ovarian tumors may be an effective clinical strategy; however, additional in vitro and in vivo studies with the humanized anti-STn -ADC antibodies were needed. Additional knowledge on STn prevalence in both tissue and blood (and their correlations) could also be beneficial with these novel antibodies to predict in a preclinical proof-of-concept assay which tumors could be more likely to respond to such anti-STn-ADCs. Lastly, given that our anti-STn antibody is more specific than others tested [ 6 ], it was interesting to know if there was any potential that it could augment CA-125 in the diagnosis of OvCa.

To this end, the studies herein demonstrated on-target effects of our human anti-STn-ADC, reduced viability in a subset of high grade serous ovarian cancer (HGSOC) treatment-resistant OvCa PDOs and in PDXs originating from patients diagnosed with HGSOC with a range of STn expression. Our findings also support the concept that circulating STn might serve as a potential biomarker either independently or in combination with immunohistochemistry (IHC) for triaging patients that might respond to an anti-STn-ADC based strategy. Lastly, our STn ELISA by itself did not outperform CA-125 in the detection of HGSOC. However, it did improve the detection of mucinous and clear cell cancers, albeit the numbers were limited.

Materials and methods

OvCa cell lines OVCAR3, OV90, and SKOV3 cell lines were obtained from ATCC. OVCAR4 was generously provided by Dr. Panos Konstantinopoulos (Dana Farber Cancer Institute, Harvard Medical School, Boston, MA). OVCAR3 cells were cultured in RPMI 1640 medium (Cat #11,875,093, Gibco-BRL, Gaithersburg, MD) supplemented with 10% fetal bovine serum (FBS, Cat #26,140,079, Thermo Fisher Scientific, Waltham, MA), 1X penicillin-streptomycin (Cat # 15,070,063, Thermo Fisher Scientific), and 0.01 mg/ml bovine insulin (Cat #I0516, Sigma-Aldrich, Natick, MA). OVCAR4 cells were cultured in RPMI 1640 supplemented with 10% FBS and 1X penicillin-streptomycin (Life Technologies, Carlsbad, CA). OV90 cells were cultured in a 1:1 mixture of MCDB 105 medium (Cat #117,500, Cell Applications, San Diego, CA) and Medium 199 (Cat #11,150,059, Gibco-BRL) and 10% FBS. These cell lines have been characterized previously [ 20 , 21 , 22 ]. SKOV3 Control (SKOV3 CTL and SKOV3 cells overexpressing ST6GALNAC1 enzyme which results in elevated STn (SKOV3-ST6GALNAC1), described in Prendergast et al. [ 6 ], were grown in McCoy’s media (Cat # 16,600,082, Gibco-BRL) supplemented with 10% FBS and 1X penicillin and streptomycin. All cell lines were maintained at 37˚C in 5% CO 2 . Established cell lines were subjected to human cell identity verification (STR profiling) at Dana Farber Cancer Institute ( http://moleculardiagnosticscore.dana-farber.org ). All established cell lines were regularly tested for mycoplasma contamination (Lonza MycoAlert® Mycoplasma Detection Kit Cat # LT07-418, Walkersville, MD).

To better understand the unknown levels of STn in serum, it was necessary to confidently determine the levels of STn on a known control protein, to this end we utilized bovine submaxillary mucin (BSM). BSM is a glycoprotein that is rich in STn and can be used to extrapolate unknown sera STn expression. The lot of BSM used SLBH5656V (Sigma-Aldrich, Cat # M3895) was determined to have 14% sialic acids bound (provided by vendor). This is 0.14 µg sialic acid per µg of BSM. Based on the known sialic modifications of BSM we assumed this is 0.14 µg STn per 1 µg of BSM.

Antibody generation

The generation of the murine and humanized anti-STn and anti-STn-ADC antibodies used in this assay have been described in the literature by Eavarone et al. [ 4 ] and Prendergast et al. [ 6 ].

MTT cytotoxicity assay

OVCAR3, OVCAR4, OV90, SKOV3 CTL , or SKOV3-ST6GALNAC1 cells were seeded in 96-well plates, incubated overnight in the appropriate complete culture medium, then treated with increasing doses of the human anti-STn-ADC at 0, 2.5, 5, 10, and 50 or 100 nM and then incubated for 72 h or 6–7 days. Cell viability was determined by MTT assay (Cat # M6494, Thermo Fisher Scientific), and the percentage was calculated relative to control (vehicle treated) samples using the formula = (OD sample / OD of control average) × 100.

Flow cytometry

Flow cytometry was used to assess STn levels in cell lines and primary tumor cells. Following trypsinization, or tissue processing, and incubation with FcR blocking reagent (Miltenyi Biotec, Bergisch Gladbach, Germany, Mouse, Cat # 13,009,257), cells were stained with highly specific mouse anti-STn antibody (Siamab Therapeutics, Inc., Newton, MA) directly conjugated to Alexa Flour 647 using the Zenon antibody labeling kit (Thermo Fisher Scientific Cat # Z25108) or labeled with Alexa Flour 488 (Invitrogen, Carlsbad, CA, Cat # A11017). Fixable Live/Dead Violet (Thermo Fisher Scientific Cat # L34955) was used to determine the level of live and dead cells to exclude dead cells from the staining analysis. After washing, cells were fixed in 4% paraformaldehyde for 20 min, washed and reconstituted in PBS, and then analyzed using Guava Millipore (Millipore, Burlington, MA) or LSRII). Data were analyzed using FlowJo software (version 10.0.8). For analysis of cells derived from xenograft tumors, PDX tumor cells were stained with H 2 k D (BD Biosciences, Franklin Lakes, NJ) to exclude mouse cells from the sample at the time of analysis and only determine levels of STn expression in human cells.

In vivo treatment studies

All mouse studies were carried out in compliance with our Institutional Animal Care and Use Committee guidelines at Massachusetts General Hospital. For establishment of PDXs, tumor cells originally derived from patients (who provided informed consent) diagnosed with HGSOC were processed to homogenized single cell suspension and re-suspended in PBS: Matrigel ® (Cat # 354,234, Sigma-Aldrich) (1:1) and subcutaneously (s.c.) injected into ∽  8-week-old female NOD/SCID mice (Jackson Laboratory, Bar Harbor, ME). Once initial tumor xenografts were formed, they were harvested for histological verification and tumor was cryopreserved for future rederivation. For the present study cryopreserved tumor samples were thawed and implanted in mice and allowed to form. Once sufficient tumor burden was formed the tumors were harvested, mechanically and enzymatically processed to remove mouse cells and an equal number of tumor cells were resuspended and injected into immunocompromised mice as described above and previously [ 23 , 24 , 25 ].

All animals were monitored regularly for tumor formation, and tumor volume was calculated using the formula (length x width x height)/2 as has been described [ 26 ]. When tumor volumes averaged between 200 and 250 mm 3 the mice were randomized into the different arms to have a similar average tumor volume in each arm (a minimum of 4 mice/arm). Mice were treated with either vehicle control, isotype-ADC, human anti-STn-ADC, carboplatin/paclitaxel (Cat # C2538/ Cat # T1912 Sigma Aldrich) or a combination thereof, depending on the experimental design. Human anti-STn-ADC was administered at 5 mg/kg in sterile saline weekly as determined by our previous pharmacokinetic studies [ 4 ]. Similarly, the animals treated with isotype-ADC were also treated at 5 mg/kg in sterile saline weekly. Carboplatin and paclitaxel treatments were at 25 mg/kg and 12 mg/kg, respectively, weekly. Carboplatin and paclitaxel dose concentrations were based on our prior studies [ 23 , 25 ]. Tumors were measured every 3 to 4 days with calipers, and mice were weighed twice a week. If mice lost more than 15% of their body weight or tumors grew to exceed the preset limitations, they were sacrificed. At the study’s completion, tumor samples from some xenografts were formaldehyde-fixed and paraffin-embedded for STn staining.

Organoid culture and treatment

Organoids were cultured as described previously [ 27 ]. All organoids were tested for mycoplasma (MycoAlert® Kit) and all were negative. For sensitivity analysis, organoids were digested with TrypLE (Life Technologies Cat #12,604), plated in 20% Matrigel, and then treated with either human anti-STn-ADC or Isotype-ADC at 0, 0.02, 0.05, 0.1, 0.15, 0.2, or 0.25 nM and read by CellTiter-glo (Cat # G7572, Promega, Madison, WI) six days later. On day 1, four wells were read with CellTiter-glo for growth rate correction. Growth rate corrected dose curves and area over the curve were calculated as described [ 28 ].

Patient selection and data

Blood samples were collected from patients with either OvCa or benign masses and processed for serum. Patient consent forms and sample collections were approved and in compliance with the Institutional Review Board guidelines (Protocol #s 07–049, 2016P002742 and 2000P001678). Research coordinators double-coded the samples, and the samples were run blindly. At the end of the study, the code was revealed, and results analyzed as described in the statistical analysis section. Patient characteristics are shown in Supplemental Table 1. Four hundred twenty serum samples were obtained from the VCRB MGH Gyn sample repository. All samples that were used for diagnostic assessment were collected within five days prior to surgical debulking. Clinical CA-125 values were obtained from the patient’s chart. If there were multiple CA-125 values, the results closest to the time serum was collected before surgery were used for comparison. A representative tumor piece was fixed in paraformaldehyde for a subgroup of patients and used for immunohistochemistry studies described below. Clinical correlates such as histologic subtype, stage, grade, first line treatment, number of treatment cycles, and response to treatment were collected (see Supplemental Table I).

Independent of the diagnostic potential, we initiated a preliminary experiment to assess what impact surgical debulking had on circulating STn levels. For this we obtained a set of matched blood samples from patients diagnosed with HGSOC prior to surgery and again post-surgery and subjected them to our ELISA as described.

Immunohistochemistry (IHC)

A subset of paraffin-embedded ovarian tumor samples from patients with matched blood sampled were sectioned at 5 μm thickness to compare tissue levels of STn with the patients corresponding serum STn levels. The subset of matched samples included 41 serous, 10 mucinous, 12 clear cell, and 4 endometrioid histologies. Antigen retrieval was performed using 10nM sodium citrate solution at 120º C using a pressure cooker for 15 min. Tissues were incubated with 3% H 2 O 2 (Cat # S25359, Thermo Fisher) for 20 min, then blocked with 6% serum cocktail (normal Horse Cat # S2000, Bovine Cat # SP5050, Goat Cat # S1000 serum from Vector Labs, Burlingame, CA) for 20 min. Tissues were then incubated with either murine anti-STn primary antibody or control mouse antibody MOPC isotype at 10 µg/mL (MOPC173, Biolegend, San Diego, CA), or no primary antibody diluted in the blocking cocktail. After washing with PBST, slides were incubated for 45 min with anti-mouse antibody (Santa Cruz, Dallas, TX). To visualize the staining, 3,3’-diaminobenzidine (DAB) was used (Cat # SK-4100; Vector stains, Burlingame, CA). Slides were counterstained with hematoxylin (Cat # CS402-1D Thermo Fisher Scientific) and Scott’s water. Finally, slides were dehydrated and mounted with coverslips. Membrane STn positivity was scored by a board-certified pathologist (SJH) blinded to the sample code. A score of 0 meant minimal membrane staining with STn in less than 5% tumor cells, or no staining with STn was identified. A score of 1 indicated minimal to moderate STn membrane staining in 5–24% of tumor cells. A score of 2 indicated moderate to strong membrane staining in 25 − 50% of tumor cells. A score of 3 indicated moderate to strong membrane staining in greater than 50% of the tumor cells.

Development of an STn biomarker ELISA screening assay

To develop a sandwich ELISA to identify STn expression in patient serum, we utilized commercial, and Siamab generated murine anti-STn antibodies [ 4 , 6 ] in a research use only (RUO) grade assay. A subset of these anti-STn antibodies was chosen for sandwich ELISA testing, and pairing was guided based on their light chain CDR sequences. Proof of concept studies matched murine and humanized antibody pairs. The pairing included murine S3F, 2G12-2B2, CC49, 5G2-1B3, B72.3, and humanized Hu3F1 L1H1, Hu2G12-2B2 L0H3, Hu5G2-1B3 L1H2 for testing along with isotype controls in a matrix format. Isotype controls included were murine MOPC173 (BioLegend, Cat # 400,264), Plates (Corning, Cat # 9018) were coated with 1, 3, or 5 ug/mL murine antibodies in coating buffer (50 mM sodium carbonate/bicarbonate pH 9.5) overnight at 4 C. After 3x washing with phosphate-buffered saline with 0.05% Tween-20 (PBS-T), plates were blocked with blocking buffer (1% ovalbumin (OVA) in PBS) for 1 h at room temperature. Buffer was removed, and 100 µl/well of STn glycoprotein sample (0.000125 mg/mL BSM, Sigma Aldrich) diluted in blocking buffer was added and incubated for 90 min at 37ºC. Plates were then washed 2x with PBS and a subset of wells had their sialic acids oxidized by treatment with 2 mM periodate. Wells were incubated for 20 min at 4 °C. Plates were then washed 3x with PBS-T, and 3ug/mL secondary humanized anti-STn antibodies were added to the wells diluted in blocking buffer and incubated for 1 h at room temperature. Plates were washed 3x with PBS-T followed by incubation with 0.08 µg/mL peroxidase-conjugated goat anti-human antibody (Cat #109-035-098, Jackson ImmunoResearch, West Grove, PA) for 1 h. Next, wells were washed 3x with PBS-T, and then wells were incubated with 100 µL of enzyme substrate (0.5 mg/mL o-phenylenediamine; 0.03% H 2 O 2 in citric/phosphate buffer pH 5.5). The enzyme reaction was terminated by the addition of an equal volume of 1.6 M sulfuric acid. Optical Density (OD) readings of periodate and non-periodate treated wells were determined at 490 nm. Binding affinities of the antibody pairs were compared by subtracting the periodate treated wells from the non-periodate treated wells to obtain the periodate-sensitive STn binding.

Statistical analysis

All experiments were carried out as at least 3 biological replicates and the data were analyzed with GraphPad Prism (GraphPad Software, La Jolla, CA). Patient serum biomarkers were analyzed using the R statistical computing environment (4.0.2) [ 29 ] and Stan, a platform for statistical modeling [ 30 ] using the R package rstan [ 31 ]. Bars represent mean ± SEM. One-way ANOVA was conducted to assess for significant differences seen in tumors treated with human anti-STn-ADC. A statistical significance is determined if the p value was < 0.05.

To assess the complementarity of STn to CA-125 as a biomarker for ovarian cancer, bivariate mixture models [ 32 ] estimated the joint distribution of STn and CA-125. The underlying distribution was a bivariate t -distribution to provide robustness to outliers. A mixture model for one serum biomarker (e.g., CA-125) in cases has a fraction of cases ( ∽  80%) that overexpress the biomarker compared to control patients, and the complementary fraction ( ∽  20%) that does not overexpress the biomarker, that is, its distribution is the same as the distribution for the control group (benign ovarian disease). A bivariate mixture model for two serum biomarkers in cases has four components, the first where both biomarkers (STn and CA-125) are overexpressed, the second has STn overexpressed but not CA-125, the third has CA-125 overexpressed but not STn, and the fourth where neither biomarker is overexpressed and hence has the same bivariate distribution as the controls. Each component has a fraction of the cases – with the four fractions summing to 100%. The estimate for a fraction of cases in the second component, where STn is overexpressed but CA-125 is not, estimates the complementarity of STn to CA-125.

Anti-STn-ADC selectively targets cells that express STn in vitro

To determine the on target effects of the humanized anti-STn-ADC (2G12), we utilized the SKOV3 wild type (SKOV3 CTL ) line, which was chosen because it had little to no detectable levels of STn and SKOV3 cells that were stably transfected with ST6GalNAc1 to increase STn on the cell surface (SKOV3-ST6GALNAC1) [ 6 ] (Supplemental Fig. 1A, B). There was no significant effect on cell proliferation observed using MTT assays following treatment with humanized anti-STn-ADC of SKOV3 CTL cells. However, there was a dose-dependent decrease in MTT activity in SKOV3-ST6GALNAC1 cells (Fig.  1 A and B).

figure 1

Demonstrating on-target effects and response of humanized anti-STn-ADC in pre-clinical models. SKOV3 CTL cells which express little to no STn and SKOV3 cells that were stably transfected with ST6GalNac1 to express elevated levels of STn (SKOV3-ST6GALNAC1) were plated in 96-well plate and then treated the next day with 0, 2.5, 5, 10, 50 or 100 nM of humanized anti-STn ADC (A, B). Seventy-two hours later the metabolic activity as an indirect measure of viability was assessed via MTT assay. The humanized anti-STn-ADC did not affect the SKOV3 CTL cells, whereas a dose-dependent decrease in metabolic activity was evident in the SKOV3 cells with elevated levels of STn. Figure  1 C, D show the response of 2 independent HGSOC PDX models expressing different levels of STn. Following randomization of mice in each of the arms (n = minimum of 4 mice/arm), they were treated with either vehicle, an isotype-ADC, or anti-STn-ADC weekly at a dose of 5 mg/kg. The graph showing the collective results in mice harboring tumors with high percentage STn positive tumor cells are shown in panel C and those hosting tumor with low a percentage of STn positive tumor cells are shown in panel D. Panel E represents a schematic of the sequential and combination treatment strategy. Panel F represents the results of the multi-arm experiment included one arm receiving vehicle at days 0, 7, 14 and 21 (Black Line); one arm received anti-STn-ADC at day 0, 7, 14 and 21 (Red Line); one arm received the combination of anti-STn-ADC and carboplatin/paclitaxel (on days 0, and 7) which was then followed by weekly PBS (on days 14, 21 Blue Line). One arm beginning with a larger number of mice hosting tumor received carboplatin and paclitaxel (1 x weekly x 2 weeks, days 0, and 7) after which the mice were then randomized across two additional arms. One receiving just PBS on days 14 and 21 (Orange Line) after cessation of carboplatin/paclitaxel (which results in a resurgence of tumor growth). The final sub arm received anti-STn -ADC (on days 14 and 21) following the cessation of the carboplatin/paclitaxel (Green Line) to assess whether it could impede the resurgence. At the end of the experiment all arms had a minimum of 4 mice. Asterisks represent differences ( p  < 0.05) compared to vehicle controls

Anti-STn-ADC is more effective in tumors with higher levels of STn expression in vivo.

One of the therapeutic challenges in OvCa is the heterogeneity of the tumor and continual resurgence of tumor cells that become resistant leading to high patient mortality. We postulated that PDXs more faithfully replicate the heterogenous nature of OvCa in patients more so than xenografts generated from many of the commercially available long term established OvCa cell lines. Therefore, we identified two HGSOC PDXs for comparison, one expressing a low percentage of STn positive tumor cells (5.48 ± 0.85 SEM) and one with a higher percentage STn positive tumor cells (53 ± 0.1 SEM), as determined by flow cytometry (Supplementary Fig. 2A and B). Humanized anti-STn-ADC (2G12, 5 mg/kg per week as determined by PK studies [ 4 ]) was tested in these two OvCa PDX models. Mice hosting PDX with high STn (Fig.  1 C) tumors had a quick and robust response (50% decrease in tumor volume in ∽  4 weeks) to treatment when compared to mice hosting tumors with lower STn (Fig. 1 D). Despite the lower STn expression, there was still a benefit from anti-STn-ADC. Mice were weighed every 3–4 days and treatment had no significant effect on mouse weights in either of the PDX models (Supplemental Fig. 2Cand D).

STn-ADC was more effective than carboplatin/paclitaxel treatment

To determine whether anti-STn-ADC (2G12) has comparable effects to carboplatin/paclitaxel treatment we utilized the high STn expressing PDX described above (Fig.  1 C), examining anti-STn-ADC alone or in combination or sequential treatments. Animals were randomized and treated when tumors reached an average size of 300 mm 3 (Fig.  1 E). Animals were initially randomized to four arms which included vehicle, anti-STn-ADC alone, carboplatin and paclitaxel in combination or sequentially with anti-STn-ADC and carboplatin/paclitaxel alone. At the designated time point (day 14 or third treatment point) the arm receiving carboplatin and paclitaxel in combination with anti-STn-ADC was received vehicle for the remaining treatment points. The mice receiving carboplatin and paclitaxel alone were then randomized to receive vehicle alone or anti-STn-ADC alone (Fig.  1 F). The animals treated with anti-STn-ADC alone demonstrated a decrease in tumor size compared to vehicle at the end of week 2. To model impact of anti-STn treatment on tumor resurgence, animals that were treated with carboplatin and paclitaxel for two weeks were then randomized to treatment with vehicle or anti-STn-ADC. Tumors that were switched to vehicle increased in size with time as previously shown [ 25 ] and those switched to treatment with anti-STn-ADC continued to decrease.

Organoid cultures respond to anti-STn-ADC in a dose-dependent manner.

To determine whether using the human anti-STn-ADC could affect tumor cell viability in PDOs, we tested anti-STn-ADC on four PDO lines. The four lines had variable genetic backgrounds and known therapeutic response. Briefly, the PDO 17–39 was derived from a patient diagnosed with recurrent HGSOC. The tumor was reported to have a BRCA1 germline mutation and a MYC amplification. 17–121 was derived from a patient diagnosed with recurrent HGSOC with MYC amplification. PDO 17–116 was derived from a patient who had received neoadjuvant chemotherapy following a diagnosis of HGSOC. PDO 18–47 was derived from a patient diagnosed with HGSOC prior to treatment and is CCNE1 amplified. All PDOs were reported to harbor TP53 mutations similar to the parent tumor [ 27 ].

Three of the lines were known to be treatment resistant and one of which is still standard of care treatment sensitive [ 27 ] (Fig.  2 and Supplemental Figs. 3 and 4). The PDOs were treated with increasing concentrations of anti-STn-ADC for 6 days and proliferation was determined using Cell Titer-glo (Fig.  2 A-E). Interestingly, the line most sensitive to anti-STn-ADC, 17–121 is a line that has been determined to be multi-therapy resistant as shown in the area over the curve (Fig.  2 F). To determine whether STn levels would be detectable in the media, our ELISA was used, however the levels in organoid culture were below detection. We then assessed the levels of STn in the HGSOC PDOs by flow cytometry using the mouse anti-STn antibody or IgG control. The organoids ranged from 5 to 23% positivity (Supplemental Fig. 3). In line with expectations, the PDO with the highest level of STn (17–121) responded more favorably to the anti-STn -ADC.

figure 2

Assessment of anti-STn-ADC effects in a panel of HGSOC PDOs. The PDOs were treated with either vehicle or increasing concentrations of anti-STn-ADC on day 1. Each organoid was treated as two individual repeats for each line and cultured for 6 days (3 replicates per dose in each repeat). An untreated series of wells was read on day 1 to compare to the day 6 treated wells to allow for mathematical growth rate correction (Panel A). Figure  2 B, C, D, and E represent the area over the growth rate corrected dose curve (AOC) for each organoid line demonstrating its sensitivity to treatment with anti-STn-ADC compared to its vehicle control. Figure  2 F illustrates the AOC GR50 of anti-STn -ADC in each line

Expression of STn in blinded patient samples

Patient serum samples were tested using the custom RUO ELISA developed to determine whether STn serum levels were a successful biomarker in predicting OvCa versus non OvCa patient status (Fig.  3 , A and B). A total of 420 samples were run at various dilutions and STn levels were calculated based on the BSM standard curve. Out of these, 274 were from patients with OvCa (Details in Supplemental Table 1), 146 samples were from benign gynecological diagnoses (i.e., endometriosis, endometrioma, follicular cysts, fibroids etc.). On average, benign samples had significantly lower STn levels than cancerous samples (median values: 0.1 vs. 0.3 nM STn p value < 0.05). STn levels were able to determine cancer from non-cancer at a 15.3% sensitivity at 97.1% specificity with a cutoff value of 0.2 nM.

figure 3

Development and testing of a custom ELISA. Panel A provides a schematic of the ELISA. Panel B illustrates the STn levels in serum collected from patients diagnosed with OvCa (all subtypes) compared to retrospective collected serum collected from patients diagnosed with benign conditions (endometriosis, uterine fibroids, follicular cysts etc.)

Correlation between STn and patient clinical outcomes

Clinical measures and survival outcomes (OS and PFS) were collected for the patients (median 7 years follow-up) and correlative analysis was done using log-rank tests and Cox proportional hazards models. Samples were grouped into clear cell (Fig.  4 A), mucinous (Fig.  4 B) endometrioid (Fig.  4 C), and serous (Fig.  4 D). There was no predictive association between serum STn expression and cancer stage in endometroid, serous, nor mucinous cancers. There was a correlation between stage and STn levels in clear cell subtype by t -test ( p  = 0.002) with late stage (III & IV) cancers having a median STn 90% higher than early stage (I & II). Future cohorts may benefit from increased patient numbers in specific staged cancers by histology to improve statistical power.

figure 4

Box and whisker plots displaying distribution of serum STn levels (log concentration scaled left axis, log (concentration) right axis) by stage for clear cell OvCa with significant difference between early stage (I & II) and late stage (III & IV) p  = 0.002 (panel A), mucinous ovarian cancer (panel B), endometrioid OvCa (panel C), and serous OvCa (panel D). Serum STn versus serum CA125 bivariate plot coded by histology (serous (red), mucinous (orange), endometrioid (green), clear cell (blue), benign (purple) panel E)

To examine the ability of pre-treatment patient levels of STn expression to predict response to treatment with platinum and paclitaxel based strategy, we grouped the cohort according to clinical response to therapy and performed correlative analysis by t -test, and log-rank test and Cox proportional hazards tests for survival outcomes (OS and PFS). These tests showed no predictive power of pre-treatment serum STn for the following outcomes. Patient clinical responses to treatment were determined as sensitive (response > 6 months), refractory (did not respond), resistant (responded < 6 months). There was no statistical difference in pre-treatment STn levels between patients with different responses to chemotherapy (Supplemental Fig. 5A). Kaplan Meier curves for OS and PFS were determined and there was no correlation between pre-treatment STn levels and patient PFS and OS (Supplemental Fig. 5B and C).

Determining if circulating STn can augment CA-125 as a diagnostic

In comparison, patient CA-125 levels showed 79% sensitivity at 97.1% specificity using a cutoff of 35 IU/mL. There were 25 patients (9.1%) with OvCa but low levels of CA-125. Amongst this group of low CA-125 patients, only 4 showed higher levels of STn, limiting the increase in sensitivity due to STn above the sensitivity provided by CA-125 to about 2%. The combination of serum CA-125 and STn (Fig.  4 E) did not improve biomarker prediction of cancer vs. non-cancer. The bivariate robust t-distribution mixture model estimated that combining all histologies the four components had the following estimates for a fraction of cases: (1) STn and CA-125 overexpression: 51.1%; (2) STn overexpression, CA-125 same as controls: 1.6%; (3) CA-125 overexpression, STn same as controls: 25.3%; and (4) STn and CA-125 same expression as control patients: 21.1%. However, for mucinous cases, the four components had the following fractions: (1) STn and CA-125 overexpression: 13.8%; (2) STn overexpression, CA-125 same as controls: 9.3%; (3) CA-125 overexpression, STn same as controls: 21.1% and (4) STn and CA-125 same expression as control patients: 47.7%. Hence for mucinous subtype, serum STn was complementary to CA-125 in 9.3% of cases in our cohort, providing complementarity to CA-125. However, greater numbers of mucinous ovarian cancer cases are required before this indication that preoperative serum STn levels may add predictive power to serum CA-125 levels for distinguishing ovarian cancer from benign pelvic disease and such estimates are deemed stable and accurate.

STn levels pre and post-surgical debulking.

Currently, OvCa disease progression monitoring is typically accomplished using CA-125 via clinically accessible blood assays. As expected, CA-125 values normally decline post-surgical debulking. The decrease is attributed to the marked reduction in tumor volume, although it is generally appreciated it is not an exact correlation. To date we had no knowledge as to what impact surgical debulking had on circulating STn levels. To begin to address this question an independent set of matched blood samples derived from patients diagnosed with HGSOC prior to surgery and again post-surgery were assessed in our ELISA as described. We determined that circulating STn levels declined post debulking surgery (Supplemental Fig. 6) as has been reported for CA-125. These results, albeit preliminary, suggest that a future study with a larger cohort with more detailed clinical information related to the amount of disease, whether the surgery was sub-optimally or optimally debulked would be of interest. In addition, assessment of circulating levels of STn pre and post-surgery at regular intervals until there is evidence of recurrent disease as determined by elevated CA-125 and or imaging could determine if there is any value for STn as an additional marker of recurrent disease. Whether post-surgical circulating STn levels could inform the onset of tumor resurgence is yet to be determined.

figure 5

Levels of positive STn staining observed in fixed embedded tumor samples reflected levels of serum STn. Representative examples of IHC scoring scheme (Panel 5 A). Arrows indicate STn positive cells. Scale bars represent 100 μm. A linear regression analysis of the STn serum levels and IHC (0, 1, 2, 3 levels) showed they were positively correlated (Fig.  5 B, p  < 0.0001) with median STn serum (Panel B)

Immunohistochemical analysis of serum matched tumor samples.

To determine whether the levels of STn observed in tumors mimicked the levels of serum STn, we initially stained a subset of HGSOC (stage III, n  = 30) samples that were evaluated and scored (scoring scheme examples are shown in Fig.  5 A).

A linear regression analysis of the STn serum levels and IHC (0, 1, 2, 3 levels) for all histologies ( n  = 52, including serous, clear cell, mucinous, and endometrioid) showed they were positively correlated (Fig.  5 B, p  < 0.0001) with median STn serum almost tripling with each unit increase in IHC (292% increase). Only two patients (3.8%) had low serum STn, below the median STn of 0.1 nM, with high (level 2 or 3) IHC STn expression. STn can present on a variety of proteins, and we do not expect all to be shed at the same rate. Of note, we observed evidence of heterogeneity regarding the presence of STn positive staining when evaluating different samples collected spatially and temporally apart. This variation highlights that the expression of STn could be heterogenous throughout a patient’s clinical treatment, supporting the argument for having a complimentary analysis via a blood-based assay. Future studies could benefit our understanding of these observed STn biomarker changes, and correlations between tissue and blood STn prevalence, using powered longitudinal cohorts.

The current study demonstrated on-target effect of treatment with a humanized anti-STn-ADC in OvCa cells, HGSOC PDOs, and patient derived serous ovarian cancer xenograft models. This significantly extends our previous assessment of STn expression in some of the long term established ovarian cancer cell lines and xenograft model with the mouse anti-STn-ADC and the more recent investigations characterization of the humanized anti-STn-ADC [ 4 , 5 , 6 ]. In addition to the functional studies, we developed an STn ELISA as a proof-of-concept to assess which patients express elevated STn levels and thus might benefit most from an anti-STn based strategy. Unlike other studies [ 12 , 33 , 34 ] we did not observe an increase in circulating STn levels correlating with advanced stage. Moreover, there was no correlation with circulating STn level and PFS or OS in our cohort using our ELISA. There was, however, evidence that the serum levels were reflective of the tumor levels of STn. These differences between our results and that of the others could be attributed in part to the different specificities of the different antibodies used, as previously described [ 6 , 19 ]. Lastly, while our ELISA could distinguish between malignant and non-malignant samples with a relatively high degree of confidence, the assessment of serum STn using our ELISA did not outperform or augment CA-125 in the detection of serous ovarian cancer. However, it did augment CA-125 in detecting mucinous and clear cell cancers in our cohort, albeit the numbers of samples representing these subtypes were limited and would benefit from the analysis of a larger cohort to have any confidence. Further development of this blood-based ELISA immunoassay platform to improve sensitivity and dynamic range could be beneficial to this unique target. Our data herein suggest the humanized anti-STn-ADC preferentially targets cells that express STn in an in vitro model. This was shown by treating the SKOV3 CTL OvCa cell line, which naturally expresses little to no STn and a SKOV3 cell line that stably overexpresses ST6GALNAC1 resulting in increased levels of STn [ 4 ] with the human anti-STn-ADC. The human anti-STN-ADC had no significant effect on the SKOV3 CTL line but displayed a dose-dependent impact on the SKOV3-ST6GALNAC1 line as determined by an MTT assay. This result agrees with our murine antibody specificity in an in vitro setting using long-term established ovarian cancer cell lines which displayed a range of cells positive for STn [ 5 , 6 ]. To augment these findings, we assessed the effect of our anti-STn-ADC on four PDO lines three of which were shown to be resistant to one or more drug treatments and one of which was still platinum sensitive [ 27 ]. Similar to what we observed in the established OvCa line using the murine anti-STn-ADC, the humanized anti-STn-ADC proved effective in patient derived OvCa organoids, further supporting our argument. The pattern of response was reflected by their STn levels with the 17–121 PDO showing the highest level of STn being the most responsive.

Moving forward, it was essential to determine the predictive power of STn expression relative to the response to the anti-STn-ADC treatment in in vivo models. Similar to that observed with the STn positive and negative SKOV3 cells, the PDX models indicate that the levels of STn impacts the effect of the treatment with anti-STn-ADC. The PDXs derived from the tumor that had relatively low levels of STn responded to anti-STn-ADC, although modestly compared to the PDX hosting the tumor with higher levels of STn positive tumor cells which responded more robustly. These data suggest that not all tumors would equally benefit from the anti-STn-ADC treatment and would require some level of screening for STn expression to enrich for potential responders. This finding supports the conclusions of a previous vaccine trial (Theratope) where it was determined retrospectively that the ability to enrich for patients whose tumors displayed higher levels would likely improve the chance of success [ 7 ]. However, the longitudinal expression of STn was not studied in this cohort and it may be dynamic, and/or change in response to treatment and will need to be investigated.

Using a previously described PDX mouse model [ 25 ] that mimics, in part, recurrent or resurgence of cancer, we compared a standard carboplatin and paclitaxel regimen against anti-STn-ADC treatment alone and in combination. Interestingly, treatment with single-agent anti-STn-ADC resulted in an improved response over carboplatin and paclitaxel alone in the PDX that hosted the tumor with relatively high levels of STn. We appreciate the PDX model used herein reflects a more acute response rather than a long-term response. Stopping the carboplatin and paclitaxel treatment was meant to mimic tumor resurgence, as shown in the past [ 25 ]. More specifically, this model enables us to have some insight if another drug was added at the point the cytotoxic was withdrawn or if a combination strategy was attempted at the onset and the cytotoxic agent was stopped, but the other drug regimen was maintained, was it sufficient to prevent the resurgence prompted by the withdrawal of the cytotoxic regimen. Therefore, we examined two combination strategies with anti-STn-ADC and chemotherapy treatment. The first was concurrent carboplatin and paclitaxel, and anti-STn-ADC, and the second was sequential treatment starting with chemotherapy followed by anti-STn-ADC. We postulated that chemotherapy would enrich for STn positive cells suggesting the sequential strategy would be more effective than the single-agent treatment. This, however, was not the case in our study, where both single-agent anti-STn-ADC and sequential treatments were equally effective. Based on this outcome, we postulate that an anti-STn-ADC might serve as a maintenance strategy; additional studies would bolster or refute this observation.

Historically, other STn based therapies have been less than effective in clinical trials. The Theratope vaccine was tested in breast cancer patients; however, the results were underwhelming. As mentioned, these lackluster results were likely due, in part, to the lack of patient stratification based on their tumor levels of STn. Retrospective ad hoc studies demonstrated that when the patients were divided into those who had detectable levels of STn fared better than those that didn’t have any detectable levels of STn [ 7 ]. These results, along with our own observations [ 4 , 5 , 6 ], prompted us to determine whether treatment using our anti-STn-ADC would benefit from patient stratification by STn expression. Consequently, we focused on developing a serum ELISA that may serve to identify patients with circulating STn and if the levels were reflective of what was observed in the tumor.

Several antibodies have been used to assess circulating STn levels or detect malignancies with varied outcomes [ 12 , 35 , 36 , 37 ]. Some of the variability observed in previous studies was likely due, in part, to differences in methodology, the lack of antibody specificity, and/or assessment of STn in the cytoplasm, membrane bound proteins or circulating proteins. We previously demonstrated that some of the historical anti-STn antibodies bound additional glycans and/or may have glycoprotein preferences for antigen recognition [ 6 ]. Commercial STn serum quantification assays often utilize CC49 and B72.3 historical antibody pairs, which are not strictly STn specific. We postulated that an assay utilizing our unique anti-STn antibodies might improve assay sensitivity and specificity for future biomarker or patient stratification serum ELISA assays. Based on preliminary non-published data, a subset of anti-STn antibodies were chosen for sandwich ELISA testing, and pairing was guided based upon their light chain CDR sequences. This report described initial development and preliminary characterization of the STn ELISA and explored its suitability to predict clinical response in retrospective OvCa patient cohorts.

As described, we analyzed a cohort of blood samples from patients originally diagnosed with benign or malignant gynecologic disease/conditions. The benign cases were preferentially chosen for their high propensity for a false positive read if CA-125 were the sole discriminating factor. These benign GYN cases had diagnoses such as endometrioma, endometriosis, cystic follicles, leiomyomata, etc. The malignant cases primarily focused on serous ovarian but included a limited cohort of ovarian mucinous, clear cell, and endometrioid cancers. One of our objectives was to determine if the ELISA could detect circulating levels of STn and whether the levels were concordant with STn positivity on FFPE sections from paired samples. For this, we identified a subset of matching samples to determine if there was any similarity between blood and tumor levels.

We did not expect a 1:1 correlation, given some proteins or peptides that present with STn may be on the membrane or are cleaved and found in circulation. For example, MUC16, can be cleaved and the cleaved portion, CA-125 can be found in the circulation. Other STn positive proteins can be internalized or remain on the surface. Others have also suggested that STn can be ubiquitously expressed or irregularly distributed across tumors [ 6 , 7 , 9 ]. Still, others have reported that more aggressive areas might display increased levels of STn [ 7 ]. If this were the case, the FFPE sampling might be an issue. However, this was not the case. In the cases we evaluated the majority were concordant, meaning that if STn was high in the serum, it was also highly expressed on the tumor sections with adequate sampling and vice versa. This finding suggests that the ELISA or IHC may serve to triage patients appropriately. Combining the two methods, ELISA and IHC could provide additional confidence.

Unlike others have reported [ 12 , 33 ], using our antibody combination in a sandwich ELISA we did not see a clear increase in STn levels concordant with the stage of disease except for ovarian clear cell, but this was only a small cohort. We attribute this difference to the specificity of our antibody cocktail used in our ELISA as opposed to the antibodies used by others, many of which we showed were not specific to STn [ 6 ].

Given that our STn ELISA was generally able to distinguish between malignant and non-malignant cases, we were curious how it compared to CA-125 and if it augmented CA-125’s ability to detect ovarian cancer. CA-125 has been FDA approved to monitor ovarian cancer patients’ response to treatment. CA-125 has ∽  90% specificity and 50–60% sensitivity [ 38 , 39 ]. However, CA-125 is not approved as a diagnostic measure in ovarian cancer due to its low specificity since it is found elevated in benign conditions such as endometriosis, ovarian cysts, and fibroids [ 40 , 41 , 42 , 43 , 44 ]. Despite introducing new biomarkers, such as HE4, and other multi-modal tests (ROMA, ROCA, OVA1, Overa), CA-125 [ 45 ] is still often one of the key biomarkers studied. CA-125 is a mucin-type glycoprotein product of cleaved MUC16. The extracellular domain of CA-125 is rich in N- and O-liked glycans with 249 potential N-glycosylation and over 3,700 O-glycosylation sites [ 46 ]. CA-125 glycan post-translational modifications often differ between normal and malignant conditions, expressing truncated glycans in OvCa such as the STn [ 47 ]. The Elecsys CA 125 II clinical assay does not allow discrimination among various glycoforms of CA-125 as neither mAb clone is significantly influenced by N- or O-glycosylation of CA-125 [ 48 ]. We postulated the analysis of post-translational modifications such as glycosylation may increase the specificity and sensitivity of existing OvCa biomarkers or even lead to novel independent biomarkers with improved accuracy. Over 70% of human blood proteins are glycosylated, and these modifications often play important roles in adhesion, migration, immune recognition, and other signaling pathways [ 49 ]. However, despite the increased specificity of the STn antibody and the number of glycoproteins with STn expressed, we were unable to demonstrate improvement over CA-125 independently or augment its ability to detect ovarian cancer except for the mucinous clear cell cancers. Despite showing statistical differences, the small sample size warrants caution about its value in augmenting CA-125. Together, these findings suggest that using an antibody-based approach against more than one of the aberrant glycosylations associated with malignancies might be a better approach to improve diagnostic potential.

Collectively, we conclude our preclinical studies, along with others, support the concept that targeting STn with a highly specific antibody-drug conjugate where there is evidence of STn present in either circulation and/or the tumor, might be a viable clinical option for treatment of OvCa. Further studies are needed to validate whether an anti-STn-ADC could be used as maintenance strategy.

Data availability

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request. Sharing of primary human tumor material are subject to approval by institutional review board, institutional data and tissue management and sharing committees, and approved material transfer agreements.

Torre LA, Trabert B, DeSantis CE, Miller KD, Samimi G, Runowicz CD, Gaudet MM, Jemal A, Siegel RL. Ovarian cancer statistics, 2018. CA Cancer J Clin. 2018;68(4):284–96.

Article   PubMed   PubMed Central   Google Scholar  

Siegel RL, Miller KD, Fuchs HE, Jemal A. Cancer statistics, 2022. CA Cancer J Clin. 2022;72(1):7–33.

Article   PubMed   Google Scholar  

Zhang Z, Wuhrer M, Holst S. Serum sialylation changes in cancer. Glycoconj J. 2018;35(2):139–60.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Eavarone DA, Al-Alem L, Lugovskoy A, Prendergast JM, Nazer RI, Stein JN, Dransfield DT, Behrens J, Rueda BR. Humanized anti-sialyl-tn antibodies for the treatment of ovarian carcinoma. PLoS ONE. 2018;13(7):e0201314.

Starbuck K, Al-Alem L, Eavarone DA, Hernandez SF, Bellio C, Prendergast JM, Stein J, Dransfield DT, Zarrella B, Growdon WB, et al. Treatment of ovarian cancer by targeting the tumor stem cell-associated carbohydrate antigen, Sialyl-Thomsen-Nouveau. Oncotarget. 2018;9(33):23289–305.

Prendergast JM, Galvao da Silva AP, Eavarone DA, Ghaderi D, Zhang M, Brady D, Wicks J, DeSander J, Behrens J, Rueda BR. Novel anti-sialyl-tn monoclonal antibodies and antibody-drug conjugates demonstrate tumor specificity and anti-tumor activity. MAbs. 2017;9(4):615–27.

Julien S, Videira PA, Delannoy P. Sialyl-tn in cancer: (how) did we miss the target? Biomolecules 2012, 2(4):435–66.

Werther JL, Tatematsu M, Klein R, Kurihara M, Kumagai K, Llorens P, Guidugli Neto J, Bodian C, Pertsemlidis D, Yamachika T, et al. Sialosyl-Tn antigen as a marker of gastric cancer progression: an international study. Int J Cancer. 1996;69(3):193–9.

Article   CAS   PubMed   Google Scholar  

Munkley J. The role of Sialyl-Tn in Cancer. Int J Mol Sci. 2016;17(3):275.

Ferreira JA, Videira PA, Lima L, Pereira S, Silva M, Carrascal M, Severino PF, Fernandes E, Almeida A, Costa C, et al. Overexpression of tumour-associated carbohydrate antigen sialyl-Tn in advanced bladder tumours. Mol Oncol. 2013;7(3):719–31.

Carvalho S, Abreu CM, Ferreira D, Lima L, Ferreira JA, Santos LL, Ribeiro R, Grenha V, Martinez-Fernandez M, Duenas M, et al. Phenotypic analysis of Urothelial Exfoliated cells in bladder Cancer via Microfluidic immunoassays: Sialyl-Tn as a Novel Biomarker in Liquid biopsies. Front Oncol. 2020;10:1774.

Kobayashi H, Terao T, Kawashima Y. Serum sialyl tn as an independent predictor of poor prognosis in patients with epithelial ovarian cancer. J Clin Oncol. 1992;10(1):95–101.

Carrascal MA, Severino PF, Guadalupe Cabral M, Silva M, Ferreira JA, Calais F, Quinto H, Pen C, Ligeiro D, Santos LL, et al. Sialyl Tn-expressing bladder cancer cells induce a tolerogenic phenotype in innate and adaptive immune cells. Mol Oncol. 2014;8(3):753–65.

Al-Alem LF, Pandya UM, Baker AT, Bellio C, Zarrella BD, Clark J, DiGloria CM, Rueda BR. Ovarian cancer stem cells: what progress have we made? Int J Biochem Cell Biol. 2019;107:92–103.

Foster R, Buckanovich RJ, Rueda BR. Ovarian cancer stem cells: working towards the root of stemness. Cancer Lett. 2013;338(1):147–57.

Wang WY, Cao YX, Zhou X, Wei B, Zhan L, Sun SY. Stimulative role of ST6GALNAC1 in proliferation, migration and invasion of ovarian cancer stem cells via the akt signaling pathway. Cancer Cell Int. 2019;19:86.

Reddish MA, Jackson L, Koganty RR, Qiu D, Hong W, Longenecker BM. Specificities of anti-sialyl-tn and anti-tn monoclonal antibodies generated using novel clustered synthetic glycopeptide epitopes. Glycoconj J. 1997;14(5):549–60.

Murad JP, Kozlowska AK, Lee HJ, Ramamurthy M, Chang WC, Yazaki P, Colcher D, Shively J, Cristea M, Forman SJ, et al. Effective targeting of TAG72(+) peritoneal ovarian tumors via Regional Delivery of CAR-Engineered T cells. Front Immunol. 2018;9:2268.

Loureiro LR, Carrascal MA, Barbas A, Ramalho JS, Novo C, Delannoy P, Videira PA. Challenges in antibody development against tn and Sialyl-Tn antigens. Biomolecules. 2015;5(3):1783–809.

Hamilton TC, Young RC, McKoy WM, Grotzinger KR, Green JA, Chu EW, Whang-Peng J, Rogan AM, Green WR, Ozols RF. Characterization of a human ovarian carcinoma cell line (NIH:OVCAR-3) with androgen and estrogen receptors. Cancer Res. 1983;43(11):5379–89.

CAS   PubMed   Google Scholar  

Provencher DM, Lounis H, Champoux L, Tetrault M, Manderson EN, Wang JC, Eydoux P, Savoie R, Tonin PN, Mes-Masson AM. Characterization of four novel epithelial ovarian cancer cell lines. Vitro Cell Dev Biol Anim. 2000;36(6):357–61.

Article   CAS   Google Scholar  

Domcke S, Sinha R, Levine DA, Sander C, Schultz N. Evaluating cell lines as tumour models by comparison of genomic profiles. Nat Commun. 2013;4:2126.

Groeneweg JW, DiGloria CM, Yuan J, Richardson WS, Growdon WB, Sathyanarayanan S, Foster R, Rueda BR. Inhibition of notch signaling in combination with Paclitaxel reduces platinum-resistant ovarian tumor growth. Front Oncol. 2014;4:171.

Curley MD, Therrien VA, Cummings CL, Sergent PA, Koulouris CR, Friel AM, Roberts DJ, Seiden MV, Scadden DT, Rueda BR, et al. CD133 expression defines a tumor initiating cell population in primary human ovarian cancer. Stem Cells. 2009;27(12):2875–83.

McCann CK, Growdon WB, Kulkarni-Datar K, Curley MD, Friel AM, Proctor JL, Sheikh H, Deyneko I, Ferguson JA, Vathipadiekal V, et al. Inhibition of hedgehog signaling antagonizes serous ovarian cancer growth in a primary xenograft model. PLoS ONE. 2011;6(11):e28077.

Groeneweg JW, Hernandez SF, Byron VF, DiGloria CM, Lopez H, Scialabba V, Kim M, Zhang L, Borger DR, Tambouret R, et al. Dual HER2 targeting impedes growth of HER2 gene-amplified uterine serous carcinoma xenografts. Clin Cancer Res. 2014;20(24):6517–28.

Hill SJ, Decker B, Roberts EA, Horowitz NS, Muto MG, Worley MJ Jr., Feltmate CM, Nucci MR, Swisher EM, Nguyen H, et al. Prediction of DNA repair inhibitor response in short-term patient-derived ovarian Cancer Organoids. Cancer Discov. 2018;8(11):1404–21.

Hafner M, Niepel M, Chung M, Sorger PK. Growth rate inhibition metrics correct for confounders in measuring sensitivity to cancer drugs. Nat Methods. 2016;13(6):521–7.

Team RC. R: A language and environment for statistical computing. 2020.

Team SD. Stan Modeling Language Users Guide and Reference Manual, 2.28. 2021.

Team SD. RStan: the R interface to Stan. R package version 2.21.2. 2020.

Skates SJ, Horick N, Yu Y, Xu FJ, Berchuck A, Havrilesky LJ, de Bruijn HW, van der Zee AG, Woolas RP, Jacobs IJ, et al. Preoperative sensitivity and specificity for early-stage ovarian cancer when combining cancer antigen CA-125II, CA 15 – 3, CA 72 – 4, and macrophage colony-stimulating factor using mixtures of multivariate normal distributions. J Clin Oncol. 2004;22(20):4059–66.

Kobayashi H, Terao T, Kawashima Y. Clinical evaluation of circulating serum sialyl tn antigen levels in patients with epithelial ovarian cancer. J Clin Oncol. 1991;9(6):983–7.

Kobayashi H, Terao T, Kawashima Y. [Serum sialyl tn antigen as a prognostic marker in patients with epithelial ovarian cancer]. Nihon Sanka Fujinka Gakkai Zasshi. 1992;44(1):14–20.

Doerr RJ, Abdel-Nabi H, Krag D, Mitchell E. Radiolabeled antibody imaging in the management of colorectal cancer. Results of a multicenter clinical study. Ann Surg. 1991;214(2):118–24.

Bohdiewicz PJ. Indium-111 satumomab pendetide: the first FDA-approved monoclonal antibody for tumor imaging. J Nucl Med Technol. 1998;26(3):155–63. quiz 170 – 151.

Bhatt P, Vhora I, Patil S, Amrutiya J, Bhattacharya C, Misra A, Mashru R. Role of antibodies in diagnosis and treatment of ovarian cancer: basic approach and clinical status. J Control Release. 2016;226:148–67.

Colakovic S, Lukic V, Mitrovic L, Jelic S, Susnjar S, Marinkovic J. Prognostic value of CA125 kinetics and half-life in advanced ovarian cancer. Int J Biol Markers. 2000;15(2):147–52.

van der Burg ME, Lammes FB, van Putten WL, Stoter G. Ovarian cancer: the prognostic value of the serum half-life of CA125 during induction chemotherapy. Gynecol Oncol. 1988;30(3):307–12.

Giudice LC, Jacobs AJ, Bell CE, Lippmann L. Serum levels of CA-125 in patients with endometriosis. Gynecol Oncol. 1986;25(2):256–8.

Fedele L, Vercellini P, Arcaini L, da Dalt MG, Candiani GB. CA 125 in serum, peritoneal fluid, active lesions, and endometrium of patients with endometriosis. Am J Obstet Gynecol. 1988;158(1):166–70.

Jacobs I, Bast RC Jr. The CA 125 tumour-associated antigen: a review of the literature. Hum Reprod. 1989;4(1):1–12.

Moloney MD, Thornton JG, Cooper EH. Serum CA 125 antigen levels and disease severity in patients with endometriosis. Obstet Gynecol. 1989;73(5 Pt 1):767–9.

Burghaus S, Drazic P, Wolfler M, Mechsner S, Zeppernick M, Meinhold-Heerlein I, Mueller MD, Rothmund R, Vigano P, Becker CM, et al. Multicenter evaluation of blood-based biomarkers for the detection of endometriosis and adenomyosis: a prospective non-interventional study. Int J Gynaecol Obstet. 2024;164(1):305–14.

Bayoumy S, Hyytia H, Leivo J, Talha SM, Huhtinen K, Poutanen M, Hynninen J, Perheentupa A, Lamminmaki U, Gidwani K, et al. Glycovariant-based lateral flow immunoassay to detect ovarian cancer-associated serum CA125. Commun Biol. 2020;3(1):460.

Saldova R, Struwe WB, Wynne K, Elia G, Duffy MJ, Rudd PM. Exploring the glycosylation of serum CA125. Int J Mol Sci. 2013;14(8):15636–54.

Fu C, Zhao H, Wang Y, Cai H, Xiao Y, Zeng Y, Chen H. Tumor-associated antigens: tn antigen, sTn antigen, and T antigen. HLA. 2016;88(6):275–86.

Marcos-Silva L, Narimatsu Y, Halim A, Campos D, Yang Z, Tarp MA, Pereira PJ, Mandel U, Bennett EP, Vakhrushev SY, et al. Characterization of binding epitopes of CA125 monoclonal antibodies. J Proteome Res. 2014;13(7):3349–59.

Li Q, Kailemia MJ, Merleev AA, Xu G, Serie D, Danan LM, Haj FG, Maverakis E, Lebrilla CB. Site-specific glycosylation quantitation of 50 serum glycoproteins enhanced by Predictive Glycopeptidomics for Improved Disease Biomarker Discovery. Anal Chem. 2019;91(8):5433–45.

Download references

Acknowledgements

The authors are grateful for the MGH gynecologic oncology surgeons and their support staff and the MGH gyn pathologists and their support team for their cooperation in obtaining samples for the repository. We would also like to thank those patients who provided their informed consent to allow excess samples to be banked for research purposes with the intent to combat against gynecologic diseases.

Research reported in this publication was supported in part by the National Cancer Institute (NCI) of the National Institutes of Health (NIH) Small Business Innovation Research (SBIR) under grants JMP, DJD, JB and BRR, HHSN261200700063C, and HHSN261200900034C. In addition, BRR, BZ, DZ, VP and LAL were supported, in part, by the Nile Albright Research Foundation (NARF) and the Vincent Memorial Hospital Foundation (VMHF). SJH was supported by NIH 1DP5OD029637, DOD OCRP Pilot OC180061, and a Tina’s Wish Rising Star Award. SJS was supported by NCI Early Detection Research Network grants U01-CA152990, U2C-CA271871, and the Concord (MA) Detect Ovarian Cancer Early Fund.

Author information

Authors and affiliations.

Vincent Center for Reproductive Biology, Department of Obstetrics and Gynecology, Massachusetts General Hospital, Boston, MA, 02114, USA

Linah Al-Alem, Bianca Zarrella, Dominique T. Zarrella, Whitfield B. Growdon, Venkatesh Pooladanda & Bo R. Rueda

Obstetrics, Gynecology and Reproductive Biology, Harvard Medical School, Boston, MA, 02115, USA

Linah Al-Alem, Whitfield B. Growdon, Venkatesh Pooladanda & Bo R. Rueda

Siamab Therapeutics, Inc, Newton, MA, 02458, USA

Jillian M. Prendergast, Justin Clark, Rawan I. Nazer, Jeff Behrens & Daniel T. Dransfield

Division of Gynecologic Oncology, Department of Obstetrics and Gynecology, Massachusetts General Hospital, Boston, MA, 02114, USA

Whitfield B. Growdon & Bo R. Rueda

Department of Medical Oncology, Dana-Farber Cancer Institute, Boston, MA, 02215, USA

Sarah J. Hill

Division of Molecular and Cellular Oncology, Dana-Farber Cancer Institute, Boston, MA, 02215, USA

Department of Medicine, Harvard Medical School, Boston, MA, 02115, USA

Division of Hematology-Oncology, Massachusetts General Hospital, 55 Fruit St, Boston, MA, 02114, USA

David R. Spriggs

Department of Medicine, Massachusetts General Hospital, Boston, MA, 02114, USA

Obstetrics and Gynecology Epidemiology Center, Brigham and Women’s Hospital, Boston, MA, 02115, USA

Daniel Cramer

Division of Gynecologic Oncology, Department of Obstetrics and Gynecology, Brigham and Women’s Hospital, Boston, MA, 02115, USA

Kevin M. Elias

Biostatistics Center, Massachusetts General Hospital, Boston, MA, 02114, USA

Steven J. Skates

You can also search for this author in PubMed   Google Scholar

Contributions

Author Contributions: “Conceptualization, LAA, JMP, SJH, SJS, WBG and BRR, methodology, LAA, JMP, SJH, SJS, JC, BZ, DZ, RIN and BRR; validation, LAA, SMP, JMP, SJS, BRR; formal analysis, LAA, JMP, SJH, SJS, WBG and BRR; investigation; resources, DC, KME, DJD, JB, SJH, and BRR; data curation, LAA, JMP, BZ, DZ, SJH, SJS, VP and BRR; writing—original draft preparation, LAA, JMP, BRR; writing—review and editing, KME, JMP, JC, DJD, JB, DZ, BZ, VP, SJH, SJS, BRR; visualization, LAA, VP, SJH, SJS, and BRR; supervision, BRR; project administration, BRR; funding acquisition, BRR, SJH, and JB. All authors have read and agreed to the published version of the manuscript.”

Corresponding author

Correspondence to Bo R. Rueda .

Ethics declarations

Ethics approval and consent to participate.

As stated above all primary patient samples used herein were derived either from patients who provided written informed consent or were collected on an institutional IRB approved protocol allowing for use of discarded tissue. All samples were coded and therefore we have no direct access to patient health identifiers as per our IRB protocol.

Consent for publication

Not Applicable.

Competing interests

This research led to the development of products that were previously owned by or licensed to Siamab Therapeutics, Inc. in which JMP, RIN, DTD, JB had a business and/or financial interest. BRR served as a consultant for SIAMAB Therapeutics.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, supplementary material 3, supplementary material 4, supplementary material 5, supplementary material 6, supplementary material 7, supplementary material 8, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Al-Alem, L., Prendergast, J.M., Clark, J. et al. Sialyl-Tn serves as a potential therapeutic target for ovarian cancer. J Ovarian Res 17 , 71 (2024). https://doi.org/10.1186/s13048-024-01397-1

Download citation

Received : 24 August 2023

Accepted : 21 March 2024

Published : 02 April 2024

DOI : https://doi.org/10.1186/s13048-024-01397-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Ovarian cancer
  • Companion diagnostic
  • Targeted therapy

Journal of Ovarian Research

ISSN: 1757-2215

example of research model

IMAGES

  1. Research Model

    example of research model

  2. An example of a research model

    example of research model

  3. Structure of the research model

    example of research model

  4. Strategies and Models

    example of research model

  5. Research Model

    example of research model

  6. The research model. The research model.

    example of research model

VIDEO

  1. SAMPLING PROCEDURE AND SAMPLE (QUALITATIVE RESEARCH)

  2. #SOUND RESEARCH MODEL 800G USA BY SU-DISTRIBUIDOR

  3. Research model #hydroponics #automation #engineering #college #trading ,Automatic hydroponic system

  4. How to select a Research method? #researchmethod #researchmodel #problemarea

  5. Research Design

  6. A Model for Inventory and Assortment Planning

COMMENTS

  1. PDF Research Models and Methodologies

    Definition of Research Research Paradigms (a.k.a research philosophy or research model) specifying concepts-phenomena of interest as defined in model, and statements- propositions involving concepts Theories, Methods and Application Domains Classes of Research Methodologies that have emerged as a consequence of conducting similar kinds of ...

  2. What Is a Research Design

    A research design is a strategy for answering your research question using empirical data. Creating a research design means making decisions about: Your overall research objectives and approach. Whether you'll rely on primary research or secondary research. Your sampling methods or criteria for selecting subjects. Your data collection methods.

  3. Research Methodology

    Research methodology formats can vary depending on the specific requirements of the research project, but the following is a basic example of a structure for a research methodology section: I. Introduction. Provide an overview of the research problem and the need for a research methodology section; Outline the main research questions and ...

  4. Theories and Models: What They Are, What They Are for, and What They

    What Are Theories. The terms theory and model have been defined in numerous ways, and there are at least as many ideas on how theories and models relate to each other (Bailer-Jones, Citation 2009).I understand theories as bodies of knowledge that are broad in scope and aim to explain robust phenomena.Models, on the other hand, are instantiations of theories, narrower in scope and often more ...

  5. What Is a Conceptual Framework?

    Developing a conceptual framework in research. Step 1: Choose your research question. Step 2: Select your independent and dependent variables. Step 3: Visualize your cause-and-effect relationship. Step 4: Identify other influencing variables. Frequently asked questions about conceptual models.

  6. Research Methods

    Research methods are specific procedures for collecting and analyzing data. Developing your research methods is an integral part of your research design. When planning your methods, there are two key decisions you will make. First, decide how you will collect data. Your methods depend on what type of data you need to answer your research question:

  7. Research Design

    Table of contents. Step 1: Consider your aims and approach. Step 2: Choose a type of research design. Step 3: Identify your population and sampling method. Step 4: Choose your data collection methods. Step 5: Plan your data collection procedures. Step 6: Decide on your data analysis strategies.

  8. Conceptual Models and Theories: Developing a Research Framew

    t. In this research series article the authors unravel the simple steps that can be followed in identifying, choosing, and applying the constructs and concepts in the models or theories to develop a research framework. A research framework guides the researcher in developing research questions, refining their hypotheses, selecting interventions, defining and measuring variables. Roy's ...

  9. The Four Types of Research Paradigms: A Comprehensive Guide

    A paradigm is a system of beliefs, ideas, values, or habits that form the basis for a way of thinking about the world. Therefore, a research paradigm is an approach, model, or framework from which to conduct research. The research paradigm helps you to form a research philosophy, which in turn informs your research methodology.

  10. What Is Research Design? 8 Types + Examples

    Research design refers to the overall plan, structure or strategy that guides a research project, from its conception to the final analysis of data. Research designs for quantitative studies include descriptive, correlational, experimental and quasi-experimenta l designs. Research designs for qualitative studies include phenomenological ...

  11. Writing a model research paper: A roadmap

    Title of the research paper. Since the research paper's "title" is the first one to be read (in the table of contents of a journal), it needs to be attractive. [ 1, 13] It needs to provoke curiosity and it should accurately convey what the paper is about. [ 1, 10, 13] At the same time, it needs to be simple, concise, and easily understood ...

  12. Research Methods Information : Theoretical Models (Using Theory)

    Theories are formulated to explain, predict, and understand phenomena and, in many cases, to challenge and extend existing knowledge, within the limits of the critical bounding assumptions. The theoretical framework is the structure that can hold or support a theory of a research study. The theoretical framework introduces and describes the ...

  13. 4. Research Methods: Modeling

    Modeling as a scientific research method. Whether developing a conceptual model like the atomic model, a physical model like a miniature river delta, or a computer model like a global climate model, the first step is to define the system that is to be modeled and the goals for the model. "System" is a generic term that can apply to something very small (like a single atom), something very ...

  14. Saunders' Research Onion Explained (+ Examples)

    Research Philosophy 2: Interpretivism On the other side of the spectrum, interpretivism emphasises the influence that social and cultural factors can have on an individual. This view focuses on people's thoughts and ideas, in light of the socio-cultural backdrop.With the interpretivist philosophy, the researcher plays an active role in the study, as it's necessary to draw a holistic view ...

  15. The C.A.R.S. Model

    The Creating a Research Space [C.A.R.S.] Model was developed by John Swales based upon his analysis of journal articles representing a variety of discipline-based writing practices. His model attempts to explain and describe the organizational pattern of writing the introduction to scholarly research studies. ... For example, one could state ...

  16. What Are Common Research Models?

    Mice are one of the most common models. They share many similar genes to humans and are a common vertebrate model. Part of why mice are helpful for scientific research is because of the robust genetic tools available for studying them. These tools allow researchers to delete specific genes from (1) the entire genome or (2) from specific cell ...

  17. Blueprints for Academic Research Projects

    Today's post is written by Dr. Ben Ellway, the founder of www.academic-toolkit.com.Ben completed his Ph.D. at The University of Cambridge and created the Research Design Canvas, a multipurpose tool for learning about academic research and designing a research project.. Based on requests from students for examples of completed research canvases, Ben created the Research Model Builder Canvas.

  18. An example of a research model

    Download scientific diagram | An example of a research model from publication: Strategic alliance of small firms in knowledge industries: A management consulting perspective | Purpose To propose a ...

  19. Chapter 4 Research Model, Hypotheses, and Methodology

    Research Model, Hypotheses, and Methodology This chapter deals with the research model. In the first step the problem statement of the work is defined. Based on that, the research questions and the research model are elaborated. Thereafter, the hypotheses are summarized. The chapter ends with a section that deals with methodological aspects.

  20. Original research: Evidence-based practice models and frameworks in the

    Objectives. The aim of this scoping review was to identify and review current evidence-based practice (EBP) models and frameworks. Specifically, how EBP models and frameworks used in healthcare settings align with the original model of (1) asking the question, (2) acquiring the best evidence, (3) appraising the evidence, (4) applying the findings to clinical practice and (5) evaluating the ...

  21. How to Write a Research Proposal

    Research proposal examples. Writing a research proposal can be quite challenging, but a good starting point could be to look at some examples. We've included a few for you below. Example research proposal #1: "A Conceptual Framework for Scheduling Constraint Management".

  22. Research Report

    Thesis. Thesis is a type of research report. A thesis is a long-form research document that presents the findings and conclusions of an original research study conducted by a student as part of a graduate or postgraduate program. It is typically written by a student pursuing a higher degree, such as a Master's or Doctoral degree, although it ...

  23. Gravity Models for Global Migration Flows: A Predictive Evaluation

    This study introduces a comprehensive econometric framework based on gravity equations and designed to forecast migrant flows between countries. The model's theoretical underpinnings are validated through empirical data, and we show that the model has better out-of-sample predictive ability than alternative global models. We explore the quantitative effects of various socioeconomic ...

  24. Knowledge-Building in Classroom: A Multimodal Semantic Wave Model

    At present, a variety of multimodal discourse analysis methods have been proposed in the field of corpus research. For example, the linguistic annotation tool ELAN has been applied to analyze multimodal information (Brugman et al., 2004; Wittenburg et al., 2006), and Baldry (2006) presented that multimodal corpus authoring system allowed ...

  25. What Are Autoregressive Models?

    An autoregressive model, or AR model, is a linear predictive model that uses past data to predict future trends. For example, an autoregressive model might continually integrate stock market data into its algorithm to provide updated forecasts for future prices. This class of statistical models is commonly used for time series analysis.

  26. Small Sample Methods in Multilevel Analysis: The Journal of

    1 Fixed effects model, which is often used in Econometrics (see, e.g., Wooldridge, Citation 2019) and which incorporates dummy variables representing clusters into the model and excludes level-2 variables, is frequently used as a viable strategy for dealing with small sample sizes at level-2 when the focus lies solely on the fixed effects at level-1 (e.g., McNeish & Kelley, Citation 2019).

  27. What Is Qualitative Research?

    Qualitative research involves collecting and analyzing non-numerical data (e.g., text, video, or audio) to understand concepts, opinions, or experiences. It can be used to gather in-depth insights into a problem or generate new ideas for research. Qualitative research is the opposite of quantitative research, which involves collecting and ...

  28. Creating a Corporate Social Responsibility Program with Real Impact

    Summary. Exploring the critical role of experimentation in Corporate Social Responsibility (CSR), research on four multinational companies reveals a stark difference in CSR effectiveness ...

  29. Plan-Do-Study-Act (PDSA) Directions and Examples

    Access the Worksheet and Directions in Word (25 KB) and Worksheet and Directions in PDF (157 KB). Plan-Do-Study-Act Directions and Examples. The Plan-Do-Study-Act (PDSA) method is a way to test a change that is implemented. Going through the prescribed four steps guides the thinking process into breaking down the task into steps and then evaluating the outcome, improving on it, and testing again.

  30. Sialyl-Tn serves as a potential therapeutic target for ovarian cancer

    Demonstrating on-target effects and response of humanized anti-STn-ADC in pre-clinical models. SKOV3 CTL cells which express little to no STn and SKOV3 cells that were stably transfected with ST6GalNac1 to express elevated levels of STn (SKOV3-ST6GALNAC1) were plated in 96-well plate and then treated the next day with 0, 2.5, 5, 10, 50 or 100 nM of humanized anti-STn ADC (A, B).