U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Indian Dermatol Online J
  • v.12(1); Jan-Feb 2021

Research Funding—Why, When, and How?

Shekhar neema.

Department of Dermatology, Armed Forces Medical College, Pune, Maharashtra, India

Laxmisha Chandrashekar

1 Department of Dermatology, Jawaharlal Institute of Postgraduate Medical Education and Research (JIPMER), Dhanvantari Nagar, Puducherry, India

Research funding is defined as a grant obtained for conducting scientific research generally through a competitive process. To apply for grants and securing research funding is an essential part of conducting research. In this article, we will discuss why should one apply for research grants, what are the avenues for getting research grants, and how to go about it in a step-wise manner. We will also discuss how to write research grants and what to be done after funding is received.

Introduction

The two most important components of any research project is idea and execution. The successful execution of the research project depends not only on the effort of the researcher but also on available infrastructure to conduct the research. The conduct of a research project entails expenses on man and material and funding is essential to meet these requirements. It is possible to conduct many research projects without any external funding if the infrastructure to conduct the research is available with the researcher or institution. It is also unethical to order tests for research purpose when it does not benefit patient directly or is not part of the standard of care. Research funding is required to meet these expenses and smooth execution of research projects. Securing funding for the research project is a topic that is not discussed during postgraduation and afterwards during academic career especially in medical science. Many good ideas do not materialize into a good research project because of lack of funding.[ 1 ] This is an art which can be learnt only by practising and we intend to throw light on major hurdles faced to secure research funding.

Why Do We Need the Funds for Research?

It is possible to publish papers without any external funding; observational research and experimental research with small sample size can be conducted without external funding and can result in meaningful papers like case reports, case series, observational study, or small experimental study. However, when studies like multi-centric studies, randomized controlled trial, experimental study or observational study with large sample size are envisaged, it may not be possible to conduct the study within the resources of department or institution and a source of external funding is required.

Basic Requirements for Research Funding

The most important requirement is having an interest in the particular subject, thorough knowledge of the subject, and finding out the gap in the knowledge. The second requirement is to know whether your research can be completed with internal resources or requires external funding. The next step is finding out the funding agencies which provide funds for your subject, preparing research grant and submitting the research grant on time.

What Are the Sources of Research Funding? – Details of Funding Agencies

Many local, national, and international funding bodies can provide grants necessary for research. However, the priorities for different funding agencies on type of research may vary and this needs to be kept in mind while planning a grant proposal. Apart from this, different funding agencies have different timelines for proposal submission and limitation on funds. Details about funding bodies have been tabulated in Table 1 . These details are only indicative and not comprehensive.

Details of funding agencies

Application for the Research Grant

Applying for a research grant is a time-consuming but rewarding task. It not only provides an opportunity for designing a good study but also allows one to understand the administrative aspect of conducting research. In a publication, the peer review is done after the paper is submitted but in a research grant, peer review is done at the time of proposal, which helps the researcher to improve his study design even if the grant proposal is not successful. Funds which are available for research is generally limited; resulting in reviewing of a research grant on its merit by peer group before the proposal is approved. It is important to be on the lookout for call for proposal and deadlines for various grants. Ideally, the draft research proposal should be ready much before the call for proposal and every step should be meticulously planned to avoid rush just before the deadline. The steps of applying for a research grant are mentioned below and every step is essential but may not be conducted in a particular order.

  • Idea: The most important aspect of research is the idea. After having the idea in mind, it is important to refine your idea by going through literature and finding out what has already been done in the subject and what are the gaps in the research. FINER framework should be used while framing research questions. FINER stands for feasibility, interesting, novel, ethical, and relevant
  • Designing the study: Well-designed study is the first step of a well-executed research project. It is difficult to correct flawed study design when the project is advanced, hence it should be planned well and discussed with co-workers. The help of an expert epidemiologist can be sought while designing the study
  • Collaboration: The facility to conduct the study within the department is often limited. Inter-departmental and inter-institutional collaboration is the key to perform good research. The quality of project improves by having a subject expert onboard and it also makes acceptance of grant easier. The availability of the facility for conduct of research in department and institution should be ascertained before planning the project
  • Scientific and ethical committee approval: Most of the research grants require the project to be approved by the institutional ethical committee (IEC) before the project is submitted. IEC meeting usually happens once in a quarter; hence pre-planning the project is essential. Some institutes also conduct scientific committee meeting before the proposal can be submitted for funding. A project/study which is unscientific is not ethical, therefore it is a must that a research proposal should pass both the committees’ scrutiny
  • Writing research grant: Writing a good research grant decides whether research funding can be secured or not. So, we will discuss this part in detail.

How to write a research grant proposal [ 13 , 14 , 15 ] The steps in writing a research grant are as follows

  • Identifying the idea and designing the study. Study design should include details about type of study, methodology, sampling, blinding, inclusion and exclusion criteria, outcome measurements, and statistical analysis
  • Identifying the prospective grants—the timing of application, specific requirements of grant and budget available in the grant
  • Discussing with collaborators (co-investigators) about the requirement of consumables and equipment
  • Preparing a budget proposal—the two most important part of any research proposal is methodology and budget proposal. It will be discussed separately
  • Preparing a specific proposal as outlined in the grant document. This should contain details about the study including brief review of literature, why do you want to conduct this study, and what are the implications of the study, budget requirement, and timeline of the study
  • A timeline or Gantt chart should always accompany any research proposal. This gives an idea about the major milestones of the project and how the project will be executed
  • The researcher should also be ready for revising the grant proposal. After going through the initial proposal, committee members may suggest some changes in methodology and budgetary outlay
  • The committee which scrutinizes grant proposal may be composed of varied specialities. Hence, proposal should be written in a language which even layman can understand. It is also a good idea to get the proposal peer reviewed before submission.

Budgeting for the Research Grant

Budgeting is as important as the methodology for grant proposal. The first step is to find out what is the monetary limit for grant proposal and what are the fund requirements for your project. If these do not match, even a good project may be rejected based on budgetary limitations. The budgetary layout should be prepared with prudence and only the amount necessary for the conduct of research should be asked. Administrative cost to conduct the research project should also be included in the proposal. The administrative cost varies depending on the type of research project.

Research fund can generally be used for the following requirement but not limited to these; it is helpful to know the subheads under which budgetary planning is done. The funds are generally allotted in a graded manner as per projected requirement and to the institution, not to the researcher.

  • Purchase of equipment which is not available in an institution (some funding bodies do not allow equipment to be procured out of research funds). The equipment once procured out of any research fund is owned by the institute/department
  • Consumables required for the conduct of research (consumables like medicines for the conduct of the investigator-initiated trials and laboratory consumables)
  • The hiring of trained personnel—research assistant, data entry operator for smooth conduct of research. The remuneration details of trained personnel can be obtained from the Indian Council of Medical Research (ICMR) website and the same can be used while planning the budget
  • Stationary—for the printing of forms and similar expense
  • Travel expense—If the researcher has to travel to present his finding or for some other reason necessary for the conduct of research, travel grant can be part of the research grant
  • Publication expense: Some research bodies provide publication expense which can help the author make his findings open access which allows wider visibility to research
  • Contingency: Miscellaneous expenditure during the conduct of research can be included in this head
  • Miscellaneous expenses may include expense toward auditing the fund account, and other essential expenses which may be included in this head.

Once the research funding is granted. The fund allotted has to be expended as planned under budgetary planning. Transparency, integrity, fairness, and competition are the cornerstones of public procurement and should be remembered while spending grant money. The hiring of trained staff on contract is also based on similar principles and details of procurement and hiring can be read at the ICMR website.[ 4 ] During the conduct of the study, many of grant guidelines mandate quarterly or half-yearly progress report of the project. This includes expense on budgetary layout and scientific progress of the project. These reports should be prepared and sent on time.

Completion of a Research Project

Once the research project is completed, the completion report has to be sent to the funding agency. Most funding agencies also require period progress report and project should ideally progress as per Gantt chart. The completion report has two parts. The first part includes a scientific report which is like writing a research paper and should include all subheads (Review of literature, material and methods, results, conclusion including implications of research). The second part is an expense report including how money was spent, was it according to budgetary layout or there was any deviation, and reasons for the deviation. Any unutilized fund has to be returned to the funding agency. Ideally, the allotted fund should be post audited by a professional (chartered accountant) and an audit report along with original bills of expenditure should be preserved for future use in case of any discrepancy. This is an essential part of any funded project that prevents the researcher from getting embroiled in any accusations of impropriety.

Sharing of scientific findings and thus help in scientific advancement is the ultimate goal of any research project. Publication of findings is the part of any research grant and many funding agencies have certain restrictions on publications and presentation of the project completed out of research funds. For example, Indian Association of Dermatologists, Venereologists and Leprologists (IADVL) research projects on completion have to be presented in a national conference and the same is true for most funding agencies. It is imperative that during presentation and publication, researcher mentions the source of funding.

Research funding is an essential part of conducting research. To be able to secure a research grant is a matter of prestige for a researcher and it also helps in the advancement of career.

Financial support and sponsorship

Conflicts of interest.

There are no conflicts of interest.

Basics of scientific and technical writing: Grant proposals

  • Career Central
  • Published: 23 April 2021
  • Volume 46 , pages 455–457, ( 2021 )

Cite this article

research funding definition

  • Morteza Monavarian 1  

5062 Accesses

148 Altmetric

Explore all metrics

Avoid common mistakes on your manuscript.

Grant proposals

A grant proposal is a formal document you submit to a funding agency or an investing organization to persuade them to provide the requested support by showing that (1) you have a plan to advance a certain valuable cause and (2) that the team is fully capable of reaching the proposed goals. The document may contain a description of the ideas and preliminary results relative to the state of the art, goals, as well as research and budget plans. This article provides an overview of some steps toward preparation of grant proposal applications, with a particular focus on proposals for research activities in academia, industry, and research institutes.

Different types of proposals

There are different types of grant proposals depending on the objectives, activity period, and funding organization source: (1) research proposals, (2) equipment proposals, and (3) industry-related proposals. Research proposals are those that seek funding to support research activities for a certain period of time, while equipment proposals aim for a certain equipment to be purchased. For equipment proposals to be granted, you need to carefully explain how its purchase could help advance research activities in different directions. Unlike research proposals, which are focused on a specific direction within a certain field of research, equipment proposals can have different directions within different areas of research, as long as the proposed equipment can be used in those areas.

There are also industry-related funding opportunities. For example, the National Science Foundation (NSF) has programs within its Division of Industrial Innovation and Partnerships, in which small businesses and industries can involve research funding opportunities. Examples of such programs include Small Business Innovation Research and Small Business Technology Transfer programs. These opportunities are separate from any opportunities directly involving the companies funding your research, where the companies are the source of the funding.

Steps to submit a proposal

Figure  1 shows an overview of a standard process flow for a grant proposal application, from identifying the needs and focus to acceptance and starting the project. As shown, the process of writing grants is not linear, but rather a loop, indicating the need for consistent modifications and development of your ideas, depending on the input you receive from the funding agencies or the results you obtained from previously funded projects.

figure 1

Diagram of grant proposal preparation.

Before starting, you need to define the ultimate purpose of the research you want to pursue and to convince others that the work is indeed worth pursuing. Think about your proposed research in the context of problems to solve, potential hypotheses, and research design. To start shaping the idea you are pursuing, ask yourself: (1) What knowledge do I gain from finishing this project? (2) What is the significance of the end goal of the project? (3) How would the completion of this project be useful in a broader sense? Having convincing answers to these questions would be extremely helpful in developing a good grant proposal.

After identifying the needs and focus and initially developing the ideas and plans, the next step is to secure a funding agency to which you would like to submit the grant proposal. It is a good practice to keep track of programs and corresponding funding opportunity announcements for different funding agencies relevant to your field of research. Once you secure a funding agency and find the deadline for submission, review the submission guidelines for the program carefully. The grant proposal document should be perfectly aligned with the structure and content proposed in the guidelines provided by the agency program to avoid any premature rejection of your application. Some programs only require a few documents, while others many more. Some agencies may require a concept paper: a short version of the proposal submitted before you are eligible for a full proposal submission.

After securing the agency/program and reviewing the guidelines, the next step is to write the full proposal document, according to the guidelines proposed by the funding agency. Before submission, review your documents multiple times to ensure the sections are well written and are consistent with one another and that they perfectly convey your messages. Some institutes have experts in reviewing proposal documents for potential linguistic and/or technical edits. Submit at least a day before the deadline to ensure that all documents safely go through. Some agencies have strict deadlines, which you do not want to miss, or you may have to wait upwards of a year to submit again. The agency then usually sends your documents to a few expert reviewers for their comments. The review may be graded or have written comments that require attention and response. A response letter has to be prepared and submitted (according to the agency guidelines) by a new deadline imposed by the agency for consideration by the program manager.

After reviewing the full response and revised documents, the agency will contact you with notification of their decision. If your proposal is accepted, the agency will provide details regarding funding and a start date. During the term of the project, agencies normally require a periodic (quarterly or annually) report in either a written or oral form. Different agencies may have rules for any publications or patents that could potentially result during the project term, when the work is complete or the idea is developed as a result of the awarded grant. As shown in the figure, even if the proposal is rejected, upon careful review, revision, and further development or adjustment of the proposal, you may try for another funding opportunity. After finishing a recently funded project, you can further develop an idea and submit another proposal for funding.

Structure of proposals (NSF example)

The structure of proposals differs with funding agencies. Included is an overview of an NSF proposal as a guide.

In addition to the technical volume (narrative) document, containing all the major descriptions of the project, other necessary documents include bio sketches, budget, justifications, management plan, and project summary. Bio sketches contain resumes of all the principal investigators (PIs), including any prior experience, relevant publications, and outreach activities. Budget and justifications are two separate documents relevant to a breakdown of the required budgets for the project, including salaries for the PIs and the team, travel, publication costs, equipment costs, materials and supplies, and any other relevant expenses. The budget document could be an Excel spreadsheet, indicating the exact dollar amounts, while the justification indicates the rationale for each charge. Depending on the agency and program, some expenses are allowed to be included in the budget list (carefully read related guidelines). Other potential requirements for submission may include a description of the project summary, management plans, and the facilities in which the work will be performed.

The technical volume is likely the one you will spend the most time preparing. It consists of several sections. Included is an example of a structure (read the Proposal and Award Policies and Procedures Guide on the NSF website for details). The total technical volume should not exceed 15 pages, excluding the reference section, which will be submitted as a separate document. While there are different review criteria for an NSF proposal, the main two are intellectual merit (encompasses the potential to advance knowledge) and broader impacts (potential to benefit society). Your proposal should reflect that the work will be rich in these two criteria. NSF reviewers typically provide qualitative grades (ranging from poor to excellent) to the proposal and feedback in their review.

Introduction and overview

The first section of the technical volume may start with an introduction/motivation and overview of the proposed work. This section should be no longer than a page, but should give an overview of the background and state of the art in the research area, motivations, objectives of the proposed work (maybe in the context of intellectual merit and broader impacts), and a brief description of the work breakdown (tasks). The last couple of paragraphs of the introduction could summarize the education and outreach plans, as well as the PIs’ experience and expertise. Feel free to highlight any major statements in this section to serve as main takeaways for the reviewers. Also, making an overview figure for this section may help summarize the information.

Background and relationship to the state of the art

The second section gives more details of background and relationship to the state of the art. This section may be a few pages long and contain figures and relevant citations.

Technical methods and preliminary results

This section should describe the technical methods and preliminary results relevant to the proposed research from your prior work. It should contain illustrative figures and plots to back up the proposed work.

Research plan

After discussing the prior art and the technical methods and preliminary results (in previous sections), you should discuss the proposed research and plan. A good standard is to divide your work into two to three thrusts, with each thrust containing two to three tasks. You can also prepare a timetable (also called a Gantt chart) to indicate when the tasks will be completed with respect to the project term, which is usually between three to five years.

Integration of education and research

The last section should describe any plans for integration of education and research, including any K-12 programs or planned outreach activities.

Results from prior supports

Finally, describe results from all of your prior NSF supports. For each project, provide a paragraph describing the goal of the project, the outcomes, and any related publications. You can also write this section in the context of intellectual merit and broader impacts.

Things to remember when preparing grant proposals

Find the proper timing for any idea to explore. Sometimes the idea you think is worth pursuing is either too early or too late to explore, depending on the existing body of literature.

Begin early to avoid missing any deadlines. Give the process some time, as it could take a while.

Try to have sufficient preliminary results as seeds for the proposal.

Have a decent balance between the amount of ideas and preliminary results you put in the grant proposal. Too many ideas but too few results may make your proposal sound too ambitious, while too few ideas and too many results may make your proposed work seem complete, therefore no need for funding.

Try to attend funding agency panels. It will help you understand the review process, grading criteria, and mindsets of program managers. Learn about proposals that are funded.

Locate any related funding agency announcements to know the deadlines in advance.

Be mindful of deadlines. Last day submissions may jeopardize your funding opportunities.

Learn what is customary. One figure per page is ideal for the proposed technical volume. A wordy proposal with not enough figures will be boring and more difficult for the reviewers to follow.

Do not give up! You may need to submit several proposals (to different programs/agencies) to get one awarded.

Be cautious about self-plagiarism! Do not copy and paste texts/figures from your previously supported proposal or papers in your new submissions.

Be ambitious but practical when developing ideas.

Develop a solid research program. It is not all about hunting grants; it is also how to execute your funded projects. You may have periods (waves) of grant hunting followed by periods of delivering on the funded projects. Any successful prior research can help you gain more funding in the next wave.

Enjoy your research!

Author information

Authors and affiliations.

Materials Department and Solid State Lighting & Energy Electronics Center, University of California, Santa Barbara, USA

Morteza Monavarian

You can also search for this author in PubMed   Google Scholar

Additional information

This article is the third in a three-part series in MRS Bulletin that will focus on writing papers, patents, and proposals.

Rights and permissions

Reprints and permissions

About this article

Monavarian, M. Basics of scientific and technical writing: Grant proposals. MRS Bulletin 46 , 455–457 (2021). https://doi.org/10.1557/s43577-021-00105-4

Download citation

Published : 23 April 2021

Issue Date : May 2021

DOI : https://doi.org/10.1557/s43577-021-00105-4

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 21 September 2021

The value of research funding for knowledge creation and dissemination: A study of SNSF Research Grants

  • Rachel Heyard   ORCID: orcid.org/0000-0002-7531-4333 1 &
  • Hanna Hottenrott   ORCID: orcid.org/0000-0002-1584-8106 2 , 3  

Humanities and Social Sciences Communications volume  8 , Article number:  217 ( 2021 ) Cite this article

11k Accesses

29 Citations

54 Altmetric

Metrics details

  • Science, technology and society

This study investigates the effect of competitive project funding on researchers’ publication outputs. Using detailed information on applicants at the Swiss National Science Foundation and their proposal evaluations, we employ a case-control design that accounts for individual heterogeneity of researchers and selection into treatment (e.g. funding). We estimate the impact of the grant award on a set of output indicators measuring the creation of new research results (the number of peer-reviewed articles), its relevance (number of citations and relative citation ratios), as well as its accessibility and dissemination as measured by the publication of preprints and by altmetrics. The results show that the funding program facilitates the publication and dissemination of additional research amounting to about one additional article in each of the three years following the funding. The higher citation metrics and altmetrics by funded researchers suggest that impact goes beyond quantity and that funding fosters dissemination and quality.

Similar content being viewed by others

research funding definition

Interdisciplinary researchers attain better long-term funding performance

research funding definition

Winners and runners-up alike?—a comparison between awardees and special mention recipients of the most reputable science award in Colombia via a composite citation indicator

research funding definition

Scientific prizes and the extraordinary growth of scientific topics

Introduction.

Scientific research generated at universities and research organizations plays an important role in knowledge-based societies (Fleming et al., 2019 ; Poege et al., 2019 ). The created knowledge drives scientific and technological progress and spills over to the broader economy and society (Hausman, 2021 ; Jaffe, 1989 ; Stephan, 2012 ). The growing importance of science-based industries puts additional emphasis on the question of how scientific knowledge is generated and whether public funding can accelerate knowledge creation and its diffusion. In an effort to promote scientific research, grant competitions as a means of allocating public research funding have become an important policy tool (Froumin and Lisyutkin, 2015 ; Oancea, 2016 ). The goal is to incentivize the generation of ideas and to allocate funding such that it is most likely to deliver scientific progress and eventually economic and social returns Footnote 1 . In light of these developments, it is important to understand whether research grants indeed facilitate additional, relevant research outputs and whether these are accessible to the public.

In particular individual-level analyses are highly interesting since most grants are awarded to individual researchers or to small teams of researchers. The estimation of the effect that a grant has on research outputs is, however, challenging. The main difficulties are the availability of information on all applicants (not only winners) as well as detailed information about the individual researchers (demographic information). Moreover, the non-randomness of the award of a grant through the selection of the most able researchers into the funding program results in the non-comparability of funded and non-funded researchers. The fact that researchers can receive multiple grants at the same time as well as several consecutive grants further challenges the estimation of effects from funding (Jaffe, 2002 ). Another difficulty stems from finding appropriate measures for research output (Oancea, 2016 ). Publications and citations are easy to count, but likely draw an incomplete picture of research impact, its dissemination and the extent to which funded research contributes to public debates. Moreover, both publication and citation patterns as well as funding requirements are highly field-dependent which makes output analyses in mixed samples or inter-disciplinary programs difficult.

In this study, we aim to quantify the effect of the Swiss National Science Foundation’s (SNSF) Footnote 2 project funding (PF) grants on the individual researcher in terms of future scientific publications and their dissemination. Our analyses is based on detailed information on both grants and awardees covering 20,476 research project grants submitted during the period 2005 and 2019. This study adds to previous work in several dimensions. By focusing on the population of applicants which constitutes a more homogeneous set of researchers than when comparing grant winners to non-applicants and by accounting of individual characteristics of the applicants, our study results are less prone to overlook confounding factors affecting both the likelihood to win a grant as well as research outputs. Information on the evaluation scores submitted in the peer-review process of the grant proposals allows us to compare researchers with similarly rated proposals. In other words, by comparing winning applicants to non-winners and by taking into account the evaluation scores that their applications receive, we can estimate the causal effect of the grant on output while considering that both research ideas, as well as grant writing efforts (and skills), are required for winning a grant. By studying a long time period and accounting for the timing of research grants and outcomes, we can further take into account that there are learning effects from the grant writing itself even for unsuccessful applicants (Ayoubi et al., 2019 ). To benchmark our results to previous studies, we first investigate the impact of grants on publication outputs. In addition, we consider preprints which have become an important mode of disseminating research results quickly but received so far no attention in the research of funding effects. Preprints do not undergo peer-review (Berg et al., 2016 ; Serghiou and Ioannidis, 2018 ), but help researchers to communicate their results to their community and to secure priority of discovery.

This study goes beyond previous work that mainly considered citation-weighted publication counts, by measuring impact in a researcher’s field of study by relative citation ratios (RCR) and field citation ratios (FCR). These metrics account for field-specific citation patterns. Additionally, we explicitly explore researchers’ altmetric scores as a measure of attention, research visibility, and accessibility of research outcomes beyond academia. Altmetrics reflect media coverage, citations on Wikipedia and in public policy documents, on research blogs and in bookmarks of reference managers like Mendeley, as well as mentions on social networks such as Twitter. While altmetrics may reflect fashionable or provocative research, they may indicate accessible insights disseminated through the increasingly important online discussion of research and may therefore measure the general outreach of research (Warren et al., 2017 ). Although they are a potentially important measure of dissemination to the wider public and therefore of research impact in the age of digital communication (Bornmann, 2014 ; Konkiel, 2016 ; Lăzăroiu, 2017 ), the effect of funding on altmetrics has not been investigated so far.

Finally, by explicitly investigating outputs over several years after funding, our study contributes new insights on the persistency of effects. Since a large share of project funding typically goes into wages of doctoral and post-doctoral researchers which require training and learning on the job, there may be a considerable time lag between the start of the project and the publication of any research results and an underestimation of output effects when considering only immediate outcomes.

The results from our analysis based on different estimation methods show that grant-winning researchers publish about one additional peer-reviewed publication more per year in the 3 years following funding than comparable but unsuccessful applicants. Moreover, these publications are also influential as measured by the number of citations that they receive later on. SNSF PF seems to promote timely dissemination as indicated by the higher number of published preprints and researchers’ higher altmetrics scores. The funding impact is particularly high for young(er) researchers as well as for researchers at a very late career stage when funding keeps output levels high. These results add new insights to the international study of funding effects which provided partially ambiguous findings as our review in the next section illustrates. In summary, the results presented in the following stress the important role played by project funding for research outcomes and hence for scientific progress. Institutional funding alone does not appear to facilitate successful research to the same extent as targeted grants which complement institutional core funds.

The impact of funding on research outcomes

The impact of competitive research funding on knowledge generation (typically proxied by scientific publications) has been studied in different contexts and at multiple levels: the institutional level, the research group or laboratory, and the level of the individual researcher. At the level of the university, Adams and Griliches ( 1998 ) find a positive elasticity of scientific publications to university funding. Payne ( 2002 ) and Payne and Siow ( 2003 ), using congressional earmarks and appropriation committees as instruments for research funding, present similar results. They show that a $1 million increase in funding yields 10–16 additional scientific articles. Wahls ( 2018 ) analyses the impact of project grants from the National Institutes of Health (NIH) in the United States and finds positive institution-level returns (in terms of publications and citation) to funding which, however, diminish at higher levels of funding.

At the laboratory level, the results are rather inconclusive so far which is likely due to heterogeneity in unobserved lab characteristics and the variety of grants and resources that typically fund lab-level research. An analysis of an Italian biotechnology funding program by Arora et al. ( 1998 ) finds a positive average elasticity of research output to funding, but with a stronger impact on the highest quality research groups. These findings, however, seem to be specific to engineering and biotechnology. Carayol and Matt ( 2004 ) included a broader set of fields and did not find a strong link between competitive research funding and lab-level outputs.

At the level of the individual researcher, Arora and Gambardella ( 2005 ) find that research funding from the United States National Science Foundation (NSF) in the field of Economics has a positive effect on publication outcomes (in terms of publication success in highly ranked journals) for younger researchers. For more advanced principle investigators (PIs between 5 and 15 years since PhD), however, they do not find a significant effect of NSF funding when taking the project evaluation into account. Jacob and Lefgren ( 2011 ) study personal research funding from the NIH and find that grants resulted in about one additional publication over the next 5 years. These results are close to the estimated effect from public grants of about one additional publication in a fixed post-grant window in a sample of Engineering professors in Germany (Hottenrott and Thorwarth, 2011 ). Likewise, a study on Canadian researchers in nanotechnology (Beaudry and Allaoui, 2012 ) documents a significant positive relationship between public grants and the number of subsequently published articles.

More recent studies considered output effects both in terms of quantity and quality or impact. Evaluating the impact of funding by the Chilean National Science and Technology Research Fund on research outputs by the PIs, Benavente et al. ( 2012 ) find a positive impact in terms of a number of publications of about two additional publications, but no impact in terms of citations to these publications. In contrast to this, Tahmooresnejad and Beaudry ( 2019 ) show that there is also an influence of public grants (unlike private sector funding) on the number of citations for nanotechnology researchers in Canada. In addition, Hottenrott and Lawson ( 2017 ) find that grants from public research funders in the United Kingdom contribute to publication numbers (about one additional publication per year) as well as to research impact (measured by citations to these publications) even when grants from other private sector sources are accounted for. Results for a sample of Slovenian researchers analyzed by Mali et al. ( 2017 ), however, suggest that public grants result in ‘excellent publications’ Footnote 3 only if researchers’ funding comes mostly from one source.

Explicitly looking at research novelty Footnote 4 , Wang et al. ( 2018 ) find that projects funded by competitive funds in Japan have on average higher novelty than projects funded through institutional funding. However, this only holds for senior and male researchers. For junior female researchers, competitive project funding has a negative relation to novelty.

In a study on Switzerland-based researchers, Ayoubi et al. ( 2019 ) find, in a sample of 775 grant applications for special collaborative, multi-disciplinary and long-term projects, that participating in the funding competition does indeed foster collaborative research with co-applicants. For grant-winners, they observe a lower average number of citations received per paper compared to non-winners (not controlling for other sources of funding that the non-winners receive). The authors relate this finding to the complexity of such interdisciplinary projects, the cost of collaboration, and the fact that also applicants who do not eventually win this particular type of grant publish more as a result of learning from grant writing or through funding obtained from alternative sources.

By studying grants distributed via the main Swiss research funding agency, we are capturing the vast majority of competitive research grants in the country. The Swiss research funding system is characterized by a relatively strong centralization of research funding distribution with the SNSF accounting by far for the largest share of the external research funding of universities (Jonkers and Zacharewicz, 2016 ; Schmidt, 2008 ) Footnote 5 . To account for major sources outside of Switzerland such as from the European Research Council (ERC), we collected information on Swiss-based researchers who received such funding during our period of analysis.

Empirical model of funding and research outputs

All of the following is based on the assumption that academic researchers strive to make tangible contributions to their fields of research. The motivations for doing so can be diverse and heterogeneous ranging from career incentives to peer recognition (Franzoni et al., 2011 ). We also assume that producing these outputs requires resources (personnel, materials, equipment) and hence researchers have incentives to apply for grants to fund their research. However, research output, that is the success of a researcher in producing results and the frequency with which this happens, also depends on researcher characteristics, characteristics of the research field and the home institution. Research success is also typically path-dependent following a success-breeds-success pattern. Thus, we build on the assumption that a researcher who generates an idea for a research project files a grant application to obtain funding to pursue the project. If the application succeeds, the researcher will spend the grant money and may or may not produce research outputs. The uncertainty is inherent to the research process. The funding agency screens funding proposals and commissions expert reviews to assess the funding worthiness of the application. If the submitted proposal received an evaluation that is sufficiently good in comparison to the other proposals, funding is granted in accordance with the available funding amount. This implies that even in case of a rejected grant proposal the researcher may pursue the project idea, but without these dedicated resources available. In many instances, funding decisions are made at the margin, with some winning projects being only marginally better than non-wining projects (Fang and Casadevall, 2016 ; Graves et al., 2011 ; Neufeld et al., 2013 ). If the funding itself has indeed an effect on research outcomes, we would expect that the funded researcher is more successful in generating outputs both in terms of quantity and quality.

In addition to resource-driven effects, there may also be direct dissemination incentives related to public project funding. On the one hand, funding agencies may encourage or even require the dissemination of any results from the funded project. On the other hand, the researchers may have incentives to publish research outcomes to signal project success to the funding agency and win reputation gains valuable for future proposal assessments.

While estimating the contribution of funding to research outputs measured by different indicators, we have to take into consideration that the estimation of the funding effect requires assumptions about output generation by researchers. The extent to which the output produced can be attributed to the funding itself also depends on the econometric model used (Silberzahn et al., 2018 ). We, therefore, take a quantitative multi-method approach taking up and adding to methods applied in previous related studies. Comparing the results from different estimation methods also allows an assessment of the sensitivity of our conclusions to specific modeling assumptions. In particular, we estimate longitudinal regression models which aim to account for unobserved heterogeneity between researchers. In addition, we use non-parametric matching methods to explicitly model the selectivity in the grant awarding process.

Mixed effects models

We define P i t as the research output of researcher i in year t and F i t −1 as a binary variable indicating whether this same researcher i had access to SNSF funding in year t −1. Note that this indicator takes the value one for the entire duration of the granted project. The funding information is lagged by one year as an immediate effect of funding on output is unlikely. Note that, we will differentiate between funding as PI and as co-PI (only). The general empirical model can then be expressed as

with ϕ being the vector of parameters. X i t represents a vector with explanatory factors at t including observed characteristics of the researcher and the average quality of the grant applications as reflected in the average evaluation score. Further T t captures the overall time trend, v i is the unobserved individual heterogeneity, and ϵ i t is the error term.

The specification above describes a production function for discrete outcomes following Blundell et al. ( 1995 ). As a first estimation strategy, count data models will be used to estimate research outputs, as for example, the number of peer-reviewed articles or preprints. Moreover, these models account for unobserved individual characteristics, v i , which likely predict research outputs besides observable characteristics and are independent of project funding. One way to estimate this unobserved heterogeneity is to use random intercepts for the individuals Footnote 6 , here the researchers, and account for the hierarchical structure of the information (e.g. panel data). Thus, we estimate mixed count models to capture v i Footnote 7 . The mixed regression models for count data take the following form:

In addition to count-type outputs, we estimate the effect of funding on continuous output variables such as the average number of yearly citations per article or the researcher’s average yearly altmetric score. For these output types we estimate linear regression models based on a comparable model specification with regard to F i t −1 , X i t , T t and v i .

Non-parametric treatment effect estimation

In an alternative estimation approach, we apply a non-parametric technique: The average treatment effect of project funding on scientific outcomes is estimated by an econometric matching estimator which addresses the question of “How much would a funded researcher have published (or how much attention in terms of altmetrics or citations would her research have received) if she had not received the grant?”. This implies comparing the actually observed outcomes to the counterfactual ones to derive an estimate for the funding effect. Given that the counterfactual situation is not observable, it has to be estimated.

For doing so, we employ a nearest neighbor propensity score matching. That is, we pair each grant recipient with a non-recipient by choosing the nearest ‘twin’ based on the similarity in the estimated probability of receiving a grant and the average score that the submitted applications received. Note that we select the twin researcher from the sample of unsuccessful applicants so that matching on both, the general propensity to win (which includes personal and institutional characteristics) and the proposal’s evaluation score, allows to match both on an individual as well as on proposal (or project idea) characteristics to find the most comparable individuals.

The estimated propensity to win a grant is obtained from a probit estimation on a binary treatment indicator which takes the value of one for each researcher-year combination in which an individual had received project funding. The advantage of propensity score matching compared to exact matching is that it allows combining a larger set of characteristics into a single indicator avoiding the curse of dimensionality. Nevertheless, introducing exact matching for some key indicators can improve the balancing of the control variables after matching. In particular, we match exactly on the year of the funding round as this allows to have the same post-treatment time window for treated and control individual and also captures time trends in outputs which could affect the estimated treatment effect. In addition, we match only within a research field to not confound the treatment effect with heterogeneity in resource requirements and discipline differences in output patterns. We follow a matching protocol as suggested by Gerfin and Lechner ( 2002 ) and calculate the Mahalanobis distance between a treatment and a control observation as

where Ω is the empirical covariance matrix of the matching arguments (propensity score and evaluation score). We employ a caliper to avoid bad matches by imposing a threshold of the maximum distance allowed between the treated and the control group. That is, a match for researcher i is only chosen if ∣ Z j − Z i ∣ < ϵ , where ϵ is a pre-specified tolerance. After having paired each researcher with the most similar non-treated one, any remaining differences in observed outcomes can be attributed to the funding effect. The resulting estimate of the treatment effect is unbiased under the conditional independence assumption (Rubin, 1977 ). In other words, in order to overcome the selection problem, participation and potential outcome have to be independent for individuals with the same set of characteristics X i t Footnote 8 . Note that by matching on the evaluation score in addition to the propensity score, our approach is similar to the idea of regression discontinuity design (RDD). The advantage of the selected approach is, however, that it allows us to draw causal conclusions for a more representative set of individuals. While RDD designs have the advantage of high internal consistency, this comes at the price of deriving effects estimates only for researchers around the cut-off (de la Cuesta and Imai, 2016 ). Yet, in our case, this threshold is not constant, but depends on the pool of submitted proposals and there is considerable variation in the evaluation scores that winning proposals receive. In our application, we also expect heterogeneous impacts across researchers so that a local effect might be very different from the effect for researchers away from the threshold for selection (Battistin and Rettore, 2008 ).

Using the matched comparison group, the average effect on the treated can thus be calculated as the mean difference of the matched samples:

with \({P}_{i}^{T}\) being the outcome variable in the treated group, \({P}_{j}^{{\rm {C}}}\) being the counterfactual for i and n T is the sample size (of treated researchers). Footnote 9

Data and descriptive analysis

Data provided by the SNSF has been used to retrieve a set of researchers of interest. These researchers have applied to the SNSF funding instrument project funding (PF) or Sinergia Footnote 10 as main applicant (e.g. PI) or co-applicant Footnote 11 (e.g. co-PI). The PF scheme is a bottom-up approach as it funds costs of research projects with a topic of the applicant’s own choice.

The study period is dynamic and researcher-specific: it starts with the year in which the SNSF observes the researcher for the first time as (co-)PI to PF or as a career funding grantholder (after the postdoctoral level); the year the independent research career starts. However, this study period has its lower bound in 2005. The period ends in 2019 for everyone, and some researchers are observed for a longer period than others. For each researcher, a pre-sample period is defined, including the 5 years before the observation started. Pre-sample information on all outcome variables of interest is needed to account for heterogeneity between the individuals in the way that they enter the study in linear feedback models and for matching on ex-ante performance in the non-parametric estimation approach. Further, only researchers who applied at least once after 2010 to the SNSF are included to ensure a minimum research activity. In a next step, we retrieve a unique Dimensions-identifier (Dim-ID) from the Dimensions database (Digital Science, 2018 ) using a person’s name, research field, age and information about past and current affiliations Footnote 12 . The Dim-ID enables us to collect disambiguated publication information for these researchers to be used in the empirical analysis.

Variables and descriptive statistics

The original data set comprised 11,228 eligible researchers. 10% (1,143) of the latter could not be identified in the Dimensions database. Among the researchers found using their name, the supplementary information from the SNSF database (country, ORCID, institution, etc.) did not match in 1% of the cases, and we were not sure that we found the correct researcher. For 12% of the researchers found in Dimensions no unique ID could be retrieved. After removing these observations, we observe a total of 8,793 distinct researchers (78% of the eligible researchers Footnote 13 ) and the final data set is composed of 82,249 researcher-year observations. On average researchers are observed for 9.35 years. The maximum observation length, from 2005 to 2019 is 15 years, and 2,319 researchers are observed over this maximal study period. All the publication data was retrieved in September 2020.

Research funding

The central interest of the study is the effect competitive project funding has on a researcher’s subsequent research outputs. The information on SNSF funding indicates whether a researcher had access to SNSF funding as a PI and/or co-PI in a certain year. We differentiate between PIs and co-PIs to test whether the funding effect differs depending on the role in the project. On average the researchers in our data set are funded by the SNSF for 4.6 years during the observation period; for 3.3 years as PI of a project (see Table 1 ). In total 20,476 distinct project applications (not necessarily funded) are included in the data. On average a PIs is involved in a total of 3.7 project applications (as PI or co-PI); in 3.1 submissions as PI, and in 2.3 submissions as co-PI. About 66% of all projects in the data have one sole PI applying for funding, 22% have a PI and a co-PI, 8% a PI and two co-PIs, and 4% are submitted by a PI together with three or more co-PIs. Note that the percentage of successful applications in our data set is 48% over the whole study period (the success rate for the STEM applications is ~60%, it is ~44% in SSH and the one of the LS is the lowest with ~40%).

These numbers reflect that in the Swiss research funding system, project funding does play an important role, but that institutional core funding is also relatively generous. The latter accounts for—on average—more than 70% of overall university funding (Reale, 2017 ; Schmidt, 2008 ). This allows researchers to sustain in the system without project funding. While overall, institutional funding is quite homogeneous across similar research organizations in the country, it differs between institution types. It is therefore important to account for institutional funding in the following analyses as it provides important complementary resources to researchers (Jonkers and Zacharewicz, 2016 ). Moreover, within the different institution types, we also account for the research field and the career stage of researchers as this may also capture individual differences in core budgets. We present sample characteristics in terms of these variables in the subsection “Confounding variables”. Another important aspect to consider when analyzing the effect of research funding is funding from other sources, other than institutional funding (Hottenrott and Lawson, 2017 ). In all European countries the ERC plays an important role. Hence, we collected data on Swiss-based researchers who received ERC funding and matched them to our sample. Of all the researchers considered in this study only a small fraction (4.2%) ever received funding by the ERC. Most of these researchers had a PF grant running at the same time (87%). Fig S. 3 in the supplementary material shows the count of observations in the different funding groups in more detail Footnote 14 .

Research outputs

Table 1 summarizes the output measures as well as the funding length. The most straightforward research output measure is the number of (peer-reviewed) articles. On average, a researcher in our data publishes 4.9 articles each year. The annual number of articles is higher in the STEM (5.7) and life sciences (LS) (6.5) than in the Social Science and Humanities (SSH) where researchers published about 1.5 publications per year, on average. See Table S. 1 in the supplementary material for differences in all output variables (as well as funding and researcher information) by field. In some disciplines, such as biomedical research, physics, or economics, preprints of articles are widely used and accepted (Berg et al., 2016 ; Serghiou and Ioannidis, 2018 ). As preliminary outputs they are made available early and thus are an interesting additional output, potentially indicating the dissemination and accessibility of research results. The average of the yearly number of preprints is a lot lower than the one of articles (0.4) which is due to preprints being a research output that emerged only rather recently and are more common in STEM fields than in others (see Table S. 1 in the supplementary material). Another output measure is the number of yearly citations per researcher. This is the sum of all citations of work by a certain researcher during a specific year to all her peer-reviewed articles published since the start of the observation period. Citations to articles published before the start of the observation period are not taken into account. On average a researcher’s work in the study period is cited 132.9 times per year. This variable is however substantially skewed with 6.8% of researchers accounting for 50% of all citations and highly correlated with the overall number of articles that a researcher published. There are also field differences with the average citation numbers between the Life Sciences (185.2) and the STEM fields (157.7), but both numbers are substantially higher than in the SSH (25.6). The average number of citations per (peer-reviewed) article of a researcher is informative about the average relevance of a researcher’s article portfolio. The articles in our sample are cited on average 4.2 times per year.

The altmetric score of each article is retrieved as an attention or accessibility measure of published research. Following the recommendation by Konkiel ( 2016 ), we employ a ‘baskets of metrics’ rather than single components of the altmetric score. This score is a product of Digital Science and represents a weighted count of the amount of attention that is picked up for a certain research output Footnote 15 . Note that the average altmetric score for a researcher at t is the mean of the altmetrics of all articles published in the year t . Footnote 16 On average a researcher in our sample achieves an altmetric of 13. Similar to citation counts, this variable is heavily skewed. The differences in altmetrics across disciplines are rather small (see Table S. 1 in the supplementary material).

When using simple output metrics like citation counts, it is important to account for field-specific citation patterns. In order to do so, we collect the relative citation ratio (RCR) and the field citation ratio (FCR). The RCR was developed by the NIH (Hutchins et al., 2016 ). As described by Surkis and Spore ( 2018 ), the RCR uses an approach to evaluate an article’s citation counts normalized to the citations received by NIH-funded publications in the same area of research and year. The calculation of the RCR implies to dynamically determine the field of an article based on its co-citation network, that is, all articles that have been cited by articles citing the target article. The advantage of the RCR is to field- and time-normalize the number of citations that an article received. A paper that is cited exactly as often as one would expect based on the NIH-norm receives an RCR of 1 and an RCR larger one indicated that an article is cited more than its expectation given the field and year. The RCR is only calculated for the articles that are present on PubMed, have at least one citation and are older than two years. Thus, when analyzing this output metric, we focus on researchers in the life sciences only. The FCR is calculated by dividing the number of citations a paper has received by the average number received by publications published in the same year and in the same fields of research (FoR) category. Obviously, the FCR is very dependent on the definition of the FoR. Dimensions uses FoR that are closest to the Australian and New Zealand Standard Research Classification (ANZSRC, 2019 ). For the calculation of the FCR a paper has to be older than two years. Simlar to the RCR, the FCR is normalized to one and an article with zero citations has an FCR of zero. As the altmetric, the RCR and FCR cannot be retrieved time-dependently but are snapshots at the day of retrieval. We will refer to the average FCR/RCR at t , as the average of the FCRs/RCRs of the papers published in t . According to Hutchins et al. ( 2016 ), articles in high-profile journals have average RCRs of ~3. The key difference between the RCR and the FCR is that the FCR uses fixed definition of the research field, while for the RCR a field is relative to each publication considered. Table S. 1 in the supplementary material shows that the average rates are comparable across fields.

Figure 1 represents the evolution of the yearly average number of articles, preprints and the altmetric score per researcher depending on the funding status of the year before (as co- and/or PI). The amount of articles published each year has been rather constant or only slightly increasing, while the preprint count increased substantially over the past years. Recent papers also have a higher altmetric scores than older publications, even though they had less time to raise attention. It is important to note, however, that since we do not account for any researcher characteristics here, the differences between funded and unfunded researchers cannot be interpreted as being the result of funding. Yet, increasing prevalence of preprints and altmetrics suggest that they should be taken into account in funding evaluations.

figure 1

Shown are the averages of the publication an preprint counts and the altmetrics for each year of observation. PI stands for principal investigator.

Confounding variables

Table 1 further shows descriptive statistics for the gender of the researchers, their biological age, as well their field of research and the institution type. These variables capture drivers of researcher outputs and are therefore taken into account in all our analyses. Almost 77% of the researchers are male and about 60% are employed at cantonal universities, 24% at technical universities (ETH Domain) and about 17% at University of Applied Sciences (UAS) and University of Teacher Education (UTE). The research field and institution type are defined as the area or the type the researcher applies most often to or from. The field of life sciences has the largest proposal share in the data with about 39%. These variables serve as confounders together with the pre-sample information on the outcome variables since they may explain differences in output and therefore need to be accounted for. Note that 1615 researchers in our data did not publish any peer-reviewed papers in the five year pre-sample period. Table S. 1 in the supplementary material shows how the confounding variables vary between the research fields.

The submitted project proposals are graded on a six-point scale: 1 =  D , 2 =  C , 3 =  B C , 4 =  B , 5 =  A B , 6 =  A . We use the information on project evaluation to control for (or match on) average project quality following the approach by (Arora and Gambardella, 2005 ). We construct the evaluation score as a rolling average over the last four years of all the grades a researcher ‘collected’ in submitted proposals as PI and co-PI (if no grade was available over the last four years for a certain researcher, we use her all time average). We do so because future research is also impacted by the quality of past and co-occurring projects. The funding decision is, however, not exclusively based on those grades. It has to take the amount of funding available to the specific call into account. Therefore the ranking of an application among the other competing applications plays an important role and even highly rated projects may be rejected if the budget constraint is reached. Projects graded with an A/AB have good chances of being funded, while projects graded as D are never funded, see Fig. S. 2 in the Supplementary material representing the distribution of the grades among rejected and accepted projects.

Note that the researchers with missing age were deleted since this is an important control variable; the missing institution type were regrouped into unclassified. Additionally, for the analyses, the funding information will be used with a one (or more) year lag and at least one year of observation is lost per researcher. The final sample used for the analyses consists of 72,738 complete observations from 8,282 unique researchers.

Mixed effects model—longitudinal regression models

Table 2 summarizes the results of both negative binomial mixed models for the count outcomes (yearly numbers of publications and preprints). The incidence rate ratios (IRR) inform us on the multiplicative change of the baseline count depending on funding status. The model for the publication count was fitted on the whole data set, while the model for the preprint count is fitted on data since 2010, because the number of preprints was rather small in general before. SNSF funding seems to have a significant positive effect on research productivity, regarding yearly publication counts (1.21 times higher for PI than without SNSF funding) as well as yearly preprint counts (1.30 times higher for the PI compared to researchers without SNSF funding). Footnote 17 An ‘average’ researcher without SNSF funding in t −1 publishes on average 4.64 articles in t . A similar researcher (with all confounding variables kept constant) with SNSF funding as PI in t −1 would publish 5.6 articles in t . PIs on an SNSF project publish more. The same is true for male researchers and younger researchers for preprints. Researchers from ETH Domain publish more than the ones from Cantonal Universities. Researchers publish more in recent years. Researchers in the LS publish more peer-reviewed articles compared to other research areas. Regarding preprints, we observe a different picture. Here STEM researchers publish more than researchers in LS.

Table 3 summarizes the results of the four linear mixed models for the continuous outcomes: the average yearly number of citations per publication, the yearly average altmetric, the yearly average RCR and the yearly average FCR. Regarding the citation patterns, there is strong evidence that SNSF funding has a positive effect; especially PIs on SNSF projects have their articles cited more frequently (increase in average yearly citations of 0.33 per article for the PIs). Articles by LS researchers are cited most compared to researchers from other fields. This is also the case for researchers from ETH Domain and older researchers. For altmetrics and citation ratios, we employ a logarithmic scale to account for the fact that their distributions are highly skewed; we can then interpret the coefficients as percentage change. Regarding altmetrics, research funded by the SNSF gets an attention score that is 5.1% higher (by September 2020) compared to other researchers. Researchers in LS have by far the highest altmetrics followed by researchers in the SSH. There is no strong evidence for an effect of the funding on the average yearly RCR. This implies that in the short-run research outcomes of SNSF-funded researchers are as often cited as a mixed average of articles funded by the NIH or other important researcher funded world-wide, but also not significantly more than that. Younger researchers and researchers from the ETH Domain have higher RCRs. The results also suggest a positive relation between SNSF funding and a researcher’s FCR.

Non-parametric estimation

While the previous estimation approaches modeled unobserved heterogeneity across individuals, the non-parametric matching approach addresses the selection into the treatment explicitly. It accounts for selection on observable factors which may—if not accounted for—lead to wrongly attributing the funding effect to the selectivity of the grant-awarding process. We model a researcher’s funding success as a function of researcher characteristics. In particular, this includes their previous research track record (publication experience and citations) and the average of all evaluation scores for submitted proposals (PI or co-PI) received by the researcher. In addition, we include age, gender, research field and institution type. We obtain the propensity score to be used in the matching process as described in the section “Non-parametric treatment estimation”.

The results from the probit estimation on the funding outcome (success vs. rejection) are presented in Table 4 . The table first shows the model for the full sample which provides the propensity score for the estimation of treatment effects on articles and citations to these articles, and on preprints. The second model shows the model for the sub-sample of researchers in the LS used for estimating treatment effects on the RCR. The third model shows the estimation for the full sample, but accounting for pre-sample FCR, and provides the propensity score for the estimation of the treatment effect on the FCR. The fourth model controls for pre-sample altmetrics values and serves the estimation of the treatment effect on future altmetrics scores. Consistent across all specification, the results show that the evaluation score is a key predictor of grant success. The higher the score, the more likely is it that a proposal gets approved. The grant likelihood for male researches is higher than for females as well as for older researchers. The latter result can have various reasons, which are outside the scope of this paper and are being discussed elsewhere Footnote 18 . As expected, past research performance is another strong predictor of grant success where peer-reviewed articles matter more than preprints. In addition to quantity, past research quality (as measured by citations) increases the probability of a proposal being granted. Interesting in more recent years (as shown in model 4), quality rather than quantity appears to predict grant success as it is the average number of citations to pre-period publication rather than their number that explains funding success.

The comparison of the distribution of the propensity score and the evaluation score before and after matching shows that the nearest neighbor matching procedure was successful in balancing the sample in terms of the grant likelihood and—importantly—also the average scores (see Fig. S. 1 in the supplementary material). This ensures that we are comparing researchers with funding to researchers without funding that have similarly good ideas (the scores are the same, on average) and are also otherwise comparable in their characteristics predicting a positive application outcome. The balancing of the propensity scores and the evaluation scores in both groups (grant winners and unsuccessful applicants) after each matching are shown in Tables 5 and 6 . Note that we draw matches for each grant-winner from the control group with replacement and that hence some observations from researchers in the control group are used several times as ‘twins’. Table S. 5 in the supplementary material shows that across the different matched samples <10% of control researcher–year observations are used only once and about 60% up to 25 times. About 10% of control group researchers are used very frequently, i.e. more than 160 times.

Tables 5 and 6 show the estimated treatment effects after matching, i.e. the test for the magnitude and significance of mean differences across groups. Note that the number of matched pairs differs depending on the sample used and that log values of output variables were used to account for the impact of skewness of the raw variable distribution in the mean comparison test. The magnitude of the estimated effects is comparable to the ones of the parametric estimation models. Researchers with a successful grant publish on average 1.2 articles (exp[0.188]) and about one additional preprint (exp[0.053]) more in the following year, their articles receive 1.7 citations (exp(0.532)) more than articles from the control group. In terms of altmetrics we also see a significant difference in means which is 1.15 (exp[0.138]) points higher in the group of grant receivers. Also, in terms of the FCR and the RCR, there are significant effects on the treatment group. The probability to be among the ‘highly cited researchers’ (as measured by an FCR > 3) is 5.5 ( α TT  = 0.055) percentage points higher in the group of funded researchers. This means publications in t  + 1 are cited at least three times as much as the average in the field.

Persistency of treatment effects

In addition to the effect in the year after funding ( t  + 1), we are interested in the persistency of the effect in the following years up to ( t  + 3). It is likely that any output effects occur with a considerable time-lag after funding received. The start-up of the research project including the training of new researchers and the set-up of equipment may take some time before the actual research starts. In principle, we could of course expect the effect to last also longer than three to four years. However, after 4 years, the treatment effect of one project grant may become confounded by one (or several) follow-up grants. Tables 5 and 6 show the results for the different outcome variables also for different time horizons.

The results suggest that the funding has a persistent output effect amounting to about one additional article in each of the 3 years following the year of funding. The effect on preprints is already significant in the first year, but also turns out to sustain in later years suggesting that research results from the project are probably circulated via this channel. In contrast to these results, we find for altmetrics that they are significantly higher early on, but not in the medium-run. When looking at citation-based measures as indicators for impact and relevance, we see that the number of citations stays significantly higher in the medium-run, but effect size declines somewhat indicating that researchers publish the most important results earlier after funding. This is also reflected in the results for the average number of citations and the probability to be highly cited. For the FCR, the effect is less persistent as the difference between groups fades after the first year. For the RCR the differences in means is strongest in the first year after the grant and only significant at the 10% level in t  + 3.

Impact heterogeneity over the academic life-cycle and research fields

For most outcomes, we find a significant and persistent difference between funded and unfunded researchers, while controlling for other drivers of research outcomes. As shown in earlier studies (Arora and Gambardella, 2005 ; Jacob and Lefgren, 2011 ), a grant’s impact may depend on the career stage of a researcher. As a proxy for career stage, we use the biological age of the researchers. Additionally, there might be heterogeneity in the funding effect depending on the research fields. We perform interaction tests between (i) the age and the funding and (ii) between the research field and the funding. More specifically, we employ a categorical variable for age and allow for an interaction term with the funding variable in the mixed models presented in the section “Mixed effects model—longitudinal regression models”. The same procedure is repeated with research field. The interaction tests suggests indeed that there is evidence for a difference in the effect of funding on the article and preprint count depending on the age group (with p -value < 0.001, for both outcomes) and the research field (with p -value of <0.001 for articles and p -value of 0.0045 for preprints). When we test for those same interaction effects in the continuous outcome models, the results suggest that there is a difference in the funding effect on the average number of citations per article depending on the age group ( p -value < 0.001) and the research field ( p -value = 0.0242). For altmetrics and the citation ratios, we see no evidence for major differences across age groups ( p -value of 0.328 for the altmetric, 0.802 for the RCR and 0.873 for the FCR) nor research fields ( p -value of 0.2296 for altmetric and p -value of 0.5124 for FCR Footnote 19 ).

To better understand those differences in funding effect, we refer to Fig. 2 for the article counts and Fig. 3 for the average number of citations per article. Those figures show the predicted article or citation count depending on the funding group (in t −1) and the age group or the research field. For all those subgroups, SNSF funding (as PI) in t −1 has a positive effect on the outcome. However the size of this effect differs substantially. The youngest age group (<45) seems to benefit considerably from the funding in terms of predicted difference between treatment and control researchers in article count, but also in citation per article (the confidence intervals of funded as PI and no funding do not overlap). More senior funded researchers (45–54 and 55–65 years of age) perform similarly well compared to researchers with the same characteristics but no funding. It is noteworthy that for older researchers (65+) the difference between groups is again higher indicating that funding helps to keep productivity up. We obtain very similar results based on post-estimations with interaction effects in the matched samples from the propensity score matching approach (see Fig. S. 7 in the supplementary material).

figure 2

To predict the article count the baseline confounding variables were fixed to Year 2015–19, Male, Evaluation Score Score AB-A, University, LS in the age interaction model and age lower to 45 for the field interaction model. We see a significant positive percentage change of 18% for the youngest age group among PIs (<45) and 115% for the most senior researchers (>65) compared to no SNSF funding. Additionally, the effect of funding is largest for STEM researchers (23% more articles as PI compared to unfunded researchers. The effect in LS and SSH is less prominent, +15% and +12%, respectively.

figure 3

For the predictions the baseline confounding variables were fixed to Year 2010–14, Male, Evaluation Score Score AB-A, University, LS in the age interaction model and age lower to 45 for the field interaction model. A significant positive percentage change of 10% for the youngest age group among PIs (<45) compared to no SNSF funding can be observed for the average number of citations. The remaining changes in citation number are not significant. Then, the effect of funding is largest for SSH researchers (15% more citations per article as PI compared to unfunded researchers). The effect in LS (+6%) and STEM (+8%) is less prominent. Note that the intervals however all overlap.

For all research areas, SNSF funding has a positive effect on article count and number of citations. STEM researchers however benefit most with a percentage change of 23% more articles as funded PI compared to no funding; funded (PI) researchers from the LS publish 15% more articles and the SSH researchers 12%. This could reflect that in STEM and LS the extent to which research can be successfully conducted is highly funding-dependent, while this is not necessarily the case in the SSH. Yet regarding the number of citations per article, the SSH researchers benefit most (14% more citations for SSH, 8% for STEM and 6% for LS). This suggest that funding may support the quality of research and hence its impact more in the SSH field. Thus, it should be noted that even though SSH researcher publish and are cited less in absolute numbers, we still see a substantial positive effect of SNSF funding on the outcomes. The respective figures for the remaining outcomes can be found in the supplementary material; more specifically Fig. S. 5 for the altmetric score, Fig. S. 4 for the preprint count and Fig. S. 6 for the FCR, in the supplementary material.

Conclusions

Understanding the role played by competitive research funding is crucial for designing research funding policies that best foster knowledge generation and diffusion. By investigating the impact of project funding on scientific output, its relevance and accessibility, this study contributes to research on the effects of research funding at the level of the individual researcher.

Using detailed information—including personal characteristics and the evaluation scores that their submitted projects received by peers—on the population of all project funding applicants at the SNSF during the 2005–2019 period, we estimate the impact of receiving project funding on publication outcomes and their relevance. The strengths of this study are in the detailed information on both researchers and grant proposals. First, the sample consists of both successful as well as unsuccessfully applicants. Therefore researchers who also had a research idea to submit are part of the control group. Second, information on the proposal evaluation scores allows to compare researchers which have submitted project ideas of—on average—comparable quality. The estimated treatment effects therefore take into account that all applicants may benefit from the competition for funding through participation effects (Ayoubi et al., 2019 ).

Besides these methodological aspects, a key contribution of this study is that—in addition to articles in scientific journals—it is the first to include preprints. Preprints are an increasingly important means of disseminating research results early and without access restrictions (Berg et al., 2016 ; Serghiou and Ioannidis, 2018 ). Besides this, we investigate relevance and impact in terms of absolute and relative citation measures. In the analysis of citations that published research receives, it is important to account for field-specific citation patterns. We do so by including the RCR and the FCR as measures for relative research impact in a researcher’s own field of study as additional outcome measures. Finally, this is the first study to investigate the link between funding and researchers’ altmetrics scores which mirror the attention paid to research outcomes in the wider public (Bornmann, 2014 ; Lăzăroiu, 2017 ; Warren et al., 2017 ).

The results show a similar pattern across all estimation methods indicating an effect size of about one additional article in each of the 3 years following the funding. In addition, we find a similarly sized effect on the number of preprints. The comparison across methods suggests that if accounting for important observable researcher characteristics (e.g. age, field, gender and experience) as well as proposal quality (as reflected in evaluation scores) parametric regression results and non-parametric models lead to similar conclusions with regard to publication outputs. Importantly, a significant effect on the number of citations to articles could be observed indicating that funding does not merely translate into more, but only marginally relevant research. Funded research also appears to reach the general public more than other research as indicated by higher average altmetrics in the group of grant-winners. In terms of the RCR and FCR the results indicate that there might be an effect on the funded researchers’ overall visibility in the research community. However, the effects on the RCR are not robust to the estimation method used.

The funding program analyzed in this study is open to all researchers in Switzerland affiliated with institutions eligible to receive SNSF funding. This allows us to study treatment effect heterogeneity over researchers’ life cycle and research field. The results suggest here, that funding is particularly important at earlier career stages where PF facilitates research that would not have been pursued without funding. With regard to treatment effect heterogeneity across fields, we find the highest effect of funding on the article count for STEM researchers and the highest funding effect on citations in SSH.

While the insights on a positive effect of funding on the number of subsequent scientific articles are in line with previous studies, compared to previous results, the effects that we document here are larger. The reason for that may be related to the fact that the SNSF is the main source of research funding in Switzerland we can therefore identify researchers for the control group who really had no other project grant in the period for which they are considered a control. We also observe co-PIs which may in other studies—due to a focus on PIs or lack of information—be assigned to the control group. Both may lead to an under-estimation of funding effects in previous studies. Moreover, by counting all publications of these researchers, we further take not only articles directly related to the project into account, but also that there are learning spillovers and synergies beyond the project that improve a researcher’s overall research performance.

Despite all efforts, this study is not without limitations. First, we do not observe industry funding for research projects which may be important in the engineering sciences (Hottenrott and Lawson, 2017 ; Hottenrott and Thorwarth, 2011 ). Moreover, the fact that researchers receive grants repeatedly and may switch between treatment and control group over time, makes a simple difference-in-difference analysis difficult. These factors further complicate the assessment of long-term impact of the research outcomes that we observe. The methods presented here aim to account for the non-randomness of the funding award and the underlying data structure. While we find that the main results are robust to the estimation method used, the reader should keep in mind that time-varying unobserved factors that affect an individual’s publication outcomes such as family or health status, involvement in professional services or administrative roles and duties (Fudickar et al., 2016 ) may be not sufficiently accounted for. Moreover, we do not have detailed information on the involved research teams and individual responsibilities within the projects. Therefore we do not investigate the role of team characteristics for any outcome effects. In such an analysis, it would be desirable to study whether and how sole-PI and multiple-PI projects differ and which role different PI profiles play for project success. A more detailed analysis of teams would also be interesting in order to differentiate between group and individual effort. Third, we used preprints and altmetrics as output measures which is novel compared to previous research on funding effects. Since we cannot compare our results to previous ones, we encourage future research on the effects of funding on early publishing and science communication more directly. It should be kept in mind that altmetrics may measure popularity in addition to efforts at dissemination as well as the extent to which authors are embedded in a network, but not the quality of individual research outcomes. Probably more than publications in peer-reviewed journals, preprints and altmetrics may be gamed—for example by repeated sharing of own articles or by ‘Salami slicing’ research outcomes into several preprints. Finally, it should be noted that we did not investigate several aspects that might be important in impact evaluation in this study. This list includes the role of the funding amount, the degree of novelty of the produced research, as well as treatment effect heterogeneity in terms of individual characteristics other than age.

Data availability

An anonymized and aggregated data set can be found on Zenodo ( https://doi.org/10.5281/zenodo.5011201 ). In order to anonymize the data we only provide applicants’ age as categorical variable.

Importance of competitive research funding increased substantially over the past three decades. The basic idea of promoting such science policy goes back to New Public Management reforms which aimed to increase the returns to public science funding through the selective provision of more funding to the most able researchers, groups and universities (winners in funding competitions), and to create performance incentives at all levels of the university system (Gläser and Serrano-Velarde, 2018 ; Krücken and Meier, 2006 ).

The SNSF is Switzerland’s main research funding agency. The SNSF is mandated by the Swiss confederation to allocate research funding to eligible researchers at universities, (technical) colleges and research organizations.

Excellent publications in this study were for instance papers in the upper quarter of journals included in the Science Citation Index (SCI).

Novelty is measured by the extent to which a published paper makes first time ever combinations of referenced journals while taking into account the difficulty of making such combinations.

Charities and private sector grants do play an increasing, but still a relatively minor role in Switzerland (Jonkers and Zacharewicz, 2016 ; Schmidt, 2008 ).

An alternative approach is to employ pre-sample information of the researcher as a proxy for unobservable characteristics, such as a researcher’s ability or writing talent which impact research output in the (later) sample period. We conducted such linear feedback models (LFM) as robustness tests and present them in Supplement S. 2.1 .

We use the lmer package in R and a negative binomial family.

In addition to the closeness on MD, we use elements of exact matching by requiring that selected control researchers belong exactly to the same subject field and to be observed in the same year as the researchers in the treatment group. This allows to account for different publication patterns across disciplines and also for time trends in funding likelihood and in the outcome variables.

As we perform sampling with replacement to estimate the counterfactual situation, an ordinary t -statistic on mean differences after matching is biased, because it does not take the appearance of repeated observations into account. Therefore, we have to correct the standard errors in order to draw conclusions on statistical inference, following Lechner ( 2001 ).

The Sinergia scheme is closely linked to PF, so that we will not differentiate between them in the following.

If granted, a co-applicant is entitled to parts of the funding.

If Dimensions found more than one ID for a certain name, we used further information on the researcher available to the SNSF to narrow the ID-options down. This supplementary information was, if present the ORCID, the current and previous research institution(s), country and birth year. Only researchers with a unique ID could be used in the following. See Table S 2 in the supplementary material for a comparison of the researchers that were found and not found.

Some characteristics on the researchers without unique ID can be found in Table S 2 in the Supplementary material.

Since only a few cases are identified to hold major international grants but no SNSF funding, we do not differentiate between these groups in the following. Note that the data was retrieved from the ERC Funded Projects Database included only grants acquired since 2007.

https://help.altmetric.com/support/solutions/articles/6000233311-how-is-the-altmetric-attention-score-calculated-

Unfortunately the altmetric cannot be retrieved as a time-dependent variable from Dimensions but only as the altmetric state at the time point of data retrieval (September 2020). Therefore the altmetric informs us on the cumulative importance an article published at t got until September 2020.

Note that we also tested the robustness of this result to when focusing on PF as treatment and adding the researchers with a funded Sinergia project to the control group, but adjusting with a Sinergia dummy variable. The size of funding as PI and co-PI effects and their confidence intervals were comparable.

Severin et al. ( 2020 ), for example, discuss gender biases on the reviewer scores leading to lower grant likelihood for female researchers.

Note that we did not test the interaction for the RCR outcome, as this analysis was done only for the LS field.

Adams JD, Griliches Z (1998) Research productivity in a system of universities. Ann Écon Stat 49–50, 127–162.

ANZSRC (2019) Outcomes paper: Australian and New Zealand Standard Research Classification Review 2019. Ministry of Business, Innovation & Employment.

Arora A, David P, Gambardella A (1998) Reputation and competence in publicly funded science: estimating the effects on research group productivity. Ann Econ Stat 49–50, 163–198.

Arora A, Gambardella A (2005) The impact of NSF support for basic research in economics . Ann Écon Stat 79–80, 91–117.

Ayoubi C, Pezzoni M, Visentin F (2019) The important thing is not to win, it is to take part: what if scientists benefit from participating in research grant competitions? Res Policy 48:84–97

Article   Google Scholar  

Battistin E, Rettore E (2008) Ineligibles and eligible non-participants as a double comparison group in regression-discontinuity designs. J Econom 142:715–730

Article   MathSciNet   Google Scholar  

Beaudry C, Allaoui S (2012) Impact of public and private research funding on scientific production: the case of nanotechnology Res Policy 41:1589–1606

Benavente JM, Crespi G, Figal Garone L, Maffioli A (2012) The impact of national research funds: a regression discontinuity approach to the Chilean fondecyt Res Policy 41:1461–1475

Berg JM, Bhalla N, Bourne PE, Chalfie M, Drubin DG, Fraser JS, Greider CW, Hendricks M, Jones C, Kiley R, King S, Kirschner MW, Krumholz HM, Lehmann R, Leptin M, Pulverer B, Rosenzweig B, Spiro JE, Stebbins M, Strasser C, Swaminathan S, Turner P, Vale RD, VijayRaghavan K, Wolberger C (2016) Preprints for the life sciences. Science 352:899–901

Article   ADS   CAS   Google Scholar  

Blundell R, Griffith R, Windmeijer F (1995), Dynamics and correlated responses in longitudinal count data models. In: Seeber GUH, Francis BJ, Hatzinger R, Steckel-Berger G (eds), Statistical modelling. Springer New York, New York, pp. 35–42.

Bornmann L (2014) Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics J Informetr 8:895–903

Carayol N, Matt M (2004) Does research organization influence academic production? laboratory level evidence from a large european university Res Policy 33:1081–1102

de la Cuesta B, Imai K (2016) Misunderstandings about the regression discontinuity design in the study of close elections. Annu Rev Political Sci 19:375–396

Digital Science (2018) Dimensions [software] available from https://app.dimensions.ai . Accessed Sept 2020, under licence agreement.

Fang F, Casadevall A (2016) Research funding: the case for a modified lottery. mBio 7(2):e00422-16.

Fleming L, Greene H, Li G, Marx M, Yao D (2019) Government-funded research increasingly fuels innovation. Science 364:1139–1141

Franzoni C, Giuseppe S, Stephan P (2011) Changing incentives to publish. Science (New York, NY) 333:702–3

Froumin I, Lisyutkin M (2015) Excellence-driven policies and initiatives in the context of bologna process: rationale, design, implementation and outcomes. In: Curej A, Matei L, Pricopie R, Salmi J, Scott P (eds) The European higher education area. Springer.

Fudickar R, Hottenrott H, Lawson C (2016) What’s the price of academic consulting? effects of public and private sector consulting on academic research. Ind Corp Change 27:699–722

Gerfin M, Lechner M (2002) A microeconometric evaluation of the active labour market policy in Switzerland. Econ J 112:854–893

Gläser J, Serrano-Velarde K (2018) Changing funding arrangements and the production of scientific knowledge: introduction to the special issue. Minerva 56:1–10

Graves N, Barnett AG, Clarke P (2011) Funding grant proposals for scientific research: retrospective analysis of scores by members of grant review panel. BMJ 343:d4797.

Hausman N (2021) University innovation and local economic growth. Rev Econ Stat (forthcoming). https://doi.org/10.1162/rest_a_01027 .

Hottenrott H, Lawson C (2017) Fishing for complementarities: research grants and research productivity Int J Ind Organ 51:1–38

Hottenrott H, Thorwarth S (2011) Industry funding of university research and scientific productivity. Kyklos 64:534–555

Hutchins BI, Yuan X, Anderson JM, Santangelo GM (2016) Relative citation ratio (RCR): a new metric that uses citation rates to measure influence at the article level. PLoS Biol 14:1–25

Jacob BA, Lefgren L (2011) The impact of research grant funding on scientific productivity. J Public Econ 95:1168–1177

Jaffe AB (1989) Real effects of academic research. Am Econ Rev 79:957–970

Google Scholar  

Jaffe AB (2002) Building programme evaluation into the design of public research support programmes. Oxf Rev Econ Policy 18:22–34

Jonkers K, Zacharewicz T (2016) Research performance based funding systems: a comparative assessment. Technical Report JRC101043, Publications Office of the European Union.

Konkiel S (2016) Altmetrics: diversifying the understanding of influential scholarship. Palgrave Commun 2:16057

Krücken G, Meier F (2006) Turning the university into an organizational actor. In: Drori GS, Meyer JW, Hwang H (eds) Globalization and organization: world society and organizational change, vol 18. pp. 241–257.

Lechner M (2001) Identification and estimation of causal effects of multiple treatments under the conditional independence assumption. In: Lechner M, Pfeiffer F (eds) Econometric evaluation of labour market policies. Physica-Verlag HD, Heidelberg, pp. 43–58.

Lăzăroiu G (2017) What do altmetrics measure? Maybe the broader impact of research on society. Educ Philos Theory 49:309–311

Mali F, Pustovrh T, Platinovšek R, Kronegger L, Ferligoj A (2017) The effects of funding and co-authorship on research performance in a small scientific community. Sci Public Policy 44:486–496

Neufeld J, Huber N, Wegner A (2013) Peer review-based selection decisions in individual research funding, applicants’ publication strategies and performance: the case of the ERC starting grants. Res Eval 22:237–247

Oancea A (2016) Research governance and the future(s) of research assessment. Palgrave Commun 5:27

Payne A (2002) Do US congressional earmarks increase research output at universities? Sci Public Policy 29:314–330

Payne A, Siow A (2003) Does federal research funding increase university research output? Adv Econ Anal Policy 3:1018–1018

Poege F, Harhoff D, Gaessler F, Baruffaldi S (2019) Science quality and the value of inventions. Sci Adv 5(12):eaay7323.

Reale E (2017) Analysis of national public research funding (PREF)—final report, Technical Report JRC107599, Publications Office of the European Union.

Rubin DB (1977) Assignment to treatment group on the basis of a covariate. J Educ Stat 2:1–26

Schmidt J (2008) Das Hochschulsystem der Schweiz: Aufbau, Steuerung und Finanzierung der schweizerischen Hochschulen. Beitr Hochschulforsch 30:114–147

Serghiou S, Ioannidis JPA (2018) Altmetric scores, citations, and publication of studies posted as preprints. JAMA 319:402–404

Severin A, Martins J, Heyard R, Delavy F, Jorstad A, Egger M (2020) Gender and other potential biases in peer review: cross-sectional analysis of 38–250 external peer review reports. BMJ Open 10:e035058

Silberzahn R, Uhlmann EL, Martin DP, Anselmi P, Aust F, Awtrey E, Bahník Š, Bai F, Bannard C, Bonnier E, Carlsson R, Cheung F, Christensen G, Clay R, Craig MA, Rosa AD, Dam L, Evans MH, Cervantes IF, Fong N, Gamez-Djokic M, Glenz A, Gordon-McKeon S, Heaton TJ, Hederos K, Heene M, Mohr AJH, Högden F, Hui K, Johannesson M, Kalodimos J, Kaszubowski E, Kennedy DM, Lei R, Lindsay TA, Liverani S, Madan CR, Molden D, Molleman E, Morey RD, Mulder LB, Nijstad BR, Pope NG, Pope B, Prenoveau JM, Rink F, Robusto E, Roderique H, Sandberg A, Schlüter E, Schönbrodt FD, Sherman MF, Sommer SA, Sotak K, Spain S, Spörlein C, Stafford T, Stefanutti L, Tauber S, Ullrich J, Vianello M, Wagenmakers E-J, Witkowiak M, Yoon S, Nosek BA (2018) Many analysts, one data set: making transparent how variations in analytic choices affect results. Adv Methods Pract Psychol Sci 1:337–356

Stephan PE (2012) How economics shapes science. Harvard University Press, Cambridge.

Surkis A, Spore S (2018) The relative citation ratio: what is it and why should medical librarians care? J Med Libr Assoc 106:508–513

Tahmooresnejad L, Beaudry C (2019) Citation impact of public and private funding on nanotechnology-related publications Int J Technol Manag 79:21–59

Wahls WP (2018) High cost of bias: diminishing marginal returns on NIH grant funding to institutions. Preprint at https://doi.org/10.1101/367847 .

Wang J, Lee Y-N, Walsh JP (2018) Funding model and creativity in science: competitive versus block funding and status contingency effects Res Policy 47:1070–1083

Warren HR, Raison N, Dasgupta P (2017) The rise of altmetrics. JAMA 317:131–132

Download references

Acknowledgements

We are grateful to Tobias Phillip for helpful comments on the study design and on previous versions of this manuscript and to Matthias Egger for an additional careful review of the manuscript prior to submission. This work was supported by the SNSF (internal funds).

Open Access funding enabled and organized by Projekt DEAL.

Author information

Authors and affiliations.

Swiss National Science Foundation, Berne, Switzerland

Rachel Heyard

TUM School of Management, Munich, Germany

Hanna Hottenrott

Munich Data Science Institute, Garching bei München, Germany

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hanna Hottenrott .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary material, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Heyard, R., Hottenrott, H. The value of research funding for knowledge creation and dissemination: A study of SNSF Research Grants. Humanit Soc Sci Commun 8 , 217 (2021). https://doi.org/10.1057/s41599-021-00891-x

Download citation

Received : 23 November 2020

Accepted : 31 August 2021

Published : 21 September 2021

DOI : https://doi.org/10.1057/s41599-021-00891-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

This article is cited by

Choice of open access in elsevier hybrid journals.

  • Sumiko Asai

Publishing Research Quarterly (2024)

Exploring individual character traits and behaviours of clinical academic allied health professionals: a qualitative study

  • Elizabeth King
  • Terry Cordrey
  • Owen Gustafson

BMC Health Services Research (2023)

A new approach to grant review assessments: score, then rank

  • Stephen A. Gallo
  • Michael Pearce
  • Elena A. Erosheva

Research Integrity and Peer Review (2023)

What is research funding, how does it influence research, and how is it recorded? Key dimensions of variation

  • Mike Thelwall
  • Subreena Simrick
  • Peter Van den Besselaar

Scientometrics (2023)

Scientific laws of research funding to support citations and diffusion of knowledge in life science

  • Melika Mosleh
  • Saeed Roshani
  • Mario Coccia

Scientometrics (2022)

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

research funding definition

U.S. flag

An official website of the United States government

Here's how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock A locked padlock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

Home

Understanding Types of Grants and Funding

The NIDCD funds research through a variety of award mechanisms.

  • Research grants (R series) These grants may be awarded to individuals at universities, medical and other health professional schools, colleges, hospitals, research institutes, for-profit organizations, and government institutions.
  • Research training and career development (NRSA, K series, and more) These include individual fellowships, institutional awards, career development awards, and other opportunities. 
  • Small business grants (SBIR/STTR)
  • Clinical research center grants (P50)
  • Clinical trials funding (R01, U01)
  • Conference grants
  • Administrative supplements
  • Funding for drug development
  • Loan repayment programs (NIH)

Economic Consultants

  • Consultancy
  • Merger & Acquisition
  • Due Diligence
  • EC Associates
  • Research Funding

Definition;

Research Funding is a term generally covering any funding for scientific research, in the areas of both “hard” science and technology and social science. The term often connotes funding obtained through a competitive process, in which potential research projects are evaluated and only the most promising receive funding. Such processes, which are run by government, corporations or foundations, allocate scarce funds. Total research funding in most developed countries is between 1.5% and 3% of GDP; Sweden is the only country to exceed 4% .

Funding Sources

Most research funding comes from two major sources, corporations (through research and development departments) and government (primarily carried out through universities and specialised government agencies). Some small amounts of scientific research are carried out (or funded) by charitable foundations, especially in relation to developing cures for diseases such as cancer, malaria and AIDS. In the OECD, around two-thirds of research and development in scientific and technical fields is carried out by industry, and 20% and 10% respectively by universities and government, although in poorer countries such as Portugal and Mexico the industry contribution is significantly less.

research funding definition

Research Funding Sources

The US government spends more than other countries on military R&D, although the proportion has fallen from around 30% in the 1980’s to under 20%. Government funding for medical research amounts to approximately 36% in the U.S. The government funding proportion in certain industries is higher, and it dominates research in social science and humanities. Similarly, with some exceptions (e.g. biotechnology) government provides the bulk of the funds for basic scientific research. In commercial research and development, all but the most research-oriented corporations focus more heavily on near-term commercialisation possibilities rather than “blue-sky” ideas or technologies (such as nuclear fusion).

Government-Funded Research

Government-funded research can either be carried out by the government itself, or through grants to academic and other researchers outside the government. Critics of basic research are concerned that research funding for the sake of knowledge itself does not contribute to a great return. However, scientific innovations often foreshadow or inspire further ideas unintentionally. For example, NASA’s quest to put a man on the moon inspired them to develop better sound recording and reading technologies. NASA’s research was furthered by the music industry, who used it to develop audio cassettes. Audio cassettes, being smaller and able to store more music, quickly dominated the music industry and increased the availability of music. An additional advantage to government sponsored research is that the results are publicly shared, whereas with privately funded research the ideas are controlled by a single group. Consequently, government sponsored research can result in mass collaborative projects that are beyond the scope of isolated private researchers.

Privately Funded Research

Funding of research by private companies is mainly motivated by profit, and are much less likely than governments to fund research projects solely for the sake of knowledge. The profit incentive causes researchers to concentrate their energies on projects which are perceived as likely to generate profits. However, the rise of corporate responsibility as an important communication issue for larger corporation led to experiments in funding basic research by companies such as IBM (high temperature supra conductivity was discovered by IBM sponsored basic experimental research in 1986), L’Oreal (which created the L’Oreal-Unesco prize for women scientists and finances internships), AXA (which launched a Research Fund in 2008 and finances Academic Institutions such as advanced fundamental mathematics French Fundation IHES), e.g.

research funding definition

An often-quoted case study is the first sequencing of the human genome, which was simultaneously carried out in two competing projects, the United States government-managed Human Genome Project (HGP) and the private venture capital funded Celera Genomics. Celera Genomics used a newer, albeit riskier technique, which some HGP researchers claimed would not work, although that project eventually adopted some of the same methods. However, it has been argued by some genomics researchers that a simple efficiency comparison for such programs is not apt. Much of the funding provided for the HGP served the development of new technologies, rather than the sequencing of the human genome itself. In addition, Celera started much later than the HGP and could take advantage of the experience gained by the HGP, which, as a publicly funded project, made much of its work available as a basis upon which Celera could build. Though Celera’s sequencing strategy allowed the sequencing of the majority of the human genome with much higher efficacy, the strategy used by the HGP allowed the sequencing of a higher percentage of the genome.

Research Funding Process

research funding definition

Funding Influence on Research

A 2005 study in the journal Nature surveyed 3247 US researchers who were all publicly funded (by the National Institutes of Health). Out of the scientists questioned, 15.5% admitted to altering design, methodology or results of their studies due to pressure of an external funding source. In a contemporary study published in the New England Journal of Medicine[citation needed], a similar proportion of the 107 medical research institutions questioned were willing to allow pharmaceutical companies sponsoring research to alter manuscripts according to their interests before they were submitted for publication.

In the eighteenth and nineteenth centuries, as the pace of technological progress increased before and during the industrial revolution, most scientific and technological research was carried out by individual inventors using their own funds. A system of patents was developed to allow inventors a period of time (often twenty years) to commercialise their inventions and recoup a profit, although in practice many found this difficult. The talents of an inventor are not those of a businessman, and there are many examples of inventors (e.g. Charles Goodyear) making rather little money from their work whilst others were able to market it. In the twentieth century, scientific and technological research became increasingly systematised, as corporations developed, and discovered that continuous investment in research and development could be a key element of success in a competitive strategy. It remained the case, however, that imitation by competitors – circumventing or simply flouting patents, especially those registered abroad – was often just as successful a strategy for companies focused on innovation in matters of organisation and production technique, or even in marketing. A classic example is that of Wilkinson Sword and Gillette in the disposable razor market, where the former has typically had the technological edge, and the latter the commercial one. Sources; Wikipedia and other encyclopedias

research funding definition

  • The EC Funding Programmes
  • Sources of Funding
  • Recourse Loans and Non-Recourse Loans
  • Acceptable Stock Exchanges
  • Venture Capital
  • Angel Investors
  • Credit (finance)
  • Crowd Funding
  • Foundation (Charity)
  • Peer-to-Peer Lending

Bank Instruments

  • Stock based Loans
  • The Financial Industry's Regulatory Authorities

Opening hours!

Monetisation.

We offer monetisation of Bank Instruments from top-rated banks. LTV normally 70 to 80%. Standard Swift B to B procedure MT760. Payment within 5 days after acceptance of Instrument. Fast processing! For more information see the Non-recourse Monetisation page !

Trading Programs – PPP’s

research funding definition

Through our Providers, we offer a range of Private Placement Programs (PPP). Our providers are all licensed. For all Tier One programs, Money remain in clients own bank account with full control. Tier One Spot trades from 100M and upwards. Buy/sell Program available now. 50 M (min). Huge returns in 12 weeks. Small Cap from 1M entry, 10Day Ping Program. For more information write to [email protected] .

Recent Posts

  • EC Press Release 1 2022 - PPP Update
  • EC Press Release 1, 2017 Mr. Marek J. Halys joins Economic Consultants!
  • The EC October 2016 Newsletter!
  • Privacy Policy
  • Anti-money Laundering
  • EC Conflict of Interest

Imperial College London Imperial College London

Latest news.

research funding definition

Voluntary corporate emissions targets not enough to create real climate action

research funding definition

Imperial and CNRS strengthen UK-France science with new partnerships

research funding definition

New AI startup accelerator led by Imperial opens for applications

  • Research Office
  • Research and Innovation
  • Support for staff
  • Funder information

Types of research funding

A generic photo of a College laboratory

Non-commercial

The key aim of research charities is to generate knowledge that benefits the public good. Charities provide an important independent stream of research funding which complements the objectives of the Research Councils and Government departments. There are hundreds of research funding charities covering a wide range of aims. All are regulated by charity law and are required to adhere to certain obligations and restrictions on the use of charitable funds for research, e.g. the requirement to disseminate research findings and a prohibition on funding research for the purpose of commercial or private gain.

Association of Medical Research Charities (AMRC)

The AMRC is a member organisation of the leading UK charities that fund medical and health research. There are currently over 100 members, including the world’s largest charity, the Wellcome Trust, all with the common aim of improving human health by funding a wide range of research including basic, applied and disease specific. These charities provide funds in a variety of ways ranging from small pump-priming grants to substantial funds intended for programmes of research. Medical research charities can only fund research that falls within their charitable objectives, which may focus on a particular disease or condition, a range of diseases or more widely on improving human health through education and research.

Members are listed on the AMRC website .

UK Research and Innovation

UKRI , primarily through its Research Councils, invests approximately £2.8 billion per annum in research ranging from medical and biological sciences to astronomy, physics, chemistry, engineering, social sciences, economics, and the arts and humanities. The aim, scale and balance of the projects funded reflect the national research priorities agreed in consultation with Government and other stakeholders.  UKRI is principally funded through the Science Budget by the Department for Business, Energy and Industrial Strategy (BEIS).

UKRI comprises nine organisations: seven Research Councils, which are organised by discipline,  Innovate UK who support translation activities (often involving industry partners) and Research England, a funding council who support underpinning funding for universities:

  • Arts and Humanities Research Council (AHRC)
  • Biotechnology and Biological Sciences Research Council (BBSRC)
  • Economic and Social Research Council (ESRC)
  • Engineering and Physical Sciences Research Council (EPSRC)
  • Medical Research Council (MRC)
  • Natural Environment Research Council (NERC)
  • Science and Technology Facilities Council (STFC)
  • Innovate UK
  • Research England

UKRI also has its own funding streams:

  • Official Development Assistance ( Newton Fund)
  • National Productivity Investment Fund (NPIF)
  • Industrial Strategy Challenge Fund
  • Talent -  Future Leaders Fellowships  (Imperial has an internal process for this scheme)
  • Place -  Strength in Places Fund
  • Strategic Priorities Fund

The Strategic Priorities Fund is currently the newest stream of funding emerging from the UKRI NPIF and is primarily being strategically allocated at this stage. There may be open calls at later stages. A summary of the current status of this stream can be seen in the following table - UKRI does not group together calls for this stream in the same way as the other streams. 

Other UK Goverment departments

A number of Government departments provide significant funding for a wide variety of research activities.

Department of Environment, Food and Rural Affairs (DEFRA) DEFRA programmes on the environment, food and rural affairs.

Defence Science and Technology Laboratory (DSTL) and QinetiQ DSTL is an agency of the UK Ministry of Defence (MOD). QinetiQ is Britain's largest independent science and technology company. They both supply scientific and technical research and advice to the MOD. Department for Transport (DfT) The DfT oversees the delivery of a reliable, safe and secure transport system that responds efficiently to the needs of individuals and business whilst safeguarding our environment. Department of Health (DH) The DH funds significant programmes of research and development in the NHS through the National Institute for Health Research . The national programmes investigate a broad range of healthcare matters including the provision of funding to support the training and education of future health researchers. In addition, DH spends about £30 million per annum through ad-hoc research budgets (held by Departmental policy branches) and through research undertaken by arm's length bodies including the Public Health England (previously known as the Health Protection Agency).

National Academies, including the Royal Society

There are four English-based National Academies, all of which have the same principal roles:

  • an independent fellowship of world-leading scholars and researchers
  • a funding body that supports new research, nationally and internationally
  • advocacy for their respective research fields
  • a forum for debate and engagement

The four organisations are as follows:

The  Royal Society  is a leading independent scientific body in the UK and the Commonwealth, promoting excellence in science by supporting scientists from postdoctoral level to senior professorships. They offer grants for a variety of purposes ranging from conference travel to the modernisation of laboratories.

The British Academy   is the UK’s national body for the humanities and social sciences – the study of peoples, cultures and societies, past, present and future.  The British Academy provides a variety of grants and fellowships to support academic research, career development and wider engagement. Funding opportunities cover UK and international research from the postdoctoral level upwards.

The Royal Academy of Engineering  is  the UK’s national academy for engineering and technology, bringing together the most talented and successful engineers to advance and promote excellence in engineering for the benefit of society.  The Academy runs a programme of grants and prizes to support and celebrate the pursuit of engineering activities and to enable closer contact between academia and industry. 

The Academy of Medical Sciences is  the independent body in the UK representing the diversity of medical science, advancing biomedical and health research and its translation into benefits for society.  The Academy is committed to supporting the careers of the next generation of biomedical and health researchers  through a portfolio of  grant schemes , a  mentoring programme ,  career development events  and careers policy work .

National Institutes of Health, USA (NIH)

The NIH is the United States’ national medical research agency and consists of twenty-seven institutes and centres. It funds grants, cooperative agreements and contracts that support the advancement of fundamental knowledge about the nature and behaviour of living systems to meet the NIH mission of extending healthy life and reducing the burdens of illness and disability. Comprehensive information on NIH policies, funding opportunities is available on the NIH website .

NIH Application Process Flowchart [pdf]

National Institutes of Health (NIH) - Research Portfolio Online Reporting Tool (RePORTER)

Further information: NIH Financial Conflict of Interest Policy

European Commission

The European Commission’s main mechanism for funding research and innovation in Europe is Horizon Europe which offers a range of funding opportunities to UK HEIs.

Information on funding opportunities, application processes and details of support is available from the Research Office Europe team: Horizon Europe

The UK Research Office (UKRO) is the UK's leading national information and advice service on European Commission funding for research and higher education, and their mission is to promote effective UK participation in European Commission funded research programmes, higher education programmes, and other related activities. This includes:

  • Supporting sponsors and subscribers through early insight and briefing on developments in European programmes and policies
  • Disseminating timely and targeted information on European Commission funding opportunities
  • Providing high quality advice, guidance and training on applying for and managing European Commission projects

Details on European Commission funding are also available at UK Research Office (UKRO)

Internal schemes

The application and award processes of a number of College internal funding schemes are managed by the Funding Strategy team in the Research Office on behalf of the Vice-Provost (Research). These include, but are not limited to:

  • Research Council supported Impact Acceleration Accounts
  • EPSRC Doctoral Prize Fellowships
  • Imperial College Research Fellowships

These awards go through the research ledger and are managed post award by Faculty Research Services teams.

Industry and private companies (national and multinational)

A wide variety of activities are funded by industry and the private sector.  A good understanding of the market context is critical when entering into negotiation with industry. This includes:

  • Understanding the investigators’ and Imperial’s position within the wider market, i.e. retaining or gaining market share
  • Acquiring sufficient knowledge of competitors
  • Taking advantage of opportunities, e.g. gaps in the market
  • Minimising risks and threats
  • Relating supply with demand, e.g. reacting to funders’ priorities, where appropriate
  • Understanding the funders’ willingness and ability to pay
  • The value of the research to its business
  • Consideration of multiple services to provide a competitive edge

Imperial has faculty-based contract negotiators who liaise with companies on the terms and conditions of funding. The requirement to retain the academic freedom to disseminate knowledge and ownership of background and arising Intellectual Property is central to contract negotiations. Because of this, negotiation can be a lengthy process. Background information to the contracting process and associated policies is available in the Contracts pages.

Go to the homepage

Example sentences research funding

Definition of 'funding' funding.

IPA Pronunciation Guide

Definition of 'research' research

B1+

COBUILD Collocations research funding

Browse alphabetically research funding.

  • research fellowship
  • research finding
  • research firm
  • research funding
  • research grant
  • research highlights
  • research indicates
  • All ENGLISH words that begin with 'R'

Quick word challenge

Quiz Review

Score: 0 / 5

Tile

Wordle Helper

Tile

Scrabble Tools

Research Funding Agreement

Trustpilot

Jump to Section

What is a research funding agreement.

A research funding agreement is a contract between a sponsor and a researcher where the latter agrees to donate money toward a specific research project. This contract can exist between two or more parties, since it's not uncommon for multiple sponsors to donate toward the funding of a single research project. The research funding agreement contains information about which research project is being funded, how long the research is expected to last, and what the money can be spent on. It includes information about protocols to take when the agreement must be terminated, extended, or modified.

Common Sections in Research Funding Agreements

Below is a list of common sections included in Research Funding Agreements. These sections are linked to the below sample agreement for you to explore.

Research Funding Agreement Sample

Reference : Security Exchange Commission - Edgar Database, EX-10.2 7 ex102.htm LICENSE AND RESEARCH FUNDING AGREEMENT, DATED DECEMBER 14, 2016, BETWEEN THE COMPANY AND ARIEL UNIVERSITY R&D CO., LTD. , Viewed September 27, 2022, View Source on SEC .

Who Helps With Research Funding Agreements?

Lawyers with backgrounds working on research funding agreements work with clients to help. Do you need help with a research funding agreement?

Post a project  in ContractsCounsel's marketplace to get free bids from lawyers to draft, review, or negotiate research funding agreements. All lawyers are vetted by our team and peer reviewed by our customers for you to explore before hiring.

Meet some of our Research Funding Agreement Lawyers

Adalbert M. on ContractsCounsel

Adalbert M.

Dynamic Attorney helping people and small business owners protect their assets. Founding Attorney at THE CYA LAW FIRM, PLLC, in Port Saint Lucie, Florida. Offering a wide range of legal services including: Living Trusts and Wills, POA and Advanced Directives, Business Formation, Contract drafting, Business Counsel, Prenuptials and Postnuptials, and more. **Licensed in Florida and fluent in English and Spanish.

Kimbrelly K. on ContractsCounsel

Kimbrelly K.

Attorney Kegler has been licensed to practice law in ​the State of North Carolina since 1998. Over the years,​ she has worked in firms that focused on small​ business financing, initial startup formation, to​ starting several businesses of her own with bootstrap​ financing to venture capital funding. As a Certified​ Dream Manager, she couples the skills of listening to​ understand the big picture to get to solutions that not​ only fit today's needs but also the long term needs of​ her entrepreneurial clients.​

Christina S. on ContractsCounsel

Christina S.

I am an attorney who has been practicing for over a decade, experienced in multiple areas of law, both from a litigation and more procedural side. The great thing about my practice is that it has trained me to deal with so many different types of problems and to find solutions in a variety of legal scenarios that are almost never similar.

Karl D. S. on ContractsCounsel

Karl D. Shehu, has a multidisciplinary practice encompassing small business law, estate and legacy planning, real estate law, and litigation. Attorney Shehu has assisted families, physicians, professionals, and people of faith provide for their loved ones by crafting individualized estate and legacy plans. Protecting families and safeguarding families is his passion. Attorney Shehu routinely represents lenders, buyers, sellers, and businesses in real estate transactions, researching and resolving title defects, escrowing funds, and drafting lending documents. To date, Attorney Shehu has closed a real estate deal in every town in Connecticut. As a litigator, Attorney Shehu has proven willing to engage in contentious court battles to obtain results for his clients. While practicing at DLA Piper, LLP, in Boston, Attorney Shehu represented the world’s largest pharmaceutical companies in multidistrict litigations filed throughout the United States. He has been a passionate advocate for immigrants and the seriously injured, frequently advising against lowball settlement offers. He is willing to try every case to verdict, and he meticulously prepares every case for trial. Attorney Shehu began his legal career as a consumer lawyer, utilizing fee-shifting statutes to force unscrupulous businesses to pay the legal fees of aggrieved consumers. For example, in Access Therapies v. Mendoza, 1:13-cv-01317 (S.D. Ind. 2014), Attorney Shehu utilized unique interpretations of the Trafficking Victims Protection Act, Truth-in-Lending Act, and Racketeer Influenced and Corrupt Organizations Act (RICO) to obtain a favorable result for his immigrant client. Attorney Shehu is a Waterbury, Connecticut native. He attended Our Lady of Mount Carmel grammar school, The Loomis Chaffee School, and Chase Collegiate School before earning degrees from Boston College, the University of Oxford’s Said Business School in England, and Pepperdine University School of Law. At Oxford, Karl was voted president of his class. Outside of his law practice, Attorney Shehu has worked to improve the world around him by participating in numerous charitable endeavors. He is a former candidate for the Connecticut Senate and a parishioner of St. Patrick Parish and Oratory in Waterbury. In addition, Attorney Shehu has written extensively on the Twenty-fifth Amendment and law firm retention by multinational firms.

CRAIG C. on ContractsCounsel

I have 27 years of experience with drafting, editing, revising, reviewing and amending business and commercial contracts and agreements of all kinds.

Cherryl M. on ContractsCounsel

I am a U.S. lawyer (licensed in California) and have recently relocated to London. I hold a bachelor’s degree in Political Science from the University of California, Berkeley and a Juris Doctor law degree from the University of California, Hastings College of the Law. I have extensive experience in providing legal services and support in areas of business, labor & employment, IP enforcement (patent infringement, copyright & trademark), and other litigation matters; Reviewing, drafting, and editing business and legal documents/contracts; Conducting legal research and analysis, drafting memorandums, pleadings, discovery, document review, various motions, mediation briefs, and other litigation related activities; Reviewing and preparation of templates, policies, and processes for compliance with laws and regulations; educating and advising on legal and compliance issues.

Myron M. on ContractsCounsel

For over 20 years Myron E. Mims Esq. has provided legal and consulting services to small and medium sized businesses. Mims served as regional counsel for a real estate investment and development firm where he managed the Company’s contract execution and management, and dispute resolution affairs. Mims was responsible for oversight and risk management of all legal affairs, including management of a robust litigation docket consisting of a seven figure, multi-party construction lawsuit, and multiple vendor and tenant disputes. Mims prepared new contract docs and implemented execution and management processes that lead to the reduction of litigation. As a managing partner of Nixon Mims, LLP Mims provided legal and consulting services to clients of that consisted of real estate, construction, telecommunications, media and food industry businesses. Mims routinely assisted clients with developing corporate governance and management protocols, strategic planning initiatives, and advised clients in the negotiation and execution of complex business transactions. Mims routinely provided operational oversight and technical analysis for management. During this period Mims obtained firsthand experience of the access to capital impediments and challenges that growth-stage businesses face.

Find the best lawyer for your project

research funding definition

Quick, user friendly and one of the better ways I've come across to get ahold of lawyers willing to take new clients.

How It Works

Post Your Project

Get Free Bids to Compare

Hire Your Lawyer

Financial lawyers by top cities

  • Austin Financial Lawyers
  • Boston Financial Lawyers
  • Chicago Financial Lawyers
  • Dallas Financial Lawyers
  • Denver Financial Lawyers
  • Houston Financial Lawyers
  • Los Angeles Financial Lawyers
  • New York Financial Lawyers
  • Phoenix Financial Lawyers
  • San Diego Financial Lawyers
  • Tampa Financial Lawyers

Research Funding Agreement lawyers by city

  • Austin Research Funding Agreement Lawyers
  • Boston Research Funding Agreement Lawyers
  • Chicago Research Funding Agreement Lawyers
  • Dallas Research Funding Agreement Lawyers
  • Denver Research Funding Agreement Lawyers
  • Houston Research Funding Agreement Lawyers
  • Los Angeles Research Funding Agreement Lawyers
  • New York Research Funding Agreement Lawyers
  • Phoenix Research Funding Agreement Lawyers
  • San Diego Research Funding Agreement Lawyers
  • Tampa Research Funding Agreement Lawyers

Contracts Counsel was incredibly helpful and easy to use. I submitted a project for a lawyer's help within a day I had received over 6 proposals from qualified lawyers. I submitted a bid that works best for my business and we went forward with the project.

I never knew how difficult it was to obtain representation or a lawyer, and ContractsCounsel was EXACTLY the type of service I was hoping for when I was in a pinch. Working with their service was efficient, effective and made me feel in control. Thank you so much and should I ever need attorney services down the road, I'll certainly be a repeat customer.

I got 5 bids within 24h of posting my project. I choose the person who provided the most detailed and relevant intro letter, highlighting their experience relevant to my project. I am very satisfied with the outcome and quality of the two agreements that were produced, they actually far exceed my expectations.

Want to speak to someone?

Get in touch below and we will schedule a time to connect!

Find lawyers and attorneys by city

The Federal Register

The daily journal of the united states government, request access.

Due to aggressive automated scraping of FederalRegister.gov and eCFR.gov, programmatic access to these sites is limited to access to our extensive developer APIs.

If you are human user receiving this message, we can add your IP address to a set of IPs that can access FederalRegister.gov & eCFR.gov; complete the CAPTCHA (bot test) below and click "Request Access". This process will be necessary for each IP address you wish to access the site from, requests are valid for approximately one quarter (three months) after which the process may need to be repeated.

An official website of the United States government.

If you want to request a wider IP range, first request access for your current IP, and then use the "Site Feedback" button found in the lower left-hand side to make the request.

AcqNotes

The Defense Acquisition Encyclopedia

Financial Management

Research Development Test & Evaluation (RDT&E) funding is used to pay the operating costs of dedicated activities engaged in the conduct of research, development, and test and evaluation efforts performed by a contractor and/or government organization. It’s used to develop equipment, material, or computer application software and its Development Test and Evaluation (DT&E) and Initial Operational Test and Evaluation (IOT&E) .

Research Development Test & Evaluation (RDT&E) Appropriation

There is an RDT&E Appropriation Category for each service.

Research Development Test & Evaluation (RDT&E) Costs

The RDT&E cost may include purchases of end items, weapons, equipment, components, and materials as well as the performance of services in order to develop and test the system. This applies to automated information systems as well as weapon systems.  RDT&E funds can also be used for both investment-type costs (e.g., sophisticated laboratory test equipment) and expense-type costs (e.g., salaries of civilian employees at R&D-dedicated facilities).

RDT&E funds may also be used in the acquisition or construction of industrial facilities costing less than $750,000 for the dedicated use of research and development. All construction at R&D installations and activities costing more than that will be funded in the Military Construction appropriations.

– See Typical RDT&E Cost – See RDT&E Appropriation

Research Development Test & Evaluation (RDT&E) Funding Guidance

DoD 7000.14-R Volume 2A “Financial Management Regulation (FMR)” provides general guidance on the formulation and submission of the budget requests to the Office of the Secretary of Defense (OSD) . It’s used for a budget review, submission, presentation and justification to Congress. For more on the Research and Development rules and regulations and budget guidance

Research Development Test & Evaluation (RDT&E) Budget

The budget request for RDT&E projects and programs should be developed and presented in accordance with the following principles: [1]

  • Annual estimates of initial financing needed for new major weapon systems and other development programs and projects requiring several years to complete, and which involve contracts spanning more than one year, should be formulated to cover costs expected to be incurred during each fiscal year.
  • Work longer than 12 months but less than 24 months with no logical way to divide it up will be presented.
  • 2-year availability of funds authorized for the Research, Development, Test, and Evaluation appropriation provides the necessary flexibility for program execution in those circumstances.
  • Engineering change orders should be funded commensurate with the level of risk in the program.

– See Budgeting Process – See DoD Research and Technology Funding Opportunities

AcqLinks and References:

  • [1] DoD 7000.14-R Volume 2A “Financial Management Regulation”
  • Website: OSD Comptroller DoD 7000.14R

Updated:67/28/2021

Leave a Reply

You must be logged in to post a comment.

Together we are beating cancer

About cancer

Cancer types

  • Breast cancer
  • Bowel cancer
  • Lung cancer
  • Prostate cancer

Cancers in general

  • Clinical trials

Causes of cancer

Coping with cancer

  • Managing symptoms and side effects
  • Mental health and cancer
  • Money and travel
  • Death and dying
  • Cancer Chat forum

Health Professionals

  • Cancer Statistics
  • Cancer Screening
  • Learning and Support
  • NICE suspected cancer referral guidelines

Get involved

  • Make a donation

By cancer type

  • Leave a legacy gift
  • Donate in Memory

Find an event

  • Race for Life
  • Charity runs
  • Charity walks
  • Search events
  • Relay For Life
  • Volunteer in our shops
  • Help at an event
  • Help us raise money
  • Campaign for us

Do your own fundraising

  • Fundraising ideas
  • Get a fundraising pack
  • Return fundraising money
  • Fundraise by cancer type
  • Set up a Cancer Research UK Giving Page
  • Find a shop or superstore
  • Become a partner
  • Cancer Research UK for Children & Young People
  • Our We Are campaign

Our research

  • Brain tumours
  • Skin cancer
  • All cancer types

By cancer topic

  • New treatments
  • Cancer biology
  • Cancer drugs
  • All cancer subjects
  • All locations

By Researcher

  • Professor Duncan Baird
  • Professor Fran Balkwill
  • Professor Andrew Biankin
  • See all researchers
  • Our achievements timeline
  • Our research strategy
  • Involving animals in research

Funding for researchers

Research opportunities

  • For discovery researchers
  • For clinical researchers
  • For population researchers
  • In drug discovery & development
  • In early detection & diagnosis
  • For students & postdocs

Our funding schemes

  • Career Development Fellowship
  • Discovery Programme Awards
  • Clinical Trial Award
  • Biology to Prevention Award
  • View all schemes and deadlines

Applying for funding

  • Start your application online
  • How to make a successful application
  • Funding committees
  • Successful applicant case studies

How we deliver research

  • Our research infrastructure
  • Events and conferences
  • Our research partnerships
  • Facts & figures about our funding
  • Develop your research career
  • Recently funded awards
  • Manage your research grant
  • Notify us of new publications

Find a shop

  • Volunteer in a shop
  • Donate goods to a shop
  • Our superstores

Shop online

  • Wedding favours
  • Cancer Care
  • Flower Shop

Our eBay store

  • Shoes and boots
  • Bags and purses
  • We beat cancer
  • We fundraise
  • We develop policy
  • Our global role

Our organisation

  • Our strategy
  • Our Trustees
  • CEO and Executive Board
  • How we spend your money
  • Early careers

Cancer news

  • Cancer News
  • For Researchers
  • For Supporters
  • Press office
  • Publications
  • Update your contact preferences

ABOUT CANCER

GET INVOLVED

NEWS & RESOURCES

FUNDING & RESEARCH

You are here

research funding definition

Environmental sustainability in research

1. purpose  .

This policy sets out Cancer Research UK (CRUK)’s position on environmental sustainability in research. As a charity, we’re committed to reducing our direct and indirect emissions (scopes 1, 2 and 3) by 50% by 2030, and reaching net zero by 2050.

At CRUK, we recognise the considerable impact research and innovation has on the environment, and recognise our role as a funder of research in driving a greener and more efficient research system to reduce the impact on the environment of the research we fund.

The requirements described in this policy are intended to complement existing and prospective activities to improve sustainability in research organisations we fund.

Download a printable version of this policy

This policy applies to Lead or Joint Lead grant applicants to our response-mode funding schemes, to all Group Leaders, Senior Staff Scientists and Facilities at our core-funded CRUK Institutes, as well as setting requirements for all Host Institutions hosting researchers funded by CRUK

The policy also describes changes to allowable costs for those in receipt of CRUK funding, as well as broader changes that all researchers and staff engaged in cancer research are encouraged to adopt to be more sustainable.

This policy builds on our previous position statement to now set more specific requirements of those we fund.

3. Definitions 

Core facilities: A centralised capability function accessible by research groups at a CRUK Institute.

CRUK Institute : CRUK Scotland Institute; CRUK Cambridge Institute; CRUK Manchester Institute; Francis Crick Institute, London.

Environmental sustainability in research: The act of reducing the environmental impact of research and innovation activities. This includes efforts to (i) reduce or eradicate greenhouse gas emissions, (ii) avoid the depletion or degradation of natural resources, and (iii) allowing for long-term environmental quality.

Host Institution: The university, institution or other organisation at which some or all of the research funded by CRUK will be carried out.

Junior Group Leader/Senior Group Leader (JGL/SGL): A formal appointment at one of the four CRUK Institutes, with responsibility for a research group.

Lead/Joint Lead applicants: The person(s) leading a CRUK grant application.

Senior Staff Scientist: A formal appointment at the CRUK Scotland Institute with responsibility for both a core facility and a research group.

4. Key Points 

We expect all those involved in research we fund to consider, manage and where possible reduce the environmental impact of their work.

4.1 Requirements for CRUK funding applicants 

To be eligible for CRUK response-mode grant funding, Lead and Joint Lead applicants must each :

  • hold either the Laboratory Efficiency Assessment Framework ( LEAF ) or the My Green Lab Certification at the Silver level for their research group(s) at the time of submission;
  • either attach LEAF or My Green Lab Certification to their funding application or provide a link to a publicly available list of accredited research groups at their Host Institution(s).

This requirement will be enforced for Lead and Joint Lead Applicants submitting proposals to all relevant funding calls closing from 1 January 2026 onwards (see exclusions below) and applies to researchers applying from both UK and non-UK-based institutions. Applicants not able to evidence their accreditation at final grant application submission by this date will be considered ineligible.

4.1.1 Exclusions 

The following individuals are not required to attain laboratory sustainability accreditation to be eligible for response-mode funding.

Lead or Joint Lead applicants:

  • to the following schemes : any Bursary; Clinician- and Advanced Clinician Scientist Fellowships; Clinical Trial Fellowship ; Career Development Fellowship ; Career Establishment Award ; any Primer award;
  • holding one of the following positions/job titles (or equivalent) : PhD or MD/PhD student; postdoctoral research fellows/associates; clinical research fellows; research assistants/technicians; facility/technical specialists;
  • whose start date of their contract is fewer than 18 months from the final CRUK grant application submission deadline;
  • whose research groups solely employ research techniques that are : computational, such as bioinformatics, biostatistics, or data science; desk-based, such as qualitative analysis or policy-focused research.

Given not all Lead and Joint Lead Applicants will have the authority to acquire laboratory sustainability accreditation in their immediate research group/environment, we have excluded researchers holding an early career researcher position. In recognition of the time needed to attain accreditation for those leading their own research groups, we have also excluded Lead- or Joint Lead applicants who have only recently been appointed to their post.

Furthermore, the principles and priority areas covered by LEAF and My Green Lab accreditation are most relevant to ‘wet lab’ research settings, such as biomedical, clinical and chemical laboratories. Sustainability accreditations applicable to alternative research settings are currently in development but until these are properly established, we have therefore excluded applicants performing solely ‘non-laboratory’ research methods and practices from this policy requirement.

4.2. Requirements for CRUK Institute Group Leaders 

All Junior- and Senior Group Leaders (JGLs and SGLs), Senior Staff Scientists and core facilities based at CRUK Institutes must hold LEAF accreditation at the Silver level by 1 January 2026 .

4.2.1. Exclusions 

Given that achieving laboratory sustainability accreditation requires lead-in time, the following individuals or structures at our CRUK Institutes are not required to attain accreditation:

  • JGLs, SGLs and Senior Staff Scientists within 18 months of their start date at a CRUK Institute;
  • Core facilities established in the last 18 months.

However, JGLs, SGLs, Senior Staff Scientists and core facilities must have attained accreditation once they have progressed to 18 months since their start date or establishment.

4.3. Requirements for CRUK-funded Host Institutions 

To help support CRUK-funded researchers embed environmental sustainability in their own research, we require that their Host Institutions demonstrate a strong organisational commitment.

UK-based Host Institutions of current and prospective CRUK grantholders (irrespective of grant application date to CRUK) must become a signatory to the Concordat for Environmental Sustainability of Research and Innovation Practice no later than 1 January 2026 . This involves them developing an approach for ensuring sustainability is embedded in:

  • Leadership, governance and strategy;
  • Infrastructure, procurement, travel and collaborations/partnerships;
  • Annual carbon emissions reporting (scopes 1, 2 and 3);

If a Host Institution is not yet a Concordat signatory by this date, CRUK retains the right to apply sanctions to the organisation, which may include discontinuing funding activities.

4.4. Expectations for all researchers and Host Institutions undertaking cancer research 

CRUK expects all researchers and Host Institutions we fund to ensure that the research they conduct is environmentally sustainable. Researchers and Host Institutions should:

  • reduce energy and water consumption, switching on equipment only when it is needed and sharing its use with others;
  • reuse equipment, materials and consumables, using organisational schemes to facilitate sharing and avoid single use plastics if reusable alternatives are available and viable;
  • recycle waste products through institutional facilities and limit using general waste;
  • reduce general wastage in research by conducting research in an open and robust manner, specifically by following requirements set out in (for example) our Open Access- , Data Sharing and Management- and Research Integrity policies.

4.5. Costs guidance for CRUK funding  

CRUK funds direct costs of research and encourages decision making around project-related costs that considers environmental sustainability when determining ‘value for money’, not just the upfront cost.

All costs requested must be justified in the grant application; see our Costs guidance for grants for full details. If allowable in our Costs Guidance, Grantholders can use their existing funds to cover these costs and do not need to ask CRUK to do this.

Allowable costs:

  • Consumables and materials : Sustainable versions of materials and consumables, even if they are more expensive to purchase or dispose of;
  • Equipment : Sustainable versions of equipment, even if they are more expensive to purchase, as well as second hand or refurbished equipment, or the maintenance of existing equipment as an alternative to purchasing new equipment;
  • Research data : Repair of hardware, or replacing/updating hardware and software with more energy-efficient versions. Training and other support needed to develop more efficient code/algorithms, and training or services to support data curation/annotation for maximising its future use;
  • Training : Under our Continuing Professional Development (CPD) policy , c osts can be used for training on how to be more sustainable as a researcher and how to assess the sustainability of the research;
  • Travel : CRUK expects research teams to undertake only essential travel for research activities and if a virtual/digital alternative for the purpose of the travel is not available. Costs associated with travel for research collaboration purposes directly related to the activity funded on the grant are eligible to be included in a grant application as per our Costs Guidance . Where travel is necessary, the decision on the form of travel should consider options with a lower environmental impact, primarily through lower emissions, even if this comes at a higher price.

4.6. Policy updates 

Given the increasing risks posed by climate change, resource consumption and biodiversity loss, and as the R&I system builds capacity to achieve environmental sustainability in research, CRUK intends to monitor developments and evolve this policy from time to time as appropriate.

5. Support & Advice  

For any queries about this policy please contact: [email protected] .

6. Related documents  

For more information, please see the following linked documents:

  • Position Statement on Environmental Sustainability of Research (2022)
  • Policies that affect your grant
  • Conditions of your Grant
  • Costs guidance for grants
  • Environmental sustainability concordat 

More environmentally sustainable cancer research

This article in Cancer news  takes you through the new requirements in our policy on environmental sustainability in research and how this will affect your research and funding applications.  

white header

Morehead State among universities to receive NSF/KCV funding for academic research 

  • 22 April 2024

Morehead State is recognized for academic excellence and a commitment to faculty and student research. MSU recently received significant funding to further research efforts and opportunities.  

The National Science Foundation (NSF) awarded more than $8.25 million to Kentucky Commercialization Ventures (KCV), an initiative of the Kentucky Science and Technology Corporation (KSTC), for a program addressing inequities in academic research funding. It will support underrepresented higher education institutions with research infrastructure like Morehead State.  

The project, Granting Emerging and Developing Institutions a Competitive EDGE (Equitable and Diverse Grant Ecosystem), or KCV EDGE, will provide holistic support for research grant administration to partner institutions MSU, Kentucky State University (KSU), and Northern Kentucky University (NKU).  

"MSU is excited to be part of this KCV EDGE award. We look forward to working with KCV, NKU, and KSU to enhance our research infrastructure and help to create more opportunities for MSU faculty and staff in their pursuit of external funds," said MSU Director of Research & Sponsored Programs Dr. Shannon Harr. "We are also thrilled to be able to enhance some of our Office of Research & Sponsored Programs procedures utilizing these funds."  

KCV Executive Director Kayla Meisner said the program emerged because some Kentucky institutions needed equal access to research funding.  

"We designed KCV EDGE to dismantle these barriers to give our regional, rural, minority-serving, and community and technical institutions a competitive advantage when pursuing federal research grants," Meisner said. "Through their success, we can transform research infrastructure in the commonwealth and beyond."  

The KCV EDGE program will support MSU, KSU and NKU for five years to help each institution develop sustainable research infrastructure. Partner institutions will receive support in grant proposal writing, administration and compliance, technology commercialization and leadership development. Additionally, over $5 million of the grant will be given as sub-awards to partner institutions to build institutional capacity, such as staffing, to support an expanded research enterprise.  

The KCV team plans to expand the program into a statewide and later national model to increase access to research funding and technology transfer support.  

KSTC President Terry Samuel said this KCV grant funding naturally expands and supports KSTC's efforts to assist Kentucky higher education institutions that need more dedicated technology transfer resources.  

"This unprecedented, shared services model is already making a difference for innovators throughout the state, and with the addition of KCV EDGE, has the power to make a national impact," Samuel said.  

The KCV EDGE program funding will support MSU's future academic research efforts and significantly benefit not just Kentucky higher education but the state's future economy.  

"This award is a testament to the work of Kentucky Commercialization Ventures and the many higher education researchers, who are already creating opportunities for Kentucky to build upon the economic momentum we're experiencing throughout the state," said Governor Andy Beshear. "We know there are great ideas coming from faculty, staff and students on campuses across Kentucky. By helping all of our postsecondary institutions succeed in pursuing additional research funding, KCV EDGE will strengthen Kentucky's competitiveness as a center for research and innovation while creating high-quality jobs in the region."  

For more information about KCV EDGE, visit kycommercializationventures.com.   

To learn more about MSU’s Office of Research & Sponsored Programs , contact Harr at [email protected] or call 606-783-2010 . 

Dr. Shannon Harr Director, Office of Research & Sponsored Programs 606-783-2130 Email Us

Social Media

mountain footer

Explainer-What Is Behind the Pro-Palestinian Protests at U.S. Universities?

Reuters

Students continue to protest into the evening at Columbia University in an encampment in support of Palestinians, during the ongoing conflict between Israel and the Palestinian Islamist group Hamas, in New York City, U.S., April 23, 2024, REUTERS/Caitlin Ochs

(Reuters) - Student protests in the U.S. over the war in Gaza have intensified and expanded over the past week, with a number of encampments now in place at colleges including Columbia, Yale, and New York University. Police have been called in to several campuses to arrest demonstrators.

Here are some details on the protests:

WHAT ARE THE PROTESTERS DEMANDING?

Across campuses where protests have broken out, students have issued calls for a permanent ceasefire in Gaza, an end to U.S. military assistance for Israel, university divestment from arms suppliers and other companies profiting from the war, and an amnesty for students and faculty members who have been disciplined or fired for protesting.

War in Israel and Gaza

Palestinians are inspecting the damage in the rubble of the Al-Bashir mosque following Israeli bombardment in Deir al-Balah, central Gaza Strip, on April 2, 2024, amid ongoing battles between Israel and the Palestinian militant group Hamas. (Photo by Majdi Fathi/NurPhoto via Getty Images)

WHO ARE THE PROTESTERS?

Pro-Palestinian protests have drawn students and faculty of various backgrounds, including of Jewish and Muslim faiths. The groups organizing the protests include Students for Justice in Palestine and Jewish Voice for Peace.

The encampments have also attracted a diverse array of teach-ins, interfaith prayers, and musical performances.

Organizers have widely disavowed violence against pro-Israel counter-protesters, although some Jewish students have said they feel unsafe on campus and unnerved by chants they say are antisemitic. 

WHAT HAS BEEN THE RESPONSE FROM AUTHORITIES?

School administrators and local law enforcement have cracked down on the protests.

Columbia and the affiliated Barnard College have suspended dozens of students involved in the protests. More than 100 protesters have been arrested at Columbia, where University President Minouche Shafik called in New York Police to clear the encampment a day after she testified before a U.S. House of Representatives committee. She said the encampment violated rules against unauthorized protests.

Yale police arrested more than 60 protesters on Monday, after giving them "several opportunities to leave and avoid arrest," according to the university. 

The New York Police Department said officers arrested 120 people at NYU late on Monday. University officials said they requested their intervention because protesters had not dispersed and were "interfering with the safety and security of our community."

WHAT HAS BEEN THE IMPACT ON REGULAR CAMPUS LIFE?

After holding all classes virtually on Monday, Columbia announced most courses would be offered with both virtual and in-person attendance options for the rest of the semester. Shafik said in a statement that she would not permit any group to disrupt graduation.

California State Polytechnic University, Humboldt, canceled in-person classes until Wednesday after students barricaded themselves in an administrative building and demanded the school disclose all ties and holdings with Israel and cut ties with Israeli universities.

The University of Michigan said it would allow free expression and peaceful protest at its early May graduation ceremonies but would stop "substantial disruption."

HOW ARE POLITICAL LEADERS RESPONDING?

Democratic President Joe Biden, who has been criticized by the protesters for supplying funding and weapons to Israel, told reporters on Monday that he condemned both "antisemitic protests" and "those who don't understand what's going on with the Palestinians."

Former President Donald Trump, the Republican candidate for the 2024 election, called the campus protest situation "a mess" as he walked into the second day of his criminal trial in New York. 

(Reporting by Julia Harte in New York, Kanishka Singh in Washington, Brendan O'Brien in Chicago, and Andrew Hay in Albuquerque, New Mexico, Editing by Rosalba O'Brien)

Copyright 2024 Thomson Reuters .

Join the Conversation

Tags: United States , Israel , Connecticut , Middle East , New York , education

America 2024

research funding definition

Health News Bulletin

Stay informed on the latest news on health and COVID-19 from the editors at U.S. News & World Report.

Sign in to manage your newsletters »

Sign up to receive the latest updates from U.S News & World Report and our trusted partners and sponsors. By clicking submit, you are agreeing to our Terms and Conditions & Privacy Policy .

You May Also Like

The 10 worst presidents.

U.S. News Staff Feb. 23, 2024

research funding definition

Cartoons on President Donald Trump

Feb. 1, 2017, at 1:24 p.m.

research funding definition

Photos: Obama Behind the Scenes

April 8, 2022

research funding definition

Photos: Who Supports Joe Biden?

March 11, 2020

research funding definition

‘A Rule for the Ages’

Lauren Camera April 25, 2024

research funding definition

Sale? Ban? What’s Next for TikTok?

Laura Mannweiler April 25, 2024

research funding definition

The Status of the Cases Against Trump

Lauren Camera and Kaia Hubbard April 25, 2024

research funding definition

Economy Slows in First Quarter

Tim Smart April 25, 2024

research funding definition

A ‘Fork in the Road’ for Democracy

Lauren Camera April 24, 2024

research funding definition

Johnson at Columbia: ’Stop the Nonsense’

Aneeta Mathur-Ashton April 24, 2024

research funding definition

IMAGES

  1. Research Funding

    research funding definition

  2. Diagram of Research Funding process

    research funding definition

  3. CAMPUS FUNDING RESOURCES

    research funding definition

  4. Project Funding : Definition, Types, Features and Benefits

    research funding definition

  5. Research Funding process stock image. Image of pointing

    research funding definition

  6. Home

    research funding definition

VIDEO

  1. Funding research grants

  2. Introducing Mendeley Funding

  3. How to Write a Business plan? 10 Important Steps of Business Plan

  4. How to get Funded Project/Project Grant/ Research Proposal Financial support

  5. Funding, REF & Collaboration: Best Practice

  6. What is research

COMMENTS

  1. Research Funding—Why, When, and How?

    Research funding is defined as a grant obtained for conducting scientific research generally through a competitive process. To apply for grants and securing research funding is an essential part of conducting research. In this article, we will discuss why should one apply for research grants, what are the avenues for getting research grants ...

  2. What is research funding, how does it influence research ...

    Evaluating the effects of some or all academic research funding is difficult because of the many different and overlapping sources, types, and scopes. It is therefore important to identify the key aspects of research funding so that funders and others assessing its value do not overlook them. This article outlines 18 dimensions through which funding varies substantially, as well as three ...

  3. Defense Primer: RDT&E

    Within the S&T program, basic research (6.1) receives special attention, particularly by the nation's universities. DOD spends nearly half of its basic research budget at universities. DOD is a substantial source of federal university R&D funding for disciplines such as aerospace, aeronautical, and astronautical engineering (64%);

  4. Federal Research and Development R&D Funding: FY2023

    The President's FY2023 budget request would increase funding for basic research by $13.9 billion (33%), applied research by $11.7 billion (28%), development by $19.1 billion (27%), and R&D facilities ... Congressional Research Service 2 Definitions Associated with Federal Research and Development Funding

  5. Funding of science

    Research funding is a term generally covering any funding for scientific research, in the areas of natural science, technology, and social science. Different methods can be used to disburse funding, but the term often connotes funding obtained through a competitive process, in which potential research projects are evaluated and only the most ...

  6. (PDF) What is research funding, how does it influence research, and how

    Research funding does not necessarily hav e the goal to fund research because some streams support network formation in the expectation that the network will access other resources to support ...

  7. Federal Research and Development R&D Funding: FY2024

    Definitions Associated with Federal Research and Development Funding Two key sources of definitions associated with federal research and development (R&D) funding are the White House Office of Management and Budget (OMB) and the National Science Foundation (NSF).

  8. Research Funding

    Measuring Research Funding. The term bibliometrics was first used by Alan Pritchard in 1969, with definition—the application of mathematics and statistical methods to books and other media of communication (Pritchard, 1969). Research funding influences the volume of the research activities as well as the productivity of researchers within the ...

  9. Funding for Research: Importance, Types of Funding, and How to Apply

    Funding for research is the catalyst that fuels groundbreaking discoveries. Image by cottonbro on Pexels.com. Embarking on a PhD or research journey is akin to embarking on a quest for knowledge, a quest that often hinges on a crucial ally - funding for research.

  10. Research Funding 101: Finding and Obtaining Grants

    Public vs. private funding. The two main types of research funding are public and private. "Public grant funding comes from federal or state governments," says Dr. Albert. "Money is typically set aside because there is a specific problem or issue that the government wants rigorous research information on.

  11. Basics of scientific and technical writing: Grant proposals

    Grant proposals. A grant proposal is a formal document you submit to a funding agency or an investing organization to persuade them to provide the requested support by showing that (1) you have a plan to advance a certain valuable cause and (2) that the team is fully capable of reaching the proposed goals. The document may contain a description ...

  12. The value of research funding for knowledge creation and ...

    The Swiss research funding system is characterized by a relatively strong ... The key difference between the RCR and the FCR is that the FCR uses fixed definition of the research field, while for ...

  13. Understanding Types of Grants and Funding

    Understanding Types of Grants and Funding. The NIDCD funds research through a variety of award mechanisms. Research grants (R series) These grants may be awarded to individuals at universities, medical and other health professional schools, colleges, hospitals, research institutes, for-profit organizations, and government institutions.

  14. Research Funding

    Definition; Research Funding is a term generally covering any funding for scientific research, in the areas of both "hard" science and technology and social science. The term often connotes funding obtained through a competitive process, in which potential research projects are evaluated and only the most promising receive funding. ...

  15. Research funding and collaboration

    Research funding contests may promote collaboration through several channels. First, preparing and submitting funding proposals requires proposal team members to develop joint plans for their proposed research and to invest resources in the pursuit of a shared idea. ... Our 'rolling window' definition of N t allows us to capture the birth ...

  16. Types of research funding

    UKRI, primarily through its Research Councils, invests approximately £2.8 billion per annum in research ranging from medical and biological sciences to astronomy, physics, chemistry, engineering, social sciences, economics, and the arts and humanities.The aim, scale and balance of the projects funded reflect the national research priorities agreed in consultation with Government and other ...

  17. RESEARCH FUNDING definition and meaning

    RESEARCH FUNDING definition | Meaning, pronunciation, translations and examples

  18. Research Funding Agreement: Definition & Sample

    A research funding agreement is a contract between a sponsor and a researcher where the latter agrees to donate money toward a specific research project. ... such party. For purposes of this definition only, "control" of another person, organization or entity shall mean the possession, directly or indirectly, of the power to direct or cause ...

  19. eCFR :: 2 CFR 200.1 -- Definitions

    The notice of funding opportunity is any paper or electronic issuance that an agency uses to announce a funding opportunity, whether it is called a "program announcement," "notice of funding availability," "broad agency announcement," "research announcement," "solicitation," or some other term.

  20. Research and Development Funding

    Research Development Test & Evaluation (RDT&E) funding is used to pay the operating costs of dedicated activities engaged in the conduct of research, development, and test and evaluation efforts performed by a contractor and/or government organization. It's used to develop equipment, material, or computer application software and its Development Test and Evaluation (DT&E) and Initial […]

  21. Research Funding Definition

    Research Funding definition. Research Funding. definition. Research Funding means the research funding and license payments made by PDL to EXEL as described in Section 9.2. Research Funding shall have the meaning set forth in Section 4.2.3 (c). Research Funding means the funding to be paid by RWJPRI to KOSAN for the conduct of the RESEARCH PROGRAM.

  22. Defense Science and Technology Funding

    research and development centers (FFRDCs), 0.7%; and others, 2.0%. A number of recommendations have been put forth by various organizations regarding the appropriate level of funding for Defense S&T and DOD basic research, as well as the level of funding for investments in research supporting potentially revolutionary advancements.

  23. Environmental sustainability in research

    The policy also describes changes to allowable costs for those in receipt of CRUK funding, as well as broader changes that all researchers and staff engaged in cancer research are encouraged to adopt to be more sustainable. This policy builds on our previous position statement to now set more specific requirements of those we fund. 3. Definitions

  24. Morehead State among universities to receive NSF/KCV funding for

    Morehead State is recognized for academic excellence and a commitment to faculty and student research. MSU recently received significant funding to further research efforts and opportunities. The National Science Foundation (NSF) awarded more than $8.25 million to Kentucky Commercialization Ventures (KCV), an initiative of the Kentucky Science and Technology Corporation (KSTC), for a program ...

  25. WHO 2024 data call is now open for antifungals in the preclinical

    To have a robust clinical antifungal pipeline it's essential to invest and monitor its upstream development. In November 2022 WHO released the WHO fungal priority pathogens list (FPPL), a catalogue of the 19 fungi that represent the greatest threat to public health. The list is the first global effort to systematically prioritize fungal pathogens, considering the unmet research and development ...

  26. Explainer-What Is Behind the Pro-Palestinian Protests at U.S. Universities?

    US News is a recognized leader in college, grad school, hospital, mutual fund, and car rankings. Track elected officials, research health conditions, and find news you can use in politics ...