Back to Blog

Why EU Grant Proposals Fail: 7 Mistakes from Real Evaluation Reports

February 23, 2026
11 min read

Horizon Europe funds around 17% of submitted proposals on average. For programmes like the EIC Accelerator the rate drops to 7%. That means the vast majority of applicants receive an Evaluation Summary Report (ESR) with a rejection instead of a grant agreement. The reasons are remarkably consistent. This article draws on official REA guidance, published ESR patterns, and analysis from EU funding consultancies to identify the seven mistakes that appear most frequently in unsuccessful proposals.

17%
Average success rate
Horizon Europe 2021-2023
~7%
EIC Accelerator
Full-application stage
61,700+
Proposals submitted
Horizon Europe 2021-2023
70%
High-quality unfunded
Budget constraint, not quality
Why budget alone does not explain rejections
The oversubscription rate across Horizon Europe calls is 4.7x. Even if every below-threshold proposal were suddenly fixed, the budget could still only fund roughly one in five. However, analysis of ESRs shows that a significant share of rejected proposals fall below threshold on at least one criterion, meaning they would not be funded even in an unconstrained budget scenario. These are the avoidable failures this article addresses.
1

1. Applying to the Wrong Call

Scope mismatch is the fastest route to automatic rejection. Calls in Horizon Europe Pillar II are defined at the level of specific “expected outcomes” written into the Work Programme. Evaluators are instructed to score proposals that do not demonstrably address those outcomes below threshold on Excellence, regardless of how technically strong the underlying project is.

“The proposal does not sufficiently address the expected outcomes listed in the call. The connection between the project objectives and the specific challenge described in the Work Programme remains unclear.”

Typical ESR language, Horizon Europe Pillar II collaborative calls

This mistake is distinct from low quality. The project may be excellent for a different call. The fix requires reading the full Work Programme section, not just the call title, and mapping each expected outcome to a specific section of the proposal.

What this looks like in practice
  • Applying to a climate resilience call with a project primarily about urban logistics efficiency
  • Submitting a TRL 2-3 basic research project to an EIC Accelerator call expecting TRL 6+
  • Reusing a previous proposal with minimal edits for a different topic area
2

2. Failing the Innovation Bar: Weak State-of-the-Art Analysis

The Excellence criterion requires proposals to demonstrate that the proposed work is “ambitious and goes beyond the state-of-the-art.” This is one of the most common below-threshold findings. Evaluators are domain experts. A superficial literature review, selective citations, or claims of novelty that ignore competing approaches are spotted immediately.

“The applicant does not convincingly position the proposed research relative to recent work in the field. Several relevant competing approaches are not discussed, and the specific contribution beyond existing methods is not sufficiently articulated.”

Typical ESR language, ERC Starting Grant calls

Three specific failures recur in ESR feedback on this point:

  • 1.Cherry-picked comparisons. Citing only older or weaker competing work while ignoring recent publications or commercially available alternatives.
  • 2.Generic novelty claims. Statements like “this has never been done before” without evidence. Evaluators expect a structured comparison: what exists, where it falls short, and exactly how your approach differs.
  • 3.No positioning table. A comparative table showing your approach versus alternatives across key dimensions is a standard expectation in funded proposals. Its absence is noticed.
3

3. A Vague Impact Section with No Measurable Outcomes

The Impact criterion is weighted equally with Excellence and Implementation in most Horizon Europe calls. Yet it consistently generates the weakest scores. According to the European Research Executive Agency, a typical mistake is “addressing too few impact dimensions” and confusing project results with project impact.

“The impact section reads as a list of results rather than a demonstration of societal and economic change. Expected impacts are described qualitatively without Key Performance Indicators, timelines, or a plausible pathway from project output to real-world change.”

REA common mistakes guidance; Europa Media Trainings ESR analysis

Horizon Europe defines impact across three dimensions: scientific or technological, economic, and societal. A proposal that only covers one or two of these will be scored accordingly. Evaluators also look for:

  • Specific, quantified KPIs with target values and measurement methods
  • A credible exploitation plan with named commercial pathways, not “we will seek licensing deals”
  • An explanation of who benefits and when, including intermediate and end beneficiaries
  • Connection to specific EU policy objectives such as Green Deal targets or Digital Decade benchmarks

Impact is not the same as results. Results are what the project produces: a dataset, a prototype, a publication. Impact is what changes in the world as a result. The pathway between the two must be explicitly argued, not assumed.

4

4. An Unconvincing or Internally Inconsistent Work Plan

The Implementation criterion covers the work plan, budget, and consortium composition. Evaluators read it to answer one question: can this team actually deliver what they are promising, in the time they are claiming, with the resources they are requesting?

“The work plan does not provide sufficient detail on the methodology for Work Package 3. Task dependencies are not clearly articulated, and the timeline for the validation phase appears optimistic given the regulatory requirements described elsewhere in the proposal.”

Typical ESR language, Horizon Europe collaborative projects

Three internal consistency failures are flagged most often:

  • 1.Disconnected sections. An objective described in Section 1 (Excellence) that does not appear in the work packages in Section 3 (Implementation), or an impact claim in Section 2 that has no corresponding dissemination activity. Evaluators cross-reference sections.
  • 2.Unrealistic timelines. Regulatory approval, clinical trials, or user validation studies compressed into implausibly short periods. If you describe a complex regulatory pathway in Section 1, the work plan must reflect the actual timeline.
  • 3.Missing risk analysis. Horizon Europe proposals are expected to include a risk table covering technical, organisational, and external risks with mitigation measures. Proposals that omit this or treat it as a formality are scored down.
5

5. A Consortium Built for Convenience, Not Competence

Evaluators assess consortia against two questions: does this team have the expertise to deliver the work, and does the composition make sense for the call objectives? PNO Innovation, one of Europe’s largest grant consultancies, identifies three structural mistakes that appear consistently in rejected consortium-based proposals.

  • 1.Geographic concentration. Several Horizon Europe call topics explicitly favour multi-country teams. A consortium dominated by organisations from a single country triggers scrutiny on whether the “European added value” criterion is met.
  • 2.Partners without a clear role. Adding organisations to reach minimum consortium size, without assigning them meaningful tasks, is visible in the work plan. Evaluators comment on partners whose contribution is listed only as “dissemination and stakeholder engagement” with no specific tasks or deliverables.
  • 3.Missing end-user representation. For applied research calls, the absence of a partner who will actually use the outputs, such as a public authority, a health system, or an industry operator, weakens both the Impact and Implementation scores.

“While individual partner competences are described, the synergies between partners are not sufficiently demonstrated. It is not clear why this specific combination of organisations is the optimal consortium to address the challenge.”

Typical ESR language, Horizon Europe Pillar II Research and Innovation Actions
6

6. Weak Dissemination and Exploitation Plans

Horizon Europe introduced a stronger requirement for exploitation planning compared to Horizon 2020, but many proposals still treat dissemination and exploitation as a compliance checkbox rather than a substantive section. This directly impacts the Impact score.

The distinction between the three concepts matters to evaluators:

ConceptMeaningCommon mistake
DisseminationSharing results with the scientific communityListed as “we will publish papers” with no venues, audiences, or schedule
CommunicationReaching broader public audiencesConflated with dissemination; no specific public engagement activities
ExploitationTurning results into value: commercial, policy, or socialDeferred to after the project; no IP ownership plan; no commercial pathway described

Evaluators flag proposals that address exploitation as “too early to consider” or that list only academic publications as exploitation outputs. For industry-facing calls, the absence of a route-to-market analysis is a near-automatic score reduction on Impact.

7

7. Poor Readability and Structural Incoherence

Each proposal is reviewed by three to five independent experts who spend roughly two to three hours on it before the consensus meeting. Dense, unstructured text that buries key information penalises even technically strong proposals. This is not a stylistic issue: it directly affects whether evaluators can find and credit the content that would justify a higher score.

Common structural failures cited in ESR feedback:

  • 1.Not following the template structure. The proposal template in the Funding and Tenders Portal has specific sections for a reason. Merging or reordering sections forces evaluators to search for information, which reduces scores.
  • 2.Objectives that are not objectives. Objectives must be specific, measurable, and bounded by the project duration. “To advance the understanding of X” is a research direction, not an objective. “To develop and validate a prototype of X achieving performance threshold Y by Month 30” is an objective.
  • 3.Unexplained acronyms and undefined terminology. Technical jargon without definition is penalised more in multi-disciplinary panels where not every evaluator is a specialist in your exact sub-field.
  • 4.No figures, tables, or diagrams. A 45-page proposal with no visual representation of the architecture, methodology, or work plan is harder to evaluate than one that uses structured visuals to support the text.

“The proposal would benefit from clearer objectives. As written, it is difficult to identify what specific, measurable outcomes the consortium commits to delivering within the project lifetime.”

Typical ESR language, Horizon Europe collaborative projects

Pre-Submission Checklist: 20 Questions Before You Submit

Use this checklist in the final week before submission. Each item maps to one of the seven failure modes above.

Scope and Call Fit
  • Every expected outcome from the Work Programme is addressed by name in the proposal
  • The TRL of your solution matches the call’s stated TRL range
  • You have read the full topic text, not just the call title and budget
Excellence
  • A comparison table positions your approach against at least three current alternatives
  • Every novelty claim is supported by a reference or by prior project results
  • Objectives are specific, measurable, and bounded by project month numbers
Impact
  • At least five KPIs with numerical targets and measurement methods are listed
  • Scientific, economic, and societal impact dimensions are all addressed
  • The exploitation plan names specific routes to market or policy uptake, not generic intentions
  • Dissemination, communication, and exploitation activities are listed separately with timelines
Implementation
  • Every objective in Section 1 appears as a task or deliverable in the work plan
  • The Gantt chart shows critical path dependencies and go/no-go milestones
  • The risk table covers technical, organisational, and external risks with mitigation measures
  • Each partner’s role, person-months, and unique contribution is clearly described
Readability and Compliance
  • All acronyms are defined at first use
  • The proposal follows the template section order exactly
  • Each section is within its page or word limit
  • An external reviewer unfamiliar with the project has read the Impact section
  • The document was submitted at least 24 hours before deadline to allow for technical issues
  • Ethics and Open Science sections are completed, not left as placeholders

What to Do When You Receive an ESR

Every applicant who submits an eligible proposal receives an Evaluation Summary Report via the Funding and Tenders Portal, regardless of outcome. The ESR is the most actionable document in the resubmission process.

  1. Read the consensus scores first. A score below 3.0 on any criterion signals a structural problem. A score of 3.5-4.0 often means the idea is sound but the writing is the issue.
  2. Map each comment to a specific proposal section. ESR comments are generally criterion-specific. Annotate your original proposal document with each comment alongside the relevant paragraph.
  3. Distinguish between fixable and unfixable issues. Scope mismatch often means applying to a different call, not rewriting. Weak impact can usually be addressed by adding KPIs and exploitation detail. A fundamentally weak scientific concept is harder to repair.
  4. Check resubmission eligibility. Most Horizon Europe calls allow resubmission. Some EIC calls impose a waiting period. Confirm the rules for your specific call before investing in a rewrite.
  5. Request a redress review if scores appear miscalculated. If you believe an evaluator has made a factual error, the Funding and Tenders Portal has a formal complaints process. This does not re-evaluate quality judgments, only procedural errors.
Rejection is not final
Many funded Horizon Europe projects were successful on their second or third submission. The combination of evaluator feedback and an additional year of project development frequently strengthens both the technical content and the team’s ability to articulate impact. Treat the ESR as a paid consultation.

Frequently Asked Questions

What is the Horizon Europe success rate?

The overall average success rate in Horizon Europe is approximately 17% across all calls (2021-2023). Rates vary significantly by programme: EIC Accelerator sits around 7% at the full-application stage, ERC Starting Grants are near 14-15%, and individual Pillar II cluster destinations can fall below 10% in highly competitive years.

What are the three evaluation criteria for Horizon Europe proposals?

All Horizon Europe collaborative project proposals are scored on three criteria: Excellence (quality of objectives, methodology, and state-of-the-art positioning), Impact (expected benefits, exploitation and dissemination plans, contribution to EU priorities), and Implementation (work plan credibility, consortium competence, resource allocation). Each criterion is scored 0-5, with minimum thresholds that vary by call.

Can I resubmit a rejected Horizon Europe proposal?

Yes. Most Horizon Europe calls allow resubmission in subsequent deadlines unless the call text explicitly prohibits it. You should carefully review your Evaluation Summary Report (ESR), available in the Funding and Tenders Portal, and address every evaluator comment before resubmitting. Many funded projects were successful on their second or third attempt.

What is an Evaluation Summary Report (ESR) and how do I get it?

An ESR is the formal feedback document issued after proposal evaluation. It contains individual scores per criterion, overall consensus scores, and written comments from the evaluator panel. You can access it in the Funding and Tenders Portal under your submission regardless of the outcome. ESRs are available to all registered applicants, typically a few weeks after results are announced.

How many proposals does Horizon Europe receive per year?

Demand is very high. In 2021-2023, over 61,700 eligible proposals were submitted across all Horizon Europe calls, of which roughly 10,300 were selected for funding. For individual calls, submission volumes range from a few hundred on niche topics to over 10,000 for popular schemes such as ERC Postdoctoral Fellowships in 2024.

What is the most common reason EU grant proposals fail?

According to the European Research Executive Agency and proposal consultants, the most frequently cited weaknesses are: a weak or insufficiently evidenced Impact section, failure to demonstrate novelty beyond the state of the art, and an unconvincing or internally inconsistent work plan. Scope mismatch, applying to the wrong call, is also a leading cause of below-threshold scores.

Do I need a consultant to win an EU grant?

Not necessarily. Success rates for organisations with strong in-house expertise are comparable to those using consultants. However, first-time applicants consistently benefit from external review and coaching, particularly for the Impact section and consortium structuring. The key is understanding what evaluators look for before writing, not just writing a good project description.

How long does the Horizon Europe evaluation process take?

The standard timeline from submission deadline to results is 5-6 months for most Horizon Europe collaborative projects. ERC grants take 6-9 months due to the two-stage evaluation. EIC Accelerator uses a rolling cut-off system with results typically published 3-4 months after each cut-off date.

Find Calls Where Your Proposal Has the Best Chance

Scope mismatch is the most avoidable failure on this list, and it starts with finding the right call. The EU Funding and Tenders Portal lists thousands of open and upcoming calls, but navigating it to find the ones that genuinely match your project takes hours. GrantsFinder analyses your project description and returns the calls most semantically aligned with your work, ranked by relevance. You can use it free before committing to a full application.

If you are already preparing an application, the 10 expert tips guide covers the positive case: what funded proposals do well across each evaluation criterion. For a grounding in how the funding landscape is structured, EU Grants 101 explains programme types, eligibility rules, and the application process from scratch.

Related Articles

Ready to find your next EU grant?

Let GrantsFinder help you discover the perfect funding opportunities for your project.

Get started for free