Home / Toolkit / Impact & Evaluation / Scientific evaluation practices

Scientific evaluation practices

The scientific evaluation of an ERIC is a complex task and can encompass various aspects. A credible, independent and high-quality scientific evaluation is essential for the scientific communities accessing the services, facilities, samples, and data, to generate high-quality data and robust results through the ERICs, particularly in the context of the debate on the reproducibility of research results. Evaluation should cover the technology, methodology, quality of services, cost model, access procedures, scientific impact of supported projects, socio-economic impact, as well as the organization and if distributed, its national nodes.

The scientific evaluation process and area to be evaluated should correspond to the strategic objectives of the RI. Alignment of stakeholders’ expectations and procedures as well as guidance about KPIs and SEI would be useful.

TIP

Scientific evaluation should cover the technology, methodology, quality of services, cost model, access procedures, scientific impact of supported projects, socio-economic impact, as well as the organization and its national nodes if distributed.

How is the scientific evaluation foreseen by ERICs?

ERIC Forum analysed the statutes of the 23 ERICs created to date (March 2022) to check how the scientific evaluation was foreseen.

  • Half of the ERICS statutes include an article with a provision about the scientific evaluation or mention scientific evaluation in the statutes or in annex. In most of cases, the body in charge of the evaluation is mentioned in the statutes.
  • For 10 infrastructures, this evaluation shall be performed by independent international experts or an independent body. This body is appointed by the General Assembly.
  • For 4 infrastructures, the body is an internal scientific Monitoring group which can be complemented by additional experts specifically appointed for the purpose of the evaluation.
  • For 2 ERICs, there is no mention in the statutes of the evaluation body foreseen for the evaluation. The frequency of the evaluation varies between one to 5 years (most common) according to the infrastructure. For 2 infrastructures, the frequency is not mentioned in the statutes.
The scientific evaluation process and area to be evaluated should correspond to the strategic objectives of the RI. Alignment of stakeholders’ expectations and procedures as well as guidance about KPIs and SEI would be useful.

Case studies from two ERICs

ERIC Forum has identified and compared two scientific evaluation practices from ECRIN-ERIC and ICOS ERIC, which both went through an independent external evaluation. Their experience was shared during a joint meeting with the Horizon 2020 Accelerate project on 16 December 2020 and summarised below. See deliverable 4.2. for a more detailed comparison of their methodology, criteria, schedule and reporting.

Case ICOS ERIC

The scientific evaluation of ICOS ERIC was foreseen in the statutes and included the decision body, the periodicity, the type of evaluation committee and topics to be evaluated as well as the reporting.

ICOS ERIC evaluated 15 subcategories against 36 criteria.

CATEGORIES

CRITERIA

MANAGEMENT

General Management

CRITERION 1 Management processes are in place

CRITERION 2 Documentation is available

CRITERION 3 Processes are well executed

Operational management

CRITERION 1 Availability of technical requirements for ICOS instrumentation

CRITERION 2 Availability of ICOS-approved operational practices for the measurement of variables

CRITERION 3 Stations are labelled

CRITERION 4 Data coverage in temporal and spatial dimensions is effective

CRITERION 5 New technologies are implemented

Data life cycle

CRITERION 1 Data workflows are well defined and effective

CRITERION 2 Data is made available in a timely fashion

CRITERION 3 Data is compliant with FAIR principles

CRITERION 4 All data and data-related services are available via the Carbon Portal as the single-access point/centralised entry gateway

FINANCIAL MANAGEMENT

Core funding

CRITERION 1 The amount of core funding is in line with operations

CRITERION 2 Measures to monitor mid-term financial sustainability are implemented

CRITERION 3 Risk mitigation methods are in use

Project funding

CRITERION 1 Project funding is actively sought and reported

CRITERION 2 Project funding is effectively used and its usage is monitored

INTERNAL ENGAGEMENT AND INTEGRATION

Internal engagement

CRITERION 1 ICOS participants feel that their work is recognised, identify themselves as ICOS partners and are active in branding ICOS

CRITERION 2 ICOS participants are interested in and participate in common activities, as well as take part in organising them

Internal integration and structure

CRITERION 1 Internally, ICOS is a well-integrated organisation, in which participants feel properly included

CRITERION 2 The ICOS organisation has the ability to improve its activities and respond in an agile way to new opportunities or challenges

CRITERION 3 ICOS has potential for an alternative and improved structure

ICOS DATA AND USER EXPECTATIONS

A priori design

CRITERION 1 ICOS participates or enables participation in international efforts to codesign standards for ICOS measurements

Data download

CRITERION 1 ICOS data is downloaded from the Carbon Portal by users in all ICOS domains

CRITERION 2 ICOS data is downloaded via other portal

ICOS data usage

CRITERION 1 ICOS data is used and cited in scientific publications

CRITERION 2 ICOS data is used across different scientific fields

CRITERION 3 ICOS data is used in educational tools and education activities

Active data promotion and meeting user/stakeholder expectations

CRITERION 1 ICOS facilitates scientific initiatives successfully

CRITERION 2 ICOS Science Conferences successfully enable scientific exchange

CRITERION 3 Articles are published in online media/general media outlets, and the RI is present on social media

Downstream private sector cooperation for ICOS data usage

CRITERION 1 ICOS engages in downstream projects with the private sector

INTERNATIONAL COOPERATION

Estimation of the intensity of ICOS international cooperation

CRITERION 1 Cooperation with the main actors of the European and global GHG information systems

CRITERION 2 Relevance for the global response to climate change

The individual level of ICOS involvement in international cooperation

CRITERION 1 Participation in events of regional or global relevance

ICOS international cooperation in the eyes of the stakeholders

CRITERION 1 Common observational sites with other RIs at country level

CRITERION 2 Formal agreements (Memoranda of Understanding, MoUs) with other RIs or organisations

The whole process from the mandate given by the ICOS-ERIC General Assembly to the Head Office to coordinate with the external evaluation committee, to the final report took one year. The ICOS office supported the concept development and prepared the evidence report (documentation and data). This was a high overall workload, estimated to 2 full-time equivalent.

Read more about the process in the links below.

Case ECRIN ERIC

The scientific evaluation of ECRIN ERIC was foreseen in the statutes and included the decision body, the periodicity, the type of evaluation committee.

The scientific evaluation focused on three main domains; Positioning and strategy, Governance and management, and Activities. Those domains were assessed against 14 standards considering the 3 following criteria; quality of services provided to support research and excellence, the impact and relevance for society and the sustainability and management efficiency. Others aspects such as research integrity, ethics, capacity building and interaction with other organisations could possibly be considered.

DOMAINS STANDARDS

Positioning and strategy

Standard 1: the ERIC presents its positioning and its operation model in light of its missions in the European landscape of research and innovation.

Standard 2: the ERIC has an institutional strategy in relation to its missions and skills in the European landscape of research infrastructures and innovation.

Standard 3: the ERIC has a strategy of alliances and partnerships on a local, national and international level.

Governance and management

Standard 4: the ERIC defines a functional and geographical organization for the implementation of its activities in support of its missions and strategy.

Standard 5: the governance of the ERIC is based on authorities and decision-making processes consistent with the strategy and chosen modes of action

Standard 6: the ERIC has implemented an overall quality policy which takes into account the monitoring of all activities and results, and the implementation of corrective actions

Standard 7: the ERIC develops a communication policy

Standard 8: the ERIC manages multi-annual implementation of its strategy by using prospective analysis tools

Standard 9: the ERIC structures its management processes and relies on a suitable set of support and assistance services

Standard 10: Data management

Standard 11: Intellectual property

Activities

Standard 12: Service provision to users

Standard 13: the ERIC demonstrates its ability to monitor, analyse and qualify the results of its various activities

Standard 14: the ERIC controls its development trajectory

Read more about ECRIN’s process in the links below.

Other examples

Some ERICs are evaluated by their independent advisory board/committee (namely Scientific Advisory Board, Scientific and Technical Advisory Group, Scientific, Technical and Ethics Advisory Committee,_…) that are consultative bodies composed of independent experts.

EURO-ARGO is one of the ERICs that underwent an evaluation after 5 years’ existence and they prepared an activity report covering the evaluation period and including a set of KPIs. They also developed a 5-year plan about the objectives for the next period.

Resources and further reading