Scientific Evaluation Sample Clauses

Scientific Evaluation. The proposals that are found administratively eligible will be scientifically evaluated by three Turkish- and three Maltese-nominated external reviewers. Following the scientific evaluation by external reviewers, TÜBİTAK and MCST will exchange the Project Proposal Evaluation Forms for perusal. The Managing Authorities will make their decisions by considering the evaluations of the external reviewers from both countries. TÜBİTAK and MCST will select the project proposals to be funded during a joint committee meeting. Only the project proposals which are approved by both TÜBİTAK and MCST will be supported within available budget. The intent of this 4th Call is to award five high-ranking projects, ideally including at least one PRIMA-related project.
AutoNDA by SimpleDocs
Scientific Evaluation. Each Managing Authority will follow their own scientific evaluation process as defined in the National Rules & Regulations. As a result of the scientific evaluation processes conducted by the Managing Authorities, they will share the evaluation results with each other. The final decision is taken by the Joint Committee that has equal representation of experts from both Managing Authorities. The Joint Committee will choose to fund the projects among the ones that have been evaluated as the most successful projects by both Managing Authorities’ evaluation processes, depending on the following call criteria: - Scientific / Technological Excellence - Methodology - Project Management - Importance of the international collaboration - Impact
Scientific Evaluation. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the IP Consortium, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the IP consortium or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Scientific Evaluation. The Parties will proceed, independently, to the received applications’ scientific assessment, which, for CNR, will be performed by an Expert Committee, whose appointment is published on CNR institutional website, xxxxx://xxx.xxx.xx/it/progetti-comuni-ricerca. For the Tunisian part, it will be performed by an Expert Committee designed by the MHESR.
Scientific Evaluation. 1.1. For reports, which are contributed by individual members of the consortium, the supplier provides the consortium with the report itself and existing and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set. 1.2. The quality of the reports is determined by the Industry Technical Panel, or experts commissioned by the Steering Committee, in accordance with the Xxxxxxxx et al.3 method 2 xxx.xxxxx0000.xx
Scientific Evaluation. A member of the Scientific Liaison Panel (SLP), expert on the respective proposal topic, is allocated to each proposal. This SLP member accompanies the proposals and he/she is responsible throughout the different steps of the evaluation process and – if the proposal is successful – even afterwards for the cruise reporting. One member of the Advisory Board (AB) participates at the SLP meetings ensuring the transparency of the evaluation process. The ARICE Evaluation Office maintains a list of expert evaluators to assist in the evaluation of all proposals for funding. The names of the experts assigned to individual proposals are not made public. Evaluators are required to read and sign a Declaration of Confidentiality and Conflict of Interest Form. Proposals meeting the eligibility criteria are evaluated based on their individual merit by as a general rule three individual evaluators. Evaluators are chosen in mutual agreement by the Scientific Liaison Panel and the Evaluation Office. The experts examine the proposal(s) assigned to them and score and comment on each proposal under each of the Evaluation Criteria (see below) using an individual Proposal Assessment Form.
Scientific Evaluation. For reports, which are contributed by individual Members, the supplier provides the Consortium with the report itself and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the Technical Committee, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the Technical Committee or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
AutoNDA by SimpleDocs
Scientific Evaluation. The Parties will proceed, independently, to the received applications’ scientific assessment, which, for CNR, will be performed by an Expert Committee, whose appointment is published on CNR institutional website, xxxxx://xxx.xxx.xx/it/progetti-comuni-ricerca. For the Tunisian part, it will be performed by an Expert Committee designed by the MHESR. The Committee will select the projects worth of financing by following the below-mentioned criteria: 1. Project Quality (maximum 5 points): scientific relevance (concept, innovative character with respect to the state of the art, quality of the goals); methodology, activities work plan; impact of project results, dissemination, use. 2. Quality of the Research Group (maximum 5 points): qualifications and expertise; complementarity in terms of content, methodology and equipment; justification for a collaboration necessity 3. Appropriateness and justification of the work plan (maximum 5 points). 4. Added value and a predictably wider impact, thanks to bilateral cooperation (maximum 5 points). The applications scientific assessment’s outcome will be notified to the Italian and Tunisian Responsible.
Scientific Evaluation. To prioritise the principle of caution. “Before launching a research project, an evaluation of the impacts should be performed beforehand. (…) [A good practice is] the evaluation performed by the scientist of UN, the IPCC [International Panel on Climatic Change]. (…) Unfortunately, all the evaluations I know are a posteriori and not a priori. [Another good practice] is the EU’s elaboration of the RICH legislation on the evaluation of cancer-related illnesses. There are many products with unknown impacts on health and the environment. Thanks to that evaluation, a new regulation will be established for controlling these types of products.” (NSCI) - To learn from good practices. “To systematize good practices, to combine ex ante and ex post evaluation to professionalize evaluation…” (MAN) “[A good practice is] the independent evaluation of the CSIC when elaborating its Planning of Activities for the period 2006-2009.” (PBMAN) - Transparency and decentralisation. “We often do not know who we are evaluating, their curricula, who has chosen them, which criteria they are using… I think transparency and decentralization are crucial.” (SSCI) - Internationalisation of evaluation processes. “To externalize and internationalize evaluation processes.” (PBMAN) - To apply the consequences of evaluation: rewards and punishments. “Measurable indicators should be implemented. (…) And sometimes drastic decisions must be adopted [for example, to fire professors or researchers].” (PVMAN) “For me it is important to have an evaluation that recognises the excellence of scientific work, helping those who are doing it better more. (…) What is well done should be rewarded, and what is not well done should not.”
Scientific Evaluation. ‌ 1.1. For reports, which are contributed by individual members of the consortium, the supplier provides the consortium with the report itself and existing and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set. 1.2. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al. method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. 1.3. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. 1.4. The quality of the robust summaries and IUCLID datasets is determined by the Technical Committee, or experts commissioned by the latter. 1.5. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing the Technical Committee or experts commissioned by the latter, should develop a robust summary and an IUCLID update. 1.6. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!