Scientific Evaluation Sample Clauses

Scientific Evaluation. The proposals that are found administratively eligible will be scientifically evaluated by three Turkish- and three Maltese-nominated external reviewers. Following the scientific evaluation by external reviewers, TÜBİTAK and MCST will exchange the Project Proposal Evaluation Forms for perusal. The Managing Authorities will make their decisions by considering the evaluations of the external reviewers from both countries. TÜBİTAK and MCST will select the project proposals to be funded during a joint committee meeting. Only the project proposals which are approved by both TÜBİTAK and MCST will be supported within available budget. The intent of this 4th Call is to award five high-ranking projects, ideally including at least one PRIMA-related project.
AutoNDA by SimpleDocs
Scientific Evaluation. Each Managing Authority will follow their own scientific evaluation process as defined in the National Rules & Regulations. As a result of the scientific evaluation processes conducted by the Managing Authorities, they will share the evaluation results with each other. The final decision is taken by the Joint Committee that has equal representation of experts from both Managing Authorities. The Joint Committee will choose to fund the projects among the ones that have been evaluated as the most successful projects by both Managing Authorities’ evaluation processes, depending on the following call criteria: - Scientific / Technological Excellence - Methodology - Project Management - Importance of the international collaboration - Impact
Scientific Evaluation. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the IP Consortium, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the IP consortium or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Scientific Evaluation. For reports, which are contributed by individual Members, the supplier provides the Consortium with the report itself and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set. The quality of the reports is determined by the Technical Committee, or experts commissioned by the latter, in accordance with the Xxxxxxxx et al.1 method by classifying the report into one of the following categories: (1) reliable without restriction, (2) reliable with restrictions, (3) not reliable, (4) not assignable. The allocation to the four categories must be accompanied by appropriate substantiation in accordance with the requirements described in the chapter "Documentation of reliability categories in data sheets (IUCLID)" of the Xxxxxxxx et al. publication. The quality of the robust summaries and IUCLID datasets is determined by the Technical Committee, or experts commissioned by the latter. If the documents (IUCLID data set and/or robust summary) submitted by a party supplying a report are not in conformity with the state of the art or missing, the Technical Committee or experts commissioned by the latter, should develop a robust summary and an IUCLID update. Also studies, for which no standard protocol exists, e.g., exposure studies, must be documented by an IUCLID data set and a robust summary, and are also to be evaluated under the Xxxxxxxx et al. method.
Scientific Evaluation. 1.1. For reports, which are contributed by individual members of the consortium, the supplier provides the consortium with the report itself and existing and available summaries in the form of an IUCLID data set and a robust summary. The robust summary may also be integrated into the IUCLID data set.
Scientific Evaluation. A member of the Scientific Liaison Panel (SLP), expert on the respective proposal topic, is allocated to each proposal. This SLP member accompanies the proposals and he/she is responsible throughout the different steps of the evaluation process and – if the proposal is successful – even afterwards for the cruise reporting. One member of the Advisory Board (AB) participates at the SLP meetings ensuring the transparency of the evaluation process. The ARICE Evaluation Office maintains a list of expert evaluators to assist in the evaluation of all proposals for funding. The names of the experts assigned to individual proposals are not made public. Evaluators are required to read and sign a Declaration of Confidentiality and Conflict of Interest Form. Proposals meeting the eligibility criteria are evaluated based on their individual merit by as a general rule three individual evaluators. Evaluators are chosen in mutual agreement by the Scientific Liaison Panel and the Evaluation Office. The experts examine the proposal(s) assigned to them and score and comment on each proposal under each of the Evaluation Criteria (see below) using an individual Proposal Assessment Form.
Scientific Evaluation. To prioritise the principle of caution. “Before launching a research project, an evaluation of the impacts should be performed beforehand. (…) [A good practice is] the evaluation performed by the scientist of UN, the IPCC [International Panel on Climatic Change]. (…) Unfortunately, all the evaluations I know are a posteriori and not a priori. [Another good practice] is the EU’s elaboration of the RICH legislation on the evaluation of cancer-related illnesses. There are many products with unknown impacts on health and the environment. Thanks to that evaluation, a new regulation will be established for controlling these types of products.” (NSCI) - To learn from good practices. “To systematize good practices, to combine ex ante and ex post evaluation to professionalize evaluation…” (MAN) “[A good practice is] the independent evaluation of the CSIC when elaborating its Planning of Activities for the period 2006-2009.” (PBMAN) - Transparency and decentralisation. “We often do not know who we are evaluating, their curricula, who has chosen them, which criteria they are using… I think transparency and decentralization are crucial.” (SSCI) - Internationalisation of evaluation processes. “To externalize and internationalize evaluation processes.” (PBMAN) - To apply the consequences of evaluation: rewards and punishments. “Measurable indicators should be implemented. (…) And sometimes drastic decisions must be adopted [for example, to fire professors or researchers].” (PVMAN) “For me it is important to have an evaluation that recognises the excellence of scientific work, helping those who are doing it better more. (…) What is well done should be rewarded, and what is not well done should not.”
AutoNDA by SimpleDocs
Scientific Evaluation. The Parties will proceed, independently, to the received applications’ scientific assessment, which, for CNR, will be performed by an Expert Committee, whose appointment is published on CNR institutional website, xxxxx://xxx.xxx.xx/it/progetti-comuni-ricerca. For the Tunisian part, it will be performed by an Expert Committee designed by the MHESR. The Committee will select the projects worth of financing by following the below-mentioned criteria:

Related to Scientific Evaluation

  • JOC EVALUATION If any materials being utilized for a project cannot be found in the RS Means Price Book, this question is what is the markup percentage on those materials? When answering this question please insert the number that represents your percentage of proposed markup. Example: if you are proposing a 30 percent markup, please insert the number "30". Remember that this is a ceiling markup. You may markup a lesser percentage to the TIPS Member customer when pricing the project, but not a greater percentage. EXAMPLE: You need special materials that are not in the RS Means Unit Price Book for a project. You would buy the materials and xxxx them up to the TIPS Member customer by the percentage you propose in this question. If the materials cost you, the contractor, $100 and you proposed a markup on this question for the material of 30 percent, then you would charge the TIPS Member customer $130 for the materials. No response TIPS/ESC Region 8 is required by Texas Government Code § 791 to be compensated for its work and thus, failure to agree shall render your response void and it will not be considered. Yes - No Vendor agrees to remit to TIPS the required administration fee or, if resellers are named, guarantee the fee remittance by or for the reseller named by the vendor?

  • Annual Evaluation The Partnership will be evaluated on an annual basis through the use of the Strategic Partnership Annual Evaluation Format as specified in Appendix C of OSHA Instruction CSP 00-00-000, OSHA Strategic Partnership Program for Worker Safety and Health. The Choate Team will be responsible for gathering required participant data to evaluate and track the overall results and success of the Partnership. This data will be shared with OSHA. OSHA will be responsible for writing and submitting the annual evaluation.

Time is Money Join Law Insider Premium to draft better contracts faster.