Methodological approach. Econometric estimation of total factor productivity (TFPQ) at firm level. Subsequent estimation of the causal effect of import competition on firm productivity using econometric techniques.
Methodological approach. The neuGRID project is focused on the setting up of “a grid-based e-infrastructure for data archiving/communication and computationally intensive applications in the medical sciences”. The exploitation of the developed infrastructure for the exchange of imaging and clinical data has been assured by a sharp and well focused dissemination strategy, ensuring effective collaboration both internally in the project and with communities external to it, and coordinating neuGRID with related project and activities carried out in Europe and elsewhere. The dissemination strategy has as its main objectives: • To disseminate project results to the relevant scientific communities; • To raise awareness at political and decision making level of the opportunities offered by neuGRID; • To spread within research, scholar and clinical communities wide knowledge about facilities and tools supplied by the infrastructure; • To assess the regulatory needs of pharma industry for pre-competitive research and clinical trials including clinical trial registration, agreements that should be prepared and signed by potential industry users, IPR management, and regulations for data ownership, exchange, and analysis; to define the adaptations or expansions of the present infrastructure to host industry pre-competitive research and randomized clinical trials with clinical and imaging/biological surrogates; and to define a set of activities that should be carried out to make neuGRID compliant with industry needs; • To allow compatibility of neuGRID with related initiatives that are being carried out in North America, Japan, and Australia; • To promote integration into neuGRID of the most popular tools for brain image analyses to carry out high performance grid computing by international researchers on own or merged datasets; • To spread infrastructure aims and services to be exploited in the daily research and clinical practice; • To teach potential users how to use the implemented services through the provided GUI; • To teach research users how to take advantage by the high performance computing facilities. There are several possible channels for disseminating information and results about neuGRID. The selection of modalities and ways varies in relation to the communication targets. As detailed in the Dissemination and training plan, during the first 18 months of the project the dissemination activities have included (and will include): conferences, teleconferences, meetings, workshops, le...
Methodological approach. The structure and crystallinity of the zeolites were determined by X-ray powder diffraction using a Bruker AXS D8 Advance diffractometer equipped with a graphite monochromator and a position sensitive detector Våntec-1 using CuKα radiation in Xxxxx–Xxxxxxxx geometry. Nitrogen adsorption/desorption isotherms were measured on a Micromeritics GEMINI II 2370 volumetric Surface Area Analyzer at -196 °C to determine surface area, pore volume and pore size distribution. Before the sorption measurements, all samples were degassed in a Micromeritics FlowPrep 060 instrument under helium at 300 °C (heating rate 10 °C/min) for 4 h. The specific surface area was evaluated by BET method using adsorption data in the range of a relative pressure from p/p0 = 0.05 to p/p0 = 0.25. The t-plot method was applied to determine the volume of micropores (Vmic). The adsorbed amount at relative pressure p/p0= 0.98 reflects the total adsorption capacity (Vtot). The concentration and the type of acid sites were determined by adsorption of acetonitrile as a probe molecule followed by FTIR spectroscopy (Nicolet 6700 FTIR with DTGS detector) using the self- supported wafer technique. Prior to adsorption of the probe molecule, self-supported wafers of zeolite samples were activated in-situ by overnight evacuation at temperature 450 °C. CD3CN adsorption proceeded at room temperature for 30 min at equilibrium pressure 5 Torr, followed by 30 min degassing at room temperature. To obtain quantitative analysis, the molar absorption coefficients for CD3CN adsorbed on Brønsted acid sites (ν(C≡N)-B at 2297 cm-1, ε(B) = 2.05 ± 0.1 cm μmol-1) and strong and weak Xxxxx acid sites (ν(C≡N)-L1 at 2325 cm-1 ν(CN)-L2 2310 cm-1, ε(L) = 3.6 ± 0.2 cm μmol-1) were used. Integral intensities of individual bands were used and spectra were normalized to the wafer thickness 10 mg cm-2. The Iso-Therm thermostat (e-Lab Services, Czech Republic) maintaining temperature of the sample with accuracy of ± 0.01 K was used for the measurement of carbon dioxide adsorption at temperatures from 273 K to 333 K. After argon adsorption measurement, adsorption isotherms of CO2 were subsequently recorded on the same sample at temperatures 273 K, 293 K, 313 K and 333 K. The exact temperature was determined using a platinum resistance thermometer. Zeolites were degassed before each measurement at 473 K (temperature ramp of 1 K min-1) under turbomolecular pump vacuum overnight.
Methodological approach. To investigate these questions, this study adopts a methodology that sits within the qualitative tradition. This tradition offers a variety of possible frameworks and techniques for exploring the social world that ‘do not encompass a single universally understood position’ (Caelli et al. 2003, p.8), but are nevertheless held together by a basic allegiance to some foundational philosophic assumptions. The strategy of research here can be described as: • a qualitative study using a combination of generic qualitative methods, supported by Grounded Theory, and an evaluative framework, developed from the CA, which takes an interpretivistic approach to investigate capability development in the post-16 setting, with a view to understanding the process of student ‘preparedness.’
Methodological approach. As indicated in the previous sections, the present deliverable refers to three field sites. The methodological approach followed for each site has been practically the same, although with some differences induced by their specific characteristics. The first step has been the collection of geological and hydrogeological data on a regional scale to design the site stratigraphy, identify the SA and its groundwater level. Data have been collected from scientific papers, available maps, and national databases. Subsequently, the analysis has been more focused on a local scale. More detailed information has been acquired from the people/companies responsible of the G-ER exploitation and/or the environmental quality monitoring of each field site, as enabled by the S4CE consortium. Whenever possible, measurements have been performed directly on site. According to the collected data, for each of three sites the Conceptual Circulation Model (CCM) and the Numerical Circulation Model (NCM) of the Groundwater (GW) have been realized. The level of details of the three models is different because of existing differences in terms of available and reliable datasets. Several simplifications have been performed especially for the Cornwall site, because of the lack of reliable data. More detailed information about models construction and characteristics is reported hereinafter.
Methodological approach. This study uses bibliometric analyses to uncover the research portfolio of the five partner universities. In recent years, the increase in scientific output, along with the aggregation of scholarly information within bibliographic databases, has resulted in the adoption of ‘bibliometrics’ as an effective means for assessing scientific output through statistical examination of quantitative information derived from academic literature. Bibliometric methods can extract and analyse the characteristics of publications, including years, journals, authors, countries, and keywords, to provide insights on development trends or research orientations within a specific subject (xx Xxxxxxxx, et al., 2019; Xxxxxxx et al., 2017). The following analyses are based on publication data in Scopus spanning the years 2010 to 2022. Scopus is an abstract and citation database consisting of peer-reviewed scientific content (Xxxx et al., 2020). In its most basic form, Scopus may be considered a scientific search engine that provides relevant documents based on criteria such as keywords, article titles, journal names and author names (Xxxxxxxx et al., 2017, p. 34). When aggregated among research institutions, the data offers a view into their research output. For more information on the methodology and data source see the appendix. However, as with any data source, there are some caveats and limitations to consider. Scopus data has been shown to be biased toward Natural Sciences, Engineering, and Biomedical Research, while Social Sciences and Humanities (SSH) are underrepresented in the dataset (Xxxxxxx & Paul-Hus, 2016). Furthermore, the dataset prioritizes publications in the English language and has relatively limited coverage of regional literature (Pranckute, 2021). This means that research in languages other than English may not be adequately represented. For the ERUA universities, which are particularly strong in the humanities and social sciences and in research that engages with local communities, this may give a somewhat skewed impression of the research output. The dataset may unintentionally favour Nordic and Western European countries at the expense of Eastern European countries that, in the humanities and social sciences, tend to publish more in national languages (Xxxxxxxxx et al., 2018, p. 483) When interpreting the following results, it is important to consider these limitations and biases. Rather than providing a comprehensive depiction of the research output, this...
Methodological approach. The methodology used is shown diagrammatically in Figure 1 below: Finalise Scope of Enquiries Prepare Briefing Papers Collect 3rd Party evidence Take CAFE views Final Report d Tender from IIASA in ENV.C1/SER/2002/0031, available on IIASA’s web page. The methodology consisted of the following operations:
a) A kick-off meeting with the Commission at which a detailed work plan was discussed and agreed.
b) A formal contact, via email, with prominent third party stakeholders in order to get their views and advice for the evaluation.
c) A first meeting between the review group and the RAINS team to collect information and evidence for the ‘Enquiry’ i.e. to determine the questions to address, the key reports and scientific papers to examine etc.
d) Preparation of briefing material for each of the Task Areas 2-5 based on the results from the first meeting and views from third party stakeholders and the CAFE secretariat.
e) A second meeting of the whole Review Team and IIASA – the “Enquiry meeting”– at which our Team examined the evidence collected and sought clarification of IIASA.
f) A Draft Final Report that was distributed to IIASA, CAFÉ and the TFIAM for comment; IVL and AEAT authors then took account of feedback in drafting the final report.
g) The final Report answering the questions posed by the Commission, identifying unresolved issues for IIASA to pursue in the course of their work, and making recommendations to the Commission on developmental needs. The first review meeting with XXXXX followed the conventional process of presentations and discussions, putting forward questions to the IIASA team etc. The questions presented in the Call for Tender were used as the basis for the discussions. The Enquiry meeting examined the evidence collected. Members of the Review Team Members acted as advocates for the various aspects of the RAINS modelling while the rest of the Review Team, acting together, challenged the robustness of the supporting evidence. This approach was designed to detect strengths and weaknesses in the RAINS methodology, to cut across conventional wisdom, detect blind spots and open new areas of questioning. Following the meeting the Review Team drafted its conclusions; these formed the main component of the Draft Report to the EC and the CAFE secretariat. A number of stakeholders were contacted at start of the review in order to get comments from them on the work plan and also get advice on aspects to be included in the review.
Methodological approach. The factsheet has been created according to established EuroStemCell best practice. It contains ‘layers’ of information and a short summary section structured around three key questions: What do we know? What are researchers investigating? What are the challenges? For non-specialists who wish for more indepth information there are a further series of ‘tabs’ covering topics such as the small intestine, current treatments for SBS, current research and clinical trials that can be explored. The text has been written by professional science communications and reviewed by senior scientists in the INTENS project. To augment the factsheet, three videos have been produced featuring three INTENS scientists based at UCL. These can be utilized as stand alone videos by project members but have also been incorporated into the factsheet.
Methodological approach. 2.1. Robust methodology and general purpose software for the study of porosity
Methodological approach. For measuring robot exposure at the regional level, we follow the approach by Xxxxxxxx and Xxxxxxxx (2020). For given national changes in robot adoption at the industry level, this approach assigns stronger automation exposure to regions that were historically specialized in industries for which robot adoption has later been more substantial. We combine data on Deliverable D5.5 Version 1.0 the adoption of industrial robots at the country-industry level, sourced from the International Federation of Robotics, with regional employment data, sourced either from Eurostat or from national sources. For measuring individual exposure to automation, we develop in the paper a novel methodology. In particular, in order to capture the individual exposure to automation in a way that is not contaminated by the consequences of automation itself, we do not use information on the current occupation. Instead, we employ a vector of predicted probabilities for each individual to be employed in each occupation. Crucially, these probabilities are estimated based on individual characteristics and on the pre-sample, historical composition of employment at the occupation level in the region of residence. The individual vulnerability to automation is then obtained as the scalar product between this vector of probabilities and a vector of automatability scores of the occupations. In other words, the vulnerability score is a weighted average of the automatability scores for each occupation, where weights are the probabilities of employment of each individual in each occupation. To obtain the individual exposure to automation at the time of a given election, the vulnerability score is further interacted with the pace of robot adoption in the specific country and election year. Intuitively, for a given national pace of robot adoption, our measure of individual exposure assigns higher scores to individuals that would have been more likely – in the pre-sample historical labour market – to work in occupations whose automatability is higher. The logic of the individual measure is analogous to the one underlying the regional measure in Acemoglu and Xxxxxxxx (2020). In that case, the vulnerability of a region is determined by its historical sectoral composition. In this case, the vulnerability of an individual is determined by the historical distribution of occupations in her labour market, in conjunction with her observable characteristics. We employ historical labour market data from the Eu...