Background and motivation. Global CO2 emissions from the consumption of fossil oil have increased dramatically from 22,188.5 million tons in 1995 to 33,508.4 million tons in 2015, with an annual average rate of 2.1%.1 In current global energy consumption, fossil fuel-based energies still provide approximately 86.0% of the global total energy needs.1,2 To solve this problem, hydrogen, an attractive energy carrier with high energy density (140 MJ/kg) which is more than two times higher than typical solid fuels (50 MJ/kg), has been recognized as a promising alternative to replace fossil oil used in the industry and transportation.3 In addition, hydrogen has versatile significant applications in the traditional industry such as petroleum refinement, ammonia fertilizer, metal refinement, and heating.4 Demand for hydrogen in the United States is projected to grow from 60 million to nearly 700 million metric tons from now to the mid-century, even without considering the rapid development of fuel cell electric vehicles.4 The Hydrogen Council has made a comprehensive assessment of the future potential impact of the hydrogen economy. In the report, hydrogen energy is believed to be able to meet 18% of the world’s energy demand, create a $2.5 trillion market, and reduce carbon dioxide emissions by 40–60% in transportation, industry, and residential.5 Although hydrogen is a renewable “carbon-zero” fuel, 96% of the current hydrogen is produced from the steam reforming of nonrenewable fossil fuels (methane, coal, and oil) with high energy consumption and CO2 emission.6 Moreover, due to the nature of the steam reforming reaction, impurities such as CO or H2S are inevitable in the produced H2. Trace amounts of such impurities can severely poison the platinum (Pt) based catalysts currently used in proton exchange membrane fuel cells (PEMFCs).7,8 Therefore, combined with renewable energy, electrochemical and photoelectrochemical hydrogen production has attracted considerable interest worldwide as the alternative, environmentally friendly long-term pathway to produce high purity H2 on a large scale, as suggested by the Department of Energy (DOE) in the United States (Figure 1).
Background and motivation. The use of a hybrid membrane-liquefaction process for post-combustion CO2 capture can potentially be more cost effective compared to two-stage membrane processes or standard MEA absorption processes [1]. In the membrane-assisted CO2 liquefaction (MAL) process, the two different separation technologies can each carry out a partial separation within its favourable regime of operation. The membrane separation is generally suited for bulk separation with moderate product purity, while the low-temperature liquefaction process is well suited for purification of the CO2 stream, from moderate purity to a high-purity product by phase separation, as described in CEMCAP deliverable number D11.3 [2]. An advantage of this process is that there are no requirements for process steam, which is normally not available in cement plants. The MAL process has been investigated by process simulations and laboratory experiments to validate its performance. Focus has been on obtainable carbon capture ratio (CCR), CO2 product purity and main process parameters (e.g. temperature, pressure and retention time in separation vessels) for synthesized binary membrane permeate-gas compositions. To increase the TRL level of the MAL capture technology for cement plants to 7–8, a test plant with real flue gas from a cement plant in an operational environment is required. In this work, a test plant design has been proposed and simulated in Aspen HYSYS. The necessary main components, equipment types and availability of off-the-shelf equipment has been investigated. Suggestions on how the plant can be operated, based on experience from the laboratory experiments are also provided.
Background and motivation. Cyber-Physical Systems (CPS) are systems that comprise both real-world entities and digital components. Modelling and designing CPSs typically requires a combination of different languages and tools that adopt comple- mentary specification paradigms. For real-world artefacts, physics models in the form of differential equations are the norm. Digital components, such as software controllers, are typically described via control diagrams, state machines, and real-time programs. This diversity of specification and design methods makes CPS challenging to study and analyse. Co-simulation [16] is perhaps the de facto technique for analysing the be- haviour of CPS. It requires that models of artefacts are simulated in iso- lation, while master algorithms control the various simulators and thereby orchestrate the co-simulation as a whole. This, however, raises issues of interoperability between the master algorithm and simulators. The Func- tional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate those issues, and has since been successfully used in many industrial appli- cations. The FMI standard prescribes how master algorithms (MA) and simulators communicate. It does so by virtue of a bespoke API that simulators have to implement, and that can be used to implement compliant master algorithms. The API enables master algorithms to exchange data between the compo- nents of a co-simulation, called FMUs (Functional Mock-up Units), perform simulation steps, and suitably deal with errors in simulators. It also allows for advanced features such as roll-back of already performed steps. While (co)simulation is currently the predominant approach to validate CPS models, we here describe a complementary technique based on a formal model of an FMI system. Our technique formalises both the master algorithm and the simulated FMUs, and allows for verification of their properties.
Background and motivation. The nature of today’s global competitive market has given rise to increased organizational cooperation in form of strategic alliances where organizations no longer compete in isolation, but as value chains. Globalization and increased market pressures lead organizations to enter into strategic partnerships with the overall goal of achieving a competitive advantage. Through aligning resources and capabilities with business partners, mutual benefits can be gained in form of quality, time, and costs. The realization of such collaborative efforts requires integrated behavior, sharing of information, and appropriate management of business relationships. As a result, the concept of Supply Chain Management (SCM) has been flourishing the last decade. The objective of SCM is in short to coordinate activities between businesses across traditional organizational boundaries to improve the performance of the supply chain partners and the supply chain as a whole. Another closely related concept which has been reaping increased attention the last decade is the role of information technology (IT) in inter-organizational business activities. The use of such inter-organizational information systems (IOS) has become central for business collaboration, and the different systems range from simple web portals to extensive integrated electronic networks. Recent and continuous advances in these technological solutions offer new ways to collaborate and compete inter-organizationally. And, in view of the fact that these technological solutions are becoming so common and easy to procure, organizations that are late in adopting such solutions might fall behind in the competitive environment of today’s markets. There is an interception between the two concepts of SCM and IOS. As Xxxxxx (2007) notes, IOS are critical in managing operational and strategic activities between organizations as they can provide the supply chain partners with real-time, critical information of demand and supply data. Xxxxxx and Xxxxxxxxxxxxxx (1998) take it even further by saying that coordinated business activities, integrated behavior, and sharing of information between organizations requires the use of an IOS. Hence, IOS can be viewed as an essential enabler of effective management of the supply chain (i.e. SCM). However, the majority of IOS projects is costly and might even be the largest investment an organization goes through with ever (Xxxxxx, 2005). The importance of ensuring the IOS’s success is t...
Background and motivation. Recent history with transient-execution side channels proves the need for clear specification of the limits of speculative execution to support both software reasoning and hardware development.
Background and motivation. Co-simulation techniques are popular in the design of cyber-physical sys- tems (CPS) [18]. Such systems are typically engineered using a variety of languages and tools that adopt complementary paradigms; examples are physics-related models, control laws, and sequential, concurrent and real-time programs. This diversity makes CPS generally difficult to analyse and study. The Functional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate that problem and has since been successfully used in industry. It addresses the challenge of interoperability, coupling different simulators and their high-level control components via a bespoke FMI API1. While (co)simulation is currently the predominant approach to analyse CPS, this report describes a proof-based complementary technique that uses math- ematical reasoning and logic. Simulation is useful in helping engineers to un- derstand modelling implications and spot design issues, but cannot provide universal guarantees of correctness and safety. It is usually impossible to run an exhaustive number of simulations as a way of testing the system. For these reasons, it is often not clear how the evidence provided by simulations is to be qualified, since simulations depend on parameters and algorithms, and are software systems (with possible faults) in their own right. Proof-based techniques, on the other hand, hold the promise of making uni- versal claims about systems. They can potentially abstract from particular simulation scenarios, parametrisations of models, and interaction patterns used for testing. In traditional software engineering, they have been success- fully used to validate the correctness of implementations against abstract requirements models [5]. Yet, their application to CPS is fraught with diffi- culties: the heterogeneous combination of languages used in typical descrip- tions of CPS raises issues of semantic integration and complexity in reasoning 1Abstract Programming Interface about those models. The aspiring ideal of any verification technique is a com- positional approach, and such approaches are still rare for CPS [30].
Background and motivation. Emory University Cardiovascular Biobank aims to address a variety of research questions in cardiovascular diseases. It is a registry of patients with suspected or confirmed coronary artery disease undergoing cardiac catheterization. The final database will store approximately 12,000 patients’ records, and will contain information from eight sources including major Emory Healthcare units. Apart from the data collected with standardized questionnaire, clinical data is collected from up to eight types of reports: Cardiac Catheterization Procedure Report, Echocardiogram Report, History and Physical Report, Discharge Summary, Outpatient Clinic Note, Outpatient Clinic Letter, Coronary Angiogram Report, and Inpatient Report as well as Discharge Medication lists. Data elements extracted from reports and structured records are integrated to provide comprehensive information for patient identification. Manual extraction of the data is infeasible due to the large number of reports.
Background and motivation. Synoptic reporting [82-84] has become a powerful tool for providing summarized findings through predefined data element templates such as CAP Cancer Protocols [4]. Meanwhile, standard groups such as IHE are proposing structured reporting standards such as Anatomic Pathology Structured Reports [3] in HL7. While the community is tending towards structured reporting, a vast amount of pathology reports exists in legacy systems in unstructured format, and the standardization effort only captures major data elements, leaving useful research information in free text that is difficult to process and search. We explore the adaptive vocabulary feature of IDEAL-X, which employs an initial controlled vocabulary that is continuously refined through online learning during the extraction process. We also provide a built-in query interface to support searching patients based on extracted data elements.
Background and motivation. Venous thromboembolism (VTE), including deep vein thrombosis (DVT) and pulmonary embolism (PE), is associated with significant morbidity and mortality [87]. VTE can be diagnosed by several radiolographic studies, including lower or upper extremity ultrasonography and computerized tomography (CT) of the chest. Federally mandated reporting of VTE defined by the Agency for Healthcare Research and Quality Patient Safety Indicator 12 (AHRQ PSI-12) [88] is based on administrative and billing data, whose accuracy for detecting VTE has yet to be demonstrated. We use IDEAL-X to evaluate its accuracy for identifying VTE diagnosis directly from radiology reports in electronic medical records.
Background and motivation. Despite the overwhelming cost of fossil fuel, commercial photovoltaic solar cells account for less than 0.1% of the energy consumption in the US. This is partially due to the low conversion efficiency (~15%) and high installation cost of the current solar cell technology (~7$/W), far exceeding the generation of electricity from fossil fuel. In this context, semiconductor nonmaterial has promising applications in solar cell technology as they offer good photostability and conversion. However, to date, no significant advances have been achieved due either to size nonuniformity, low yield, or matrix inhomogeneity. Various methods exist for the production of Si nanoparticles, but most produce a wide size distribution. In addition, many methods, e.g. laser ablation, pyrolosis of gas, and ion beam deposition generally produce small quantities of particles, which cannot be readily integrated into subsequent processes and manufacturing scale up.