Background and motivation Sample Clauses

Background and motivation. Global CO2 emissions from the consumption of fossil oil have increased dramatically from 22,188.5 million tons in 1995 to 33,508.4 million tons in 2015, with an annual average rate of 2.1%.1 In current global energy consumption, fossil fuel-based energies still provide approximately 86.0% of the global total energy needs.1,2 To solve this problem, hydrogen, an attractive energy carrier with high energy density (140 MJ/kg) which is more than two times higher than typical solid fuels (50 MJ/kg), has been recognized as a promising alternative to replace fossil oil used in the industry and transportation.3 In addition, hydrogen has versatile significant applications in the traditional industry such as petroleum refinement, ammonia fertilizer, metal refinement, and heating.4 Demand for hydrogen in the United States is projected to grow from 60 million to nearly 700 million metric tons from now to the mid-century, even without considering the rapid development of fuel cell electric vehicles.4 The Hydrogen Council has made a comprehensive assessment of the future potential impact of the hydrogen economy. In the report, hydrogen energy is believed to be able to meet 18% of the world’s energy demand, create a $2.5 trillion market, and reduce carbon dioxide emissions by 40–60% in transportation, industry, and residential.5 Although hydrogen is a renewable “carbon-zero” fuel, 96% of the current hydrogen is produced from the steam reforming of nonrenewable fossil fuels (methane, coal, and oil) with high energy consumption and CO2 emission.6 Moreover, due to the nature of the steam reforming reaction, impurities such as CO or H2S are inevitable in the produced H2. Trace amounts of such impurities can severely poison the platinum (Pt) based catalysts currently used in proton exchange membrane fuel cells (PEMFCs).7,8 Therefore, combined with renewable energy, electrochemical and photoelectrochemical hydrogen production has attracted considerable interest worldwide as the alternative, environmentally friendly long-term pathway to produce high purity H2 on a large scale, as suggested by the Department of Energy (DOE) in the United States (Figure 1).
AutoNDA by SimpleDocs
Background and motivation. 1.1.1 Introduction to complex networks
Background and motivation. The growth in the data traffic demands in the communication industry in the recent past has been explosive. The sheer number of subscribing users requiring full and reliable access to a range of highly data-hungry applications is what is driving such growth and overwhelm. High data rates are therefore needed to fulfil such mobile data-centric users’ accessibility to these applications. With the current infrastructure of cellular networks, providing mobile and broadband con- nectivity to an increasing pool of users can compromise QoE. On the other hand, the existing trend spiral for consumption and reward is shifting towards higher provision of services at lower prices. Therefore, mobile operators are resorting to network-wide fundamental and infrastructural changes to boost revenue. To this end, new spectrally-efficient technologies that support the unprecedented outburst in the need for true robustness and data rates are needed for future deployment of mobile networks. To meet the this objective, the International Telecommunications Union - Radio (ITU-R) issued performance specifications for 4G standards [1]. After LTE Release 8, as a 3GPP project failed to satisfy the requirement terms specified by the ITU, the next-step evolved version of the LTE technology was released by 3GPP as the LTE-A. Major performance insufficiencies of the LTE [2] were in terms of required data rates, spectral efficiency and the support for variable bandwidth which LTE-A overcame. Evolution-based innovations are adopted following the deployment of LTE-A further refining the spectrally-efficient techniques in LTE-A that were already ex- istent in its predecessor LTE. Amongst such techniques that offer more enhanced utilisation of the scarce radio resources is the Coordinated MultiPoint (CoMP). Targeting interference-suffering cell-edge users with low average SINRs, CoMP techniques help increase user QoS significantly. Depending on the mode and level of coordination, exceptional spectral efficiencies are achieved by simultane- ous transmission on the same resources. This is achieved by turning undesirable signals into ones contributing towards enhancing the SINR levels, in the case of joint transmission (JT). Same goal is obtained through the coordinated scheduling (CS) mode of CoMP by cooperatively scheduling transmissions amongst eNBs. Moving towards 5G, ultra dense cell deployments have been shown to in- crease capacity in the entire cell beyond what is possible through conven...
Background and motivation. Users’ expectations to receive high volume and reliable traffic data have shown an unprecedented growth in recent years and is projected to double every year in the current decade [15]. This is predominantly due to new and emerging data- hungry and personal hand-held devices such as tablets and smart phones [13] narrowing user demands between mobile and fixed networks. To satisfy such user demands, International Mobile Telecommunication Advanced (IMT-A), a global standard initiative was introduced by the international telecommunications union (ITU) in 2007 [16]. The IMT-A requires peak DL and UL data rates of 1 Gbps and 500 Mbps for low mobility scenarios, respectively. This was when peak data rates of 300 Mbps and 75 Mbps at a maximum available bandwidth of 20 MHz were supported by the Long Term Evolution (LTE), corresponding to Releases 8 and 9 of the 3rd Generation Partnership Project (3GPP) [17]. Promising to enhance LTE’s performance, LTE advanced (LTE-A) soon was issued as an IMT- A technology two years after its introduction by 3GPP in 2010. Since reachable data rates increase linearly with bandwidth, acquiring more spectrum is a necessity for meeting the ever-growing traffic requirements. LTE-A allows the utilisation of a maximum of 100 MHz system bandwidth. However, due to unavailability of large fragments of contiguous bandwidth, operators seek alternatives to use spectrum chunks at different carrier frequencies and aggregate them for data transmission. First standardised in Release 10 as one of the key fea- tures of 3GPP, CA facilitates the aggregation of fragmented and non-contiguous bandwidth as an expensive and scarce commodity [18] [19] [20]. This thesis only considers CA in the DL.
Background and motivation. Despite the overwhelming cost of fossil fuel, commercial photovoltaic solar cells account for less than 0.1% of the energy consumption in the US. This is partially due to the low conversion efficiency (~15%) and high installation cost of the current solar cell technology (~7$/W), far exceeding the generation of electricity from fossil fuel. In this context, semiconductor nonmaterial has promising applications in solar cell technology as they offer good photostability and conversion. However, to date, no significant advances have been achieved due either to size nonuniformity, low yield, or matrix inhomogeneity. Various methods exist for the production of Si nanoparticles, but most produce a wide size distribution. In addition, many methods, e.g. laser ablation, pyrolosis of gas, and ion beam deposition generally produce small quantities of particles, which cannot be readily integrated into subsequent processes and manufacturing scale up.
Background and motivation. The nature of today’s global competitive market has given rise to increased organizational cooperation in form of strategic alliances where organizations no longer compete in isolation, but as value chains. Globalization and increased market pressures lead organizations to enter into strategic partnerships with the overall goal of achieving a competitive advantage. Through aligning resources and capabilities with business partners, mutual benefits can be gained in form of quality, time, and costs. The realization of such collaborative efforts requires integrated behavior, sharing of information, and appropriate management of business relationships. As a result, the concept of Supply Chain Management (SCM) has been flourishing the last decade. The objective of SCM is in short to coordinate activities between businesses across traditional organizational boundaries to improve the performance of the supply chain partners and the supply chain as a whole. Another closely related concept which has been reaping increased attention the last decade is the role of information technology (IT) in inter-organizational business activities. The use of such inter-organizational information systems (IOS) has become central for business collaboration, and the different systems range from simple web portals to extensive integrated electronic networks. Recent and continuous advances in these technological solutions offer new ways to collaborate and compete inter-organizationally. And, in view of the fact that these technological solutions are becoming so common and easy to procure, organizations that are late in adopting such solutions might fall behind in the competitive environment of today’s markets. There is an interception between the two concepts of SCM and IOS. As Xxxxxx (2007) notes, IOS are critical in managing operational and strategic activities between organizations as they can provide the supply chain partners with real-time, critical information of demand and supply data. Xxxxxx and Xxxxxxxxxxxxxx (1998) take it even further by saying that coordinated business activities, integrated behavior, and sharing of information between organizations requires the use of an IOS. Hence, IOS can be viewed as an essential enabler of effective management of the supply chain (i.e. SCM). However, the majority of IOS projects is costly and might even be the largest investment an organization goes through with ever (Xxxxxx, 2005). The importance of ensuring the IOS’s success is t...
Background and motivation. ‌ Cyber-Physical Systems (CPS) are systems that comprise both real-world entities and digital components. Modelling and designing CPSs typically requires a combination of different languages and tools that adopt comple- mentary specification paradigms. For real-world artefacts, physics models in the form of differential equations are the norm. Digital components, such as software controllers, are typically described via control diagrams, state machines, and real-time programs. This diversity of specification and design methods makes CPS challenging to study and analyse. Co-simulation [16] is perhaps the de facto technique for analysing the be- haviour of CPS. It requires that models of artefacts are simulated in iso- lation, while master algorithms control the various simulators and thereby orchestrate the co-simulation as a whole. This, however, raises issues of interoperability between the master algorithm and simulators. The Func- tional Mock-up Interface (FMI) Standard [11] has been proposed to alleviate those issues, and has since been successfully used in many industrial appli- cations. The FMI standard prescribes how master algorithms (MA) and simulators communicate. It does so by virtue of a bespoke API that simulators have to implement, and that can be used to implement compliant master algorithms. The API enables master algorithms to exchange data between the compo- nents of a co-simulation, called FMUs (Functional Mock-up Units), perform simulation steps, and suitably deal with errors in simulators. It also allows for advanced features such as roll-back of already performed steps. While (co)simulation is currently the predominant approach to validate CPS models, we here describe a complementary technique based on a formal model of an FMI system. Our technique formalises both the master algorithm and the simulated FMUs, and allows for verification of their properties.
AutoNDA by SimpleDocs
Background and motivation. Gilthead Seabream and Sea bass are the most important finfish species farmed in the Mediterranean Sea. Until the 1980s, Seabream was only fished from wild populations, but successful reproductions and intensive rearing resulted in a rapid increase in production. Already in 1993, culturing in cages exceeded that of open sea fishing. Seabream is the largest marine farmed fish in the Mediterranean, they usually weigh between 400 and 600 g and are sold fresh, whole or eviscerated. The major producing countries within the EU are Greece, Spain, and Italy. France is the fourth largest producer of seabream juveniles. Turkey is the second major producer in the Mediterranean. In 2015, seabream accounted for 12% of the total production, in terms of value and volume for the EU marine aquaculture. Recently, the volume in the production of seabream fry and juveniles has increased, and costs reductions were achieved by automation. The finfish aquaculture industry has heavily invested in farming technologies and automation to improve quality, food safety and traceability of produced fish. A further increase in production volumes rely on environmentally friendly approaches to aquaculture, which are mandatory for production licences and commercially viable production. The ambition for the business case developed here in the Space@Sea project for Sea bream aquaculture is to increase sustainable food production at large scale production in offshore conditions by making use of modular floating islands. An increase of food production at sea fits to the EU Blue Growth strategy. An important aspect also is to reduce the environmental footprint by making use of a closed type of aquaculture system, minimising interactions (water intake and discharge) with the marine environment. Closed systems also enable to keep out parasites and to control disease outbreaks.
Background and motivation. Within the Nunataryuk project, a vast amount and diversity of data will be produced. The purpose of the DMP is to document how the data generated within the project is handled during and after the project. It describes the basic principles for data management within the project. This includes standards and generation of discovery and use metadata, data sharing and preservation and life cycle management. This DMP is a living document that will be updated during the project in time with the periodic reports. Nunataryuk is following the principles outlined by the Open Research Data Pilot (OpenAIRE) and The FAIR Guiding Principles for scientific data management and stewardship (Xxxxxxxxx et al. 20161).
Background and motivation. Looking back in history, trying to trace back this ever-increasing ab- straction that we may call mathematics, we can clearly see that there are two fundamental concepts; shapes and numbers. Number theory, and especially the study of Diophantine equations, can be considered a core mathematical study. We recall that a Diophantine equation is a system of polynomial equa- tions over a field K, and therefore it can be thought of as a subset of the affine space Kn. This simple idea gave number theorists a whole new arsenal of techniques to tackle old problems. It also paved the path to new connections in mathematics. Probably, the best example is Xxxxxx’s Last Theorem, which remained open for more than three centuries until Xxxxx gave his famous proof in 1995. This interplay between number theory and algebraic geometry can be used to find a natural, though unexpected, way to categorise Diophantine equa- tions; the dimension of the zero locus. For example, we can restrict ourselves to one-dimensional varieties, or curves. Examples of curves are: x + y + z = 0 in P2 and y2 − xz = yw − z2 = xw − yz = 0 in P3. These two serve as a fine example of a connection that would not have been possible without the use of Algebraic Geometry in Number Theory. A finer categorisation for curves is the genus. Curves are far more studied and understood than higher dimensional varieties, but even in this case, the only genera that we have a good understanding of are 0 and 1. If the genus of the curve over a number field K is higher than 1, we still have some deep results, such as Faltings’ Theorem that asserts that the curve has only finitely many points. Elliptic curves (smooth curves of genus 1 that have a K rational point) have formed a paradigm on the way to look for results in Diophantine equations. For a number field K, the set of K-rational points on an elliptic curve E defined over K forms a finitely generated abelian group; this is the famous Mordell–Weil Theorem. This thesis is concerned with analogues of the Mordell–Weil theorem in higher dimensions, building on recent advances in the two-dimensional case, due to Siksek [16] and Xxxxxx [2], [3].
Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!