State of the art. (a) Comcast and the Township acknowledge that the technology of Cable Systems is an evolving field. Comcast’s Cable System in the Township shall be capable of offering Cable Services that are comparable to other Cable Systems owned and managed by Comcast or its Affiliated Entities in the County of Allegheny in the Commonwealth of Pennsylvania (“Comparable Systems”) pursuant to the terms of this section. The Township may send a written notice to Comcast, not to exceed one request every two (2) years, requesting information on Cable Services offered by such Comparable Systems.
(b) If the identified Cable Services are being offered by Comcast and/or any of its Affiliated Entities to at least forty percent (40%) of the total Subscribers in the Comparable Systems, then the Township may require that Comcast make such Cable Services available in the Township. Should the Township determine that Comcast shall provide such comparable Cable Services, then the Township and Comcast shall enter into good faith discussions to negotiate a schedule for deployment of such Cable Services. The discussions shall take into consideration the benefits from the provision of such Cable Services, the cost of implementing the Cable Services in the Township, the technical and economic feasibility of implementing such improvements, and the impact, if any, on Subscriber rates.
(c) The Township shall not initiate the procedures set forth under the preceding State-of-the-Art provisions or issue any related order if Comcast is subject to effective competition in the Township as expressly ordered in writing by the FCC and Comcast presents such written order to the Township.
State of the art. A. Text normalization
B. Language identification
State of the art. It is not possible to summarize the huge number of studies and papers on probabilistic seismic hazard assessment (PSHA) all around the world in the last decades, where different approaches to the determination of maximum magnitude were defined and applied. What we can remark from this large bibliography is that two main strategies were followed in the past: from one side, the maximum magnitude was determined by people in charge of the definition of the catalog and/or the seismic source zones model; on the other side, the maximum magnitude was determined by people in charge of the hazard computation (Figure 2.1).
State of the art. The Technical and Organisational Measures are subject to technical progress and further development. In this respect, it is permissible for the Supplier to implement alternative adequate measures. In so doing, the security level of the defined measures must not be reduced. Substantial changes must be documented.
State of the art. This sub-section includes short descriptions of the most common algorithms exploited for crawling, followed by a discussion of web page classification techniques applied to focused crawling.
A. Crawling
1. Breadth-First (Xxxxxxxxx, 1994) is the simplest algorithm for crawling. It uses a list of URLs scheduled to be fetched called the frontier. In Breadth-First, the frontier is implemented as a First-In First-Out (FIFO) queue. Thus the pages are crawled in the order in which the links to other pages appear in the page under examination.
1 In the initial version of both the monolingual and bilingual crawlers to be developed by T12 of the project, crawling will target HTML pages only.
2. Best-First (Xxx et al., 1998) is probably the most appropriate algorithm for a focused crawling task. Its basic idea is to select for crawling the best link from the frontier according to an estimation criterion. In its simplest form, a text to topic classifier (like Naive Bayes, Cosine Similarity, SVM, string matching, etc) is exploited to provide a score of relevance to each crawled page. This score is also assigned to each link within the page.
3. PageRank (Brin and Page, 1998) is based on the same idea but exploits the ―popularity‖ of a web page instead of its relevance. The term ―popularity‖ refers to the probability that a random crawler will visit that page at any given time. In other words, a page‘s popularity score is estimated on the basis of the popularity scores of the pages that point to it. Consequently, the PageRank algorithm is more suitable for indexing web pages instead of collecting pages relevant to a specific domain.
4. Fish-search (De Bra and Post, 1994) could be considered as a combination of the Breadth- First and Best-First algorithms. It exploits a binary classifier in order to keep only the links within relevant pages. Then, the pages are crawled in the order in which the links to them appear in the page under examination.
5. Shark-search (Xxxxxxxxx et al., 1998) is an improvement of the fish-search algorithm. The potential score of each link is influenced by the estimated relevance of its anchor text (i.e. the visible, clickable text in a link) and the source web page. Regression is adopted instead of binary classification.
State of the art. In [87], the authors first recall the security and data privacy loss risks exposed by multi-party learning models likely to take place in 5G network management (e.g., operators may not share their network operating metadata) as well as the merits of Intel SGX to mitigate these risks. Because of the expected performance losses incumbent to SGX, the authors produce some optimizations for customized binary integration of learning algorithms (K-means, CNN, SVM, Matrix factorization) and stress the requirements for data obliviousness which preserve privacy for the training and sample data, collected and generated outside SGX. In doing so, the authors map the security and privacy issues holistically, all way through the complete AI data pipeline. The incurred overhead when running the model inside SGX varies from a more than satisfactory 1% to a more impacting 91% according to the algorithm type (respectively, CNN and K-Means). In [88], the authors deliver efficient deep learning on multi-source private data, leveraging Differential Privacy (DP) on commercial TEEs. Their technology dubbed MYELIN shows similar performance (or negligible slow down) when applying DP-protected ML. To do so, their implementation goes through the compilation of a static library embedding the core minimal routines. The static library is then fully run in the TEE, which removes any costly context switch from the TEE mode to the normal execution mode. Specialized hardware accelerators (Tensor Processing Units - TPUs) are also viewed as the necessary step to take for highly demanding (fast) decision taking. That is a grey area, with no existing TEE embodiment for specialized hardware to the best of our knowledge. In addition, leveraging TEE data sealing capability looks like another path to consider for further improvements. In [89], the authors deliver a fast, verifiable and private execution of neural networks in trusted hardware, leveraging a commercial TEE. SLALOM splits the execution between a Graphics Processing Unit (GPU) and the TEE while delivering security assurance on the GPU operation ĐŽƌƌĞĐƚŶĞƐƐ ƵƐŝŶŐ &ƌĞŝǀĂůroĚceƐss͛frƐom thĂe ůTEŐE tŽo tƌheŝGƚPUŚisŵai͘m edKƵƚƐŽ at boosting performance, in a scheme that can be applied to any faster co-processor. Full TEE- embedded inference was the bottom line of this research, deemed as not satisfactory on the performance aspect. In [90] , the authors recall the need for ever-growing and security-privacy sensitive training data set which ca...
State of the art. We review here the already existing and potential relations between MIR and musicology, digital libraries, education and eHealth, which we identi ed as particularly relevant for our eld of research. Applications in musicology The use of technology in music research has a long history (e.g. see Goebl [19] for a review of measurement techniques in music performance research). Before MIR tools became available, music analysis was often performed with hardware or software created for other purposes, such as audio editors or speech analysis tools. For example, Repp used software to display the time-domain audio signal, and he read the onset times from this display, using audio playback of short segments to resolve uncertainties [27]. This methodology required a large amount of human intervention in order to obtain suf ciently accurate data for the study of performance interpretation, limiting the size and number of studies that could be undertaken. For larger scale and quantitative studies, automatic analysis techniques are necessary. An example application of MIR to music analysis is the beat tracking system BeatRoot [15], which has been used in studies of expressive timing [18, 20, 30]. The SALAMI (Structural Analysis of Large Amounts of Music Information 75) project is another example of facilitation of large-scale computational musicology through MIR-based tools. A general framework for visualisation and annotation of musical recordings is Sonic Visualiser [8], which has an extensible architecture with analysis algorithms supplied by plug-ins. Such audio analysis systems are becoming part of the standard tools employed by empirical musicologists [9, 10, 22], although there are still limitations on the aspects of the music that can be reliably extracted, with details such as tone duration, articulation and the use of the pedals on the piano being considered beyond the scope of current algorithms [24]. Other software such as GRM Acousmographe, IRCAM Audiosculpt [5], Praat [4] and the MIRtoolbox 76, which supports the extraction of high-level descriptors suitable for systematic musicology applications, are also commonly used. For analysing musical scores, the Humdrum toolkit [21] has been used extensively. It is based on the UNIX operating system's model of providing a large set of simple tools which can be combined to produce arbitrarily complex operations. Recently, music21 [11] has provided a more contemporary toolkit, based on the Python programming langua...
State of the art. (a) Comcast and the City acknowledge that the technology of Cable Systems is an evolving field. Comcast’s Cable System in the City shall be capable of offering Cable Services that are comparable to other Cable Systems owned and managed by Comcast or its Affiliated Entities in the County of Allegheny in the Commonwealth of Pennsylvania (“Comparable Systems”) pursuant to the terms of this section. The City may send a written notice to Comcast, not to exceed one request every two (2) years, requesting information on Cable Services offered by such Comparable Systems.
(b) If the identified Cable Services are being offered by Comcast and/or its Affiliated Entities to at least forty percent (40%) of the total Subscribers in the Comparable Systems, then the City may require that Comcast make such Cable Services available in the City. Should the City determine that Comcast shall provide comparable Cable Services, then the City and Comcast shall enter into good faith discussions to negotiate a schedule for deployment of such Cable Services. The discussions shall take into consideration the benefits from the provision of such Cable Services, the cost of implementing them in the City, the technical and economic feasibility of implementing such improvements, and the impact, if any, on Subscriber rates.
State of the art. 2.1 Regulatory and legal compliance In a broad definition, compliance is the conformance of human or artificial behaviour with a set of rules, norms, principles, or values. In the Internet of Things (IoT), compliance has also been bootstrapped, because if humans must be compliant, so must be cyber-physical systems (CPS), autonomous and intelligent systems (AI/S), socio-technical systems (STS), and socio-cognitive technical systems (SCTS). Regulatory and legal compliance should be carefully distinguished. Regulatory compliance refers to the concept, languages and methodologies developed within the business, commercial and corporate fields to design, control and monitor in advance business processes and activities. Legal compliance refers to the formal developments that can be deemed ‘legal’ according to the norms, principles, and jurisdictions of regional, national, international, and transnational legal systems. They certainly converge, but the meanings of the two notions should be kept separate, as some requirements must be added for legal compliance be accorded from official bodies. This is linked to the Compliance by Design (CbD) schemes that have been developed in the corporate business field since the beginning of the century to cope with the constraints set by the Xxxxxxxx-Xxxxx Act (2002), a US Federal law that laid down new requirements for public company boards and accounting firms. There is some confusion ins this regard. In computer science literature regulatory compliance also denotes “the act and process on ensuring adherence to laws” that involves “discovering, extracting and representing different requirements from laws and regulations that affect a business process.” (Xxxxxxx et al., 2015). In the past twenty years, several formal languages have been developed to carry out these tasks, following a variety of methodologies and techniques described many times in the literature on the subject, within four main fields: (i) deontic logic (temporal deontic logic and computational tree logic), (ii) Petri nets, (iii) graph-based business modelling—BPMN, event-driven process chain (EPC diagrams), unified modelling language (UM)—, (iii) goal-oriented languages, (iv) and languages for the semantic web (legalXML, legalRuleML). Figure 1 maps these different trends.
Figure 1. Languages for business compliance models. SDL: Standard Deontic Logic, CTL: Computer Tree Logic, BPMN: Business Process Model and Notation, EPC: Event-driven Process Chain, UML: Uni...
State of the art. As Systemic is a composite security solution, aggregating several security functions, there is no competing offering or solution for such compound functional definition so that the state-of-the-art analysis should be worked out per offered function. Our survey on the integrity and confidentiality state-of-the-art is focused on future trends for 5G software security, namely taking advantage of hardware trusted execution environment. We have assembled a deep survey of academic research ůĞǀĞƌĂŐŝŶŐ /ŶƚĞů͛Ɛ ^'y ĨŽƌ ŶĞƚǁŽƌŬ ƐĞĐƵƌŝƚLJ ƐŽ