Performance Comparison Sample Clauses

Performance Comparison. We analyze both communication and computation costs for join, leave, merge and partition proto- cols. In doing so, we focus on the number of: rounds, messages, serial exponentiations, signature generations, and signature verifications. Note that we use RSA signatures for message authenti- cation since RSA is particularly efficient in verification. We distinguish among serial and total measures. The serial measure assumes parallelization within each protocol round and represents the greatest cost incurred by any participant in a given round. The total measure is the sum of all participants’ costs in a given round. We compare STR protocols to TGDH that has been known to be most efficient in both com- munication and computation. For detailed comparison with other group key agreement protocols such as GDH.3 [27], BD (Xxxxxxxxx-Xxxxxxx) [11] can be found at [2]. Table 6.1 summarizes the communication and computation costs of both protocols. The num- bers of current group members, merging members, merging groups, and leaving members are denoted as: n, m, k and p, respectively. The height of the key tree constructed by the TGDH protocol is h. The overhead of the TGDH protocol depends on the tree height, the balancedness of the key tree, the location of the joining tree, and the leaving nodes. In our analysis, we assume the worst-case configuration and list the worst-case cost for TGDH. The number of modular exponentiations for a leave event in STR depends on the location of the deepest leaving node. We thus compute the average cost, i.e., the case when the n -th node leaves the group. For all other events and protocols, exact costs are shown. In the current implementations of TGDH and STR, all group members recompute bkeys that have already been computed by the sponsors. This provides a weak form of key confirmation, since a user who receives a token from another member can check whether his bkey computation is correct. This computation, however, can be removed for better efficiency, and we consider this optimization when counting the number of exponentiations. It is clear that computation cost of STR is fairly high: O(m) for merge and O(n) for subtractive events. However, as mentioned in Section 1, this high cost becomes negligible when STR is used in a high-delay wide-area network. Evidence to support this claim can be found in [2]. Communication Computation Round Message Exponentiation Signature Verification TGDH Join 2 3 3h − 3 2 3 Leave 1 1 3h − 3 1 1 merge ...
AutoNDA by SimpleDocs
Performance Comparison. In order to evaluate the performance of the proposed scheme, we compare it with some other DLP based password authentication schemes in this section.
Performance Comparison. This section shows a comparison of the proposed protocol with some other existing Group Key Agreement protocols on the basis of its performance in terms of communication and computation costs. The result is shown in Table I (where n is the total number of participants). The following notations are used for comparison.  PM: number of Scalar point multiplications.  Message: Total number of message overheads during group key generation process (including unicast and broadcast).  n: Total number of participants.  Pairings: number of bilinear pairing computations needed in the key agreement process (zero in case of our proposal)  h=log3n : The height of the original key tree in proposed technique TGECDH [12] h=log2n 2*(n-1) n*(n-1)+n*h Xxxxx et al. [13] h=log2n 2*(n-1) n*(n-1)+n*h GDH.2[14] n n n*(n+3)/2-1 GDH.3 n+1 2n-1 5n-6 BD[14] n 2n n*(n+1) Proposed protocol h=log3n Floor[3*(n-1)/2] 5*(n-1)/2+h*n
Performance Comparison. Since Xx-Xxxxx’x protocol [29], Xxxx et al.’s protocol [30], Hwang et al.’s protocol [31] are more efficient and more secure than other existing DBAKA protocols [4, 21-22, 24, 28, 31], we only compare our scheme with three schemes [29-30, 32] in term of storage cost, computational cost and communication cost. To measure the message size, we assume that each identity is 32 bits long. The output size of hash function is 160 bits (if we use MD5 hash function) and the block size of symmetric encryption/decryption (for example, AES) is 128 bits. The order q of the generator Q in the elliptic curve group G is a 160-bit prime and p is a 163-bit prime. Such choice of q, p delivers a comparable level of security to 1024-bit ElGamal encryption over general field. Since one element of G is a point on the group E(Fp), there are two affine coordinates. By using the point compression method, one can bring two elements of Fp down to one element of Fp, i.e., the y-coordinate of each point in the group

Related to Performance Comparison

  • Performance Metrics The “Performance Metrics” for the Performance Period are: (i) the JD Power Residential National Large Segment Survey for investor-owned utilities; (ii) the System Average Interruption Frequency Index (Major Events Excluded) (“XXXXX”); (iii) Arizona Public Service Company’s customer to employee improvement ratio; (iv) the OSHA rate (All Incident Injury Rate); (v) nuclear capacity factor; and (vi) coal capacity factor. (1) With respect to the Performance Metric described in clause (i) of this Subsection 6(a), the JD Power Residential National Large Segment Survey will provide data on an annual basis reflecting the Company’s percentile ranking, relative to other participating companies. (2) With respect to the Performance Metric described in clause (ii) of this Subsection 6(a), the Edison Electric Institute (“EEI”) will provide data on an annual basis regarding the XXXXX result of the participating companies; the Company will calculate its XXXXX result for the year in question and determine its percentile ranking based on the information provided by EEI. (3) With respect to the Performance Metric described in clause (iii) of this Subsection 6(a), SNL, an independent third party data system, will provide data on an annual basis regarding the customer and employee counts; the Company will use its customer and employee counts for the year in question and determine its percentile ranking based on the information provided by SNL. Only those companies whose customers and employees were included in the data provided by SNL in each of the years of the Performance Period will be considered. (4) With respect to the Performance Metric described in clause (iv) of this Subsection 6(a), EEI will provide data on an annual basis regarding the OSHA rate of the participating companies; the Company will calculate its OSHA rate for the year in question and determine its percentile ranking based on the information provided by EEI. (5) With respect to the Performance Metric described in clause (v) of this Subsection 6(a), SNL will provide data on an annual basis regarding the nuclear capacity factors of the participating nuclear plants; the Company will calculate its nuclear capacity factor for the year in question and determine its percentile ranking based on the information provided by SNL. Only those plants that were included in the data provided by SNL in each of the years of the Performance Period will be considered. (6) With respect to the Performance Metric described in clause (vi) of this Subsection 6(a), SNL will provide data on an annual basis regarding the coal capacity factors of the participating coal plants; the Company will calculate its coal capacity factor for the year in question and determine its percentile ranking based on the information provided by SNL. Only those plants that were included in the data provided by SNL in each of the years of the Performance Period will be considered. (7) The Company’s percentile ranking during the Performance Period for each Performance Metric will be the average of the Company’s percentile ranking for each Performance Metric during each of the three years of the Performance Period (each, an “Average Performance Metric”); provided, however, that if the third year of a Performance Metric is not calculable by December 15 of the following year, the Performance Metric shall consist of the three most recent years for which such Performance Metric is calculable. The Company’s “Average Performance,” for purposes of determining any Base Grant adjustments pursuant to Subsection 5(b) above will be the average of the Average Performance Metrics. If only quartile, rather than percentile, rankings are available for a particular Performance Metric, the Average Performance Metric for any such Performance Metric shall be expressed as a percentile. For example, if the Performance Metric was in the top quartile for two Performance Periods and in the lowest quartile in the other Performance Period, the average of these quartiles would be 3 (the average of 4, 4, and 1) and the Average Performance Metric would be the 75th percentile (3 /4). The calculations in this Subsection 6(a)(7) will be verified by the Company’s internal auditors. (8) If either EEI or SNL discontinues providing the data specified above, the Committee shall select a data source that, in the Committee’s judgment, will provide data most comparable to the data provided by EEI or SNL, as the case may be. If the JD Power Residential National Large Segment Survey for investor-owned utilities (or a successor JD Power survey) is not available during each of the years of the Performance Period, the Performance Metric associated with the JD Power Residential Survey (Subsection 6(a)(1)) will be disregarded and not included in the Company’s Average Performance for purposes of determining any Base Grant adjustments pursuant to Subsection 5(b).

  • Performance Management 17.1 The Contractor will appoint a suitable Account Manager to liaise with the Authority’s Strategic Contract Manager. Any/all changes to the terms and conditions of the Agreement will be agreed in writing between the Authority’s Strategic Contract Manager and the Contractor’s appointed representative. 17.2 The Contractor will ensure that there will be dedicated resources to enable the smooth running of the Framework Agreement and a clear plan of contacts at various levels within the Contractor's organisation. Framework Public Bodies may look to migrate to this Framework Agreement as and when their current contractual arrangements expire. The Contractor will where necessary assign additional personnel to this Framework Agreement to ensure agreed service levels are maintained and to ensure a consistent level of service is delivered to all Framework Public Bodies. 17.3 In addition to annual meetings with the Authority's Strategic Contract Manager, the Contractor is expected to develop relationships with nominated individuals within each of the Framework Public Bodies to ensure that the level of service provided on a local basis is satisfactory. Where specific problems are identified locally, the Contractor will attempt to resolve such problems with the nominated individual within that organisation. The Authority's Strategic Contract Manager will liaise (or meet as appropriate) regularly with the Framework Public Bodies' Contract Manager, and where common problems are identified, it will be the responsibility of the Contractor to liaise with the Authority's Strategic Contract Manager to agree a satisfactory course of action. Where the Contractor becomes aware of a trend that would have a negative effect on one or more of the Framework Public Bodies, they should immediately notify the Authority's Strategic Contract Manager to discuss corrective action. 17.4 Regular meetings, frequency to be advised by Framework Public Body, will be held between the Framework Public Bodies' Contract Manager and the Contractor's representative to review the performance of their Call-Off Contract(s) under this Framework Agreement against the agreed service levels as measured through Key Performance Indicators (KPIs). Reports will be provided by the Contractor to the Framework Public Bodies' Contract Manager at least 14 days prior to the these meetings. 17.5 Performance review meetings will also be held annually, between the Authority's Strategic Contract Manager and the Contractor's representative to review the performance of the Framework Agreement against the agreed service levels as measured through Key Performance Indicators. A summary of the quarterly reports will be provided by the Contractor at least 14 days prior to these meetings. 17.6 The Authority will gather the outputs from contract management to review under the areas detailed in the table below. Provision of management reports 90% to be submitted within 10 working days of the month end Report any incident affecting the delivery of the Service(s) to the Framework Public Body 100% to be reported in writing to FPB within 24 hours of the incident being reported by telephone/email Prompt payment of sub-contractors and/or consortia members (if applicable). Maximum of 30 from receipt of payment from Framework Public Bodies, 10 days target 100% within 30 days

  • Performance Measurement The Uniform Guidance requires completion of OMB-approved standard information collection forms (the PPR). The form focuses on outcomes, as related to the Federal Award Performance Goals that awarding Federal agencies are required to detail in the Awards.

  • Metrics Institutional Metrics System-Wide Metrics

  • Performance Measure Grantee will adhere to the performance measures requirements documented in

  • Performance Monitoring ‌ A. Performance Monitoring of Subrecipient by County, State of California and/or HUD shall consist of requested and/or required written reporting, as well as onsite monitoring by County, State of California or HUD representatives. B. County shall periodically evaluate Subrecipient’s progress in complying with the terms of this Contract. Subrecipient shall cooperate fully during such monitoring. County shall report the findings of each monitoring to Subrecipient. C. County shall monitor the performance of Subrecipient against the goals, outcomes, milestones and performance standards required herein. Substandard performance, as determined by County, will constitute non-compliance with this Contract for which County may immediately terminate the Contract. If action to correct such substandard performance is not taken by Subrecipient within the time period specified by County, payment(s) will be denied in accordance with the provisions contained in this Paragraph 47 of this Contract. D. HUD in accordance with 24 CFR Part 570 Subpart O, 570.902, will annually review the performance of County to determine whether County has carried out its Community Development Block Grant (CDBG) assisted activities in a timely manner and has significantly disbursed CDBG funds and met the mandated “1.5 ratio” threshold. Subrecipient is responsible to ensure timely drawdown of funds.

  • Goals Goals define availability, performance and other objectives of Service provisioning and delivery. Goals do not include remedies and failure to meet any Service Goal does not entitle Customer to a Service credit.

  • Performance Targets Threshold, target and maximum performance levels for each performance measure of the performance period are contained in Appendix B.

  • Performance Measures The System Agency will monitor the Grantee’s performance of the requirements in Attachment A and compliance with the Contract’s terms and conditions.

  • Performance Expectations The Charter School’s performance in relation to the indicators, measures, metrics and targets set forth in the CPF shall provide the basis upon which the SCSC will decide whether to renew the Charter School’s Charter Contract at the end of the charter term. This section shall not preclude the SCSC from considering other relevant factors in making renewal decisions.

Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!