Common use of Time Definitions Clause in Contracts

Time Definitions. Reporting Tools: AMDB and KACE Exceptions: Any assets within the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation Supplemental definition of terms: In order to measure this SLA each month a random sampling of ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. Low Volume Eligible: No Threshold parameters: ***% SLA Metrics and parameters:(field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of ***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

AutoNDA by SimpleDocs

Time Definitions. Reporting Tools: AMDB and KACE OPAS Change Management Exceptions: Any assets within the sample set - Post implementation of service pack, if Microsoft through their website releases a bug notification related to that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation specific Service Pack & that has caused unsuccessful implementation of patch. - Time between Dell's request for CoreLogic's approval and when CoreLogic provides its approval Supplemental definition of terms: In order Apply Patches to measure this SLA each month a random sampling all instances of ***% enterprise infrastructure applications including acquiring, testing, and installing multiple patches (Service Pack). Patch management tasks include: maintaining current knowledge of available patches, deciding what patches are appropriate, ensuring that patches are installed properly, testing systems after installation, and documenting all associated procedures, such as specific configurations required. Definition of Normal and Critical patches are part of the actively reporting KACE servers PPM documentation for Intel and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGXUnix support. Low Volume Eligible: No Threshold parameters: ***% of normal patches applied in ***and critical patches applied in *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared Requested for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM CompanyActual Start Date/TimeActual End Date/TimeCompleted Date/TimeExclude from SLA ReportingProduct Categorization Tier 1Product Categorization Tier 2Product Categorization Tier 3Operational Categorization Tier 1Operational Categorization Tier 2Operational Categorization Tier 3Excluded from SLA reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of ***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB CriteriaRequested Company = YesAMDB Status “CORELOGIC”,”RELS”,”FINITI”,”STARS”Completed Date and Time = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start reporting periodProduct Categorization 1 = “HARDWARE”;Product Categorization 2 = “SERVER”Operational Category Tier 1 = “INSTALL”Operational Category Tier 2 = “CODE”Operational Category Tier 3 = “PATCH- MAINTENANCE”Excluded from SLA Reporting <>”Yes”Performance Rating = 5Priority = Medium (target is Normal)Priority = High (target is Critical) Calculations Critical Patches SLA = Complete Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report /time - Submit Date: 9/6/2013 Reporting Period: *** /Time Date/Time (every *** in Should be equal to ***) Reporting Frequency: *** Normal Patches SLA = Complete Date/time - Submit Date/Time (every Should be equal to ***)) Service Level Achievement = (Total number of Enterprise Patches that are successfully installed in accordance with SLA/ Total number of Enterprise Patches that are scheduled to be completed during the Measurement Period) * 100% CONFIDENTIAL MATERIAL APPEARING IN THIS DOCUMENT HAS BEEN OMITTED AND FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION IN ACCORDANCE WITH THE SECURITIES EXCHANGE ACT OF 1934, on or before the AS AMENDED, AND RULE 24b-2 PROMULGATED THEREUNDER. OMITTED INFORMATION HAS BEEN REPLACED WITH “*** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction*”.

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Change Management Exceptions: Any assets within the sample set - Post implementation of service pack, if Microsoft through their website releases a bug notification related to that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation specific Service Pack & that has caused unsuccessful implementation of patch. - Time between NTT request for CoreLogic's approval and when CoreLogic provides its approval Supplemental definition of terms: In order Apply Patches to measure this SLA each month a random sampling all instances of ***% non-enterprise infrastructure applications including acquiring, testing, and installing multiple patches (Service Pack). Patch management tasks include: maintaining current knowledge of available patches, deciding what patches are appropriate, ensuring that patches are installed properly, testing systems after installation, and documenting all associated procedures, such as specific configurations required. Definition of Normal and Critical patches are part of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. PPM documentation Low Volume Eligible: No Threshold parameters: ***% of normal patches applied in *** and critical patches applied in *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared Requested for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM CompanyActual Start Date/TimeActual End Date/TimeCompleted Date/TimeExclude from SLA ReportingProduct Categorization Tier 1Product Categorization Tier 2Product Categorization Tier 3Operational Categorization Tier 1Operational Categorization Tier 2Operational Categorization Tier 3Excluded from SLA reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of CriteriaRequested Company = “CORELOGIC”,”RELS”,”FINITI”,”STARS”Completed Date and Time = within the SLA reporting periodProduct Categorization 1 = “HARDWARE”;Product Categorization 2 = “SERVER”Operational Category Tier 1 = “INSTALL”Operational Category Tier 2 = “CODE”Operational Category Tier 3 = “PATCH- MAINTENANCE”Excluded from SLA Reporting <>”Yes”Performance Rating = ***% of the CIs will Priority = Medium (target is Normal)Priority = High (target is Critical) Calculations Critical Patches SLA = Complete Date/time - Submit Date/Time Date/Time (Should be considered an SLA miss, a CI must meet the following requirements in order equal to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** Normal Patches SLA = Complete Date/time - Submit Date/Time (every Should be equal to ***)) Service Level Achievement = (Total number of Enterprise Patches that are successfully installed in accordance with SLA/ Total number of Enterprise Patches that are scheduled to be completed during the Measurement Period) * 100% CONFIDENTIAL MATERIAL APPEARING IN THIS DOCUMENT HAS BEEN OMITTED AND FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION IN ACCORDANCE WITH THE SECURITIES EXCHANGE ACT OF 1934, on or before the AS AMENDED, AND RULE 24b-2 PROMULGATED THEREUNDER. OMITTED INFORMATION HAS BEEN REPLACED WITH “*** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction*”. CONFIDENTIAL MATERIAL APPEARING IN THIS DOCUMENT HAS BEEN OMITTED AND FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION IN ACCORDANCE WITH THE SECURITIES EXCHANGE ACT OF 1934, AS AMENDED, AND RULE 24b-2 PROMULGATED THEREUNDER. OMITTED INFORMATION HAS BEEN REPLACED WITH “***”.

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Request Management, OPAS Incident Records Exceptions: - Any assets within the sample set that are unavailable hold in approval processes or inaccessible to Supplier tools and/or Supplier Personnel Change Request initiated by CoreLogic - Any termination requests over 40 during a given day will not be excluded measured -Pending time from this calculation valid Pending Event Supplemental definition of terms: In order to measure this SLA each month a random sampling of ***% of Resolution Time means the actively reporting KACE servers and workstations will be obtained. The criteria determine if elapsed time from the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it record being created to the comparable field of data in time the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit record is completely resolved. Termination Request will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an oninclude: Active Directory ID CRM Mainframe Oracle Remote Access/VPN UNIX RightFax TeamForge TimeTracker AS/400-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. iSeries Enterprise Business Applications Low Volume Eligible: No Threshold parameters: ***% within *** and ***% within *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest Summary NameRequest Approval Date/TimeRequest Assigned Date/TimeRequest Completed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit CriteriaSupporting Company = “CORELOGIC”, “FINITI”,”RELS”,”STARS”Request Summary = “Terminate A Resource (employee/contractor/vendor) andWork Order Summary is = “Terminate System Access” or Terminate Mainframe Access or Terminate Remote Access VPNAND All incident records for the period whereSummary = “TERMINATION_CLGX:Termination*” Request Closed Date/Time is within the reporting periodExcluded in SLA Reporting <> YesCalculations Request SLA = Latest Work Order Completed Date/Time – earliest Work Order Assigned Date/Time Two (2) thresholds will be conducted. Any failure on data based on attribute to sampling computed :Threshold (1) Service Level Achievement (Number of ***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs successfully completed requests within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** /Total number of requests in the measurement period) * 100% Threshold (every 2) Service Level Achievement (Number of successfully completed requests within *** /Total number of requests in ***the measurement period) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction100%

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: Any - Batch new user setups will be excluded from the On-Time Completion percentage calculation. - Remote Location: The eleventh or more new user setup per day per remote location will be excluded from SLA calculation. - Campus Location: The sixteenth or more new user setup per day per campus location will be excluded from SLA calculation. - Time before arrival of assets within the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation measurement (i.e., ticket is submitted after arrival of all necessary assets.) - Request for Non-Standard Hardware that requires unique configuration requirements not previously defined. _ A Batch will be defined as *** new user set-ups per day for Remote locations and *** new user set-ups per day for Campus Locations. Supplemental definition of terms: In order to measure this SLA each month a random sampling Measures the percentage of ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs New Authorized User setup requests that are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs completed within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within required timeframes during the preceding *** will be excluded from inclusion in the random sampling.The results of the audit Measurement Period Request Name : User Provisioning and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. Work Order = Computer Hardware Implementation Low Volume Eligible: No Yes, provided that if any single request is not resolved within ***) then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM Requested by CompanyRequest SummaryClosed Date/TimeSubmitted Date/TimeApproval Date/TimePending TimeStatusExcluded from SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling CriteriaAll Request Records with Request Type = “New User Provisioning” If Request Summary = “New User Provisioning”Get Work Order Name = “*EUD*” or Work Order name = “Desk Phone*” or Work Order Name = “Mobile Device*” or ( Request Type = ““TEMP NUP CLGX-Add or Install End User Device” or ““TEMP NUP CLGX-Computer Hardware” or “TEMP NUP CLGX-Desk Phone” or “TEMP NUP CLGX-TEM-Mobile Device”) Closed Date/Time is within the reporting date of the SLAExcluded from SLA reporting <>”Yes”Status = “Closed”CoreLogic authorized users, currently:Requested by Company = “CORELOGIC”, “RELS”,”FINITI” Calculations SLA Elapsed Time = Completed Date/Time - Approval Date/Time - Pending Time Service Level Achievement = Number of End User Device Setup with SLA Elapsed time <=***% / Total number of End User Device Setup Service Requests for the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Measurement Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: Any assets within the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel - Greater than *** in single request is batch and will be excluded from this calculation the on-Time Completion percentage calculation. Continual Service Improvement Supplemental definition of terms: In order to measure this SLA each month a random sampling Yes Measures the percentage of Virtual Service Provisioning Requests (less than ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs ) that are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs fulfilled by Supplier within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. required timeframe Low Volume Eligible: No Yes, provided that if any single request is not resolved within ***, then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest SummaryWork Order SummaryRequest Approval Date/TimeRequest Assigned Date/TimeRequest Closed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conductedCriteriaSupporting Company = “CORELOGIC”, “FINITI”,”RELS”,”STARS”Request Summary = “Virtual Server Install - Linux - Internal or “Virtual Server Install - Windows - Internal “And Work Order Summary = “Provision Server” and Work Order Summary = “Service Device/QA”Request Closed Date/Time is within the reporting periodExcluded in SLA Reporting <> Yes Calculations Request SLA = Latest Completed Date for Work Orders – Earliest Assigned Date for Work - Total Pending Time Service Level Achievement (Number of successfully completed requests with Request SLA /Total number of requests in the measurement period) * 100% CONFIDENTIAL MATERIAL APPEARING IN THIS DOCUMENT HAS BEEN OMITTED AND FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION IN ACCORDANCE WITH THE SECURITIES EXCHANGE ACT OF 1934, AS AMENDED, AND RULE 24b-2 PROMULGATED THEREUNDER. Any failure on data based on attribute to sampling of OMITTED INFORMATION HAS BEEN REPLACED WITH “***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction”.

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: Any - Batch new user setups will be excluded from the On-Time Completion percentage calculation. - Remote Location: The eleventh or more new user setup per day per remote location will be excluded from SLA calculation. - Campus Location: The sixteenth or more new user setup per day per campus location will be excluded from SLA calculation. - Time before arrival of assets within the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation measurement (i.e., ticket is submitted after arrival of all necessary assets.) - Request for Non-Standard Hardware that requires unique configuration requirements not previously defined. _ A Batch will be defined as *** new user set-ups per day for Remote locations and *** new user set-ups per day for Campus Locations. Continual Service Improvement Supplemental definition of terms: In order to measure this SLA each month a random sampling Yes Measures the percentage of ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs New Authorized User setup requests that are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs completed within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within required timeframes during the preceding *** will be excluded from inclusion in the random sampling.The results of the audit Measurement Period Request Name : User Provisioning and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. Work Order = Computer Hardware Implementation Low Volume Eligible: No Yes, provided that if any single request is not resolved within ***) then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM Requested by CompanyRequest SummaryClosed Date/TimeSubmitted Date/TimeApproval Date/TimePending TimeStatusExcluded from SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of CriteriaAll Request Records with Request Type = “New User Provisioning” If Request Summary = “New User Provisioning”Get Work Order Name = “*EUD*” or Work Order name = “Desk Phone*% ” or Work Order Name = “Mobile Device*” or ( Request Type = ““TEMP NUP CLGX-Add or Install End User Device” or ““TEMP NUP CLGX-Computer Hardware” or “TEMP NUP CLGX-Desk Phone” or “TEMP NUP CLGX-TEM-Mobile Device”) Closed Date/Time is within the reporting date of the CIs will be considered an SLAExcluded from SLA missreporting <>”Yes”Status = “Closed”CoreLogic authorized users, a CI must meet the following requirements in order to pass the auditcurrently:Requested by Company = “CORELOGIC”, “RELS”,”FINITI” Calculations SLA Elapsed Time = Completed Date/Time – Approval Date/Time – Pending Time Service Level Achievement = Number of End User Device in AMDB Setup with SLA Elapsed time <= YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before / Total number of End User Device Setup Service Requests for the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfactionPeriod

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Incident Management , Change Management, Atrium Exceptions: Any assets within the sample set - Single point of failure of hardware, software, or carrier services Root Cause - All records that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will indicate not part of this SLA as part of RCA from clause "Will be excluded measured using an Incident RCA based measurement approach" flagged as Excluded from this calculation SLA reporting Supplemental definition of terms: In order Service availability for Right Fax (including SQL server database) server to measure this SLA each month a random sampling of ***% of the actively reporting KACE servers send and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGXreceive faxes. Low Volume Eligible: No Threshold parameters: ***% SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the Supporting CompanyPriority Product NameIncident IDIncident Related ChangeIncident Assigned Date/TimeIncident Resolved Date/TimeChange IDChange CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM UnavailabilityChange CI Unavailability Start Date/TimeChange CI Unavailability End Date/Time Logical description of the SLA calculation: Report CriteriaManual Audit will be conductedCriteriaSupporting Company = “CORELOGIC”,”STARS”,”RELS”,”FINITI”Priority = Critical or HighService Tier = “RightFax”Incident Resolved Date/Time = Period of ReportingIncident Related Changes for All incidents during the Period of Reporting Calculations Service Level Achievement = (Total number of hours of RightFax UPTIME/ Expected Uptime * 100% CONFIDENTIAL MATERIAL APPEARING IN THIS DOCUMENT HAS BEEN OMITTED AND FILED SEPARATELY WITH THE SECURITIES AND EXCHANGE COMMISSION IN ACCORDANCE WITH THE SECURITIES EXCHANGE ACT OF 1934, AS AMENDED, AND RULE 24b-2 PROMULGATED THEREUNDER. Any failure on data based on attribute to sampling of OMITTED INFORMATION HAS BEEN REPLACED WITH “***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfaction”.

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

AutoNDA by SimpleDocs

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: - Any assets hold in approval processes or change request initiated by CoreLogic- Supplier's failure to meet this Service Level in respect of any hardware-related Incidents shall be excused to the extent that such failure is caused by any Supplier third party provider's failure to perform, or delay in performing, any repair or replacement actions required to be performed by such third party provider in connection with the resolution of any such Incident; provided, that (i) Supplier uses commercially reasonable efforts to cause such third party providers to perform within the sample set that are unavailable or inaccessible required time frame and (ii) to Supplier tools and/or Supplier Personnel will be excluded from this calculation the extent documented in Supplier's Root Cause Analysis- Pending Time based on allowed Pending Event Supplemental definition of terms: In order to measure this SLA each month a random sampling Measures the time taken between processing of ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. approved request(s) Low Volume Eligible: No Threshold parameters: ***% of requests completed within ***; ***% completed within *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest Summary NameRequest Approval Date/TimeRequest Assigned Date/TimeRequest Completed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of CriteriaSupporting Company = “CORELOGIC”, “FINITI”,”RELS”,”STARS”Request Summary Name = Firewall Request or IP Assignment Request Closed Date/Time is within the reporting periodExcluded in SLA Reporting <> Yes Calculations Request SLA = Request Completed Date/Time – Request Approval Date/Time- Total Pending Time ***% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations - Service Level Achievement (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the = (Number of successfully completed requests within *** following /Total number of requests in the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this measurement period) * 100% ***% - Service Level is to measure CoreLogic Operation Managers customer satisfactionAchievement (within ***)= (Number of successfully completed requests within *** /Total number of requests in the measurement period) * 100%

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: Any - Greater than 10 in single request is batch and will be excluded from the on-Time Completion percentage calculation. - Time before arrival of assets within the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this measurement (i.e. ticket is submitted after arrival of all necessary assets.) - Customer Approval Time not included in SLA calculation - Pending Time based on Valid Pending Activity Supplemental definition of terms: In order to measure this SLA each month a random sampling Measures the percentage of ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs Physical Service Provisioning Service Requests that are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs fulfilled by Supplier within the sample set Each device included in a random sample audit will be tagged with a Last Audit Daterequired timeframe. Devices with Last Audit Dates within the preceding This Service Level applies to standard configuration x*** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. servers Low Volume Eligible: No Yes, provided that if any single request is not resolved within ***, then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest Summary NameRequest Approval Date/TimeRequest Assigned Date/TimeRequest Closed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of ***% of the CIs will be considered an SLA missCriteriaSupporting Company = “CORELOGIC”, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs “FINITI”,”RELS”,”STARS”Request Summary= “Physical Server Install - UNIX/Linux - Internal” or “Physical Server Install - Windows - Internal”Request ClosedDate/Time is within the sample set/ reporting periodExcluded in SLA Reporting <> Yes Calculations Request SLA = Request Completed Date/Time - Request Approval Date/Time- Total Pending Time Service Level Achievement (Number of successfully completed requests with Request SLA /Total number of requests in the total count of sample set CIs 8measurement period) * 100% 37. 2.2.h 2.4.j KM-SMMR-CSAT Operations Managers VIRTUAL SERVER PROVISIONING REQUEST DATES SLA Start Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this Service Level is to measure CoreLogic Operation Managers customer satisfactionthe requests to add server(s) into Cloud Environment: Less than *** VM's - Percentage of server provisioning requests that are successfully completed within the target timeframe

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests Exceptions: - Any assets within Storage provisioning requests which exceed (in sum) more than ***% of allocated storage during the sample set that are unavailable or inaccessible to Supplier tools and/or Supplier Personnel designated Measurement Period will be excluded from this calculation excluded. - Locally attached storage - Third Party Hardware Exception Continual Service Improvement Supplemental definition of terms: In order to measure this SLA each month Yes Measures the percentage of Tier 1, 2, and 3 storage configuration service requests, less than 100TB of all allocated storage, that are performed within the required timeframe during the Measurement PeriodAllocated Storage is equivalent last month’s storage capacity report which will be used a random sampling of monthly baseline for the ***% of the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs are compliant will be the following: Does the device exist in the AMDB? If no, then this counts as a failure If yes, then is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGXvalue. Low Volume Eligible: No Yes, provided that if any single request is not resolved within ***, then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field parameters: (field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest Summary NameRequest Approval Date/TimeRequest Assigned Date/TimeRequest Closed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute CriteriaSupporting Company = “CORELOGIC”, “FINITI”,”RELS”,”STARS”Request Summary = “Server Storage Request” (question included at the work order level to sampling of ***% of indicate whether the CIs will be considered an SLA miss, a CI must meet the following requirements request Total Size in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs GB)Request Closed Date/Time is within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES reporting periodTotal Size in GB <100 TBExcluded in SLA Start Reporting <> Yes Calculations Request SLA = Request Completed Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report /Time – Request Approval Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this /Time- Total Pending Time Service Level is to measure CoreLogic Operation Managers customer satisfactionAchievement (Number of successfully completed requests with Request SLA /Total number of requests in the measurement period) * 100%

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Time Definitions. Reporting Tools: AMDB and KACE OPAS Service Requests, OPAS Incidents Exceptions: - Any assets within the sample set that are unavailable hold in approval processes or inaccessible to Supplier tools and/or Supplier Personnel will be excluded from this calculation Change Request initiated by CoreLogic - Any valid Pending events Supplemental definition of terms: In order Measures the time taken between processing of approved request to measure this SLA each month a random sampling of ***% of setup the actively reporting KACE servers and workstations will be obtained. The criteria determine if the CIs are compliant account(s) SecAdmin will be the followingsupport group for Dell that handles User Access New User Provisioning Request will include: Does the device exist Active Directory Check Writer CRM Oracle Unix VPN TeamForge TimeTrack Mainframe RightFax Enterprise or Business Unit Software (list provided in the AMDB? If no, then this counts as a failure If yes, then NUP form where Dell is the device in ‘deployed’ status in the AMDB?If not in a ‘deployed’ status, then this counts as a failure If yes and in a Deployed status then compare the data pulled from KACE and compare it required to the comparable field of data in the AMDB If any of the fields for a device does not match, then the device counts as a failure All others are accurate [this count to feed into the count of accurately reported CIs within the sample set Each device included in a random sample audit will be tagged with a Last Audit Date. Devices with Last Audit Dates within the preceding *** will be excluded from inclusion in the random sampling.The results of the audit and supporting data will be provided to Service Management as per their processes to include the data in the SLA calculations and reporting requirements.Each device failing the audit will be remediated each month and will have an on-line edit function to record notes as to the remediation.Each *** prior to the new sampling, the previous *** sampling and remediation will be snapped to an archive.Each *** Asset Management will review the previous *** audit results with CLGX. provision logical access) AS/400 iseries Low Volume Eligible: No Yes, provided that if any single request is not resolved within 24 business hours, then this Low Volume exception shall not apply Threshold parameters: ***% of requests completed within *** SLA Metrics and parameters:(field names in OPAS) Supported CompanyRandom ***% of the CI SamplingAttributes to be compared for audit consist of the following:KACE ID - to identify matches Supporting CompanyRequest Summary NameRequest Approval Date/TimeWork Order Assigned Date/TimeWork Order Completed Date/TimePending TimeExcluded in the AMDBHost NameStatusDomainIP AddressMAC AddressOS TypeService PackHard Drive SizeTotal RAM SLA Reporting Logical description of the SLA calculation: Report CriteriaManual Audit will be conducted. Any failure on data based on attribute to sampling of CriteriaSupporting Company = “CORELOGIC”, “FINITI”,”RELS”,”STARS”(Request Summary = “New User Provisioning “ andWork Order Summary = “*SLAor Request Summary = “*TEMP NUP CLGX-Oracle Access” or *% of the CIs will be considered an SLA miss, a CI must meet the following requirements in order to pass the audit:Device in AMDB = YesAMDB Status = DeployedAMDB Data = KACE Data Calculations TEMP NUP- SecAdmin- Mainframe/zSeries or *TEMP NUP UNIX User Addition or *TEMP NUP CLGX-Time Tracker or *TEMP NUP Digital Certificate (will be done Manually by Service Management)Audit Accuracy = The count of accurately reported CIs VPN) orRequest Closed Date/Time is within the sample set/ the total count of sample set CIs 8. 2.2.h KM-SM-CSAT Operations Managers DATES reporting periodWork Order Complete Date/TimeWork Order Assigned Date/TimeExcluded in SLA Start Reporting <> Yes AND All incident records with Summary = “PROVISION_CLGX:New User Activation*” Calculations Request SLA = Latest Work order Completed Date: 8/1/2013 First Reporting Period: 8/31/2013 First Report Date: 9/6/2013 Reporting Period: *** (every *** in ***) Reporting Frequency: *** (every ***), on or before the *** following the reporting period where the survey is closed UNDERSTANDING Contract Reference: Category: CORELOGIC-Dell Schedule A-3.1(Service Level Matrix)CORELOGIC-Dell Supplement A Key Measurement Interpreted Intent of SLA: The intent of this /Time - Work order earliest Assigned Date and Time- Total Pending Time %Service Level is to measure CoreLogic Operation Managers customer satisfactionAchievement (Number of successfully completed requests with Request SLA /Total number of requests in the measurement period) * 100%

Appears in 1 contract

Samples: Master Services Agreement (Corelogic, Inc.)

Draft better contracts in just 5 minutes Get the weekly Law Insider newsletter packed with expert videos, webinars, ebooks, and more!