Data Validation Sample Clauses

Data Validation. Integration of multiple datasets together can be fraught with difficulty, including inconsistent fields, missing datasets, and conflicting sets of information. The Provider solution will need rules to ensure referential integrity between datasets:
Data Validation. The MCO must require all Texas Health Steps Providers to submit claims for services paid (either on a capitated or fee-for service basis) on the CMS 1500 claim form and use the HIPAA compliant code set required by HHSC. Encounter Data will be validated by chart review of a random sample of Texas Health Steps eligible enrollees against monthly Encounter Data reported by the MCO. HHSC or its designee will conduct chart reviews to validate that all screens are performed when due and as reported, and that reported data is accurate and timely. Substantial deviation between reported and charted Encounter Data could result in the MCO and/or Network Providers being investigated for potential Fraud, Abuse, or Waste without notice to the MCO or the Provider.
Data Validation. 1.2.3.1. Completion of sampling exercise to validate data delivered over network compared to date read from the face of the meter while installed in the field. Representative sample to be taken by FPL to meet *** confidence level +/- *** confidence interval that the readings correlate (n=***) (same exercise as 1.1.3).
Data Validation. 1.5.3.1. Validation of sample of meters to occur for which outage indication has been provided over the network. 1.5.3.1.1. No more than *** of investigated meters are to be categorized as “false positives” meaning there is no explainable reason for the outage message to be produced.
Data Validation. The HMO must require all Texas Health Steps Providers to submit claims for services paid (either on a capitated or fee-for service basis) on the CMS 1500 claim form and use the HIPAA compliant code set required by HHSC. Encounter Data will be validated by chart review of a random sample of Texas Health Steps eligible enrollees against monthly Encounter Data reported by the HMO. HHSC or its designee will conduct chart reviews to validate that all screens are performed when due and as reported, and that reported data is accurate and timely. Substantial deviation between reported and charted Encounter Data could result in the HMO and/or Network Providers being investigated for potential Fraud, Abuse, or Waste without notice to the HMO or the Provider.
Data Validation. 1. The parties agree that Defendants shall retain the services of a Data Validator for purposes of verifying and reporting on a semi-annual basis Defendants’ compliance with the exit criteria identified in this Agreement. The Data Validator shall be a third party contractor of the State of Missouri that has had prior experience conducting data validation services for state child welfare agencies. The Plaintiffs agree and understand that the services of a Data Validator will have to be retained in compliance with federal and state law governing procurement of contracts. Defendants will make best efforts to complete this process within four months from the date of this Agreement. 2. The Data Validator shall issue written reports pursuant to the schedule set forth below. The reports shall describe the measurable progress made by Defendants in relation to each of the exit criteria and reportable data elements contained in this Agreement for each six-month reporting period, as well as any issues or challenges encountered or observed by the Data Validator regarding the collection of performance data or its application to the exit criteria and data elements. 3. Sampling, data, and data analysis will be subject to review and approval by the Data Validator. Promptly after the Data Validator is retained, the parties shall work with the Data Validator to determine the appropriate means for measuring and reporting performance on each of the exit criteria and data sharing items, including ensuring that any case reviews conducted for purposes of measuring performance are based on a statistically valid, representative, random sample of Class Members. Data will be provided from the data source identified in Exhibit B. The sample files shall be drawn, without replacement, from Class Members (as opposed to all children in CD custody). The parties agree that a sample is representative if, given the population size, the case review delivers a measurement with a 5% margin of error at the 90% confidence level. A non-exclusive example of a collection or measurement issue the parties must address and resolve with the Data Validator is excluding or limiting the use of Children or employees who “straddle” relevant time periods. 4. The Data Validator shall prepare and provide to the parties an agreed upon template that provides the format for data elements and reporting at the close of each Reporting Period on: (a) Defendants’ performance on each exit criterion and (b) Defendants’ p...
Data Validation. Encounter data is currently defined by AHS as a “claim”. All claims for payments are currently submitted to HP and undergo a series of automated edits and audits to ensure accuracy, timeliness, correctness, logic, consistency and completeness. Any claim failing edits will be rejected and must be re-submitted. Claims data will be collected in a format that is consistent with the HIPAA transaction standards in place on the date of service. Claims must represent services provided to Global Commitment to Health Demonstration enrollees only. DVHA will perform validation on a random sample of all claims to ensure that services were actually provided. AHS will have direct access to all information systems.
Data Validation. In general, the NAEMS researchers invalidated measurement data if the data values were: • Unreasonably low or high when compared to normal ranges if there was supporting evidence that the data value is not correct (e.g., lagoon temperature sensor producing a reading of less than -40 ° C). • Obtained during system installation, testing or maintenance during which uncorrectable errors might be introduced. • Obtained when a sensor or instrument was proven to be malfunctioning (e.g., unstable). • Obtained during calibration or precision check of a sensor or instrument and before the sensor or instrument reached equilibrium after the check. • Obtained when the data acquisition and control hardware and/or software were not functioning correctly. Data that the NAEMS researchers deemed invalid were retained in the preprocessed data sets. However, the EPA did not use the flagged data to calculate pollutant emissions rates. For averaged data, data were invalidated to avoid errors introduced into calculated mean values due to partial-data days (e.g., only a few hours of valid data) that would result in biased time weights:
Data Validation. The final step in the conversion process is the data validation. Much attention will be given to data integrity during the testing phase by the program developers. The conversion assistant will also spend time testing the integrity of the information. Balances and the output of processes will be tested after the conversion. A visual inspection of different modules will be performed by choosing different records on a random base. But data validation is ultimately the responsibility of the Client.
Data Validation. 1.3.3.1 Completion of dual metering exercise or alternate method to compare interval data capture at the meter and compare to meter data collected and provided over the network. Method TBD.