Data Quality

ACER’s data quality assurance at a glance

​​ACER is committed to ensuring high-quality transaction and fundamental data reporting, and will continue to devote specialist supervisory efforts to this endeavour to further advance its market monitoring capabilities.

As part of its data quality framework, ACER assesses and ensures the data received under REMIT and the Implementing Regulation are complete, accurate and timely. These data quality assessments are performed regularly on different data sets, covering transactions executed either at different organised market places or bilaterally.​

The REMIT data quality assurance is based on a two-stage approach​.

1. Data collection stage

​During the data collection stage, data is inserted into the ACER's REMIT Information System (ARIS), where the validation rules are applied to the data. 

T​here are two levels of data validation: the first is performed on a technical level, while the second, more in-depth, is managed at the database level. Here, integrity checks are undertaken across the reported and reference data. Any invalid data is rejected and flagged.

Data validation ensures the quality of the collected data, so they can be stored in ACER's REMIT database. ​

2. Data quality assessment stage

​The data quality assessment stage applies other methods to confirm the quality of the data and ensures a timely follow-up.

Non-compliant data at either stage can lead to the enforcement of Article 8 of REMIT. ACER seeks to solve any data quality issues by cooperating with reporting parties, but enforcement action will be initiated if necessary.​

 

Dimension Description Example
Completeness Have all data sets and items been reported? The proportion of the stored data against the required 100% completeness.
Uniqueness Is there a single view of the data set? Every record should be reported only once. If reported twice, the system should detect it.
Timeliness Is the data reported within the timeline defined by the Regulation and IAs? The time difference between the timestamp of when a record was reported and when the business event occurred.
Validity Does the data comply with schemas and passes the validation rules? The extent to which the received data is valid. The number of records that have been rejected compared to the total dataset.
Accuracy Does the data reflect the actual business event? The degree to which the record correctly describes the business event reported (correct price, volume, units, time-stamps, identifiers).
Consistency Can we match the data set throughout the various RRMs? No differences when comparing more representations of the same or similar business event.

​​

↓ See Also