Organized by Valentine Charles and Antoine Isaac (Europeana Foundation), Amrapali Zaveri (University of Maastricht), Wouter Beek (VU University Amsterdam), Péter Király (GWDG Göttingen)
There is an unprecedented amount of data on the Web today. However, this data is only as useful as its quality allows it to be. Data Quality is an important topic in companies and organizations from various (vertical) domains, which are working with large and heterogeneous linked data, be it LOD or Linked Business Data. Academia is also now starting to address the issue and define data quality assessment frameworks.
However there is no foolproof framework for performing this kind of quality assessment and little discussion between these different domains. We as organizers are currently working on data quality issues in our own datasets. We are building tools and methodologies to assess and improve data quality.
This workshop is aiming at bridging the gap by inviting professional data experts as well as researchers to share their ideas, algorithms and lessons learnt with each other.
The first part of the workshop will be organised around a set of presentations from differents domains - by the organizers or representatives of companies that have accepted to present at the workshop:
- Cultural heritage (Europeana)
- Open Government (Kadaster, the Dutch land measuring agency)
- Life Science & Health Care (Maastricht University)
- Publishing (Elsevier)
- Telecommunication (Semaku for NXP/Qualcomm)
- Real estate (Geo Phy)
Each presenter will describe the criteria for good data quality in their domains and the framework they are developing to assess and improve their data. Data quality will be examined from both the point of view of data publishers and consumers. The audience will also be invited to share their experiences in a series of short lightning talks.