Lexicalization is defined by WordNet as "the process of making a word to express a concept" . In the context of this project, lexicalizations are surface forms referring to a given DBpedia Resource. The DBpedia Lexicalizations Dataset stores the relationships between DBpedia Resources and a set of surface forms that we found to be referent to those resources in Wikipedia.
The English Wikipedia has more than a hundred edits per minute. A large part of the knowledge in Wikipedia is not static, but frequently updated, e.g., new movies or sports and political events. This makes Wikipedia an extremely rich, crowdsourced information hub for events. We have created a dataset based on DBpedia Live. Therefore, events are extracted not based on resource description them selves, but on the changes that happen to resource descriptions. The dataset gets daily updated and provides a list of headlined events linked to the actual update and resource snapshots.