This site uses cookies. By continuing to browse the site, you agree to our use of cookies. Find out more about cookies.
Skip to main content
  • English Choose language Scegli la lingua
    • Italiano
Sign Up Sign In
  • Home
  • Events
  • Blog
  • Assemblies
  • Dashboard Lite
  • Help

inDICEs Datasets

An Archive of cultural heritage and participation data

  • About
  • Data Sets
chevron-left Back to list

Wikilink Graphs

Avatar: Antoine Houssard Antoine Houssard
15/12/2022 10:55  

Dataset name
WikiLinkGraphs: A complete, longitudinal and multi-language dataset of the Wikipedia link networks
Who is an expert on it?
FBK
Type of data
Network
Where does it come from?
https://dumps.wikimedia.org/
What information does this data contain?
Hyperlinks betweeen pages
License type
CC-BY
Approx size
51.5 Gb
How was it created / collected
Wikidumps
Will you be using standard vocabularies?
Yes
How is it managed
Databricks
How often is it updated?
Code to reproduce and obtain new data / Periodical dump
How can it be shared
Share the code
Related hypothesis URL
Related insights URL
Dataset URL
Code repository URL
https://github.com/WikiLinkGraphs/wikidump

AUTOMATED DATA CONTENT ANALYSIS

  • Keyword Graph
  • Geomap
  • Tag Cloud
Sources:
Date range:  
Color codes for sentiment analysis: Negative vs Positive
Endorsements count0
Wikilink Graphs Comments 0

Reference: IN-PROP-2022-12-204
Version number 1 (of 1) see other versions
Check fingerprint

Fingerprint

The piece of text below is a shortened, hashed representation of this content. It's useful to ensure the content hasn't been tampered with, as a single modification would result in a totally different value.

Value: 505b518c561e6f80673875aed990c262cf1a5599cdc6ebe3f648b06c91f42af8

Source: {"body":{"en":"<xml><dl class=\"decidim_awesome-custom_fields\" data-generator=\"decidim_awesome\" data-version=\"0.8.3\"><dt name=\"text-1669830915256-0\">Dataset name</dt><dd id=\"text-1669830915256-0\" name=\"text\"><div>WikiLinkGraphs: A complete, longitudinal and multi-language dataset of the Wikipedia link networks</div></dd><dt name=\"text-1669882346357-0\">Who is an expert on it? </dt><dd id=\"text-1669882346357-0\" name=\"text\"><div>FBK</div></dd><dt name=\"text-1669830917520-0\">Type of data</dt><dd id=\"text-1669830917520-0\" name=\"text\"><div>Network</div></dd><dt name=\"text-1669882487557-0\">Where does it come from?</dt><dd id=\"text-1669882487557-0\" name=\"text\"><div>https://dumps.wikimedia.org/</div></dd><dt name=\"text-1669830920266-0\">What information does this data contain?</dt><dd id=\"text-1669830920266-0\" name=\"text\"><div>Hyperlinks betweeen pages</div></dd><dt name=\"text-1669831026651-0\">License type</dt><dd id=\"text-1669831026651-0\" name=\"text\"><div>CC-BY</div></dd><dt name=\"text-1669831023688-0\">Approx size</dt><dd id=\"text-1669831023688-0\" name=\"text\"><div>51.5 Gb</div></dd><dt name=\"text-1669831009851-0\">How was it created / collected</dt><dd id=\"text-1669831009851-0\" name=\"text\"><div>Wikidumps</div></dd><dt name=\"text-1669882356724-0\">Will you be using standard vocabularies? </dt><dd id=\"text-1669882356724-0\" name=\"text\"><div>Yes</div></dd><dt name=\"text-1669831012351-0\">How is it managed</dt><dd id=\"text-1669831012351-0\" name=\"text\"><div>Databricks</div></dd><dt name=\"textarea-1669882387306-0\">How often is it updated? </dt><dd id=\"textarea-1669882387306-0\" name=\"textarea\"><div>Code to reproduce and obtain new data / Periodical dump</div></dd><dt name=\"text-1669831020268-0\">How can it be shared</dt><dd id=\"text-1669831020268-0\" name=\"text\"><div>Share the code</div></dd><dt name=\"text-1669831152334-0\">Related hypothesis URL</dt><dd id=\"text-1669831152334-0\" name=\"text\"><div></div></dd><dt name=\"text-1669904790297-0\">Related insights URL</dt><dd id=\"text-1669904790297-0\" name=\"text\"><div></div></dd><dt name=\"text-1669831231631-0\">Dataset URL</dt><dd id=\"text-1669831231631-0\" name=\"text\"><div></div></dd><dt name=\"text-1669903904024-0\">Code repository URL</dt><dd id=\"text-1669903904024-0\" name=\"text\"><div>https://github.com/WikiLinkGraphs/wikidump</div></dd></dl></xml>"},"title":{"en":"Wikilink Graphs"}}

This fingerprint is calculated using a SHA256 hashing algorithm. In order to replicate it yourself, you can use an MD5 calculator online and copy-paste the source data.

Share:

link-intact Share link

Share link:

Please paste this code in your page:

<script src="https://participate.indices-culture.eu/assemblies/indicesDatasets/f/163/proposals/204/embed.js"></script>
<noscript><iframe src="https://participate.indices-culture.eu/assemblies/indicesDatasets/f/163/proposals/204/embed.html" frameborder="0" scrolling="vertical"></iframe></noscript>

Report inappropriate content

Is this content inappropriate?

Reason

0 comments

Order by:
  • Older
    • Best rated
    • Recent
    • Older
    • Most discussed

Add your comment

Sign in with your account or sign up to add your comment.

Loading comments ...

  • Terms and Conditions
  • Change Impact Assessment Framework
  • Download Open Data files
Europe flag
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 870792.
The sole responsibility for the content of this publication lies with the authors. It does not necessarily represent the opinion of the European Union.
Neither the EASME nor the European Commission is responsible for any use that may be made of the information contained therein.
Creative Commons License Website made with free software.
Decidim Logo
Made by Platoniq

Confirm

OK Cancel

Please sign in

Sign up

Forgot your password?