Dataset and scripts released by SoNet @ FBK group

Reference paper: Social networks of Wikipedia. Massa, Paolo (2011). ACM Hypertext 2011: 22nd ACM Conference on Hypertext and Hypermedia.

If you appreciate the fact I released the scripts and the datasets, please cite this paper. Thanks! --Paolo


Network extracted from User Talk pages of Venetian Wikipedia visualized with Gephi.

Abstract:
Wikipedia, the free online encyclopedia anyone can edit, is a live social experiment: millions of individuals volunteer their knowledge and time to collective create it. It is hence interesting trying to understand how they do it. While most of the attention concentrated on article pages, a less known share of activities happen on user talk pages, Wikipedia pages where a message can be left for the specific user. This public conversations can be studied from a Social Network Analysis perspective in order to highlight the structure of the “talk” network. In this paper we focus on this preliminary extraction step by proposing different algorithms. We then empirically validate the differences in the networks they generate on the Venetian Wikipedia with the real network of conversations extracted manually by coding every message left on all user talk pages. The comparisons show that both the algorithms and the manual process contain inaccuracies that are intrinsic in the freedom and unpredictability of Wikipedia growth. Nevertheless, a precise description of the involved issues allows to make informed decisions and to base empirical findings on reproducible evidence. Our goal is to lay the foundation for a solid computational sociology of wikis. For this reason we release the scripts encoding our algorithms as open source and also some datasets extracted out of Wikipedia conversations, in order to let other researchers replicate and improve our initial effort.


Python scripts, released as open source under the GPL license, on github.com


Venetian Wikipedia (2 networks extracted automatically with the 2 algorithms but also 1 networks resulting from manual coding of User Talk pages)

Networks:

Networks are in graphml format. Right-click for downloading the desired file and then open it with your preferred program to analyze networks. We like Gephi.

Venetian Wikipedia XML dumps


Large Wikipedias (2 networks extracted automatically with the 2 algorithms)

Stats about the previous 4 networks as outputted by running the graph_analysis.py algorithm