Use data from different Semantic Mediawikis and also Communecter's database

(Simon Sarazin) #1

I’m actually starting to put data of “commons” projects on different Semantic Mediawiki :

  • transformap
  • (list of juridical commons, they just started)
  • (list of commons in the mobiliy sector, i’m working on this mediawiki and starting to get data of different commons projects)
  • (lot’s of data on open coworking/third places in France, not in semantic for the moment)

We have also more and more data on platform (based on a centralised database that we call the “opencommondatabaseXX”). For example here is a starting list of third places projects in France ( , that is also accessible in parts on the mediawiki. The same for the commons
How could we connect the data, and not duplicate it between different mediawiki using semantic extension and also between mediawiki and communecter ? I would prefer to use the transformap data than to create a new one in communecter or in an other semantic mediawiki

Any direction to find a solution ? Things to try to do ? Connectors to developp ?

Thanks :slight_smile:

(Jon Richter) #2

Your questions lead into the right direction. Unfortunately my head is preoccupied with few technical considerations around deduplication, which makes it harder for me to answer them. I believe @species has an opinion on how to do this?

What you are basically asking to me is: Do we have a way to seamlessly integrate linked data from various sources, e.g. how DBpedia does, or will we keep synchronised datasets from each other.

Then you are talking about what @otisyves has been building in the Remix the Commons wiki. See their article about for example, which links to Wikidata, Wikipedia, DBpedia and P2P Foundation sources. Therefore the Remix the Commons wiki is a suitable place to place further connections, turning into a specialised centrality within the wealth of data sources available within the network.

I would propose we further dive into

and start adding multiple of such links to our multiple data sources into one of the Semantic MediaWikis at hands. Thus, we’d be building a pretty classical federation.

Then we could start building a crawling client and inference machine which is able to combine the data of those sources into one and display it.

(Jon Richter) #3

@toka In following up on our recent phone call, this is where I had described a little context about how to strategise mixing the contents of multiple Semantic MediaWikis. Any comments or recommendations welcome.