2015-12-03_unsceduled_Monkey_mumble-meeting 19:00 - 20:00
original pad here: https://text.allmende.io/p/2015-12-03_unsceduled_Monkey_mumble-meeting
Looking at TransforMap architecture drafts by @almereyda
###hypermedia microservices architecture
Link to the Sketch, on which the following explenations are based<
(full-coloured, which looks like strikethrough, actually means highlighted)
Jon explains the image:
#left side: linked data registry: tells the consumers of the API, where to find the data
Rough time guesstimation:
~ 1-2 PM - deployment, testing and documentation <
with testing and documentation 2 person months
- last month of 2015 = CHEST conceptual development
- first quarter of 2016 = SSEDAS
- 2nd quarter of 2016 = CHEST
- We do not care, from which kind of infrastructure the geodata comes from. We need different strategies to get to the data.
- We encourage the partners to just publish in a machine-readible form and an open licence to the web, preferably in an open format as RDF or JSON-LD.
- As long as it has an API and it can be read by any programm, we can read it, because we develop a toolchain for that.
- The registry is not just a meta-thing, but it caches the geo-data. You can attach geo-data to your metadata.
Is linkedgeodata.org up and running?
Jon saw elf Pavlik using the
http://linkedgeodata.org/OnlineAccess/RestApi is THE linked data representation of OPENSTREETMAP (OSM).
- We can directly link from our registry to RDF representations of OSM entities. (Look at OSM less as an infrastructure, but as a vocabulary.) With this registry, we bring every geo-data provider on one level. We do not make any assumptions of serializations or whatever …
Strength of OSM:
- global scope
- stable API
- Everyone should be free to use their own geo-backend, but should be able to integrate it.
- It is asked not think about tempospatial data at the moment.
linked data registry linkes to other services and stores metadata about them. It proveds linked data fragements, query API and hydra-enabled webservices. It uses CouchDB as a backend. It is an http API, that stores Data (like how we currently use the semantic mediawiki. it is a directory
“Semantic Mediawiki 2.0” https://github.com/hackers4peace/sporthub - hub for sport practitioners connecting to LinkedGeoData (OSM), DBpedia and Wikidata
Which web services do we link at the linked data registry?
(see glossary - “Backends”):
- geo - [bbox] queryability
Glossary of left side
- git = (en.) Dummkopf - Distributed version control system - (GitHub and GitLab offer hosting for it)
- dat < dat-data.com
- IPFS < InterPlanetary File System (very simple way to store BLOBs [see below])
- wiki is referring to the https://npm.im/wiki (can store versioned context chunks and can make them adressable and versioned. Every junk and every version is separately linkeable)
- linked data registry = in this case,
- Hydra core vocabulary = self-documenting machine-readable API descriptions, http://www.hydra-cg.com/spec/latest/core/ (a proposal from the W3C to bridge linked data formats)
LDF = Linked Data Fragments = query language, implemented in i.e. Linked Data Registry https://github.com/scienceai/linked-data-registry
IPFS < InterPlanetary File System (very simple way to store BLOBs [see below])
IPNS = InterPlanetary Naming System
SoLiD = Social Linked Data
BLOBs - Binary Large Objects (e.g. images)
time serious data = Opening hours, events, everything that has to do with time
PM = Person Month- RESTful webservice = https://en.wikipedia.org/wiki/Representational_state_transfer#Applied_to_web_servicesouth
vf = https://valueflo.ws
AS 2.0 = Activity Streams 2.0
Social WG < W3C, standardizes AS 2.0 in JSON-LD
Spatial Data on the Web WG < W3C + OGC joint WG- WG = Working Group
ICN = Information Centric Networking from next Internet initatives https://en.wikipedia.org/wiki/Information-centric_networkingsome call it Internet 2.0
- NDN = Named Data Networking - most prominent implementation of ICN, also see bottom of http://worrydream.com/ClimateChange/
SSB = Secure Scuttlebuttmapstraction: frontent library allowing switching between openlayers and leaflet
#right side: anticipation of a decoupled geo-aware service architecture abstractions of geo-aware backends
- Rough time guesstimation: next 1 1/2 years
backend: database and API
- we need to tackle many different media types including audio and video and not only text, not only video…
- In the lower left corner of the image is a registry for media-data. A metadata repository to be representation of media. It uses the Hydra core and runs geocouch, but as we use standardized APIs, that is not important for us. If we have a bounding box request for a POI e.g. we (secretary of the meeting could not follow here) not a problem at all.
dat will just store copies of the datasets that we harvest, and if they have a license applied wie can just republish them from our infrastructure.
How are BLOBs in terms of Versioning? Everything gets a hash. Virtual hashes can point to the latest representation.
- Concern about the Queriability of OSM in terms of time serious data. Opening hours, events, everything that has to do with time
Our clients will interprete the links from the registry
- we create a new standard
- client allows querrying and storing of geospatial objects
##right side of the image (right envelop): anticipation of a decoupled geo-aware service architecture.
- TransforMap would be on the brown layer. Something that abstracts geo-aware backends and implement it on the user-side in leaflet storage.
- The users who deploy the geo-backends don´t want to be forced in to a specific geo-backend.
#currently there is no standard for geodata APIs
- bring together geo-specialists from different areas,even contribute to the W3C relating group.
Michael: How much of that has to be developed and how much is working now somwhere?
- only the UMap connection exists already, the rest is not standardized (the boxes above the brown line) the orange stuff is also not existing,
- for the “Dump?” there is reference implementations, which standardize authentication parts with linked data fragments between various clients
- lot of work needed
- Leaflet storage connection to GeoDjango does not exist
- Proposal to …enable something similar as mapstraction for geo-backends
- We can erase all that here, if we want to concentrate in standardization and taxonomy.
Postgres has much more geo-queriability than GeoCouch, as it comes from a different architectural idea. GeoCouch asumes, that most work is done in the client. Postgres allows quite complex computing already at the data, which might be the best solution for decoupled services.
We could also have a look how e.g. Wordpress, Drupal, OSM work their geo-data
Glossary & right side
- geo-backend = geo-aware backend = geo-database + relating API- leaflet storage = related to UMap- GeoDjango =
- LOV = Linked Open Vocabularies
- LD-R = Linked Data Reactor- KVM = Karte von Morgen- enc = encommuns
- WP = WordPress
- r/w = read + write
- ld = Linked Data
- geo = geospatial
- http = HyperText Transfer Protocol
- api = Application Program Interface
- spec = Specification
- Merkle DAG = Merkle Direct Acyclic Graph
- OWL = Web Ontology Language
- SPARQL = [to be contributed, seen missing in the transcribe of the pad]
#not yet addressed topics
decoupled semantic web gis architecture
- includes first offline iteration of https://tree.taiga.io/project/transformap/task/206
#206 Assess and document geodatabases alternatives >Link to sketched draft<
registry daemon environment >Link to sketched draft<
OT: Filenames derive from tagging with http://www.tagspaces.org/ linux application.
The SSEDAS questions:
We have things to map from SSEDAS, where does it go.
What would be our procedure to get it from there.
- We try to undestand the different scopes and -
####3 questions from SSEDAS
"1. how difficult is it to build the editor that is not a monster and can go on after SSEDAS and makes sense in TransforMap
- with OSM ecosystem
- without OSM ecosystem
- we do not yet know, which Editors are available in the non-osm environment
- to build the OSM-feeding editor for the 2 different things described in question 2 is difficult
- what about uMap?
- Jon sees the first question as irrelevant, as there is a flourishing ecosystem, and we can pick form the many and develop from already existing partners. e.g. OSM / Umap / weird wordpress app / great Drupal platform :).
- use OSM-tools to feed into OSM
- use non native OSM-tools to feed into OSM
- use non native OSM-tools to feed into a DB to be created
possibly donate to UMap and get an UUID integration from.
We are all about SPECS!
"2. How difficult is it to link the data, once it is in OSM to the respective tags that cannot be stored in OSM?
- How easy is it to have it in one place
- Do we need to have the Taxonomy separated from geodata anyway for the overall architecture?
We should all read the Geosemantik article on the German Wikipedia before answering this question! We answer the question after reading the article: https://de.wikipedia.org/wiki/Geosemantik + the article linked in Maybe TransforMap is a Gazeteer, too?
UUIDs as standardized foreign key! or https://github.com/jbenet/multihash from IPFSWe store the UUIDs everywhere for everything in our registry. Every Map has an UUID and every POI has an UUID.
"3. How reliable & difficult would it be to fetch the data alongside the tags in OSM and “OSM+” to be displayed in one place
- Is it a benefit or a drag, that we can use, and/or have to use the Overpass API, if we build into OSM
>Josef makes a story about clarifying things that need to be clarified with Alessa until March