You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Figure out the exact value of the BBR ID change break point , see Template:BBR-länk for estimated values then do a loop with HTTP requests to get the exact one. Once done update the template too.
Run the kulturarvsdata-prefer-rdf.py bot.
Check for duplicate statements(should be none or very few), have seen something for this task over att Tool Labs.
Start indexing the WLM lists on sv.wikipedia.org to a CSV or SQLite file(index only WP articles and BBR URIs?)
check this list against existing data in Wikidata. Look for conflicts and data which exists only in Wikidata(which should not be the case).
fix any data that needs fixing
add Wikipedia articles for all the WLM BBR items missing one(if Geonames can be a source for bot created articles anything can be a source).
Index a new CSV or SQLite file from the WLM tables.
Import all the missing data to Wikidata.
start indexing both facility and building IDs(breaks the API). Use the "BBR ID change break point" if it's a fuzzy one create a buffer were all IDs gets verified using HTTP requests(the way all currently are validated).
Add all the Wikidata IDs to the WLM lists on sv.wikipedia.org and notify the folks over at Phabricator. Research on how to parse and process wikitext tabels <-- new to me
The text was updated successfully, but these errors were encountered:
Probably happening late December(required for Kyrkosok/web-client#23).
Figure out the exact value of the BBR ID change break point , see
Template:BBR-länk
for estimated values then do a loop with HTTP requests to get the exact one. Once done update the template too.Run the
kulturarvsdata-prefer-rdf.py
bot.Check for duplicate statements(should be none or very few), have seen something for this task over att Tool Labs.
Start indexing the WLM lists on sv.wikipedia.org to a CSV or SQLite file(index only WP articles and BBR URIs?)
check this list against existing data in Wikidata. Look for conflicts and data which exists only in Wikidata(which should not be the case).
fix any data that needs fixing
add Wikipedia articles for all the WLM BBR items missing one(if Geonames can be a source for bot created articles anything can be a source).
Index a new CSV or SQLite file from the WLM tables.
Import all the missing data to Wikidata.
start indexing both facility and building IDs(breaks the API). Use the "BBR ID change break point" if it's a fuzzy one create a buffer were all IDs gets verified using HTTP requests(the way all currently are validated).
Add all the Wikidata IDs to the WLM lists on sv.wikipedia.org and notify the folks over at Phabricator. Research on how to parse and process wikitext tabels <-- new to me
The text was updated successfully, but these errors were encountered: