Earlier this week, the winners of the Sustainable Development Goals Digital Game Changer Awards were revealed. Govdirectory didn’t win, but we are still honored to have been selected as a finalist in the category Peace – Building peaceful and inclusive societies.
It would have been nice to have listed this on the slide for awards in our recent talk at Code for All, but that one was pre-recorded earlier in the summer when we didn’t know the upcoming good news.
In this talk (slides), we mention some statistics.
One aspect that I love about this project is that it is not static, but is building all the time. So since we recorded the video, we have already grown substantially.
As you can see, the coverage is just over 10% of the countries in the world, and of the countries we have, only one is complete and a handful more have great coverage. If you want to help with the data, head over to the project page on Wikidata. If you have want to help improve the website, head over to the repository on GitHub. And, of course, you are also welcome to just explore what is on the website at govdirectory.org.
I was recently a guest on the podcast Between the brackets. The podcast usually covers MediaWiki related topics, but from time to time, also have Wikimedians as guests. It was a lot of fun, since we talked about almost all the things I am currently involved in. We mostly talked about the Foundation for Public Code, Wikidata, Govdirectory, Wikimedians for Sustainable Development but also a bit about AI and Abstract Wikipedia.
A few days ago I was guest appearing in the podcast Sustain OSS. It was an energizing conversation with Richard Littauer and Justin Dorfman, and we honestly probably could have continued for several hours. We mostly discussed my work at the Foundation for Public Code and the Standard for Public Code we have developed and that I work on, and with, daily. But we also got into talking about Wikipedia, the movement and some of the bot creation initiatives we have seen.
I really enjoyed the conversation and hope I get the chance to speak with them again sometime. As they are deep in the know of the field of work, they asked just the right questions that we are also thinking about which made this more than just a shallow presentation of who we are. I guess the only thing we really didn’t get into much is how we are funded which is possibly not the most interesting conversation for everyone, but as a non-profit working for the common good is something we are always eager to explore. That being said, I think that the conversation will give anyone who listens to it a better sense of what I do at work and why I am passionate about it.
From a podcast producer point of view, I was also impressed by the process they applied. All the way from getting in contact, having guiding documents, gathering information needed for the shownotes to checklists for the recording, it was a great experience as a guest. I guess the only downside was that very few of the tools they used (that I saw) were open source solutions. Now, for remote multitrack recording, there aren’t many alternatives, that is something I am also sadly aware of, but for simple collaborative document editing there are plenty of options. If some of those services are switched, it will be an even more pleasant experience to be a guest in the future!
Two weeks ago, I participated in the mini hackathon in Utrecht. It turned out to be quite productive and I manage to complete four different tasks. But mostly it was fun to meet wikimedians again (and they also served some tasty vegan pastries).
I have several times created a query to help me understand how a topic is modeled, by exploring which properties have a specific item as a value. There is a gadget that does something very similar to what I had in mind, Easy query. So based on that, I managed to create a user script I called common-properties.js. You can see it used in this video.
Mapframe code snippet
Similarly, I have wished for Wikidata Query Service to have Kartographer’s <mapframe> as one of the code examples in the results view. I had even created a Phabricator task for it. With some excellent help from Lucas Werkmeister it resulted in a patch to gerrit. And it has already been merged!
With so many interesting people in one place, I couldn’t resist recording a special episode in English for Wikipediapodden. I talked to Siebrand Mazeland, one of the organizers, about this hackathon and hackathon organizing in general.
Just over two weeks ago, I got the chance to give my first OpenRefine talk in Wikimedia Portugal’s celebrations of the Wikidata tenth birthday. Although Zoom gave me some technical troubles in the start, I think the actual talk went pretty well. You can judge for yourself because the video is published on their YouTube channel (where my struggles luckily has been taken out). It’s a quick walkthrough of the interface and some of the strategies you might have when using OpenRefine. In it, I am doing some cleaning of a dataset and reconcile it with Wikidata and make upload some of the data.
Earlier this week, I had the honor to be a mentor in the Hack for Earth hackathon. I had signed up to be a mentor for open source, open data and anything related to Wikimedia. Unfortunately, I didn’t get any questions that were related to this, but I did get the chance to help some of the participants to find their way around the challenges and the formalities of the contest. And I did get to record a tiny “Hello!” video that got a small spot in the closing ceremony of the hackathon.
Den här kursen gick igenom ett antal grundläggande aspekter och jag kände igen en del sedan tidigare. Innehållet är bra och inspirerande, men det är en ganska kort kurs, så man får nog kalla det en introduktionskurs. För den som precis ska börja jobba med digitala policies så är det här en utmärkt start. Om alla riksdagsledamöter tog kursen så skulle det nog bli ett stort lyft för samhället i stort då principerna är övergripande och kan appliceras på frågor i de flesta utskott.
Wikibase and Semantic MediaWiki for data-driven semantics
Här blev det istället en hel del djupare ämneskunskap, och det är ju såklart ett smalare fält om specifik mjukvara. Innehållet var rikare, med långa pedagogiska videolektioner av experter i området. För mig som hållit på med Wikidata i snart tio år var det inte mycket nytt, men det var kul att se hur även EU:s institutioner använder sig av Wikibase och hur omfattande arbete de har gjort.
Båda kurserna avslutas med quiz, och sedan kan man ladda ner ett certifikat, det var en fin touch.
Kurserna var pedagogiska och välgjorda så jag kommer nog att kika på fler. Om någon presenterar ett certifikat kan det vara värt att kolla upp kursen, för även om det finns veckolånga kurser och certifikaten ser pampiga ut så kan de representera så lite som några timmars insats.