Fixing npm security issues immediately in MediaWiki projects

LibUp writes a commit message by mostly analyzing the diff, fixes up some changes, and pushes the commit to Gerrit to pass through CI and be merged. If npm is aware of the CVE ID for the security update, that will be mentioned in the commit message. Each package upgrade is tagged, so if you want to e.g. look for all commits that bumped MediaWiki Codesniffer to v26, it’s a quick search away.

Saying no to proprietary code in production is hard work: the GPU chapter

Maintaining and improving one of the largest websites in the world using Open Source software requires a continuous commitment. The site is always evolving, so for every new component we want (or need!) to deploy, we need to evaluate the Open Source solutions available.
Image of SPARQL code

Computational knowledge: Wikidata, Wikidata query Service, and women who are mayors!

One of the main aims of Wikidata is to represent knowledge in a way that is computable—that is, amenable to automatic processing. Wikipedia already contains a lot of information; much of it is reasonably easy for a human to understand—though some of the more esoteric bits are decidedly not—but it’s not at all readily crunchable by a computer.

Parsoid in PHP, or there and back again

In December 2019, we replaced the original version of Parsoid, written in JavaScript, with a version written in PHP, the primary programming language of MediaWiki. This new version, called Parsoid/PHP, is roughly twice as fast as the original JavaScript version. Parsoid/PHP brings us one step closer to integrating Parsoid and other MediaWiki wikitext-handling code into a single system.

Wikipedia’s JavaScript initialisation on a budget

This week saw the conclusion of a project that I’ve been shepherding on and off since September of last year. The goal was for the initialisation of our asynchronous JavaScript pipeline (at the time, 36 kilobytes in size) to fit within a budget of 28 KB – the size of two 14 KB bursts of Internet packets.