Web Perf Hero: Amir Sarabadani
Over the past six months, Amir (@Ladsgroup) significantly reduced the processing time and cost for saving edits in MediaWiki. Not just once, but several times! We measure this processing time through Backend Save Timing (docs). This metric encompasses time spent on the web server, from process start, until the response is complete and flushed to the client.
Amir expanded MediaWiki’s ContentHandler component, with an ability for content models to opt-out from eagerly generating HTML (T285987). On Wikipedia we generate HTML while saving an edit. This is necessary because HTML is central to how wikitext is parsed and, generating HTML ahead of time speeds up pageviews. On Wikidata, this is not the case. Wikidata entities (example) can be validated and stored without rendering an HTML page. Wikidata is also characterised by having a majority of edits come from bots, and the site receives far fewer pageviews proportional to its edits (where Wikipedia has ~1000 pageviews per edit,1 Wikidata has ~102). This does not account for Wikidata edits generally being done in sessions of several micro edits.
Amir adopted this new opt-out in the Wikibase extension, which powers Wikidata. This lets Wikidata skip the HTML generation step whenever possible. He also identified and fixed an issue with the SpamBlacklist extension (T288639), that prevented the Wikidata optimisation from working. The spam filter acts on links in the content via Parser metadata, but it requested a full ParserOutput object with HTML, rather than metadata.


Amir’s work cut latencies by half. The wbeditentity API went from upwards of 1.5s at the 95th percentile to under 0.7s, and the 75th percentile from 0.6-1.0s down to 0.4-0.5s (Grafana).
Internal metrics show where this difference originates. The EditEntity.attemptSave
p95 metric went from 0.5-0.7s down to 0.2-0.3s, and mean average EditEntity.EditFilterHookRunner
from 0.2-0.3s to consistently under 0.1s (Grafana).


Web Perf Hero award
The Web Perf Hero award is given to individuals who have gone above and beyond to improve the web performance of Wikimedia projects. The initiative is led by the Performance Team and started mid-2020. It is awarded quarterly and takes the form of a Phabricator badge.
Read about past recipients at Web Perf Hero award on Wikitech.
Footnotes
- “9 billion pageviews, 5 million edits”, Wikistats: en.wikipedia.org, April 2022. ↩︎
- “500 million pageviews, 20 million edits”, Wikistats: wikidata.org, April 2022. ↩︎