We’ve recently published research on performance perception that we did last year. The micro survey used in this study is still running on multiple Wikipedia languages and gives us insights into perceived performance.
In the search for a better user experience metric, we have tried out the upcoming Element Timing for Images API in Chrome.
Today we’re publishing our first report of the performance experienced by visitors of Wikimedia websites, focused on the Autonomous Systems visitors are connecting from.
Here are ways we debug issues in production when users report bugs.
Looking back at our ups and downs.
Should we be skeptical of performance guidelines which state that 100 milliseconds feels instantaneous to everyone?
Performance is important in many ways, some of which matter particularly for the Wikimedia Foundation.
Machine learning is a powerful tool, but it’s easy to use it incorrectly and draw biased conclusions, as we’ll show in this real world example.
We use both synthetic and RUM testing for Wikipedia. These two ways of testing performance are best friends and help us verify regressions. Today, we will look at two regressions where it helped us to get metrics both ways.
One of the Performance Team responsibilities at Wikimedia is to keep track of Wikipedias performance. Why is performance important for us? In our case it is easy: We have so many users and if we have a performance regression, we are really affecting people’s lives.