As a data-driven manager, I needed a way to track the performance of our team's numerous NPM packages. Frustrated by the lack of an obvious API, I discovered a hidden gem in the NPM registry documentation. Using this, I created a Google Sheet with custom functions to pull download stats directly. The sheet allows you to track both scoped and non-scoped packages, view data in a table or column format, and easily create charts to visualize trends. Check out the linked sheet and accompanying code to build your own NPM downloads dashboard!
This post explores whether developers are actively addressing issues highlighted by Lighthouse performance audits. By analyzing HTTP Archive data and tracking Lighthouse scores over time, we aim to uncover potential improvements. While directly correlating specific fixes to Lighthouse test failures is challenging, we can investigate trends and infer the impact of Lighthouse on web development practices. A key consideration is the evolution of Lighthouse itself, such as changes to CPU performance calculations, which may affect the comparability of results over time.
This post explores the concept of web compatibility and whether it can be quantified. It draws parallels to the Receiver Operating Characteristic (ROC) curve, questioning if a similar metric could represent the trade-off between compatibility and feature availability. The post also examines how browser vendors' incentives influence feature adoption and proposes leveraging compatibility data sources like caniuse and web platform tests (WPT) to prioritize compatibility improvements. Potential tools, such as a compatibility bot and automated blog updates, are suggested to highlight these improvements.