The observability pipeline (2018) - the case for treating metrics/logs/traces data as “another dataset” rather than something different. For example, sending them first to a central collection point (Kafka or equivalent) and then dispersing them to all the “targets” that need them - logging infra, metrics and alerting, even to a data lake so they’re available as business metrics. The main idea is to stop
NxM problems with this data and introduce some decoupling so it’s easier to switch infra providers later down the road.
Extended validation certificates are dead (2018) - the message is important - don’t waste money on EVs and just use a regular certificate. The rest of this (really long) pieces seems like it’s taking the piss on Comodo cybersecurity.
Whatever happened to the semantic web (2018) - remember the “Semantic Web”? It used to be a thing. But it never really caught on. What’s interesting to me is how the politics of this played out - there were a million standards before there was adoption or even a real killer usecase of the tech.
Pixie - a system for recommending 3+ billion items to 200+ million users in real-time (2018) - an overview of a Pinterest paper about their recommendation engine. These sorts of things are the backbones of many companies - Netflix, Pinterest, Facebook etc. But they’re usually glossed over in ML courses. But they’re interesting in their own right and very tricky to get right. So this is an interesting discussion of how the Pinterest teams does it.