few days ago I did something that I never thought I’d do again, and started putting some of our more critical logs in Postgres.
Never, ever do this, unless:
• You’re draining only canonical log lines which are a summarized digest of everything that happened during a single API call, rather than a flood of voluminous and low quality logging data.
• You’re putting them in an ephemeral database, so they can be shedded as necessary and won’t interfere with higher fidelity data like in the case where recovery from backup is necessary.
• You’re using a partitioned table which makes dropping old data easy and fast.
And even then, it’s a technique that’s probably going to have trouble scaling to enormous sizes.
Fast as a Service
from brandur.org
Filed under:
Related Notes
- In Eliyahu M. Goldratt's "Theory of Constraints", you...from ycombinator.com
- Often, people who don’t have access to the raw data expect one narr...from Josh Beckman
- The upshot for the industry at large, is: the **LLM-as-Moat model h...from Steve Yegge
- The first image ever transmitted to Earth from another planet was r...from Instagram
- My experience is companies do not anticipate that the cost of monit...from Mathew Duggan
- Nathan's four Laws of Software: 1. **Software is a gas** ...from Jeff Atwood
- > Software with fewer concepts composes, scales, and evolves mor...from oilshell
- Ad-hoc validation leads to a phenomenon that the [language-theoreti...from Alexis King