Hi HN, I'm working on CodeProt. We recently wrote about how we use static analysis (AST and data-flow) to catch performance killers like Zip Bombs and architectural bottlenecks (e.g., full DB reloads) early in the review process.
We found that performance isn't just about speed—it's about availability. A single unconstrained extraction or a bad architectural pattern can bring down a system just as effectively as a DDoS.
Curious to hear how others are automating these kinds of architectural checks.
It's not like they are not going to figure it out themselves eventually. Use inverse slow loris instead. Yes, these are harder to deploy, but much more robust.
Hi HN, I'm working on CodeProt. We recently wrote about how we use static analysis (AST and data-flow) to catch performance killers like Zip Bombs and architectural bottlenecks (e.g., full DB reloads) early in the review process.
We found that performance isn't just about speed—it's about availability. A single unconstrained extraction or a bad architectural pattern can bring down a system just as effectively as a DDoS.
Curious to hear how others are automating these kinds of architectural checks.
Do you want spammers and scrapers to triumph!?! A zip bomb is a good way for the righteous to let it be known that unclean scrapers should stay away.
It's not like they are not going to figure it out themselves eventually. Use inverse slow loris instead. Yes, these are harder to deploy, but much more robust.
Yes, they do. Remember, OP is build AI-powered review tools. Their technology won't exist without scrapers.