I think it's better to have the AI write scripts that extract the data required from logs vs directly shoving the entire log content into the AI.
An example of this is: I had Claude analyze the hourly precipitation forecasts for an entire year across various cities. Claude saved the API results to .csv files, then wrote a (Python?) script to analyze the data and only output the 60-80% expected values. So this avoided putting every hourly data point (8700+ hours in a year) into the context.
Another example: At first, Claude struggled to extract a very long AI chat session to MD. So Claude only returned summaries of the chats. Later, after I installed the context mode MCP[1], Claude was able to extract the entire AI chat session verbatim, including all tool calls.
I think it's better to have the AI write scripts that extract the data required from logs vs directly shoving the entire log content into the AI.
An example of this is: I had Claude analyze the hourly precipitation forecasts for an entire year across various cities. Claude saved the API results to .csv files, then wrote a (Python?) script to analyze the data and only output the 60-80% expected values. So this avoided putting every hourly data point (8700+ hours in a year) into the context.
Another example: At first, Claude struggled to extract a very long AI chat session to MD. So Claude only returned summaries of the chats. Later, after I installed the context mode MCP[1], Claude was able to extract the entire AI chat session verbatim, including all tool calls.
1. Sometimes?
2. Described above. I also built a tool that lets the dev/AI filter (browser dev console)logs to only the loggs of interest: https://github.com/Leftium/gg?tab=readme-ov-file#coding-agen...
3. It would be interesting to combine your log compression with the scripting approach I described.
[1]: https://hw.leftium.com/#/item/47193064