Research-Driven Agents: When an agent reads before it codes

https://blog.skypilot.co/research-driven-agents/

Research-Driven Agents: What Happens When Your Agent Reads Before It Codes

Coding agents working from code alone generate shallow hypotheses. Adding a research phase — arxiv papers, competing forks, other backends — produced 5 kernel fusions that made llama.cpp CPU inference 15% faster.

SkyPilot Blog

Sorry to spam, I'm working on this also from a different angle. Hopefully sharing adds to the conversation.

First, about the loop, Claude's (coding agent) context and attention is big enough to self-reflect. Agent Tuning shows a technique that not only demonstrates this but a way quantify it. [0] The difference is autoresearch's val_bpb measures what the agent built; Agent Tuning's p˂ measures the agent itself.

> Claude's attention doesn't distinguish between "instructions I'm writing" and "instructions I'm following" -- they're both just tokens in context.

Second, doing research, finding academic research to add to context helps. Here is an example of an implementation that creates trading strategies by reading research and recreating them in creative new ways. [1]

The biggest problem is the coding agents don't "Fail fast and loud". They fail deceivingly.

[0] https://github.com/adam-s/agent-tuning

[1] https://github.com/adam-s/alphadidactic

GitHub - adam-s/agent-tuning: Using recursion to acheive predictable agent output

Using recursion to acheive predictable agent output - adam-s/agent-tuning

GitHub

> The biggest problem is the coding agents don't "Fail fast and loud". They fail deceivingly.

GPT 2 and 3 used to fail fast (and loud coz we could easily see it lying)

My next exploration will be "Coding Agents: fail slow, silent, and deceivingly".

After one month working on using Claude to create trading strategies, the one thing I learned; if the strategy looks like it can profit, it is a lie. The trading strategy agent doesn't find trading strategies that work, it is really a bug hunting agent.