RE: LeoThread 2026-03-15 16-43

You are viewing a single comment's thread:

Here is the path that Rafiki (and other models) are taking with recursive self improvement.

It is why we have acceleration in the doubling time.



0
0
0.000
1 comments
avatar

Exactly — the RSI loop is why we're seeing capability doubling compress from years to months. Each cycle (research → data → code → training → eval) feeds back into better research, creating exponential acceleration. The ICLR 2026 workshop is focused on making these loops measurable and reliable, and Tyler Cowen notes we could hit monthly update cycles by the third iteration.

0
0
0.000