How to Build the Future: Demis Hassabis [video]

· ai · Source ↗

Watch on YouTube ↗

TLDR

  • Demis Hassabis talks with YC’s Garry Tan on AGI gaps, agents, AlphaFold’s path to virtual cells, and founder advice.

Key Takeaways

  • Unsolved memory and continuous learning are Hassabis’s cited blockers before AGI; agents are early but not overhyped.
  • AlphaGo’s reinforcement learning patterns directly shaped Gemini’s architecture and multimodal-first design.
  • Smaller models are becoming more capable; cheaper inference unlocks new application layers.
  • AlphaFold’s protein structure work is expanding toward virtual cell simulation as the next scientific frontier.
  • Hassabis frames AI as the ultimate scientific tool, not a product, with open models like Gemma part of that strategy.

Hacker News Comment Review

  • Commenters broadly trust Hassabis more than other frontier lab leaders, citing his scientist background over business incentives, though some note the structural pressure of Google’s shareholder interests.
  • Technical discussion is thin; most energy goes to character assessments and concern that Big Tech compute concentration will make individual cleverness irrelevant as LLMs merge with knowledge graphs.

Notable Comments

  • @mentalgear: warns that merging LLMs with knowledge graphs may make human cleverness obsolete, with compute as the only remaining limit.
  • @libraryofbabel: recommends Sebastian Mallaby’s The Infinity Machine for deep background on Hassabis pre-DeepMind.

Original | Discuss on HN