Scaling trillions of dollars of compute: will it turn out to be the worst bet in history? "Scaling compute" is AI industry jargon for spending more and more money on data centres and the chips (typically GPUs made by Nvidia) inside those data centres. So called hyperscalers like OpenAI, Alphabet, Amazon, Meta, Microsoft, xAI and Anthropic aim to spend nearly a trillion dollars on compute this year, more than a dozen times the cost of the Manhattan project. Professor Gary Marcus will argue that this is a bad bet, unlikely to pay off, filled with numerous downsides for society, and that it may well lead to a recession. Biography Gary Marcus is a leading voice in artificial intelligence. He is a scientist, best-selling author, and serial entrepreneur (Founder of Robust.AI and Geometric.AI, acquired by Uber). He is well-known for his challenges to contemporary AI, anticipating many of the current limitations decades in advance, and for his research in human language development and cognitive neuroscience.An Emeritus Professor of Psychology and Neural Science at NYU, he is the author of six books, including, The Algebraic Mind, Kluge, The Birth of the Mind, and the New York Times Bestseller Guitar Zero. He has often contributed to The New Yorker, Wired, and The New York Times. His 2019 book, Rebooting AI, with Ernest Davis, is one of Forbes’s 7 Must Read Books in AI. His most recent book, Taming Silicon Valley, was on The New Yorker’s list of recommended books in 2024. His newsletter, Marcus on AI, has over 100,000 subscribers. May 18 2026 11.00 - 12.30 Scaling trillions of dollars of compute: will it turn out to be the worst bet in history? Join us for a fireside chat with Gary Marcus, scientist, best-selling author and entrepreneur, where we will explore the implications of the AI industry's reliance on scaling compute. Room G07, Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB Register
Scaling trillions of dollars of compute: will it turn out to be the worst bet in history? "Scaling compute" is AI industry jargon for spending more and more money on data centres and the chips (typically GPUs made by Nvidia) inside those data centres. So called hyperscalers like OpenAI, Alphabet, Amazon, Meta, Microsoft, xAI and Anthropic aim to spend nearly a trillion dollars on compute this year, more than a dozen times the cost of the Manhattan project. Professor Gary Marcus will argue that this is a bad bet, unlikely to pay off, filled with numerous downsides for society, and that it may well lead to a recession. Biography Gary Marcus is a leading voice in artificial intelligence. He is a scientist, best-selling author, and serial entrepreneur (Founder of Robust.AI and Geometric.AI, acquired by Uber). He is well-known for his challenges to contemporary AI, anticipating many of the current limitations decades in advance, and for his research in human language development and cognitive neuroscience.An Emeritus Professor of Psychology and Neural Science at NYU, he is the author of six books, including, The Algebraic Mind, Kluge, The Birth of the Mind, and the New York Times Bestseller Guitar Zero. He has often contributed to The New Yorker, Wired, and The New York Times. His 2019 book, Rebooting AI, with Ernest Davis, is one of Forbes’s 7 Must Read Books in AI. His most recent book, Taming Silicon Valley, was on The New Yorker’s list of recommended books in 2024. His newsletter, Marcus on AI, has over 100,000 subscribers. May 18 2026 11.00 - 12.30 Scaling trillions of dollars of compute: will it turn out to be the worst bet in history? Join us for a fireside chat with Gary Marcus, scientist, best-selling author and entrepreneur, where we will explore the implications of the AI industry's reliance on scaling compute. Room G07, Informatics Forum, 10 Crichton Street, Edinburgh, EH8 9AB Register
May 18 2026 11.00 - 12.30 Scaling trillions of dollars of compute: will it turn out to be the worst bet in history? Join us for a fireside chat with Gary Marcus, scientist, best-selling author and entrepreneur, where we will explore the implications of the AI industry's reliance on scaling compute.