AI Future: Industry, Academia, Gov't Unite for Scientific Discovery
The Trillion Parameter Consortium’s TPC25 conference recently convened in San Jose, California, bringing together leaders from industry, academia, and government to discuss the future of AI, particularly its application in scientific and technical computing. While much of the recent AI progress has been driven by large private tech firms, the conference highlighted a collaborative effort to harness these advancements for broader scientific discovery, benefiting both the United States and humanity.
A panel discussion on July 30, moderated by Karthik Duraisamy of the University of Michigan, explored how various stakeholders can collaborate to leverage AI for scientific breakthroughs. Panelists included representatives from the Department of Energy (DOE), a quantum computing platform developer, a data management solutions provider, the National Science Foundation (NSF), and Intel Labs.
Hal Finkel, Director of the DOE’s computational science research and partnerships division, emphasized the department’s deep and long-standing commitment to AI. “All parts of DOE have a critical interest in AI,” Finkel stated, noting significant investment in the field. He detailed how DOE is exploring AI to accelerate scientific productivity across diverse disciplines, from fusion energy and superconductors to advanced robotics and photonics. Finkel highlighted the DOE’s extensive supercomputing expertise, including exascale systems at national laboratories, and their investment in AI testbeds and emerging technologies like neuromorphic computing, which promises greater energy efficiency for edge AI applications and embedded experimental systems.
Vishal Shrotriya, a business development executive with Quantinuum, a quantum computing platform developer, envisioned a future where quantum computers, integrated with AI algorithms, tackle complex computational problems in material science, physics, and chemistry. Shrotriya suggested that quantum computers could revolutionize molecular science by enabling precise simulations of small molecules and the generation of new synthetic data. This synthetic data could then be fed back into AI models, creating a powerful feedback loop to accelerate scientific discovery and innovation, particularly in areas like drug development, moving beyond trial-and-error methods to precise calculations of molecular interactions.
Molly Presley, Head of Global Marketing for Hammerspace, underscored the critical role of data in the AI ecosystem. She pointed out that while data is essential, its distribution and accessibility are uneven. Hammerspace aims to bridge the gap between human understanding of data and its physical manifestation, facilitating broader access. Presley stressed the importance of industry standards, particularly for data access and metadata definition. She noted that a recurring theme on her “Data Unchained” podcast is the lack of standardized metadata across different scientific domains, such as genomics, high-performance computing (HPC), and financial services. Presley suggested that the computing community, like that at TPC25, is best positioned to address this challenge to ensure metadata is standardized and searchable across workflows and locations.
Katie Antypas, Director of the Office of Advanced Cyber Infrastructure at the National Science Foundation (NSF) and an employee of Lawrence Berkeley National Lab, highlighted workforce development as a significant challenge. She emphasized the need for investments from industry partnerships and the federal government to nurture the next generation of AI talent. Antypas pointed to the National Artificial Intelligence Research Resource (NAIRR) project as a key initiative in this effort, ensuring that researchers across the country and in all domains have access to critical AI resources, fostering a healthy AI innovation ecosystem beyond the largest technology companies.
Pradeep Dubey, an Intel Senior Fellow at Intel Labs and director of the Parallel Computing Lab, discussed several challenges within the AI stack. He identified a fundamental conflict at the algorithmic level: developing models that are both highly capable and trustworthy. Dubey also addressed the issue of “hallucination” in AI models, suggesting it’s not a bug but an inherent feature that contributes to AI’s current capabilities. He also noted the challenge of making AI accessible to non-coders who prefer higher-level programming environments like MATLAB, rather than being confined to low-level GPU programming interfaces.
However, the most pressing issue, a recurring theme at TPC25, was the looming electricity shortage. Dubey warned that the massive energy demands of running large AI factories could overwhelm available resources. He pointed out that a significant portion of energy in large AI systems (30-40%, potentially rising to 70-80%) is consumed by data movement rather than computation, leading to inefficient energy use.
Addressing these challenges, from algorithmic complexities and workforce development to data standardization and energy consumption, is crucial for the computing community to fully leverage AI’s potential and advance scientific discovery. As DOE’s Hal Finkel concluded, broad, aggregated interest and community-driven efforts, facilitating collaboration and understanding among all stakeholders—government, national labs, industry, and universities—are essential for this shared AI future.