Invited Talks
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
To ask a question remotely, visit [Slido](https://sli.do) and enter #neurips2024?
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
A common model of AI suggests that there is a single measure of intelligence, often called AGI, and that AI systems are agents who can possess more or less of this intelligence. Cognitive science, in contrast, suggests that there are multiple forms of intelligence and that these intelligences trade-off against each other and have a distinctive developmental profile. The adult ability to accomplish goals and maximize utilities is often seen as the quintessential form of intelligence. However, this ability to exploit is in tension with the ability to explore. Children are particularly adept at exploration, though at the cost of competent action and decision-making. Human intelligence also relies heavily on cultural transmission, passing on information from one generation to the next, and children are also particularly adept at such learning.Thinking about exploration and transmission can change our approach to AI systems. Large language models and similar systems are best understood as cultural technologies, like writing, pictures and print, that enable information transmission. In contrast, our empirical work suggests that RL systems employing an intrinsic objective of empowerment gain can help capture the exploration we see in children.
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
Technological change typically occurs in three phases: basic research, scale-up, and industrial application, each with a different degree of methodological diversity—high, low, and medium, respectively. Historically, breakthroughs such as the steam engine and the Haber-Bosch process exemplify these phases and have had a profound impact on society. A similar pattern can be observed in the development of modern artificial intelligence (AI).
In the scale-up phase of AI, large language models (LLMs) have emerged as the most prominent example. While LLMs can be seen as highly sophisticated knowledge representation techniques, they have not fundamentally advanced AI itself. The upscaling phase of AI was dominated by the transformer architecture. More recently, other architectures, such as state-space models and recurrent neural networks, have also been scaled up. For example, Long Short-Term Memory (LSTM) networks have been scaled up to xLSTM, which in many cases outperform transformers.
We are now transitioning into the third phase: industrial AI. In this phase, we are adapting AI methods to real-world applications in robotics, life and earth sciences, engineering, or large-scale simulations that can be dramatically accelerated by AI methods. As we continue to develop these industrial AI methods, we expect to see an increase in methodological diversity, …
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
To ask a question remotely, visit [Slido](https://sli.do) and enter #neurips2024?
Anything is optimal given the right criteria: What are the optimal criteria as we invent the future of AI?
This talk explores this question with a series of stories including the development of affective computing,
inspired in part by how the human brain uses emotion to help signal what matters to a person.
One of these types of signals can be measured on the surface of the skin and has contributed to today’s
AI+wearable technology helping save lives. As artificial emotional intelligence abilities grow, what have
we learned about how to build optimal AI to engineer a future for people that is truly better?
Hint: It's unlikely to be achieved with scaling up today's models.
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
To ask a question remotely, visit [Slido](https://sli.do) and enter #neurips2024?
Humans learn though interaction and interact to learn. Automating highly dextreous tasks such as food handling, garment sorting, or assistive dressing relies on advances in mathematical modeling, perception, planning, control, to name a few. The advances in data-driven approaches, with the development of better simulation tools, allows for addressing these through systematic benchmarking of relevant methods. This can provide better understanding of what theoretical developments need to be made and how practical systems can be implemented and evaluated to provide flexible, scalable, and robust solutions. But are we solving the appropriate scientific problems and making the neccesarry step toward general solutions? This talk will showcase some of the challenges in developing physical interaction capabilities in robots, and overview our ongoing work on multimodal representation learning, latent space planning, learning physically-consistent reduced-order dynamics, visuomotor skill learning, and peak into our recent work on olfaction encoding.
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
To ask a question remotely, visit [Slido](https://sli.do) and enter #neurips2024?
In a world where rapid innovation fuels our greatest ambitions, systems and AI have found themselves in a dynamic and transformative partnership. The systems community has worked tirelessly in the background, building the foundation that enabled AI’s meteoric rise. But now, AI’s exponential progress threatens to outpace the very systems supporting it. At this critical juncture, we propose a bold “marriage”—one that allows systems and AI to co-evolve in ways that push each beyond its current boundaries.
In this keynote, we will examine the role of systems in accelerating AI advancements, the strains AI’s unprecedented growth places on current infrastructures, and the emerging ways AI can reciprocate by transforming the systems landscape. Through systems thinking and core principles, we will outline the grand challenges that arise from this union, envisioning a future where systems and AI reshape each other, charting a path forward that calls on the AI community to foster a future of symbiotic growth.
Invited Talk
[ West Exhibition Hall C, B3 ]
Abstract
To ask a question remotely, visit [Slido](https://sli.do) and enter #neurips2024?
Diffusion models have revolutionized generative modeling. Conceptually, these methods define a transport mechanism from a noise distribution to a data distribution. Recent advancements have extended this framework to define transport maps between arbitrary distributions, significantly expanding the potential for unpaired data translation. However, existing methods often fail to approximate optimal transport maps, which are theoretically known to possess advantageous properties. In this talk, we will show how one can modify current methodologies to compute Schrödinger bridges—an entropy-regularized variant of dynamic optimal transport. We will demonstrate this methodology on a variety of unpaired data translation tasks.
Successful Page Load