Research

Our research group explores the intersection of physics, machine learning, and computation by investigating how complex dynamical systems can serve as intelligent, energy-efficient platforms for information processing. We focus on non-equilibrium physical systems - such as active matter, driven materials, and collective particle dynamics - as natural reservoirs for machine learning, particularly in the framework of Reservoir Computing. These systems, which exhibit rich spatio-temporal patterns and emergent behaviors like flocking and self-organization, transform input signals into high-dimensional representations through their intrinsic dynamics. By simulating and analyzing these physical reservoirs, we study how the underlying physics influences predictive performance, aiming to develop novel, low-power computing architectures that operate directly within physical substrates.

A key component of our work is the development of data-driven coarse-graining methods that extract meaningful macroscopic behavior from complex many-body systems. By combining principles from statistical mechanics with modern machine learning techniques, we identify relevant variables, quantify memory effects, and reconstruct causal relationships in systems far from equilibrium. This enables more accurate modeling of collective dynamics while reducing computational complexity - offering new tools for both fundamental science and applied modeling in soft matter, biophysics, and materials science.

We also investigate the deep connections between physical phase transitions and machine learning dynamics. Through the lens of dynamical systems theory, we examine how concepts such as criticality, order parameters, and symmetry breaking influence the behavior of neural networks and optimization processes. This reveals that the same physical mechanisms governing the emergence of structure in nature also play a role in shaping efficient representations in artificial intelligence.

By treating both hardware and algorithms as evolving dynamical systems, our approach enables abstraction across scales - where computation arises naturally from physical behavior, independent of specific hardware implementations. This foundation supports the development of general-purpose, neuromorphic, and adaptive computing systems that are scalable, robust, and energy-efficient. We also bridge a conceptual gap between seemingly disparate AI systems - machine learning and physical reservoir computing or neuromorphic computing - to establish a new foundation of how humans can ultimately interact with them, and society can integrate them in a holistic and human-centered manner.

Our interdisciplinary research bridges physics-based modeling with machine learning innovation, contributing to a deeper understanding of complex systems while advancing the design of next-generation intelligent devices. We are committed to creating sustainable, physically grounded solutions for computation that leverage the inherent intelligence of matter - offering a promising path toward more efficient, adaptive, and scalable technologies for real-world applications.

To the top of the page