Computational architecture of visual cognition in the mind and brain
The goal of my research program is to understand how our brains transform raw sense data into objects, agents, and events, into rich, discrete structures that we can think about, plan with, and manipulate. This program promises to enable more human-like artificial intelligence systems on one hand, and inform functional, multi-level accounts of mental disorders that impact perception on the other. My lab takes a primarily computational approach to give an integrative account of biological computation spanning the cognitive and neural levels. We build computational theories that synthesizes a distinctively broad technical toolkit including probabilistic programming, planners, information theory, and efficient, approximate Bayesian inference (using deep neural networks, sequential importance samplers, approximate Bayesian computation, and their hybrids). We empirically test implemented models in quantitative psychophysical experiments in humans in both web-based and in-lab settings using computer graphics and computational fabrication for stimuli delivery. We also test these models in neural data from non-human primate experiments via experimental collaborators, and in human neuroimaging studies that we do. These multidisciplinary studies aim to produce multi-level, reverse-engineering accounts of previously unrecognized principles of biological computation underlying visual cognition and its development. This program intersects and interfaces with all three pillars of WTI, most centrally with Neurocomputation and Machine Intelligence, as well as Neurocognition and Behavior, and increasingly also Neurodevelopment and Plasticity.
Ilker Yildirim received his Bachelor?s and Master?s degrees in 2007 and 2009 from Bogazici University, Istanbul in Computer Science, and his Ph.D. in 2014 from the University of Rochester in Brain & Cognitive Sciences. After a postdoc at MIT, he started his lab at Yale University in 2019 as an assistant professor of psychology and statistics & data science. His partner Meltem is a physician at Yale Health and their two children, Birol and Nimet, are expert negotiators.
Automatic computation of navigational affordances explains selective processing of geometry in scene perception: behavioral and computational evidenceCognitive Science Society (2021)
Perception of soft materials relies on physics-based object representations: Behavioral and computational evidencebioRxiv (2021)
Current Opinion in Neurobiology (2019)