Fosters collaboration and teamwork.
This comment is not public.
David Sprunger is an Assistant Professor in the Department of Mathematical Sciences at Indiana State University. He earned his Ph.D. in Pure Mathematics from Indiana University in 2017, with supporting areas of emphasis in pure mathematics, under the supervision of Larry Moss. Earlier, he received a B.A. from Princeton University in 2011, majoring in pure mathematics. Following his doctoral work, Sprunger served as a Project Researcher at the ERATO MMSD project hosted by the National Institute of Informatics in Tokyo. There, he focused on extending formal methods and software verification techniques to cyber-physical systems, with applications to automotive control and manufacturing. His contributions included developing mathematical perspectives on machine learning and neural networks grounded in coalgebra and category theory, as well as quantitative refinements of bisimulation and other coalgebraically defined structures. Subsequently, he held a Research Fellow position in the Theory Group at the University of Birmingham, working on the EPSRC-funded project 'Nominal String Diagrams' with Dan Ghica, Fabio Zanasi, and Alexandra Silva.
Sprunger's research specializations lie at the intersection of coalgebra, logic, and category theory, with interests in mathematical foundations for machine learning and neural networks, quantitative bisimulations, and formal methods for cyber-physical systems. His key publications include 'The differential calculus of causal functions' (with Bart Jacobs, 2019), 'Differentiable Causal Computations via Delayed Trace' (with Shin-ya Katsumata, LICS 2019), 'Relational Differential Dynamic Logic' (with Jérémy Dubut, Shin-ya Katsumata, Ichiro Hasuo, Juraj Kolčák, and Akihisa Yamada, 2019), 'Quantitative bisimulations using coreflections and open morphisms' (with Jérémy Dubut, Shin-ya Katsumata, and Ichiro Hasuo, 2018), 'Fibrational Bisimulations and Quantitative Reasoning' (with Jérémy Dubut, Shin-ya Katsumata, and Ichiro Hasuo, CMCS 2018), 'Neural Nets via Forward State Transformation and Backward Loss Transformation' (with Bart Jacobs, 2018), 'Precongruences and parametrized coinduction for logics for behavioral equivalence' (with Lawrence Moss, CALCO 2017), 'A complete logic for behavioural equivalence in coalgebras of finitary set functors' (CMCS 2016), and 'Eigenvalues and Transduction of Morphic Sequences' (with Lawrence Moss, William Tune, and Jörg Endrullis, DLT 2014). Additional works cover linearization of automatic arrays, weave specifications, and variadic sequences (MFPS 2013).
