Friday, May 10, 2024

[DMANET] DIMACS Workshop on Modeling Randomness in Neural Network Training: Mathematical, Statistical, and Numerical Guarantees

***************************************************************
DIMACS Workshop on Modeling Randomness in Neural Network Training: Mathematical, Statistical, and Numerical Guarantees

June 5 - 7, 2024
DIMACS Center, Busch Campus, Rutgers University
Organizers:
Tony Chiang, University of Washington & Pacific Northwest National Lab
Ioana Dumitriu, University of California, San Diego
Anand Sarwate, Rutgers University

*********************************************************************
Workshop Announcement:

Neural networks (NNs) are at the heart of modern machine learning and artificial intelligence (ML/AI) systems. The rapid development of these technologies has led to adoption across a variety of domains, particularly in speech processing, computer vision, and natural language processing. At the same time, the theoretical underpinnings of these statistical models are not yet fully understood. The question of how and why neural networks "work" can be approached from a variety of mathematical perspectives. One of the most promising mathematical tools for analysis of neural networks is random matrix theory, a field whose relevance and applicability to modeling, understanding, and characterizing a vast array of science and technology problems is growing every day. From principle component analysis and random growth processes to particle interactions and community detection in large networks, random matrices are now used to investigate and explain high-dimensional phenomena like concentration (the so-called "blessing of dimensionality" as opposed to the "curse of dimensionality"). Recent results in universality allow for use of more complex, non-Gaussian models, sometimes even allowing for limited dependencies. This begs the question: what can random matrix theory tell us about neural networks, modern machine learning, and AI?

The overarching goal of the workshop is to create bridges between different mathematical and computational communities by bringing together researchers with a diverse set of perspectives on neural networks. Topics of interest include:

* understanding matrix-valued random processes that arise during NN training,
* modeling/measuring uncertainty and designing estimators for training processes,
* applications to these designs within optimization algorithms.

*********************************************************************

Call for Participation:

The workshop will feature a poster session and there may be limited funds available to support travel by those whose attendance is contingent on support. To register to attend the workshop or for information on applying for support and submitting a poster, please visit the workshop's webpage.

*********************************************************************
Workshop web site (including registration, lodging and parking):
http://dimacs.rutgers.edu/events/details?eID=2772


**********************************************************
*
* Contributions to be spread via DMANET are submitted to
*
* DMANET@zpr.uni-koeln.de
*
* Replies to a message carried on DMANET should NOT be
* addressed to DMANET but to the original sender. The
* original sender, however, is invited to prepare an
* update of the replies received and to communicate it
* via DMANET.
*
* DISCRETE MATHEMATICS AND ALGORITHMS NETWORK (DMANET)
* http://www.zaik.uni-koeln.de/AFS/publications/dmanet/
*
**********************************************************