BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//7.2.3.1//EN
TZID:Europe/Paris
X-WR-TIMEZONE:Europe/Paris
BEGIN:VEVENT
UID:57@cran.univ-lorraine.fr
DTSTART;TZID=Europe/Paris:20240209T140000
DTEND;TZID=Europe/Paris:20240209T153000
DTSTAMP:20240316T160214Z
URL:https://www.cran.univ-lorraine.fr/events/seminaire-projet-simul-biosis
 -9/
SUMMARY:Séminaire Valentin Leplat
DESCRIPTION:Speaker: Valentin Leplat (SkolTech\, Moscow\, Russia)\nWebpage:
  https://sites.google.com/view/valentinleplat/\nDate: February\, 9th 2024\
 , 14h-15h\nTitle: Introduction to Deep Nonnegative Matrix Factorization an
 d Stochastic Optimization with heavy-tails\n\nAbstract:\nPart 1: Deep Nonn
 egative Matrix Factorization with β-Divergences\nOur first topic revolves
  around the Deep Nonnegative Matrix Factorization (deep NMF)\, a novel and
  promising facet of unsupervised learning. Deep NMF has emerged as a poten
 t technique for extracting multi-layered features spanning various scales.
  However\, conventional deep NMF models have primarily relied on the least
  squares error as their evaluation metric\, which may not be the most suit
 able gauge for assessing the quality of approximations across diverse data
 sets. For data types such as audio signals and documents\, β-divergences 
 have gained recognition as a more fitting alternative. In this seminar\, w
 e present new models and algorithms that harness β-divergences to enhance
  deep NMF\, with an emphasis on the notion of identifiability.\n\nPart 2: 
 Heavy-Tailed Stochastic Optimization for Deep Neural Networks\nOur second 
 topic concerns stochastic optimization\, with a particular focus on recent
  discoveries concerning the nature of stochastic gradient noise in deep ne
 ural network training. Contrary to the conventional assumption of Gaussian
  noise\, empirical evidences show that gradient noise often exhibits heavy
 -tailed characteristics. We introduce an efficient mechanism for optimizer
 s to handle this noise behavior. Additionally\, we showcase an extension o
 f our recently introduced stochastic optimizer\, referred to as NAG-GS\, s
 pecifically tailored for the training of Vision Transformers.\n\nClick her
 e to join the meeting: https://teams.microsoft.com/l/meetup-join/19%3aaa79
 c15ac331466aa8ad98cbecb29ab2%40thread.tacv2/1707130295999?context=%7b%22Ti
 d%22%3a%22158716cf-46b9-48ca-8c49-c7bb67e575f3%22%2c%22Oid%22%3a%22c4a8aea
 2-7ce5-4ee9-b6c5-9fee62ad0257%22%7d
CATEGORIES:Département BioSiS,Séminaires projet SiMul
LOCATION:CRAN - FST - 4ème\, Campus Sciences\, Boulevard des Aiguillettes\
 , Vandoeuvre-lès-Nancy\, 54506\, France
X-APPLE-STRUCTURED-LOCATION;VALUE=URI;X-ADDRESS=Campus Sciences\, Boulevard
  des Aiguillettes\, Vandoeuvre-lès-Nancy\, 54506\, France;X-APPLE-RADIUS=
 100;X-TITLE=CRAN - FST - 4ème:geo:0,0
END:VEVENT
BEGIN:VTIMEZONE
TZID:Europe/Paris
X-LIC-LOCATION:Europe/Paris
BEGIN:STANDARD
DTSTART:20231029T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
END:STANDARD
END:VTIMEZONE
END:VCALENDAR