BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//wp-events-plugin.com//5.981//EN
TZID:Asia/Jerusalem
X-WR-TIMEZONE:Asia/Jerusalem
BEGIN:VEVENT
UID:116@web.iem.technion.ac.il
DTSTART;TZID=Asia/Jerusalem:20211017T143000
DTEND;TZID=Asia/Jerusalem:20211017T153000
DTSTAMP:20211005T053831Z
URL:https://web.iem.technion.ac.il/site/iemevents/learning-discrete-struct
ured-variational-auto-encoder-using-natural-evolution-strategies/
SUMMARY:Learning Discrete Structured Variational Auto-Encoder using Natural
Evolution Strategies [ \n Graduate Student Seminar\n Seminars\n
\n ]
DESCRIPTION:By: MSc. Alon Berliner\n Advisors: Prof. Tamir Hazan\n Where: Z
OOM From:\nTechnion\nAbstract:\n\nDiscrete variational auto-encoders (VAEs
) are able to represent semantic latent spaces in generative learning. In
many real-life settings\, the discrete latent space consists of high-dimen
sional structures\, and propagating gradients through the relevant structu
res often requires enumerating over exponentialy many structures. Recently
\, various approaches were devised to propagate approximated gradients wit
hout enumerating over the space of possible structures. In this work\, we
use Natural Evolution Strategies (NES)\, a class of gradient-free black-bo
x optimization algorithms\, to learn discrete VAEs. NES algorithms are com
putationally appealing as they estimate gradients with forward pass evalua
tions only\, thus they do not require to propagate gradients through their
discrete structures. We demonstrate empirically that optimizing discrete
structured VAEs using NES is as effective as gradient-based approximations
. Lastly\, we prove NES converges for non-Lipschitz functions as appear in
discrete structured VAEs.\n\n \;\n\nZoom Link\n\nhttps://technion.zo
om.us/j/3800541616
CATEGORIES:Graduate Student Seminar,Seminars
END:VEVENT
BEGIN:VTIMEZONE
TZID:Asia/Jerusalem
X-LIC-LOCATION:Asia/Jerusalem
BEGIN:DAYLIGHT
DTSTART:20210326T030000
TZOFFSETFROM:+0200
TZOFFSETTO:+0300
TZNAME:IDT
END:DAYLIGHT
END:VTIMEZONE
END:VCALENDAR