Dear all,
The CAM seminar starts @ 4:45pm. The speaker is
Guannan Zhang (ORNL)
His title, abstract, and the zoom info are below.
Abner
Title: High-dimensional black-box optimization and its applications in
machine learning
Abstract: Local gradient points to the direction of the steepest slope
of a function in an infinitesimal neighborhood. An optimizer guided by
the local gradient is often trapped in local optima when the loss
landscape is non-convex or multi-modal. To address this issue, we
developed a novel nonlocal gradient for global optimization in the
black-box setting where we only have access to function queries. We
first developed a directional Gaussian smoothing (DGS) approach and
then used DGS to define the nonlocal gradient. The DGS method conducts
1D nonlocal exploration with a large smoothing radius along d
orthogonal directions in R^d, each of which defines a nonlocal
directional derivative as a 1D integral. The d directional derivatives
are then assembled to form our nonlocal gradient, referred to as the
DGS gradient. We used the Gauss-Hermite (GH) quadrature to approximate
the d 1D integrals to obtain an accurate estimator. The DGS gradient
can be used to guide any gradient-based optimization algorithms such as
gradient descent and Adam. The superior performance of our method is
demonstrated in three sets of examples, including benchmark functions
for global optimization, and reinforcement learning.
============
https://tennessee.zoom.us/j/91531216657
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-
To leave the list go here:
https://listserv.utk.edu/cgi-bin/wa?SUBED1=MATHTALK&A=1
|