Abstract: Many standard optimization algorithms require being able to cheaply and accurately compute derivatives for the objective and/or constraint functions. However, in the presence of noise, or computationally expensive or black-box procedures, derivative information may be inaccurate or impractical to compute. Derivative-Free Optimization (DFO) encompasses a variety of techniques for nonlinear optimization in the absence of derivatives. However, such techniques can struggle on large-scale problems for reasons including high linear algebra costs and strong dimension-dependency of worst-case complexity bounds. In this talk, I will discuss model-based and direct search DFO algorithms based on iterative searches in randomly drawn subspaces and show how these methods can be used to improve the scalability of DFO. This is joint work with Coralia Cartis (Oxford) and Clément Royer (Paris Dauphine-PSL).
Talk time in other timezones: AEST 9:00 AM Fri 1 Sep, JST 8:00 AM Fri 1 Sep, CEST 1:00 AM Fri 1 Sep, BST 3:00 PM Fri 1 Sep, UTC 23:00 Thu 31 Aug, EDT 7:00 PM Thu 31 Aug, CDT 6:00 PM Thu 31 Aug, MDT 5:00 PM Thu 31 Aug, MST 4:00 PM Thu 31 Aug, PDT 4:00 PM Thu 31 Aug