High-dimensional optimization and probability [electronic resource] : with a view towards data science / Ashkan Nikeghbali, Panos M. Pardalos, Andrei M. Raigorodskii, Michael Th. Rassias, editors.

This volume presents extensive research devoted to a broad spectrum of mathematics with emphasis on interdisciplinary aspects of Optimization and Probability. Chapters also emphasize applications to Data Science, a timely field with a high impact in our modern society. The discussion presents modern...

Full description

Saved in:
Bibliographic Details
Online Access: Full Text (via Springer)
Other Authors: Nikeghbali, Ashkan, 1975-, Pardalos, P. M. (Panos M.), 1954-, Raigorodskii, Andrei, Rassias, Michael Th., 1987-
Format: Electronic eBook
Language:English
Published: Cham : Springer, 2022.
Series:Springer optimization and its applications ; v. 191.
Subjects:
Table of Contents:
  • Intro
  • Preface
  • Contents
  • Projection of a Point onto a Convex Set via Charged Balls Method
  • 1 Introduction
  • 2 Charged Balls Method and its Modification
  • 3 Case of the Set with Nonsmooth Boundary
  • 4 Numerical Experiments
  • 5 Conclusion
  • References
  • Towards Optimal Sampling for Learning Sparse Approximations in High Dimensions
  • 1 Introduction
  • 1.1 Main Problem
  • 1.2 Overview
  • 1.3 Additional Contributions
  • 1.4 Related Literature
  • 1.5 Outline
  • 2 Preliminaries
  • 2.1 Notation
  • 2.2 Problem and Key Questions
  • 2.3 Examples
  • 2.4 Multi-Index Sets.
  • 3 Sparse Approximation via (Weighted) Least Squares
  • 3.1 Computation of the Least-Squares Approximation
  • 3.2 Accuracy, Stability and Sample Complexity
  • 3.3 Monte Carlo Sampling
  • 3.4 Optimal Sampling
  • 3.5 Practical Optimal Sampling via Discrete Measures
  • 3.6 Numerical Examples
  • 3.7 Proofs of Theorems 2 and 3
  • 4 Sparse Approximation via 1-Minimization
  • 4.1 Formulation
  • 4.2 Accuracy, Stability and Sample Complexity
  • 4.3 Monte Carlo Sampling
  • 4.4 ̀Optimal' Sampling
  • 4.5 ̀Optimal' Sampling and Discrete Measures
  • 4.6 Further Discussion and Numerical Examples.
  • 4.7 Proof of Theorems 5 and 6
  • 5 A Novel Approach for Sparse Polynomial Approximation on Irregular Domains
  • 5.1 Method
  • 5.2 Orderings
  • 5.3 Numerical Examples
  • 6 Structured Sparse Approximation
  • 6.1 Weighted Sparsity and Weighted 1-Minimization
  • 6.2 Sparsity in Lower Sets
  • 6.3 Sampling and Numerical Experiments
  • 7 Conclusions and Challenges
  • References
  • Recent Theoretical Advances in Non-Convex Optimization
  • 1 Introduction
  • 2 Preliminaries
  • 2.1 Global Optimization Is NP-Hard
  • 2.2 Lower Complexity Bound for Global Optimization
  • 2.3 Examples of Non-Convex Problems.
  • 2.3.1 Problems with Hidden Convexity or Analytic Solutions
  • 2.3.2 Problems with Convergence Results
  • 2.3.3 Geometry of Non-Convex Optimization Problems
  • 3 Deterministic First-Order Methods
  • 3.1 Unconstrained Minimization
  • 3.2 Incorporating Simple Constraints
  • 3.3 Incorporating Momentum for Acceleration
  • 4 Stochastic First-Order Methods
  • 4.1 General View on Optimal Deterministic and Stochastic First-Order Methods for Non-Convex Optimization
  • 4.1.1 Deterministic Case
  • 4.1.2 Stochastic Case: Uniformly Bounded Variance
  • 4.1.3 Stochastic Case: Finite-Sum Minimization.
  • 5.2.1 Stochastic Methods and α-Weak-Quasi-Convexity.
  • 4.2 SGD and Its Variants
  • 4.2.1 Assumptions on the Stochastic Gradient
  • 4.2.2 The Choice of the Stepsize
  • 4.2.3 Over-Parameterized Models
  • 4.2.4 Proximal Variants
  • 4.2.5 Momentum-SGD
  • 4.2.6 Random Reshuffling
  • 4.3 Variance-Reduced Methods
  • 4.3.1 Convex and Weakly Convex Sums of Non-Convex Functions
  • 4.4 Adaptive Methods
  • 4.4.1 AdaGrad and Adam
  • 4.4.2 Adaptive SGD
  • 5 First-Order Methods Under Additional Assumptions
  • 5.1 Polyak-Łojasiewicz Condition
  • 5.1.1 Stochastic First-Order Methods Under Polyak-Łojasiewicz Condition
  • 5.2 Star-Convexity and α-Weak-Quasi-Convexity.