000 | 03933nam a22005775i 4500 | ||
---|---|---|---|
001 | 978-3-030-17076-9 | ||
003 | DE-He213 | ||
005 | 20220801214744.0 | ||
007 | cr nn 008mamaa | ||
008 | 190612s2020 sz | s |||| 0|eng d | ||
020 |
_a9783030170769 _9978-3-030-17076-9 |
||
024 | 7 |
_a10.1007/978-3-030-17076-9 _2doi |
|
050 | 4 | _aTK5101-5105.9 | |
072 | 7 |
_aTJK _2bicssc |
|
072 | 7 |
_aTEC041000 _2bisacsh |
|
072 | 7 |
_aTJK _2thema |
|
082 | 0 | 4 |
_a621.382 _223 |
100 | 1 |
_aShi, Bin. _eauthor. _4aut _4http://id.loc.gov/vocabulary/relators/aut _940251 |
|
245 | 1 | 0 |
_aMathematical Theories of Machine Learning - Theory and Applications _h[electronic resource] / _cby Bin Shi, S. S. Iyengar. |
250 | _a1st ed. 2020. | ||
264 | 1 |
_aCham : _bSpringer International Publishing : _bImprint: Springer, _c2020. |
|
300 |
_aXXI, 133 p. 25 illus., 24 illus. in color. _bonline resource. |
||
336 |
_atext _btxt _2rdacontent |
||
337 |
_acomputer _bc _2rdamedia |
||
338 |
_aonline resource _bcr _2rdacarrier |
||
347 |
_atext file _bPDF _2rda |
||
505 | 0 | _aChapter 1. Introduction -- Chapter 2. General Framework of Mathematics -- Chapter 3. Problem Formulation -- Chapter 4. Development of Novel Techniques of CoCoSSC Method -- Chapter 5. Further Discussions of the Proposed Method -- Chapter 6. Related Work on Geometry of Non-Convex Programs -- Chapter 7. Gradient Descent Converges to Minimizers -- Chapter 8. A Conservation Law Method Based on Optimization -- Chapter 9. Improved Sample Complexity in Sparse Subspace Clustering with Noisy and Missing Observations -- Chapter 10. Online Discovery for Stable and Grouping Causalities in Multi-Variate Time Series -- Chapter 11. Conclusion. | |
520 | _aThis book studies mathematical theories of machine learning. The first part of the book explores the optimality and adaptivity of choosing step sizes of gradient descent for escaping strict saddle points in non-convex optimization problems. In the second part, the authors propose algorithms to find local minima in nonconvex optimization and to obtain global minima in some degree from the Newton Second Law without friction. In the third part, the authors study the problem of subspace clustering with noisy and missing data, which is a problem well-motivated by practical applications data subject to stochastic Gaussian noise and/or incomplete data with uniformly missing entries. In the last part, the authors introduce an novel VAR model with Elastic-Net regularization and its equivalent Bayesian model allowing for both a stable sparsity and a group selection. Provides a thorough look into the variety of mathematical theories of machine learning Presented in four parts, allowing for readers to easily navigate the complex theories Includes extensive empirical studies on both the synthetic and real application time series data. | ||
650 | 0 |
_aTelecommunication. _910437 |
|
650 | 0 |
_aComputational intelligence. _97716 |
|
650 | 0 |
_aData mining. _93907 |
|
650 | 0 |
_aInformation storage and retrieval systems. _922213 |
|
650 | 0 |
_aQuantitative research. _94633 |
|
650 | 1 | 4 |
_aCommunications Engineering, Networks. _931570 |
650 | 2 | 4 |
_aComputational Intelligence. _97716 |
650 | 2 | 4 |
_aData Mining and Knowledge Discovery. _940252 |
650 | 2 | 4 |
_aInformation Storage and Retrieval. _923927 |
650 | 2 | 4 |
_aData Analysis and Big Data. _940253 |
700 | 1 |
_aIyengar, S. S. _eauthor. _4aut _4http://id.loc.gov/vocabulary/relators/aut _940254 |
|
710 | 2 |
_aSpringerLink (Online service) _940255 |
|
773 | 0 | _tSpringer Nature eBook | |
776 | 0 | 8 |
_iPrinted edition: _z9783030170752 |
776 | 0 | 8 |
_iPrinted edition: _z9783030170776 |
776 | 0 | 8 |
_iPrinted edition: _z9783030170783 |
856 | 4 | 0 | _uhttps://doi.org/10.1007/978-3-030-17076-9 |
912 | _aZDB-2-ENG | ||
912 | _aZDB-2-SXE | ||
942 | _cEBK | ||
999 |
_c76714 _d76714 |