語系:
繁體中文
English
日文
簡体中文
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Numerical optimization with computat...
~
SpringerLink (Online service)
Numerical optimization with computational errors[electronic resource] /
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
杜威分類號:
519.6
書名/作者:
Numerical optimization with computational errors/ by Alexander J. Zaslavski.
作者:
Zaslavski, Alexander J.
出版者:
Cham : : Springer International Publishing :, 2016.
面頁冊數:
ix, 304 p. : : ill., digital ;; 24 cm.
Contained By:
Springer eBooks
標題:
Mathematical optimization.
標題:
Mathematics.
標題:
Calculus of Variations and Optimal Control; Optimization.
標題:
Numerical Analysis.
標題:
Operations Research, Management Science.
ISBN:
9783319309217
ISBN:
9783319309200
內容註:
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
摘要、提要註:
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.
電子資源:
http://dx.doi.org/10.1007/978-3-319-30921-7
Numerical optimization with computational errors[electronic resource] /
Zaslavski, Alexander J.
Numerical optimization with computational errors
[electronic resource] /by Alexander J. Zaslavski. - Cham :Springer International Publishing :2016. - ix, 304 p. :ill., digital ;24 cm. - Springer optimization and its applications,v.1081931-6828 ;. - Springer optimization and its applications ;v.52..
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.
ISBN: 9783319309217
Standard No.: 10.1007/978-3-319-30921-7doiSubjects--Topical Terms:
176332
Mathematical optimization.
LC Class. No.: QA402.5
Dewey Class. No.: 519.6
Numerical optimization with computational errors[electronic resource] /
LDR
:02836nam a2200349 a 4500
001
447430
003
DE-He213
005
20161012172553.0
006
m d
007
cr nn 008maaau
008
161201s2016 gw s 0 eng d
020
$a
9783319309217
$q
(electronic bk.)
020
$a
9783319309200
$q
(paper)
024
7
$a
10.1007/978-3-319-30921-7
$2
doi
035
$a
978-3-319-30921-7
040
$a
GP
$c
GP
041
0
$a
eng
050
4
$a
QA402.5
072
7
$a
PBKQ
$2
bicssc
072
7
$a
PBU
$2
bicssc
072
7
$a
MAT005000
$2
bisacsh
072
7
$a
MAT029020
$2
bisacsh
082
0 4
$a
519.6
$2
23
090
$a
QA402.5
$b
.Z38 2016
100
1
$a
Zaslavski, Alexander J.
$3
590180
245
1 0
$a
Numerical optimization with computational errors
$h
[electronic resource] /
$c
by Alexander J. Zaslavski.
260
$a
Cham :
$b
Springer International Publishing :
$b
Imprint: Springer,
$c
2016.
300
$a
ix, 304 p. :
$b
ill., digital ;
$c
24 cm.
490
1
$a
Springer optimization and its applications,
$x
1931-6828 ;
$v
v.108
505
0
$a
1. Introduction -- 2. Subgradient Projection Algorithm -- 3. The Mirror Descent Algorithm -- 4. Gradient Algorithm with a Smooth Objective Function -- 5. An Extension of the Gradient Algorithm -- 6. Weiszfeld's Method -- 7. The Extragradient Method for Convex Optimization -- 8. A Projected Subgradient Method for Nonsmooth Problems -- 9. Proximal Point Method in Hilbert Spaces -- 10. Proximal Point Methods in Metric Spaces -- 11. Maximal Monotone Operators and the Proximal Point Algorithm -- 12. The Extragradient Method for Solving Variational Inequalities -- 13. A Common Solution of a Family of Variational Inequalities -- 14. Continuous Subgradient Method -- 15. Penalty Methods -- 16. Newton's method -- References -- Index.
520
$a
This book studies the approximate solutions of optimization problems in the presence of computational errors. A number of results are presented on the convergence behavior of algorithms in a Hilbert space; these algorithms are examined taking into account computational errors. The author illustrates that algorithms generate a good approximate solution, if computational errors are bounded from above by a small positive constant. Known computational errors are examined with the aim of determining an approximate solution. Researchers and students interested in the optimization theory and its applications will find this book instructive and informative. This monograph contains 16 chapters; including a chapters devoted to the subgradient projection algorithm, the mirror descent algorithm, gradient projection algorithm, the Weiszfelds method, constrained convex minimization problems, the convergence of a proximal point method in a Hilbert space, the continuous subgradient method, penalty methods and Newton's method.
650
0
$a
Mathematical optimization.
$3
176332
650
1 4
$a
Mathematics.
$3
172349
650
2 4
$a
Calculus of Variations and Optimal Control; Optimization.
$3
464715
650
2 4
$a
Numerical Analysis.
$3
465756
650
2 4
$a
Operations Research, Management Science.
$3
463666
710
2
$a
SpringerLink (Online service)
$3
463450
773
0
$t
Springer eBooks
830
0
$a
Springer optimization and its applications ;
$v
v.52.
$3
464112
856
4 0
$u
http://dx.doi.org/10.1007/978-3-319-30921-7
950
$a
Mathematics and Statistics (Springer-11649)
筆 0 讀者評論
多媒體
多媒體檔案
http://dx.doi.org/10.1007/978-3-319-30921-7
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入