語系:
繁體中文
English
日文
簡体中文
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Examples in Markov Decision Processe...
~
Piunovskiy, A. B.
Examples in Markov Decision Processes[electronic resource].
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
杜威分類號:
519.233
書名/作者:
Examples in Markov Decision Processes
作者:
Piunovskiy, A. B.
出版者:
Singapore : : World Scientific Publishing Company,, 2012.
面頁冊數:
1 online resource (308 p.)
附註:
Description based upon print version of record.
標題:
Markov processes.
ISBN:
9781848167940 (electronic bk.)
ISBN:
1848167946 (electronic bk.)
摘要、提要註:
This invaluable book provides approximately eighty examples illustrating the theory of controlled discrete-time Markov processes. Except for applications of the theory to real-life problems like stock exchange, queues, gambling, optimal search etc, the main attention is paid to counter-intuitive, unexpected properties of optimization problems. Such examples illustrate the importance of conditions imposed in the theorems on Markov Decision Processes. Many of the examples are based upon examples published earlier in journal articles or textbooks while several other examples are new. The aim was.
電子資源:
http://www.worldscientific.com/worldscibooks/10.1142/P809#t=toc
Examples in Markov Decision Processes[electronic resource].
Piunovskiy, A. B.
Examples in Markov Decision Processes
[electronic resource]. - Singapore :World Scientific Publishing Company,2012. - 1 online resource (308 p.) - Imperial College Press Optimization Series ;v.2. - Imperial College Press optimization series..
Description based upon print version of record.
This invaluable book provides approximately eighty examples illustrating the theory of controlled discrete-time Markov processes. Except for applications of the theory to real-life problems like stock exchange, queues, gambling, optimal search etc, the main attention is paid to counter-intuitive, unexpected properties of optimization problems. Such examples illustrate the importance of conditions imposed in the theorems on Markov Decision Processes. Many of the examples are based upon examples published earlier in journal articles or textbooks while several other examples are new. The aim was.
ISBN: 9781848167940 (electronic bk.)Subjects--Topical Terms:
369778
Markov processes.
Index Terms--Genre/Form:
336502
Electronic books.
LC Class. No.: QA274.7
Dewey Class. No.: 519.233
Examples in Markov Decision Processes[electronic resource].
LDR
:01454cam a2200241Mu 4500
001
399999
006
m o d
007
cr cnu---unuuu
008
140123s2012 si o 000 0 eng d
020
$a
9781848167940 (electronic bk.)
020
$a
1848167946 (electronic bk.)
035
$a
ocn830162389
040
$a
EBLCP
$b
eng
$c
EBLCP
$d
OCLCO
$d
YDXCP
$d
DEBSZ
049
$a
FISA
050
4
$a
QA274.7
082
0 4
$a
519.233
100
1
$a
Piunovskiy, A. B.
$3
556639
245
1 0
$a
Examples in Markov Decision Processes
$h
[electronic resource].
260
$a
Singapore :
$b
World Scientific Publishing Company,
$c
2012.
300
$a
1 online resource (308 p.)
490
1
$a
Imperial College Press Optimization Series ;
$v
v.2
500
$a
Description based upon print version of record.
500
$a
3.2.3 A non-optimal strategy for which v x solves the optimality equation
520
$a
This invaluable book provides approximately eighty examples illustrating the theory of controlled discrete-time Markov processes. Except for applications of the theory to real-life problems like stock exchange, queues, gambling, optimal search etc, the main attention is paid to counter-intuitive, unexpected properties of optimization problems. Such examples illustrate the importance of conditions imposed in the theorems on Markov Decision Processes. Many of the examples are based upon examples published earlier in journal articles or textbooks while several other examples are new. The aim was.
650
0
$a
Markov processes.
$3
369778
655
0
$a
Electronic books.
$2
local
$3
336502
830
0
$a
Imperial College Press optimization series.
$3
556640
856
4 0
$u
http://www.worldscientific.com/worldscibooks/10.1142/P809#t=toc
筆 0 讀者評論
多媒體
多媒體檔案
http://www.worldscientific.com/worldscibooks/10.1142/P809#t=toc
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入