語系:
繁體中文
English
日文
簡体中文
說明(常見問題)
登入
回首頁
切換:
標籤
|
MARC模式
|
ISBD
Lexical mechanics: partitions, mixtu...
~
The University of Vermont and State Agricultural College.
Lexical mechanics: partitions, mixtures, and context.
紀錄類型:
書目-語言資料,印刷品 : Monograph/item
書名/作者:
Lexical mechanics: partitions, mixtures, and context.
作者:
Williams, Jake Ryland.
面頁冊數:
111 p.
附註:
Source: Dissertation Abstracts International, Volume: 76-08(E), Section: B.
Contained By:
Dissertation Abstracts International76-08B(E).
標題:
Applied mathematics.
標題:
Psychobiology.
標題:
Linguistics.
ISBN:
9781321666779
摘要、提要註:
Highly structured for efficient communication, natural languages are complex systems. Unlike in their computational cousins, functions and meanings in natural languages are relative, frequently prescribed to symbols through unexpected social processes. Despite grammar and definition, the presence of metaphor can leave unwitting language users "in the dark," so to speak. This is not problematic, but rather an important operational feature of languages, since the lifting of meaning onto higher-order structures allows individuals to compress descriptions of regularly-conveyed information. This compressed terminology, often only appropriate when taken locally (in context), is beneficial in an enormous world of novel experience. However, what is natural for a human to process can be tremendously difficult for a computer.
電子資源:
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3688273
Lexical mechanics: partitions, mixtures, and context.
Williams, Jake Ryland.
Lexical mechanics: partitions, mixtures, and context.
- 111 p.
Source: Dissertation Abstracts International, Volume: 76-08(E), Section: B.
Thesis (Ph.D.)--The University of Vermont and State Agricultural College, 2015.
Highly structured for efficient communication, natural languages are complex systems. Unlike in their computational cousins, functions and meanings in natural languages are relative, frequently prescribed to symbols through unexpected social processes. Despite grammar and definition, the presence of metaphor can leave unwitting language users "in the dark," so to speak. This is not problematic, but rather an important operational feature of languages, since the lifting of meaning onto higher-order structures allows individuals to compress descriptions of regularly-conveyed information. This compressed terminology, often only appropriate when taken locally (in context), is beneficial in an enormous world of novel experience. However, what is natural for a human to process can be tremendously difficult for a computer.
ISBN: 9781321666779Subjects--Topical Terms:
630002
Applied mathematics.
Lexical mechanics: partitions, mixtures, and context.
LDR
:04193nam a2200325 4500
001
440968
005
20160422125039.5
008
160525s2015 ||||||||||||||||| ||eng d
020
$a
9781321666779
035
$a
(MiAaPQ)AAI3688273
035
$a
AAI3688273
040
$a
MiAaPQ
$c
MiAaPQ
100
1
$a
Williams, Jake Ryland.
$3
630000
245
1 0
$a
Lexical mechanics: partitions, mixtures, and context.
300
$a
111 p.
500
$a
Source: Dissertation Abstracts International, Volume: 76-08(E), Section: B.
500
$a
Advisers: Peter S. Dodds; Christopher M. Danforth.
502
$a
Thesis (Ph.D.)--The University of Vermont and State Agricultural College, 2015.
520
$a
Highly structured for efficient communication, natural languages are complex systems. Unlike in their computational cousins, functions and meanings in natural languages are relative, frequently prescribed to symbols through unexpected social processes. Despite grammar and definition, the presence of metaphor can leave unwitting language users "in the dark," so to speak. This is not problematic, but rather an important operational feature of languages, since the lifting of meaning onto higher-order structures allows individuals to compress descriptions of regularly-conveyed information. This compressed terminology, often only appropriate when taken locally (in context), is beneficial in an enormous world of novel experience. However, what is natural for a human to process can be tremendously difficult for a computer.
520
$a
When a sequence of words (a phrase) is to be taken as a unit, suppose the choice of words in the phrase is subordinate to the choice of the phrase, i.e., there exists an inter-word dependence owed to membership within a common phrase. This word selection process is not one of independent selection, and so is capable of generating word-frequency distributions that are not accessible via independent selection processes. We have shown in Ch. 2 through analysis of thousands of English texts that empirical word-frequency distributions possess these word-dependence anomalies, while phrase-frequency distributions do not. In doing so, this study has also led to the development of a novel, general, and mathematical framework for the generation of frequency data for phrases, opening up the field of mass-preserving mesoscopic lexical analyses.
520
$a
A common oversight in many studies of the generation and interpretation of language is the assumption that separate discourses are independent. However, even when separate texts are each produced by means of independent word selection, it is possible for their composite distribution of words to exhibit dependence. Succinctly, different texts may use a common word or phrase for different meanings, and so exhibit disproportionate usages when juxtaposed. To support this theory, we have shown in Ch. 3 that the act of combining distinct texts to form large 'corpora' results in word-dependence irregularities. This not only settles a 15-year discussion, challenging the current major theory, but also highlights an important practice necessary for successful computational analysis---the retention of meaningful separations in language.
520
$a
We must also consider how language speakers and listeners navigate such a combinatorially vast space for meaning. Dictionaries (or, the collective editorial communities behind them) are smart. They know all about the lexical objects they define, but we ask about the latent information they hold, or should hold, about related, undefined objects. Based solely on the text as data, in Ch. 4 we build on our result in Ch. 2 and develop a model of context defined by the structural similarities of phrases. We then apply this model to define measures of meaning in a corpus-guided experiment, computationally detecting entries missing from a massive, collaborative online dictionary known as the Wiktionary.
590
$a
School code: 0243.
650
4
$a
Applied mathematics.
$3
630002
650
4
$a
Psychobiology.
$3
212273
650
4
$a
Linguistics.
$3
174558
690
$a
0364
690
$a
0349
690
$a
0290
710
2
$a
The University of Vermont and State Agricultural College.
$b
Mathematical Sciences.
$3
630001
773
0
$t
Dissertation Abstracts International
$g
76-08B(E).
790
$a
0243
791
$a
Ph.D.
792
$a
2015
793
$a
English
856
4 0
$u
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3688273
筆 0 讀者評論
多媒體
多媒體檔案
http://pqdd.sinica.edu.tw/twdaoapp/servlet/advanced?query=3688273
評論
新增評論
分享你的心得
Export
取書館別
處理中
...
變更密碼
登入