Buch, Englisch, 273 Seiten, Format (B × H): 198 mm x 266 mm, Gewicht: 833 g
Reihe: Cognitive Technologies
Buch, Englisch, 273 Seiten, Format (B × H): 198 mm x 266 mm, Gewicht: 833 g
Reihe: Cognitive Technologies
ISBN: 978-981-19-5606-5
Verlag: Springer Nature Singapore
The computational linguists and deep learning researchers who developed word vectors have relied primarily on the ever-increasing availability of large corpora and of computers with highly parallel GPU and TPU compute engines, and their focus is with endowing computers with natural language capabilities for practical applications such as machine translation or question answering. Cognitive linguists investigate natural language from the perspective of human cognition, the relation between language and thought, and questions about conceptual universals, relying primarily on in-depth investigation of language in use.
In spite of the fact that these two schools both have ‘linguistics’ in their name, so far there has been very limited communication between them, as their historical origins, data collection methods, and conceptual apparatuses are quite different. Vector semantics bridges the gap by presenting a formal theory, cast in terms of linear polytopes, that generalizes both word vectors and conceptual structures, by treating each dictionary definition as an equation, and the entire lexicon as a set of equations mutually constraining all meanings.Zielgruppe
Research
Autoren/Hrsg.
Fachgebiete
- Mathematik | Informatik EDV | Informatik Informatik Künstliche Intelligenz Wissensbasierte Systeme, Expertensysteme
- Mathematik | Informatik EDV | Informatik Angewandte Informatik Computeranwendungen in Geistes- und Sozialwissenschaften
- Mathematik | Informatik EDV | Informatik Informatik Künstliche Intelligenz Spracherkennung, Sprachverarbeitung
- Mathematik | Informatik EDV | Informatik Informatik Künstliche Intelligenz Maschinelles Lernen
- Geisteswissenschaften Sprachwissenschaft Computerlinguistik, Korpuslinguistik
Weitere Infos & Material
Contents
Preface............................................................... vii
1 Foundations of non-compositionality.................................
1.1 Background ...................................................
1.2 Lexicographic principles ........................................
1.3 The syntax of de?nitions ........................................
1.4 The geometry of de?nitions......................................
1.5 The algebra of de?nitions .......................................
2 From morphology to syntax ........................................ 23
2.1 Lexical categories and subcategories .............................. 23
2.2 Bound morphemes ............................................. 25
2.3 Relations ..................................................... 302.4 Linking....................................................... 39
2.5 Naive grammar ................................................ 46
3 Time and space.................................................... 53
3.1 Space ........................................................ 54
3.2 Time ......................................................... 59
3.3 Indexicals, coercion ............................................ 62
3.4 Measure ...................................................... 65
4 Negation.......................................................... 69
4.1 Negation in the lexicon.......................................... 71
4.2 Quanti?ers .................................................... 73
4.3 Negation in compositional constructions ........................... 74
4.4 Double negation ............................................... 77
4.5 Compositional quanti?ers ....................................... 78
4.6 Disjunction ................................................... 80
4.7 Scope ambiguities.............................................. 81
4.8 Conclusions ................................................... 82
5 Valuations ........................................................ 83
5.1 Introduction ................................................... 83
5.2 The likeliness scale............................................. 84
5.3 Naive inference (likeliness update) ................................ 86
5.4 Learning...................................................... 89
5.5 Conclusions ................................................... 916 Modality ......................................................... 93
6.1 The deontic world .............................................. 93
6.2 Epistemic and autoepistemic logic ................................ 93
6.3 Defaults ...................................................... 93
7 Adjectives, gradience, implicature ................................... 95
7.1 Adjectives .................................................... 95
7.2 Gradience..................................................... 96
7.3 Implicature.................................................... 96
7.4 The elementary pieces .......................................... 97
7.5 The mechanism ................................................ 100
7.6 Memory ...................................................... 103
7.7 Conclusions ................................................... 104
8 Trainability and real-world knowledge............................... 107
8.1 Proper names.................................................. 107
8.2 Trainability ................................................... 109
9 Dynamic embeddings .............................................. 111
9.1 The internals of dynamic embeddings ............................. 111
9.2 Attention and the representation space ............................. 111
10 Unaf?liated material............................................... 113
4lang................................................................. 115
References............................................................ 117External index ........................................................ 129




