Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary.
From the Back Cover
Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary. This unique volume will be of appeal to academic and professional researchers and historians, translators, students, and others in the linguistics field. Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.
About the Author
Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.
Description:
Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary.
From the Back Cover
Automating Linguistics offers an in-depth study of the history of the mathematisation and automation of the sciences of language. In the wake of the first mathematisation of the 1930s, two waves followed: machine translation in the 1950s and the development of computational linguistics and natural language processing in the 1960s. These waves were pivotal given the work of large computerised corpora in the 1990s and the unprecedented technological development of computers and software.Early machine translation was devised as a war technology originating in the sciences of war, amidst the amalgamate of mathematics, physics, logics, neurosciences, acoustics, and emerging sciences such as cybernetics and information theory. Machine translation was intended to provide mass translations for strategic purposes during the Cold War. Linguistics, in turn, did not belong to the sciences of war, and played a minor role in the pioneering projects of machine translation.Comparing the two trends, the present book reveals how the sciences of language gradually integrated the technologies of computing and software, resulting in the second-wave mathematisation of the study of language, which may be called mathematisation-automation. The integration took on various shapes contingent upon cultural and linguistic traditions (USA, ex-USSR, Great Britain and France). By contrast, working with large corpora in the 1990s, though enabled by unprecedented development of computing and software, was primarily a continuation of traditional approaches in the sciences of language sciences, such as the study of spoken and written texts, lexicography, and statistical studies of vocabulary.
This unique volume will be of appeal to academic and professional researchers and historians, translators, students, and others in the linguistics field.
Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.
About the Author
Jacqueline Léon is a senior researcher emeritus at the Centre National de la Recherche Scientifique (CNRS) in France. After several years working on natural language processing for discourse analysis, her research concerned conversation analysis and the history of dialogue theories. Since 1992, she has been working at the Laboratoire d’Histoire des Théories Linguistiques (CNRS, Université de Paris) on the history and epistemology of contemporary language sciences.