A comprehensive theory-based approach to the treatment of text meaning in natural language processing applications. In Ontological Semantics, Sergei Nirenburg and Victor Raskin introduce a comprehensive approach to the treatment of text meaning by computer. Arguing that being able to use meaning is crucial to the success of natural language processing (NLP) applications, they depart from the ad hoc approach to meaning taken by much of the NLP community and propose theory-based semantic methods. Ontological semantics, an integrated complex of theories, methodologies, descriptions, and implementations, attempts to systematize ideas about both semantic description as representation and manipulation of meaning by computer programs. It is built on already coordinated "microtheories" covering such diverse areas as specific language phenomena, processing heuristics, and implementation system architecture rather than on isolated components requiring future integration. Ontological semantics is constantly evolving, driven by the need to make meaning manipulation tasks such as text analysis and text generation work. Nirenburg and Raskin have therefore developed a set of heterogeneous methods suited to a particular task and coordinated at the level of knowledge acquisition and runtime system architecture implementations, a methodology that also allows for a variable level of automation in all its processes.Nirenburg and Raskin first discuss ontological semantics in relation to other fields, including cognitive science and the AI paradigm, the philosophy of science, linguistic semantics and the philosophy of language, computational lexical semantics, and studies in formal ontology. They then describe the content of ontological semantics, discussing text-meaning representation, static knowledge sources (including the ontology, the fact repository, and the lexicon), the processes involved in text analysis, and the acquisition of static knowledge.
Description:
A comprehensive theory-based approach to the treatment of text meaning in natural language processing applications. In Ontological Semantics, Sergei Nirenburg and Victor Raskin introduce a comprehensive approach to the treatment of text meaning by computer. Arguing that being able to use meaning is crucial to the success of natural language processing (NLP) applications, they depart from the ad hoc approach to meaning taken by much of the NLP community and propose theory-based semantic methods. Ontological semantics, an integrated complex of theories, methodologies, descriptions, and implementations, attempts to systematize ideas about both semantic description as representation and manipulation of meaning by computer programs. It is built on already coordinated "microtheories" covering such diverse areas as specific language phenomena, processing heuristics, and implementation system architecture rather than on isolated components requiring future integration. Ontological semantics is constantly evolving, driven by the need to make meaning manipulation tasks such as text analysis and text generation work. Nirenburg and Raskin have therefore developed a set of heterogeneous methods suited to a particular task and coordinated at the level of knowledge acquisition and runtime system architecture implementations, a methodology that also allows for a variable level of automation in all its processes.Nirenburg and Raskin first discuss ontological semantics in relation to other fields, including cognitive science and the AI paradigm, the philosophy of science, linguistic semantics and the philosophy of language, computational lexical semantics, and studies in formal ontology. They then describe the content of ontological semantics, discussing text-meaning representation, static knowledge sources (including the ontology, the fact repository, and the lexicon), the processes involved in text analysis, and the acquisition of static knowledge.