Natural Language Processing – The Big Picture


Introduction

Since I’m personally in hunger for big pictures for everything around me, and since natural language processing (NLP) is highly important for computers to control human kind, and since many of the AI-related careers depend on it, and since it is used extensively for commercial uses, For all those reasons, I will give a big picture about it (Yes, I mean NLP).

What’s Natural Language Processing?

The simple definition is obvious : making computers understand or generate a human text in a certain language, however, the complete definition is : a set of computational techniques for analyzing and representing naturally occurring texts (at one or more levels) for the purpose of achieving human-like language processing for a range of applications.

NLP can be done for any language of any mode or genre, for oral or written texts. It works over multiple types or levels of language processing starting from the level of understanding a word to the level of understanding the big picture of a complete book.

A brain with language samples. (Image courtesy of MIT OCW.)

Related Sciences?

Linguistics: focuses on formal, structural models of language and the discovery of language universals – in fact the field of NLP was originally referred to as Computational Linguistics.

Computer Science: is concerned with developing internal representations of data and efficient processing of these structures.

Cognitive Psychology: looks at language usage as a window into human cognitive processes, and has the goal of modeling the use of language in a psychologically plausible way.

Language Processing VS Language Generation

NLP may focus on language processing or generation. The first of these refers to the analysis of language for the purpose of producing a meaningful representation, while the latter refers to the production of language from a representation. The task of language processing is equivalent to the role of reader/listener, while the task of language generation is that of the writer/speaker. While much of the theory and technology are shared by these two divisions, Natural Language Generation also requires a planning capability. That is, the generation system requires a plan or model of the goal of the interaction in order to decide what the system should generate at each point in an interaction.

What are its sub-problems?

NLP’s performed by solving a number of sub-problems, where each sub-problem constitute a level (mentioned earlier). Note that, a portion of those levels could be applied, not necessarily all of them. For example some applications require the first 3 levels only. Also, the levels could be applied in a different order independent of their granularity.

Level 1 – Phonology : This level is applied only if the text origin is a speech. It deals with the interpretation of speech sounds within and across words. Speech sound might give a big hint about the meaning of a word or a sentence.

Level 2 –  Morphology : Deals with understanding distinct words according to their morphemes ( the smallest units of meanings) . For example, the word preregistration can be morphologically analyzed into three separate morphemes: the prefix “pre”, the root “registra”, and the suffix “tion”.

Level 3 – Lexical : Deals with understanding everything about distinct words according to their position in the speech, their meanings and their relation to other words.

Level 4 – Syntactic : Deals with analyzing the words of a sentence so as to uncover the grammatical structure of the sentence.

Level 5- Semantic : Determines the possible meanings of a sentence by focusing on the interactions among word-level meanings in the sentence. Some people may thing its the level which determines the meaning, but actually all the level do.

Level 6 – Discourse : Focuses on the properties of the text as a whole that convey meaning by making connections between component sentences.

Level 7 – Pragmatic : Explains how extra meaning is read into texts without actually being encoded in them. This requires much world knowledge, including the understanding of intentions, plans, and goals. Consider the following 2 sentences:

  • The city councilors refused the demonstrators a permit because they feared violence.
  • The city councilors refused the demonstrators a permit because they advocated revolution.

The meaning of “they” in the 2 sentences is different. In order to figure out the difference, world knowledge in knowledge bases and inferencing modules should be utilized.

What are the approaches for performing NLP?

Natural language processing approaches fall roughly into four categories: symbolic, statistical, connectionist, and hybrid. Symbolic and statistical approaches have coexisted since the early days of this field. Connectionist NLP work first appeared in the 1960’s.

Symbolic Approach: Symbolic approaches perform deep analysis of linguistic phenomena and are based on explicit representation of facts about language through well-understood knowledge representation schemes and associated algorithms. The primary source of evidence in symbolic systems comes from human-developed rules.

Statistical Approach: Statistical approaches employ various mathematical techniques and often use large text input to develop approximate generalized models of linguistic phenomena based on actual examples of these phenomena provided by the text input without adding significant linguistic or world knowledge. In contrast to symbolic approaches, statistical approaches use observable data as the primary source of evidence.

Connectionist Approach: Similar to the statistical approaches, connectionist approaches also develop generalized models from examples of linguistic phenomena. What separates connectionism from other statistical methods is that connectionist models combine statistical learning with various theories of representation – thus the connectionist representations allow transformation, inference, and manipulation of logic formulae. In addition, in connectionist systems, linguistic models are harder to observe due to the fact that connectionist architectures are less constrained than statistical ones.

NLP Applications

Information Retrieval – Information Extraction – Question-Answering – Summarization – Machine Translation – Dialogue Systems

References

Liddy, E. D. Natural Language Processing. In Encyclopedia of Library and Information Science. 2nd Ed. Marcel Decker, Inc.

Advertisements