О ТЕОРИИ ИНФОРМАЦИИ - Студенческий научный форум

XI Международная студенческая научная конференция Студенческий научный форум - 2019

О ТЕОРИИ ИНФОРМАЦИИ

Синева А.А. 1, Бабаян С.А. 1, Юшкевич Н С 1
1ВУНЦ ВВС "ВВА" им. проф. Н.Е. Жуковского и Ю.А. Гагарина (г. Воронеж)
 Комментарии
Текст работы размещён без изображений и формул.
Полная версия работы доступна во вкладке "Файлы работы" в формате PDF

Интеграция научных направлений в век информационных технологий приводит к новому качеству идей, методов, взаимодействия и средств во многих областях человеческой деятельности, включая и военную область.

Достижения современной науки и техники позволили развивать и применять новые средства связи и наряду с разработкой устройств для передачи и обработки информации возникла теория информации.

В основном эта теория касается систем, предназначенных дл передачи информации и ее обработки. Специалисты в этой области занимаются количественными аспектами и способностью различных систем передавать, хранить и различным образом обрабатывать информацию. Основными понятиями являются: источник информации, сообщение, передатчик, приемник, сигнал, шум, канал, энтропия, управление, код, модель, язык.

Keywords: information theory, communication, transformation, signal, language, message.

О ТЕОРИИ ИНФОРМАЦИИ

Представлен общий обзор возникновения и содержания теории информации.

Ключевые слова: теория информации, связь, преобразование, сигнал, язык, сообщение.

Modern science and engineering is impossible without close integration of difference scientific branches. Improvements in information age technologies have changed the quality of communication and interaction among individuals and groups. As a result, the nature of the fog and friction of war are being radically altered. Information age militaries will be able to generate synergy because they will be better integrated in a number dimensions.

We are going to dwell upon the notion of information theory as a whole because nowadays its importance can't be overestimated.

The interactions among the constituent parts of a control system, and the ensuing transmission of physical (or other) effects from one part to another, can be advantageously regarded as a flow of signals through the system. Thus it comes about that some of the basic notions and principles of cybernetics also play important parts in information theory. Just how closely the two theories are related depends on how broadly the somewhat elastic term "Information theory" is interpreted.

The theory, which has been termed cybernetics, formulates the requirements which control systems, intended for different purposes, should satisfy. It provides methods for the theoretical design of various kinds of control systems, and it furnishes criteria for judging the performances of control systems in actual operations. The content of the theory consists essentially of portions of various familiar mathematical and physical subjects (e.g. dynamics, theory of differential equations and probability theory), this material being unified and reformulated for application to the novel problems associated with control systems [1,2].

One of the most prominent features of 20th century technology is the development and exploitation of new communication mediums. Concurrent with the growth of devices for transmitting and processing information, a unifying theory was developed and became the subject of intensive research.

This theory, known as communication theory, or, in its broader applications, information theory, is concerned with the discovery of mathematical laws governing systems designed to communicate or manipulate information. It sets up quantitative measures of information and of the capacity of various systems to transmit, store and otherwise process information.

Some of the problems treated relate to finding the best methods of utilizing various available communication systems, the best methods of separating signals from noise and the problem of setting upper bounds on what it is possible to do with a given channel while the central results are chiefly of interest to communication engineers, some of the concepts have been adopted and useful in such fields as psychology and linguistics.

Information is interpreted in its broader sense to include the messages occurring in any of the standard communication mediums such as telegraphy, radio or television, the signal involved in computers, servomechanism systems and other data-processing devices, and even the signals appearing in the nerve networks of animals and human beings. The signal or messages need not be meaningful in any ordinary sense. This theory, then, is quite different from classical communication engineering theory, which is communicated.

Information theory is based largerly on the work of C.E. Shannon [3], whose papers published in 1949 from the principal classical reference to the subject. Fundamental to this work is the application of probability theory to communication concepts and especially the results based on the concept of entropy measure of information.

Considered abstractly, information is stored or communicated in terms of a representation (such as written symbols, sounds, electrical wave forms, etc.),and the operation of a communication system requires transformations from one representation to another (and usually back again).

A transformation is information-lossless if it is reversible: i.e., if an inverse transformation exists that exactly restores the original representation. Otherwise there is a loss of information in the transformation. The types of transformation of most interest in information theory are called codes.

Thus a theoretical model of a communication system is assumed to consist of a series arrangement of subsystems called respectively message source, transmitter or encoder, channel, receiver or recoder, message destination.

Information theory owes its origin to the discovery that relative expenditure in terms of the relative frequencies of the ultimate linguistic units, such as phonemes or letters, is independent of the content of the message. This enables one to measure the "cost" of transmitting any message, that is messages in general. If we speak of that measure as one of information, it refers primarily to the cost of transmitting information.

Apart from this, the implied stability of relative frequencies of linguistic symbols enables us to make relevant guesses at missing parts of the message, and thus obtain "information" in a formal sense, that is information about the linguistic expression of the content, and about the efficiency of the linguistic code used.

Information theory was devised and first used for deciphering secret codes (Shannon, Weaver). A basic principle of secret codes is that they should not possess the statistical characteristic of the language.

The language, apart from being a medium or "channel" for expressing thought, or, as we say, for communication, is governed by an intrinsically linguistical principle of duality, such that if a statement is made it provides the occasion for another statement, and so on. Language is thus self-generating, every statement being possibly productive of another. Language not only expresses thought, it also creates thought. This aspect of language was emphasized in an inimitable way by the Viennese writer K. Kraus. In this sense, we may say that is duality which makes language tick.

Considering that the law of duality is the fundamental law of language, it is clear that duality must permeate all linguistic work, though to a different extent. The more linguistic expression is meant to serve the purpose of communication, the less duality will be found in it and vice versa. This is true, no matter whether duality is realized to be a factor of language, or whether it is applied without such express realization.

References

1. Wiener, Norbert, Cybernetics. New York, 1948.

2. Wiener, Norbert. The Human Use of Human Beings. Cybernetics and Society. Boston. 1954.

3. Shannon, Claude E., Weaver, Warren. A Mathematical Theory of Communications. Urbana, 1949.

Просмотров работы: 6