Yahoo Search Busca da Web

Resultado da Busca

  1. Há 1 dia · The concept of information entropy was introduced by Claude Shannon in his 1948 paper "A Mathematical Theory of Communication", and is also referred to as Shannon entropy. Shannon's theory defines a data communication system composed of three elements: a source of data, a communication channel, and a receiver.

  2. Há 4 dias · ShannonHartley theorem. v. t. e. Information theory is the mathematical study of the quantification, storage, and communication of information. The field was originally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s.

  3. Há 5 dias · Information theory, a mathematical representation of the conditions and parameters affecting the transmission and processing of information. Most closely associated with the work of the American electrical engineer Claude Shannon in the mid-20th century, information theory is chiefly of interest to

    • George Markowsky
  4. Há 4 dias · Formulated in 1948 by Claude Shannon and Warren Weaver, this model outlines the senders, messages, different channels, receivers, and potential noise as steps in a linear process of communication. It emphasises how important it is to have a clear message and to minimize noise or interference to ensure successful transmission.

  5. Há 1 dia · Researchers like John McCarthy, Marvin Minsky, Nathaniel Rochester, and Claude Shannon laid the groundwork for AI research, coining the term "artificial intelligence." 1960s-1970s - The Rise of Expert Systems: Early AI research focused on creating expert systems that could simulate human expertise in specific domains.

  6. Há 5 dias · História. O uso de portas lógicas através da eletrônica digital teve início em meados do século 20, quando o engenheiro Claude Shannon utilizou um conceito criado pelo matemático britânico George Boole, utilizando sinais discretizados para solucionar problemas envolvendo a telefonia naquela época.

  7. Há 1 dia · En la teoría de la información, la entropía es una medida de la cantidad de sorpresa o incertidumbre que tiene una fuente de información. Fue introducida por Claude Shannon en 1948 y se define matemáticamente como: \[ H(X) = – \sum_{i} P(x_i) \log_2 P(x_i) \]

  1. As pessoas também buscaram por