Dr Jerzy Rutkowski - strona 2

note /search

Wstęp

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 980

I NTRODUCTION Term of information is in our civilization so broad that it is impossible to expect a definition of the universal measure of information. In many domains, especially in (tale)communication, source of information, normally discrete data, and transmission channel are described by their ...

Kraft nierówności

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 574

Kraft inequality If integers satisfy: then, a prefix condition code of alphabet length D exists, with these integers as codeword lengths. A convenient graphical representation of a code is a tree built of branches connecting nodes, with sequences assigned to nodes, codewords (sequences satisfying...

Źródło Markov

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 679

Markov source A discrete memoryless source generates successive messages (symbols) randomly, independently of the source history. Such source probabilistic description is dependent of a time origin and is called non-stationary. A discrete source is...

Strategia pomiarowa

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 14
Wyświetleń: 931

Measurement strategy From all possible combinations of source data involved in the measurement/test, select such combination that gives the maximum mutual information, minimum information loss at the same time. For the optimum combination, all results should be equally probable (see P1), Repeat thi...

Program rozwoju, instrukcja problemu pomiarowego - test

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 441

Measurement test program development problem statement Measuring device gives possible readings and it is assumed that measurement/test is performed with no misinformation, , i.e. . The measurement/test has to be repeated k times, to fully compensate a source entropy by the acquired mutual informat...

Wymiana informacji z kanału dyskretnych Memoryless

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 7
Wyświetleń: 455

Mutual information of Discrete Memoryless Channel (DMC) For the given, set of input messages : , set of output messages : , Discrete Memoryless Channel (DMC) is defined by its probabilistic model (Fig.3.1), MN transition (conditional) probabiliti...

Optymalne kodowanie

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 7
Wyświetleń: 735

Optimum encoding (Huffman Code) For any given source with elements (messages), the optimum binary code exists, in which the two least likely code words and have the same length and differ in only the last bit, ending in a 0 and ending in a 1. Shorter codewords encode more likely source elements, i...

Macierz kontroli parzystości

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 28
Wyświetleń: 840

Parity check matrix An ( n , k ) linear code can be uniquely defined by a system of m = n  k linear equations that express parity check bits by the information bits. It is normally assumed that weight of each codeword is even. Such code is called the even-parity block code. The parity-check matrix...

Reed Muller codes

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 553

Reed Muller codes Reed-Muller codes form impotrant class of cyclic multiple-error-correction codes. For any integers p and r , with , there exists a binary RM code of the r -th order, denoted as RM( r , p ), with the following parameters: block-length: , length of information word: , minimum dista...

Kod liniowy w skrócie

  • Politechnika Śląska
  • Teoria informacji i kodowania
Pobrań: 0
Wyświetleń: 616

Shortened linear code If number of information bits required to encode M messages, the smallest integer l that satisfies inequality , is less than the given code information part length k , then an linear code can be shortened to code, where . This can be done by omitting j information bits. They c...