• Let a source sends messages at the information rate of R bits/second. One can intuitively reason that, for a given communication system, as the information rate increases the number of errors per second will also increase. 

    Shannon's Theorem:
    A given communication system has a maximum rate of information C known as the channel capacity.

    If the information rate R is less than C, then one can approach arbitrarily small error probabilities by using intelligent coding techniques.

    To get lower error probabilities, the encoder has to work on longer blocks of signal data. This entails longer delays and higher computational requirements.

    If R<=C then transmission may be accomplished without error in the presence of noise.
    The negation of this theorem is also trur R>C.

    Shannon's Equation:

    Maximum data rate of a noisy channel whose bandwidth is BHZ, and whose signal is noise ratio is S/N,
    then,

    C= Blog2 (1+S/N)

    Where, 
    B=bandwidth in Hz
    S= average signal power in watts
    N= Random noise power in Watts

  • Sorry No animation Yet.

    We are Working On this


OUR PARTNERS&Website builderuCoz