0

**
**

## (

## ratings

## )

## students

## Created by:

## Last updated:

## Duration:

**FREE**

## This course includes:

Unlimited Duration

## Badge on Completion

## Certificate of completion

Unlimited Duration

### Description

Principles of Communication (Part 2). Instructor: Prof. Aditya K. Jagannatham, Department of Electrical Engineering, IIT Kanpur.

This course is a sequel to Principles of Communication-Part I and covers fundamental concepts of communication systems, especially focusing on various aspects of modern digital communication systems. Beginning with the basic theory of digital communication systems pertaining to pulse shaping, modulation and optimal detection, the course will also cover several important digital modulation techniques such as Binary Phase Shift Keying (BPSK), Frequency Shift Keying (FSK), Quadrature Amplitude Modulation (QAM), M-ary Phase Shift Keying (M-PSK) etc. Other fundamental concepts such as Information Theory, Channel Capacity, Entropy Coding and Error Control Coding will be dealt with in the later parts of the course. (from **nptel.ac.in**)

### Course Curriculum

- Lecture 01 – Introduction to Digital Communication Systems Unlimited
- Lecture 02 – Spectrum of Transmitted Digital Communication Signal, Wide Sense Stationarity Unlimited
- Lecture 03 – Spectrum of Transmitted Digital Communication Signal, Autocorrelation Function Unlimited
- Lecture 04 – Spectrum of Transmitted Digital Communication Signal, Relation to Energy Spectral Density, Introduction to AWGN Channel Unlimited
- Lecture 05 – Additive White Gaussian Noise (AWGN) Properties, Gaussian Noise and White Noise Unlimited
- Lecture 06 – Structure of Digital Communication Receiver, Receiver Filter and SNR Unlimited
- Lecture 07 – Digital Communication Receiver, Noise Properties and Output Noise Power Unlimited
- Lecture 08 – Digital Communication Receiver, Optimal SNR and Matched Filter Unlimited
- Lecture 09 – Probability of Error in Digital Communication, Probability Density Functions of Output Unlimited
- Lecture 10 – Probability of Error in Digital Communication, Optimal Decision Rule and … Unlimited
- Lecture 11 – Introduction to Binary Phase Shift Keying (BPSK) Modulation Unlimited
- Lecture 12 – Introduction to Amplitude Shift Keying (ASK) Modulation Unlimited
- Lecture 13 – Optimal Decision Rule for Amplitude Shift Keying (ASK) Unlimited
- Lecture 14 – Introduction to Signal Space Concept and Orthonormal Basis Signals Unlimited
- Lecture 15 – Introduction to Frequency Shift Keying (FSK) Unlimited
- Lecture 16 – Optimal Decision Rule for Frequency Shift Keying (FSK) Unlimited
- Lecture 17 – Introduction to Quadrature Phase Shift Keying (QPSK) Unlimited
- Lecture 18 – Waveforms of Quadrature Phase Shift Keying (QPSK) Unlimited
- Lecture 19 – Matched Filtering, Bit Error Rate and Symbol Error Rate for QPSK Unlimited
- Lecture 20 – Introduction to M-ary PAM (Pulse Amplitude Modulation) Unlimited
- Lecture 21 – M-ary PAM: Optimal Decision Rule and Probability of Error Unlimited
- Lecture 22 – Introduction to M-ary QAM (Quadrature Amplitude Modulation) Unlimited
- Lecture 23 – M-ary QAM: Optimal Decision Rule, Probability of Error, Constellation Diagram Unlimited
- Lecture 24 – Introduction to M-ary PSK, Transmitted Waveform and Constellation Diagram Unlimited
- Lecture 25 – M-ary PSK: Optimal Decision Rule, Nearest Neighbor Criterion and … Unlimited
- Lecture 26 – Introduction to Information Theory, Relevance of Information Theory and … Unlimited
- Lecture 27 – Definition of Entropy, Average of Information/ Uncertainty of Source Properties … Unlimited
- Lecture 28 – Entropy Example: Binary Source, Maximum and Minimum Entropy of Binary Source Unlimited
- Lecture 29 – Maximum Entropy of Source with M-ary Alphabet, Concave/Convex Functions Unlimited
- Lecture 30 – Joint Entropy, Definition of Joint Entropy of Two Sources Unlimited
- Lecture 31 – Properties of Joint Entropy, Relation between Joint Entropy and Marginal Entropies Unlimited
- Lecture 32 – Conditional Entropy, Example and Properties of Conditional Entropy Unlimited
- Lecture 33 – Mutual Information, Diagrammatic Representation, Properties of Mutual Information Unlimited
- Lecture 34 – Examples of Mutual Information Unlimited
- Lecture 35 – Channel Capacity, Implications of Channel Capacity Unlimited
- Lecture 36 – Differential Entropy, Example for Uniform Probability Density Function Unlimited
- Lecture 37 – Differential Entropy of Gaussian Source and Insights Unlimited
- Lecture 38 – Joint Conditional/ Differential Entropies, Mutual Information Unlimited
- Lecture 39 – Capacity of Gaussian Channel Unlimited
- Lecture 40 – Capacity of Gaussian Channel: Practical Implications, Maximum Rate in Bits/sec Unlimited
- Lecture 41 – Introduction to Source Coding and Data Compression, Variable Length Codes Unlimited
- Lecture 42 – Uniquely Decodable Codes, Prefix-free Code, Instantaneous Code Unlimited
- Lecture 43 – Binary Tree Representation of Code, Example and Kraft Inequality Unlimited
- Lecture 44 – Lower Bound on Average Code Length, Kullback-Leibler Divergence Unlimited
- Lecture 45 – Optimal Code Length, Constrained Optimization and Morse Code Example Unlimited
- Lecture 46 – Approaching Lower Bound on Average Code Length, Block Coding Unlimited
- Lecture 47 – Huffman Code, Algorithm, Example and Average Code Length Unlimited
- Lecture 48 – Introduction to Channel Coding, Rate of Code, Repetition Code, Hamming Distance Unlimited
- Lecture 49 – Introduction to Convolutional Codes, Binary Field Arithmetic and Linear Codes Unlimited
- Lecture 50 – Example of Convolutional Code Output, Convolution Operation for Code Generation Unlimited
- Lecture 51 – Matrix Representation of Convolutional Codes, Generator Matrix Unlimited
- Lecture 52 – State Diagram Representation of Convolutional Code, State Transitions Unlimited
- Lecture 53 – Trellis Representation of Convolutional Code, Valid Code Words Unlimited
- Lecture 54 – Decoding of the Convolutional Code, Minimum Hamming Distance, Maximum Likelihood Codeword Estimate Unlimited
- Lecture 55 – Principle of Decoding of Convolutional Code Unlimited
- Lecture 56 – Viterbi Decoder for Maximum Likelihood Decoding of Convolutional Code using Trellis Representation Unlimited

### About the instructor

*5*Instructor Rating

Reviews

Courses

Students