Digital Communication 5

COEP
Lets Crack Online Exam

Electronic Engineering MCQ Question Papers: ENTC, IT Interview Placement

Subject: Digital Communication 5

Part 5: List for questions and answers of Digital Communication

 

Q1. In DPSK technique, the technique used to encode bits is

a) AMI

b) Differential code

c) Uni polar RZ format

d) Manchester format

 

Q2. Synchronization of signals is done using

a) Pilot clock

b) Extracting timing information from the received signal

c) Transmitter and receiver connected to master timing source

d) All of the above

 

Q3. In coherent detection of signals,

a) Local carrier is generated

b) Carrier of frequency and phase as same as transmitted carrier is generated

c) The carrier is in synchronization with modulated carrier

d) All of the above

 

Q4. Impulse noise is caused due to

a) Switching transients

b) Lightening strikes

c) Power line load switching

d) All of the above

 

Q5. Probability density function defines

a) Amplitudes of random noise

b) Density of signal

c) Probability of error

d) All of the above

 

Q6. Timing jitter is

a) Change in amplitude

b) Change in frequency

c) Deviation in location of the pulses

d) All of the above 

 

Q7. ISI may be removed by using

a) Differential coding

b) Manchester coding

c) Polar NRZ

d) None of the above

 

Q8. Overhead bits are

a) Framing and synchronizing bits

b) Data due to noise

c) Encoded bits

d) None of the above

 

Q9. The expected information contained in a message is called

a) Entropy

b) Efficiency

c) Coded signal

d) None of the above

 

Q10. The information I contained in a message with probability of occurrence is given by (k is constant)

a) I = k log21/P

b) I = k log2P

c) I = k log21/2P

d) I = k log21/P2

 

Q11. The memory less source refers to

a) No previous information

b) No message storage

c) Emitted message is independent of previous message

d) None of the above

 

Q12. Entropy is

a) Average information per message

b) Information in a signal

c) Amplitude of signal

d) All of the above 

 

Q13. The relation between entropy and mutual information is

a) I(X;Y) = H(X) – H(X/Y)

b) I(X;Y) = H(X/Y) – H(Y/X)

c) I(X;Y) = H(X) – H(Y)

d) I(X;Y) = H(Y) – H(X)

 

Q14. The mutual information

a) Is symmetric

b) Always non negative

c) Both a and b are correct

d) None of the above

 

Q15. Information rate is defined as

a) Information per unit time

b) Average number of bits of information per second

c) rH

d) All of the above

 

Q16. The information rate R for given average information H= 2.0 for analog signal band limited to B Hz is

a) 8 B bits/sec

b) 4 B bits/sec

c) 2 B bits/sec

d) 16 B bits/sec

 

Q17. Code rate r, k information bits and n as total bits, is defined as

a) r = k/n

b) k = n/r

c) r = k * n

d) n = r * k

 

Q18. The technique that may be used to increase average information per bit is

a) Shannon-Fano algorithm

b) ASK

c) FSK

d) Digital modulation techniques 

 

Q19. For a binary symmetric channel, the random bits are given as

a) Logic 1 given by probability P and logic 0 by (1-P)

b) Logic 1 given by probability 1-P and logic 0 by P

c) Logic 1 given by probability P2 and logic 0 by 1-P

d) Logic 1 given by probability P and logic 0 by (1-P)2

 

Q20. The channel capacity according to Shannon’s equation is

a) Maximum error free communication

b) Defined for optimum system

c) Information transmitted

d) All of the above 

 

Part 5: List for questions and answers of Digital Communication

 

Q1. Answer: b

 

Q2. Answer: d

 

Q3. Answer: d

 

Q4. Answer: d

 

Q5. Answer: a

 

Q6. Answer: c

 

Q7. Answer: a

 

Q8. Answer: a

 

Q9. Answer: a

 

Q10. Answer: a

 

Q11. Answer: c

 

Q12. Answer: a

 

Q13. Answer: a

 

Q14. Answer: c

 

Q15. Answer: d

 

Q16. Answer: b

 

Q17. Answer: a

 

Q18. Answer: a

 

Q19. Answer: a

 

Q20. Answer: d