CDMA Handoff between Transmitters
- 7 days ago
- 1 min read
I was curious how the detection quality of CDMA signals at a receiver would change as the receiver moved away from one transmitter toward another (i.e., same frequency, same transmit power, different pseudo-random noise (PRN) codes). Logically, when the receiver is halfway between two transmitters, the received power levels would be the same. Would the signals then interfere with each other and cause both to drop out, or is CDMA robust enough to handle the interference and allow both signals to be detected? I built a Python model to analyze the situation. Turns out, both signals can be detected almost 100% of the time when the receiver is halfway between them. It can even detect both signals more than 80% of the time when the signal-to-interference ratio (SIR) is minus 6 dB. Pretty impressive.



Comments