In this paper, we study the bit-error-rate (BER) performance of a novel method of signal detection whereby the signal is modulated by polarization rotation. The method combines two oppositely sensed circularly polarized signals to generate a linearly polarized wave with a spatial tilt angle determined by the phase difference between the two circularly polarized signals. The transmitted linearly polarized wave passes through a flat faded channel where it can be further tilted through an angle θch before it is detected by a crossed dipole at the receiver. We also develop an algorithm to estimate the tilt angle distortion to the linearly polarized signal introduced by the channel. Simulation results show BER performance of the proposed scheme for both cases when the distortion in the tilt angle by the channel is (i) accounted for, and, (ii) not accounted for at the receiver.