Abstract
This paper is devoted to the study of the performance of the linear minimum mean-square error (LMMSE) receiver for (receive) correlated multiple-input multiple-output (MIMO) systems. By the random matrix theory, it is well known that the signal-to-noise ratio (SNR) at the output of this receiver behaves asymptotically like a Gaussian random variable as the number of receive and transmit antennas converge to + ∞ at the same rate. However, this approximation being inaccurate for the estimation of some performance metrics such as the bit error rate (BER) and the outage probability, especially for small system dimensions, Li et al. proposed convincingly to assume that the SNR follows a generalized gamma distribution which parameters are tuned by computing the first three asymptotic moments of the SNR. In this paper, this technique is generalized to (receive) correlated channels, and closed-form expressions for the first three asymptotic moments of the SNR are provided. To obtain these results, a random matrix theory technique adapted to matrices with Gaussian elements is used. This technique is believed to be simple, efficient, and of broad interest in wireless communications. Simulations are provided, and show that the proposed technique yields in general a good accuracy, even for small system dimensions.
Original language | English (US) |
---|---|
Pages (from-to) | 4386-4397 |
Number of pages | 12 |
Journal | IEEE Transactions on Information Theory |
Volume | 55 |
Issue number | 10 |
DOIs | |
State | Published - 2009 |
Externally published | Yes |
Keywords
- Bit-error rate (BER)
- Correlated channels
- Gamma approximation
- Large random matrices
- Minimum mean-square error (MMSE)
- Multiple-input multiple-output (MIMO)
- Outage probability
- Signal-to-noise ratio (SNR)
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Library and Information Sciences