Notations

image:
alphabet associated with variable X
A:
transformation matrix
A(D):
weight enumerator function WEF
Ad:
number of codewords with weight d
A(W, Z):
weight enumerator function IRWEF
Aw,z:
number of codewords with weight w + z
B:
bandwidth
B:
inverse transformation matrix
c:
codeword
c(p):
polynomial associated with a codeword
C:
capacity in Sh/dimension
C′:
capacity in Sh/s
D:
variable associated with the weight or delay or distortion
D(R):
distortion rate function
DB:
binary rate
DN:
average distortion per dimension
DS:
symbol rate
d:
distance of Hamming weight
dmin:
minimum distance
e:
correction capability
e:
error vector
E[x]:
expectation of the random variable x
Eb:
energy per bit
Es:
energy per symbol
ed:
detection capability
image:
Galois Field with q elements
g(p):
polynomial generator
G:
prototype filter
G:
generator matrix
γxx(f):
power spectrum density of the random process x
H:
parity check matrix
H(X):
entropy of X
HD(X):
differential entropy of X
I(X; Y):
average mutual information between variables X and Y
k:
number of bits per information word (convolutional code)
K:
number of symbols per information word (block code)
ni:
noise sample at time i or length of the i-the message
N:
noise power or number of symbols per codeword
n:
number of bits per codeword (convolutional ...

Get Digital Communications 1 now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.