A change in the color subcarrier signal which moves its timing out of phase, i.e., it occurs at a different instant from the original signal. Since color information is encoded in a video signal as a relation between the color subcarrier and the color burst phase, a deviation in the color subcarrier phase results in a change in the image's hue.
In signal processing, phase noise is the frequency-domain representation of random fluctuations in the phase of a waveform, corresponding to time-domain deviations from perfect periodicity ("jitter"). Generally speaking, radio-frequency engineers speak of the phase noise of an oscillator, whereas digital-system engineers work with the jitter of a clock.
Historically there have been two conflicting yet widely used definitions for phase noise. Some authors define phase noise to be the spectral density of a signal's phase only, while the other definition refers to the phase spectrum (which pairs up with the amplitude spectrum, see spectral density#Related concepts) resulting from the spectral estimation of the signal itself. Both definitions yield the same result at offset frequencies well removed from the carrier. At close-in offsets, however, the two definitions differ.
The IEEE defines phase noise as ℒ(f) = Sφ(f)/2 where the "phase instability" Sφ(f) is the one-sided spectral density of a signal's phase deviation. Although Sφ(f) is a one-sided function, it represents "the double-sideband spectral density of phase fluctuation". The symbol ℒ is called a (capital or uppercase) script L.