If an AC signal is applied to an ideal A/D converter, noise present in the digitized output will be due to quantization error. For the ideal converter, the maximum error for any given input will be +/- ½ LSb. If a linear ramp signal is applied to the converter input and the output error is plotted for all analog inputs, the result will be a sawtooth waveform with a peak-to-peak value of 1 LSb as shown in the figure below:

The root-mean-square (RMS) amplitude of the error output can be approximated by the equation below.

(1)
\begin{align} ERROR_{RMS} = 1/( \sqrt{12}) • 1 LSB \end{align}

The maximum theoretical signal-to-noise ratio (SNR) for an ADC can be determined based on the RMS quantization error determined above. If a full-scale (FS) sine wave is applied to the input of the ADC, the maximum theoretical SNR is determined by the equation below, where N is the resolution of the ADC in bits. The above formula assumes that the signal noise is measured over the entire usable bandwidth of the ADC (0 - fs/2), where fs = sampling frequency. For the case of oversampling where the signal bandwidth is less than the Nyquist bandwidth, the theoretical SNR of the A/D converter is increased by 3 dB each time the fs is doubled.

(2)
\begin{equation} SNR = 6.02 • N +1.76dB \end{equation}