摘要 |
A method for operating a time-interleaved analog-to- digital converter for converting an analog input to a digital output using a time-interleaved analog-to-digital converter, wherein the time-interleaved analog-to-digital converter comprises an array of M sub ADCs (ADC<SUB>1</SUB>, ADC<SUB>2</SUB>,..., ADC<SUB>M</SUB>), where M is an even integer, and each row of the array comprises one of the M sub ADCs. The method comprises the step of, for every sampling instant n, where n is an integer in a sequence of integers, converting the analog input by means of the sub ADC in row k(n) of the array, wherein 1 = k(n) = M. A value between 1 and M is assigned to k(n) for the first sample instant, and k(n+1) is selected such that a) k(n+1) > M/2 if k(n) = M/2, otherwise k(n+1) = M/2; b) M/2-1 = |k(n+1)-k(n)| = M/2+1; and c) k(n+1) = k(m+1) if and only if n-m is an integer multiple of M. A time interleaved analog-to-digital converter operating in accordance with the method is also disclosed. |