摘要 |
A buffer arrangement on an integrated circuit is disclosed which translates the voltage levels of logic signals without distortion. Generally, signal distortion results from the difference between the low-to-high propagation delay and the high-to-low propagation delay through an individual buffer. The arrangement includes two buffers (33, 36) having unknown distorting characteristics, arranged in series-inverting pairs (31, 33; 35, 36). The first buffer (33) generates a predistorted signal from the signal to be translated, which has a delay at each transition due to either the high-to-low or low-to-high propagation delay of that buffer. By inverting (35) and redistorting the predistorted signal through a second buffer (36), each transition in the signal at the output thereof is further delayed due to a propagation delay opposite in type to that which that same transition was delayed by the first buffer. Each transition in the output signal has a total delay equal to the sum of the high-to-low propagation delay of one buffer plus the low-to-high propagation delay of the other buffer. When both buffers are of the same type on the same integrated circuit, their distortion characteristics are equivalent; and each transition in the output signal is uniformly delayed so that the output signal is faithful in shape to the input signal. |