摘要 |
An image synthesizing system improves perspective transformation of images by giving a far and near sense to the spatial change in the surface data of an object. A texture data storage unit stores texture data at positions represented by texture coordinates. An image supply unit outputs texture coordinates, brightness data, attribute data and the other data which correspond to the vertex coordinates arid vertices of polygons. A processor unit determines rendering data for each dot by subjecting the texture coordinates and brightness data to the perspective transformation, linear interpolation and inversely perspective transformation. The resulting rendering data is then mapped to representing coordinates determined by a main processor in a field buffer unit. Thereafter, the rendering data is transformed into RGB data by a palette/mixer circuit using color data which are read out from a texture data storage unit by the texture coordinates, attribute data and brightness data. Thus, an image data can be formed.
|