|
If 4D doesn't exist the current generation of video cards might last 40 years. |
|
|
|
|
|
Monday, April 28, 2003
(This is a continuation of the previous story "If 3d didn't exist the Playstation could have lasted 40 years")
Are computers succeptable to the standardization platform that exists in movies, music, and the rest of the consumer goods world? Will there be an outlet for additional power once the 3D graphics realm is adequately conquered? Much like we have been able to for 10 years now with word processing, can we finally buy one last gaming machine to last a lifetime?
Hold on to your wallet a little while longer there. Computer games and consoles are converging, but remain different beasts with different gamer tastes, and are subject to different market forces. Graphics cards can and do improve incrementally in the computer realm without breaking backwards compatibility, which contrasts sharply to the console realm where anything less than a breathtaking leap forward is met with dismal sales. Free from the resolution limitations of standard TV's, a video card can be incrementally better by providing similar viewing quality at higher resolution, by meeting a higher screen-refresh rate, or by scaling the quality of the gaming environment (a trick not viable in the one-design-fits-all model of consoles). Furthermore, the Open GL / Direct X model of abstration provides a convienient middle layer allowing all cards compatible with your version of Open GL to play the same games whether the standard has been out for 1 or 50 years... ensuring compatibility in a way that the consumer device model does not.
At some point Open GL and Direct X will "top out," providing all of the functions required for gorgious graphics given the limitations of current display devices. Has that happened yet? Without realtime raytracing that is a probable no. Will it happen at some point? History would indicate yes.
When it does happen, and assuming Moore's Law chugs along beyond the 2016 barrier, gaming companies will have to decide how far back along the power curve they are willing to support, rather than how far back along the technical curve. As such a system under a standardized API would be easy to create, it would be unwise and unnecessary to shut out many consumers from that market. Why refuse a person access to your game simply because they need the rendering distance set to 1 mile instead of 10 miles, or they use the last mile models on all game encounters?
Despite what I have said above, I do not believe perfect backwards compatibility will be available for many, many years. For one, unlike 2D graphics the 3D models we have today are sadly lacking in depth, texture, lighting etc. Glass and other transparencies, reflections, the natural diffusion of light, reflected light onto other surfaces, surface refraction etc are realistic touches that are sadly lacking in today's realtime 3D. The raytracing required to achieve such effects is so computationally expensive as to be a theoretical milestone in the future of gaming, ensuring that Open GL 2.0 will not likely be the last iteration of the graphical standard. Furthermore, while the images possible in the current generation of computer card is stunning, the physical model behind the image is greatly lacking. Arbitrary deformation as demonstrated in Worms 3D is barely achieveable with the current generation of hardware, but even basic physical concepts such as moments of impact, rebound, stretch, etc are very expensive, and fluid dynamics are just beyond modern computers despite their necessity for flight. The interactions of societies of simulated people, the stress models on user-created bridges, and accurate collision detection between basketball player and rim are all important physics issues that cannot be accurately modeled in realtime by the current generation of hardware. Quite frankly, I would not be surprised to see physics or collision detection acceleration chips cropping up in the near future. Finally, if Moore's Law is to continue beyond 2016, radical changes in the architecture of modern computers will be necessary. Quantum computing would not be simply a jump forward in power above a standard x86 architecture, but would be a tremendous re-examination of how to control a processor and how to optimize for this different low-level physical process. A new OS and new API's would be inevitable, along with the yearly updated optimization routines.
At some point the 3D revolution will top out to a process that is "good enough." We are already at a point where the yearly upgrade cycle has cooled into a 5-year process, and cards released today might last up through 2010 or beyond. While the end of this cycle is not here yet, it is important to remember that no consumer device in history has remained on the same rapid-fire upgrade treadmill for very long. Hopefully soon that will stretch to 10 or 20 years per card, rather than the previous 1. Only then will slick packing images be less important than the contentents of the box.
- Chris 1:05 PM [+]
|
|
|
|