Unlimited Graphics Power

Started by zappaDPJ, Apr 10, 2010, 22:10:41

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

Rik

A plate works for me, Steve. :)
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

I was thinking more along the lines of something beef. In fact homemade burger and a poached egg.
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Ah. We had DR burgers in a bun for lunch - they really are very good.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

Perhaps we should add a postfix to the recent posts -"if you get the picture" :whistle:
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Technical Ben

#30
It's all current and old technology, but it DOES use stupid amounts of processing power. Both CPU and GPU. Not that it could not be done, but that there are currently too many limits to this type of engine. One being, that animating anything three dimensional is an immense task, mainly because you have to move ALL those "atoms". Do you want to do that by hand?  :eek4:. But for scenery and ground, it's quite a good technique. Physics is also possible, but your having to calculate thousands of points again.
The Wiki knows all.
And to see the real power of Voxels, check out Zbrush.

These are all done with real "atoms" or voxels. So its billions of polys if converted.





I use to have a signature, then it all changed to chip and pin.

esh

Real or not, trying to get the world's programmers to shift to a radically different API will be... challenging.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

Technical Ben

Not always. Well, I am using CryEngine 3 engine as an example, but that's compiled by their own compiler/programming suite, so I guess it does not count. It is programmed from the ground up though, so offers really great performance.
I use to have a signature, then it all changed to chip and pin.

esh

Well I was more discussing the low-level API that interfaces with the graphics card than the game engine. I am reminded back to the days of the Unreal 1 engine where Epic/DE had to write and provide back-end renders for D3D, OpenGL, S3 MeTaL and the old-fashioned software renderer. It was predictably a nightmare.

The only two engines I've used for any length of time were the (old) Torque engine and Unreal 3 more recently, though I find U3 doesn't handle absolutely massive areas very well, even if they are quite sparsely detailed.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

vitriol

The best Unreal Tournament ran for me was when I had a 3dfx Voodoo 3 3000 graphics card using the Glide API.  It was just magic.  Although using OpenGL (with hi resolution textures) and lots of anti aliasing, anisotropic filterning looked mint (Radeon 8500, 9600pro, 9800pro and Nvidia 8800GTX)

zappaDPJ

The Unreal engine was the bane of my life for many years. Anyone familiar with BSP mesh holes will know why  :rant2: It spawned some fun games though  :)
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

esh

Crikey, how could I forget the Glide renderer? That was all U1 ran on originally. In fact, when I moved up to a GeForce DDR card, I ran U1 and UT99 on Glide using a Glide->D3D wrapper and it was faster than using the D3D renderer.

Times have changed. Fortunately. But I do miss my Voodoo2.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011