Unlimited Graphics Power

Started by zappaDPJ, Apr 10, 2010, 22:10:41

Previous topic - Next topic

0 Members and 1 Guest are viewing this topic.

zappaDPJ

If you play computer games you are probably aware of the limitations imposed by polygon driven graphics. Even though every year graphics processing takes a 20% (approx) leap in power allowing objects to be covered with smaller polygons and more detail, the result is still very poor geometry.

Early this month I became aware of a company claiming to be able to deliver unlimited graphics power without polygons using current graphics hardware technology. For the technically minded they claim to be using a three dimensional point cloud search algorithm. I thought at the time it was an April fool and I'm still not entirely convinced it isn't. It also appears that one of the two big players in the graphics market have also dismissed it.

You can judge for yourself if you think it's for real but if it is it'll offer the computer graphics market a 1000%+ hike in graphics processing power using current technology.

http://unlimiteddetailtechnology.com/

http://www.youtube.com/watch?v=THaam5mwIR8&NR=1
http://www.youtube.com/watch?v=l3Sw3dnu8q8&feature=related

zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

Well it's sort of plausible,would not be good news for graphic card manufacturers.
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

D-Dan

Who are they trying to kid - even employing the method that they claim (which is highly questionable in itself for several reasons) the most powerful computer and graphics chip in the world couldn't do what they claim.

Remember, the gfx chip would still have to render every pixel, so even if the theory was possible, unlimited is a word that simply cannot apply.

Steve
Have I lost my way?



This post doesn't necessarily represent even my own opinions, let alone anyone else's

zappaDPJ

But that's surely the point, the graphics chip does only have to render every pixel which is something a 10 year old PC with the most basic graphics card can do with ease? My understanding of it is that the graphics processor does almost nothing at all, it effectively just pokes values at the screen map.

What they seem to be implying is that the CPU is doing all the work by pulling only the necessary data from a file using a search criteria. That part I can almost believe, what I find suspect about the whole thing are the two things that always create a bottleneck in any graphics rendering system, namely perspective and hidden surface removal.
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Poor use of capitals and layout errors make me suspicious of this.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Gary

What about physics related to unlimited polygons and interaction that would take huge resources surely?

Damned, if you do damned if you don't

Gary

Quote from: Rik on Apr 11, 2010, 10:03:30
Poor use of capitals and layout errors make me suspicious of this.
I agree, Rik, this sentence is a good example "We are in the process of Negotiating to get the Commercial version of Unlimited Details SDK built"
Damned, if you do damned if you don't

Rik

That was the one that really caught my eye, Gary.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Gary

Quote from: Rik on Apr 11, 2010, 10:19:19
That was the one that really caught my eye, Gary.
If I was presenting a radical new technology, I sure as hell out get my sentence structure sorted out, Also as has been pointed out, there are no animations in the YouTube videos, which is a bit suspicious.
Damned, if you do damned if you don't

Rik

It doesn't 'feel' right, does it.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Gary

Quote from: Rik on Apr 11, 2010, 10:27:03
It doesn't 'feel' right, does it.
No it doesn't, Rik, it feels like a trick or a piece of personal sensationalism, because they would be snapped up very quickly by MS, Sony , etc and they said they may go as a software house if negotiations don't go the right way (something like that) which to me does not make sense if it was true.
Damned, if you do damned if you don't

zappaDPJ

#11
It doesn't feel right at all which is why I've been trawling the net to try and find if it's a wind up or not.

There is some limited animation in one of the videos but it's not easy to spot. I don't see too much of a problem with some of the additional elements that make up a computer game, collision detection for instance should be easy and on a per pixel basis.

If this is possible at all the real issue would be at the opposite end of the equation when compared to polygon based graphics. Current methods and technology allow you to extrapolate quite a lot from very little, the process starts off in a minimal fashion but as it gets closer to the final output in the graphics card it becomes a massive drain on resources. This routine appears to put all the emphasis at the start of the process, massive amounts of data being searched moving towards minimal effort to display. You would need a huge amount of RAM or you would have a bottleneck caused by disk churning virtual memory.

I'll be keeping an eye on this one as it has intrigued me.
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Let us know, Zap, would you.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

zappaDPJ

I certainly will either way. The most disappointing thing would be if it just disappeared without trace which is probably the most likely scenario.
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

zappaDPJ

zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

Polygons to polygons and chips to chips.
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Did you have to mention food, Steve? ;D
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

I am going to have some for tea
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

So am I as it happens. It was meant to be roast turkey, but Sue's got problems with her mum's health, so has been phoning doctors this afternoon (her mum lives in the Isle of Man and there's no family there anymore), so doesn't much feel like cooking. I did suggest a takeaway but got a straight no due to salt levels. Damn, I so nearly had my hands on a pepperoni passion there.  ;D
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

I am still having the roast turkey ;D
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

With chips? You must tell me how you manage to get that one past your wife. ;D
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

We've both had a busy afternoon and something quick and easy required :whistle:
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

;D

That's why it's egg and chips here - a meal I am inordinately fond of...
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

I'd need something to sit the egg on ;D
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

A plate works for me, Steve. :)
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

I was thinking more along the lines of something beef. In fact homemade burger and a poached egg.
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Ah. We had DR burgers in a bun for lunch - they really are very good.
Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Steve

Perhaps we should add a postfix to the recent posts -"if you get the picture" :whistle:
Steve
------------
This post reflects my own views, opinions and experience, not those of IDNet.

Rik

Rik
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

Technical Ben

#30
It's all current and old technology, but it DOES use stupid amounts of processing power. Both CPU and GPU. Not that it could not be done, but that there are currently too many limits to this type of engine. One being, that animating anything three dimensional is an immense task, mainly because you have to move ALL those "atoms". Do you want to do that by hand?  :eek4:. But for scenery and ground, it's quite a good technique. Physics is also possible, but your having to calculate thousands of points again.
The Wiki knows all.
And to see the real power of Voxels, check out Zbrush.

These are all done with real "atoms" or voxels. So its billions of polys if converted.





I use to have a signature, then it all changed to chip and pin.

esh

Real or not, trying to get the world's programmers to shift to a radically different API will be... challenging.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

Technical Ben

Not always. Well, I am using CryEngine 3 engine as an example, but that's compiled by their own compiler/programming suite, so I guess it does not count. It is programmed from the ground up though, so offers really great performance.
I use to have a signature, then it all changed to chip and pin.

esh

Well I was more discussing the low-level API that interfaces with the graphics card than the game engine. I am reminded back to the days of the Unreal 1 engine where Epic/DE had to write and provide back-end renders for D3D, OpenGL, S3 MeTaL and the old-fashioned software renderer. It was predictably a nightmare.

The only two engines I've used for any length of time were the (old) Torque engine and Unreal 3 more recently, though I find U3 doesn't handle absolutely massive areas very well, even if they are quite sparsely detailed.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011

vitriol

The best Unreal Tournament ran for me was when I had a 3dfx Voodoo 3 3000 graphics card using the Glide API.  It was just magic.  Although using OpenGL (with hi resolution textures) and lots of anti aliasing, anisotropic filterning looked mint (Radeon 8500, 9600pro, 9800pro and Nvidia 8800GTX)

zappaDPJ

The Unreal engine was the bane of my life for many years. Anyone familiar with BSP mesh holes will know why  :rant2: It spawned some fun games though  :)
zap
--------------------

This post reflects my own views, opinions and experience, not those of IDNet.

esh

Crikey, how could I forget the Glide renderer? That was all U1 ran on originally. In fact, when I moved up to a GeForce DDR card, I ran U1 and UT99 on Glide using a Glide->D3D wrapper and it was faster than using the D3D renderer.

Times have changed. Fortunately. But I do miss my Voodoo2.
CompuServe 28.8k/33.6k 1994-1998, BT 56k 1998-2001, NTL Cable 512k 2001-2004, 2x F2S 1M 2004-2008, IDNet 8M 2008 - LLU 11M 2011