GPU AI

Discussion about everything. New games, 3d math, development tips...
Post Reply
ACE247
Posts: 704
Joined: Tue Mar 16, 2010 12:31 am

GPU AI

Post by ACE247 »

Just found this.
http://developer.nvidia.com/object/gpu-ai.html
It seems to be quite promising give the scalability of the technique.
Image
Be sure to check the video's abd be amazed how fast it can simulate a 1000 AI Units! 8)
Viz_Fuerte
Posts: 91
Joined: Sun Oct 19, 2008 5:29 pm
Location: Valencia (Spain)
Contact:

Post by Viz_Fuerte »

It looks interesting, but are using too much GPU for almost everything. Use the GPU ONLY FOR GRAPHICS!!!

Sorry to say this, but everyone has to have his job.
For physics -> physic card.
For graphics -> graphics card.
For AI -> ¿AIPU? (Artificial Intelligence Processing Unit)
:lol:
Insomniacp
Posts: 288
Joined: Wed Apr 16, 2008 1:45 am
Contact:

Post by Insomniacp »

could be useful for a server side ai though, just throw in a graphics card and be able to handle a lot more ai is very useful in a large scale server based game since it doesn't need to use the gpu for anything else.
ACE247
Posts: 704
Joined: Tue Mar 16, 2010 12:31 am

Post by ACE247 »

I agree, The CPU is still quite well suited to doing the AI for now.
But obviously marketing people will try to add selling points to everything. :D
And that server side GPU AI might not be a bad idea at all. :wink:
DeM0nFiRe
Posts: 117
Joined: Thu Oct 16, 2008 11:59 pm

Post by DeM0nFiRe »

You know, in truth, the reason that the GPU is good at graphics is because it's good at throughput processing. In other words, it's good at processing as much information as possible in a short amount of time. Because of that, the GPU is actually a good place for physics or AI, because you have to run your routines for each object, each character etc. Taking those off the CPU and putting them on the GPU can make them faster, but you have to be careful you don't take away the GPU's time that it needs for actually doing graphics.
ACE247
Posts: 704
Joined: Tue Mar 16, 2010 12:31 am

Post by ACE247 »

Hey, why not then rename GPU -> Graphics Processing Unit to GPU -> as in General Processing Unit :!: :lol: :lol:
DeM0nFiRe
Posts: 117
Joined: Thu Oct 16, 2008 11:59 pm

Post by DeM0nFiRe »

Actually, don't be surprised if within the next decade or so we see something like a general processing unit, or muli-processing unit or some such thing. The dedicated physics card didn't work out too well, but if there is a card for physics and ai and other such processes that are repeated often per frame, it could be worth more. It will be interesting to see what happens.
Post Reply