Just found this.
http://developer.nvidia.com/object/gpu-ai.html
It seems to be quite promising give the scalability of the technique.
Be sure to check the video's abd be amazed how fast it can simulate a 1000 AI Units!
GPU AI
-
- Posts: 91
- Joined: Sun Oct 19, 2008 5:29 pm
- Location: Valencia (Spain)
- Contact:
-
- Posts: 288
- Joined: Wed Apr 16, 2008 1:45 am
- Contact:
You know, in truth, the reason that the GPU is good at graphics is because it's good at throughput processing. In other words, it's good at processing as much information as possible in a short amount of time. Because of that, the GPU is actually a good place for physics or AI, because you have to run your routines for each object, each character etc. Taking those off the CPU and putting them on the GPU can make them faster, but you have to be careful you don't take away the GPU's time that it needs for actually doing graphics.
Actually, don't be surprised if within the next decade or so we see something like a general processing unit, or muli-processing unit or some such thing. The dedicated physics card didn't work out too well, but if there is a card for physics and ai and other such processes that are repeated often per frame, it could be worth more. It will be interesting to see what happens.