Tuesday, August 19, 2008

Physx gets integrated into leading engine

Nvidia gains further support

EMERGENT HAS announced plans to further partner with Nvidia on the company's Gamebryo development platform.

In an announcement which can be seen as a sizeable win for Nvidia, Emergent will integrate Physx technology into all upcoming versions of the 'industry-leading' Gamebryo.

The next release of Emergent's Gamebryo, is scheduled for this Autumn and thus will ship with the Nvidia Physx engine directly integrated into the platform.

Gamebryo has been optimised for development on the Playstation 3, Xbox 360, Wii and PC .

It was most recently selected as the development platform for the console titles Civilization Revolution by Firaxis and Splatterhouse by BottleRocket.

Gamebryo is also being used by EA-Mythic for its upcoming game, Warhammer Online: Age of Reckoning as well as Divinity 2: Ego Draconis from Larian Studios.

Emergent has stated that to date, Gamebryo has been used in more than 200 shipped games titles ranging from massively multiplayer online games, high-end retail games across multiple genres, and casual games.

It makes sense for Gamebryo to use Physx as an underpinning technology - Physx can work across all major gaming platforms, including the above consoles, and the PC, and can be accelerated by both the CPU and any CUDA general purpose parallel computing processor - and obviously Nvidia's own Geforce GPUs

By Dean Pullen: at
http://www.theinquirer.net/

Saturday, August 16, 2008

Nvidia drivers for OpenGL 3.0

NVIDIA has released a new set of beta drivers for developers with support for the OpenGL 3.0 API and GLSL 1.30 shading language.

Just two days after the Khronos Group officially released the OpenGL 3.0 specifications, NVIDIA has deployed its first round of beta drivers (version 177.89) with support for the new API. By default, the new features are disabled and must be activated using NVIDIA’s NVemulate utility. In order to activate OpenGL 3.0 and GLSL 1.30 functionality, you must be using a GeForce 8 series or higher or one of several Quadro FX cards. Cards from both desktop and notebook lines are supported.

The drivers are available for both 32-bit and 64-bit versions of Windows XP and Windows Vista and will integrate into the standard ForceWare driver releases following the SIGGRAPH 2008 conference as part of NVIDIA’s Big Bang II.

How RAM works on a dual-GPU card

Current dual-gpu cards work by using load balancing between two gpu cores on a video card. The current designs on these dual-gpu cards don't have shared memory so each gpu core uses the required amount of vram per gpu core to render and load balance two scenes together seamlessly. Shared memory wouldn't actually give you any more ram, but rather better management of how the gpu core's could handle the provided available on board ram.

With a shared memory dual-gpu you could do more things with the same amount of on board ram essentially. For example card with 1GB shared vram and 1 gpu core could be using 256vram for one application while the other core was using 768vram for another application. Now with a 2x512MB even though the card has 1GB worth of total ram on board your restricted to 512MB per core so that same situation wouldn't be possible. Shared memory is more economical and flexible, but more complex to design which is likely the leading contributing factor as to why it isn't yet used in dual gpu cards. I hope that helps answer your question.

Thursday, August 14, 2008

GPU RAM

What effect GPU RAM really have for gamers and CUDA programming alike?
Do current video cards push too much VRAM or too little is the question game websites have generally shown over 512VRAM largely unused and over 1GB to actually begin to be detrimental in certain cases, but have these sites really tested the right applications is what I ask you?

What most benchmark sites I've seen around generally test video games alone and haven't seen any attempt to dig into how it might benift other graphic card related applications. Anyways the games typically tested are First Person Shooters such as Crysis and while they're demanding video games they seem to lack a lot of the texture divirsity found in mmorpg's that have literally thousands of subsiquent textures due to the vast amounts of different armors and weapons availible in the game to give it that fantasy atmosphere we MMORPG fans enjoy so much. On top of that there are usually more players on a screen pushing the texture variety further and the game maps themselves can be a lot larger. So with all this texture usage in MMORPG's why aren't games like Age of Conan or Everquest 2 used to test the impact on VRAM? Also as a mmo player I've played multiple accounts at the same time on a variety of mmo games which stresses VRAM that much further why do benchmark sites neglect testing that gaming scenario as well and what about 3DStudioMax, Maya, and Photoshop?