I know alot of people say gpu has little to do with cosmic break but most gpu have separate settings.
for instance alot of graphic cards have mutiple settings.
my gpu has a
2dperformance setting very low think year1999- 2000 clock speeds.
Low 3d performance setting which is what most dx9 and lower apps will run at which at default bring down the shader clock to 810mhz and the core frequency to 405mhz. I can oc to 810 mhz and a 1620 shader processor frequency. So basicly I can't reach my full clock speeds in dx9 and lower apps which could be why people don't see a performance gain in cb much witha graphic card is because of how most modern gpus limit themselves in weaker games. Also the memory frequency is bought down to 324 mhz and can only go up to 388 mhz.
In normal settings my card has a a memory frequency of 2100 mhz, core frequency of 1100 mhz, and a shader processor frequency of 2100 mhz but only programs that get to use all that power are direct x ten and 11 programs lower end games simply cant access it.
Anyways what I am going to do is underclock my low performance 3d settings and then over clock them to see if it effects my fps in cosmic break.
First test will be done with the core frequency at 200 mhz Shader processor at 200 mhz and the core clock at 200 mhz this will represent the performance of a ddr gpu I am somewhat under clocking it compared to a ddr graphics card do to the obvious advantage a new graphics card with new technologies will have. example ati radeon RV200 and geforce 2ti and geforce 3ti unstable game crashes at max graphics fps jumps from 15-40 fps during gameplay in 30 vs 30. In 15vs15 average fps is 20-30fps. Longer load times more likely to lag spikes.
2nd test will be at 380 MHz for the core clock, 340 MHz memory clock to represent a the highest end ddr card at a dx9 level. example ati radeon 9800 pro unstable game crashes at max graphics fps jumps from 15-40 fps during gameplay in 30 vs 30 in 15 vs 15 average fps is 25-30fps game was ran maxed. Longer load times alot more likely to lag spikes.
3rd test will be at 400mhz core clock and a 400 memory clock to represent the entry level dd2 gpu spectrum example a nvidia 7600.
Stable trouble running fraps in 30 vs 30 lag spikes more likely to happen
4th test 500 mhz core clock and 600mhz memory speed representing entry level ddr3 cards and the highest level of dd2 graphics cards. everything runs fine can fraps.
5th test 810 mhz core clock and 1620 shader frequency representing entry level direct x 10 gpus everything runs fine can fraps.
I will post my average fps and fraps fps for each setting.
Bonus intergrated gpu test =D coming up.