Jun. 2nd, 2007

smoketetsuo: (Kat Ranger at Computer)
"Also, the 360 work we did resulted in an engine that also runs well on low-end and mid-range PCs. This is very important for games today; the high-end PC gaming market alone is not big enough to support next-generation games with budgets in the $10-20M range. You need to run on ordinary mass-market PCs as well. In reading PC gaming websites, one might get the impression that everyone owns a dual-core PC with a pair of $600 GPUs in SLI configuration, but the reality is very different."


Most people have assumed and I have to admit I also was thinking this way that you'd probably need pretty hardcore hardware in order to get UT3 going. It's interesting that they are able to get it running on a larger variety of hardware because of the 360. Perhaps it's because of how they are also having to optimize it for the relatively small amount of RAM the 360 has compared to a PC. Not to mention the multi-core work they are doing... which brings me to:

"The Gears of War experience on Xbox 360 taught us to optimize for multi-core, and to improve the low-level performance of the key engine systems. This has carried over very well to PC. The division of UE3's rendering and gameplay into separate threads, implemented originally for 360, has brought even more significant gains on PC where there is a more heavyweight hardware abstraction layer in DirectX, hence more CPU time spent in rendering relative to gameplay."


It'll probably run on single core systems with reduced detail at a good rate but we'll probably be wanting to run it at least on a dual core system.

"More than 80% of PCs sold today are still single-core, and have very low-end DirectX9 graphics capabilities. Unreal Engine 3 supports those configurations well."


You won't need a hardcore graphics card to run UT3 but getting a better card will give you a better experience of course. On those cards he mentions you probably have to run it at 640x480 at low detail.

I was recently having a discussion with some of the people at inside mac games who think that you'd need a G5 at the very least to get it running on a PowerPC Mac but considering what I've quoted above I don't think you'd absolutely need that especially if you have a dual processor system. I think a G4 above 1.4GHz should be able to run the game on low detail settings. A Dual Core x2 G5 (Quad) should be able to run it at at least medium if not high depending on the graphics card I say.

I still doubt it'd run well on a 1GHz anything which is what one person was asking about. He was asking if it'd run on his 1GHz G4.

Then of course there is the Mac Pro.. a Mac Pro should be able to run it at Max detail especially if Apple upgrades the graphics card.
smoketetsuo: (Kat Ranger at Computer)
"Also, the 360 work we did resulted in an engine that also runs well on low-end and mid-range PCs. This is very important for games today; the high-end PC gaming market alone is not big enough to support next-generation games with budgets in the $10-20M range. You need to run on ordinary mass-market PCs as well. In reading PC gaming websites, one might get the impression that everyone owns a dual-core PC with a pair of $600 GPUs in SLI configuration, but the reality is very different."


Most people have assumed and I have to admit I also was thinking this way that you'd probably need pretty hardcore hardware in order to get UT3 going. It's interesting that they are able to get it running on a larger variety of hardware because of the 360. Perhaps it's because of how they are also having to optimize it for the relatively small amount of RAM the 360 has compared to a PC. Not to mention the multi-core work they are doing... which brings me to:

"The Gears of War experience on Xbox 360 taught us to optimize for multi-core, and to improve the low-level performance of the key engine systems. This has carried over very well to PC. The division of UE3's rendering and gameplay into separate threads, implemented originally for 360, has brought even more significant gains on PC where there is a more heavyweight hardware abstraction layer in DirectX, hence more CPU time spent in rendering relative to gameplay."


It'll probably run on single core systems with reduced detail at a good rate but we'll probably be wanting to run it at least on a dual core system.

"More than 80% of PCs sold today are still single-core, and have very low-end DirectX9 graphics capabilities. Unreal Engine 3 supports those configurations well."


You won't need a hardcore graphics card to run UT3 but getting a better card will give you a better experience of course. On those cards he mentions you probably have to run it at 640x480 at low detail.

I was recently having a discussion with some of the people at inside mac games who think that you'd need a G5 at the very least to get it running on a PowerPC Mac but considering what I've quoted above I don't think you'd absolutely need that especially if you have a dual processor system. I think a G4 above 1.4GHz should be able to run the game on low detail settings. A Dual Core x2 G5 (Quad) should be able to run it at at least medium if not high depending on the graphics card I say.

I still doubt it'd run well on a 1GHz anything which is what one person was asking about. He was asking if it'd run on his 1GHz G4.

Then of course there is the Mac Pro.. a Mac Pro should be able to run it at Max detail especially if Apple upgrades the graphics card.
smoketetsuo: (Inquisitive Kitty)


Better yet:



Speaking of that here's an article talking about what's wrong with GFW Live right now.

Here's one that will never happen in a million years:

smoketetsuo: (Default)


Better yet:



Speaking of that here's an article talking about what's wrong with GFW Live right now.

Here's one that will never happen in a million years:

smoketetsuo: (Inquisitive Kitty)


Better yet:



Speaking of that here's an article talking about what's wrong with GFW Live right now.

Here's one that will never happen in a million years:

smoketetsuo: (Default)
I found this on digg it's supposed to be Apple's prediction of computing in 2010, from 1988. The guy on the computer screen reminds me of Bill Nye the science guy.
smoketetsuo: (Default)
I found this on digg it's supposed to be Apple's prediction of computing in 2010, from 1988. The guy on the computer screen reminds me of Bill Nye the science guy.
smoketetsuo: (Default)
I found this on digg it's supposed to be Apple's prediction of computing in 2010, from 1988. The guy on the computer screen reminds me of Bill Nye the science guy.

Profile

smoketetsuo: (Default)
smoketetsuo

October 2012

S M T W T F S
 1234 56
78910111213
14151617181920
21 222324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 13th, 2025 10:54 am
Powered by Dreamwidth Studios