Now to answer some questions that have been asked here that can get a quick answer... incase Tim Sweeney are not asked them... because I happen to know at least most of the answer off the top of my head. Just a hobbyist/modder... but some of those questions are at my level.
1) "Will UnrealEngine 3 support 3D in the future"
Unreal Engine 3 already supports 3D Vision from nVidia. Unreal Tournament 3 is listed as excellent compatibility... Arkham Asylum is one of nVidia's top-3 supported.
(nVidia's list below)
http://www.nvidia.com/obj..3D_Vision_3D_Games.html 2) "Also can a older game engine be made to look sharper/better with only a hardware update? or does it also require an engine update?"
Depends on what you mean. SSAO (Screen Space Ambient Occlusion) was included in a driver update by nVidia for existing games, even some older ones. Now will dropping in a Fermi card make Unreal Tournament 2004 look better than the latest UT3 demo for the UDK? ... no.
So it can be made to look better... but you're still stuck with what was originally in it... unless it's a specific way of tweaking the rendering as a whole (like higher levels of AntiAliasing/Aniso filtering... SSAO... etc)
3) "The unreal engine has always been fantastic, are there any new versions in the works or can 3.0 be further enhanced?"
Both. Tim Sweeney discussed Unreal Engine 4 planning (though that likely will be change since Microsoft and Sony made this generation an extra-long one...) *years* ago... before Gears of War 1 was even released... but planning and actually writing the engine are two different things.
Also -- Unreal Engine 3 is undergoing changes all the time... some of which are public at
http://udn.epicgames.com/ -- They just released video of Unreal Engine 3 running on an iPhone.
4) "Will we NEED Fermi cards to play games in the future?"
At some point every videocard generation will become "minimum requirements" for some game. How long is the question, of course. Usually 4-6 years after launch.
5) "For a little over 5 years, the clock speeds of stock CPUs has remained static. It seems that focus for processors and operating systems is almost entirely focused on parallelism. I've heard game developers say that there are certain tasks that just can't be threaded, suggesting that clock speed is still a major factor. Do you feel the "Megahertz Race" needs to be ignited again?"
Not really true... the megahertz race was because Intel wanted a big number to sell customers... you were getting 3.2Ghz Celerons performing less-than half-as-fast as 1.3GHz Athlon Thunderbirds.
GHz is like having a factory and adjusting the conveyor belt speed. There are more factors than belt speed -- how many belts (cores) you have makes a difference... how many products you can make on each belt in the idle time (hyperthreading)... how your shipping/receiving (on-chip cache) is able to keep your materials stocked so you won't be sitting around waiting for something to do... how many tools (instruction sets) you have available to you... how LONG is the conveyor belt (pipeline size)... etc etc etc.
Most of the advancement wasn't on Clock Speed - it was on all the other aspects... not just how many conveyor belts you have. As an example -- Pentium 4 required like -- 40 stages in its pipeline... Core 2 required like -- 13.
6) "What do you predict the future of sound to be, now that Vista/Win 7 have (mostly) killed direct sound? I really enjoy games that can take advantage of EAX, but there are so few of them."
OpenAL.
7) "Will DirectX11 games still be able to play at DirectX9 settings? (will there still be scaling for those without the latest hardware?)"
Depends on who made that DirectX 11 game...
8) "with the unreal engine SDK free version, have any games come of it that will be published? can you give any hints as to what future things the SDK will bring?"
That's all up to whoever made those UDK games. I'm personally working on one amongst all my other hobbies... I know The Ball is planning a release... Renegade X will switch to UDK very soon... etc. etc.
9) "The Batman Arkham Asylum Anti-Aliasing issue between Nvidia and AMD. What's your take on it? Any plans to integrate AA support into UE3 or UE4 natively to avoid such issues in the future?"
It's already in Unreal Engine 3's later builds... off by default. I turned it on while I was playing around with the UDK trying to find out what I can do with it. (These were NOT done using driver hacks... this was simply editing the utgame.ini files)
http://img.photobucket.co...hopojijo/DX10UT_02.jpg http://img.photobucket.co...hopojijo/DX10UT_01.jpg The problem isn't with UnrealEngine3 per-say... it's with DirectX9, high-bit (64bit in UE3's case) HDR (NOT bloom... HDR... two different things, though bloom is a side-effect of very-high light intensities... which you can have when you calculate red/green/blue intensities to >256 shades.), and ... which Unreal Engine 3 uses unless you REALLY REALLY force it to use DirectX 10... then turn on the AA implementation. It also is quite slow.
Note: If you want to see where HDR SHOULD have been used -- play Doom 3 and Battlefield 2. Walk outdoors in Doom 3... or enter a building in Battlefield 2. Those NON-HDR engines were tuned for indoor or outdoor (respectively) lighting profiles. HDR knows more information about the light... so it can adjust what 256/256/256 R/G/B means on a frame-by-frame (or whatever) basis... rather than being tuned to some light intensity before being pressed to the DVD. And that's where bloom comes in... if your exposure is one level... say you're in a dark cave... but a REALLY bright couple of pixels (say the opening) completely jumps out of your profile? It will become "uber-white" (not an actual technical term) and... well... Bloom!
10) "When do you believe that fully destructible environments will become reality? This includes being able to destroy the ground as well as break individual objects into smaller pieces in a realistic fashion."
You can basically do that now... that's all "gameplay" issues, and a bit of lateral thinking in how you build your levels. I mean you can do more and more as you get more and more power... but it's basically possible now. The real problem is making that *fun*.
11) "Can i use EVBOT to increase the performance of the latest Unreal Engine?"
It'll speed up anything that uses you GPU... provided you don't kill your card while overclocking it.
12) "Would i see any benefit from running the latest Unreal Engine on a Dual CPU motherboard compared to a single CPU board using the same hardware (eg. same graphics card set-up, same memory etc)?"
Doubling your CPUs would be like going from Single-Core to Dual-Core... or Dual-Core to Quad-Core... or Quad-Core to Octo-Core... depends on WHAT CPUs are being used though.
13) "Concerning "Unreal Lightmass", will fps take a major hit, or will some of the calculations be offloaded to the cpu?"
Lightmass is actually PREcalculated lighting. The extra lighting detail is a bit heavier when you run it... true... but the actual lighting is done LONG BEFORE the game even is pressed on the DVD. This is useful because very few lights in a game actually move or change in any way.
You can play with it at
http://www.udk.com/ ------------------
Okay so that's just about all that I can answer personally... just in case it's not brought up to Tim Sweeney to get any form of official answer.
post edited by Phopojijo - 2010/01/28 11:19:39