Unreal creator Tim Sweeney:"PCs are good for anything, just not games”

Wally

Retired Admin
Joined
19 Ιαν 2006
Μηνύματα
25.798
Αντιδράσεις
4.312
Μιας και παρατηρω μια ραθυμια εδω μεσα...να σας τσιταρω λιγο με τη συνεντευξη του Sweeny

Αποσπασμα:

TG Daily: You have to admit, the margin is obviously there.
Sweeney: Agreed. But it is very important not to leave the masses behind. This is unfortunate, because PCs are more popular than ever. Everyone has a PC. Even those who did not have a PC in the past are now able to afford one and they use it for Facebook, MySpace, pirating music or whatever. Yesterday’s PCs were for people that were working and later playing games. Even if those games were lower-end ones, there will always be a market for casual games and online games like World of Warcraft. World of Warcraft has DirectX 7-class graphics and can run on any computer. But at the end of the day, consoles have definitely left PC games behind.

TG Daily: In other words: Too big?

Sweeney: Yes, that is huge difference. If we go back 10 years ago, the difference between the high end and the lowest end may have been a factor of 10. We could have scaled games between those two. For example, with the first version of Unreal, a resolution of 320x200 was good for software rendering and we were able to scale that up to 1024x768, if you had the GPU power. There is no way we can scale down a game down by a factor of 100, we would just have to design two completely different games. One for low-end and one for high-end.

That is actually happening on PCs: You have really low-end games with little hardware requirements, like Maple Story. That is a $100 million-a-year business. Kids are addicted to those games, they pay real money to buy [virtual] items within the game and the game.

TG Daily: Broken down, that means today’s mainstream PCs aren’t suitable for gaming?

Sweeney: Exactly. PCs are good for anything, just not games.

TG Daily: Can that scenario change?

Sweeney: Yes, actually it might. If you look into the past, CPU makers are learning more and more how to take advantage of GPU-like architectures. Internally, they accept larger data and they have wider vector units: CPUs went from a single-threaded product to multiple cores. And who knows, we might find the way to get the software rendering back into fashion.

Then, every PC, even the lowest performing ones will have excellent CPUs. If we could get software rendering going again, that might be just the solution we all need. Intel’s integrated graphics just don't work. I don't think they will ever work.
Αξιζει να διαβασετε την συνεντευξη που εκτος με τον θεμα του thread,ασχολειται με:

* το DirectX10 και το μελλον των Graphics APIs (εδω)

Αποσπασμα:

TG Daily: In the first part of our interview you implied that software rendering might be coming back. Daniel Pohl, who rewrote Quake 3 and Quake 4 using ray-tracing [and is now working as Intel's research scientist] recently showed ray-tracing on a Sony UMPC, an ultraportable device equipped with a single-core processor. True, the resolution was much lower than on PCs of today, but it looked impressive. What are your thoughts on ray-tracing? How will 3D develop in the next months and years?
Sweeney: Ray-tracing is a cool direction for future rendering techniques. Also, there is rendering and there is the ray scheme of dividing the scene into micro-polygons and voxels. There are around five to ten different techniques and they are all very interesting for the next-generation of rendering.

Rendering can be done on the CPU. As soon as we have enough CPU cores and better vector support, these schemes might get more practical for games. And: As GPUs become more general, you will have the possibility of writing a rendering engine that runs directly on the GPU and bypasses DirectX as well as the graphics pipeline. For example, you can write a render in CUDA and run it on Nvidia hardware, bypassing all of their rasterization and everything else.

All a software renderer really does is input some scene data, your position of objects, texture maps and things like that - while the output is just a rectangular grid of pixels. You can use different techniques to generate this grid. You don’t have to use the GPU rasterizer to achieve this goal.

TG Daily: What kind of advantage can be gained from avoiding the API? Most developers just utilize DirectX or OpenGL and that's about it. How does the Unreal Engine differ from the conventional approach?

Sweeney: There are significant advantages in doing it yourself, avoiding all the graphics API calling and overhead. With a direct approach, we can use techniques that require wider frame buffer, things that DirectX just doesn't support. At Epic, we're using the GPU for general computation with pixel shaders. There is a lot we can do there, just by bypassing the graphics pipeline completely.

TG Daily: What is the role of DirectX these days? DirectX 10 and the Vista-everything model promised things like more effects and direct hardware approach, claiming that lots of new built-in technologies would enable a console-like experience. DirectX 10.0 has been on the market for some time and the arrival of DirectX 10.1 is just ahead. What went right, what went wrong?

Sweeney: I don't think anything unusual happened there. DirectX 10 is a fine API. When Vista first shipped, DirectX 10 applications tended to be slower than DirectX 9, but that was to be expected. That was simply the case because the hardware guys were given many years and hundreds of man-years to optimize their DirectX 9 drivers. With DirectX 10, they had to start from scratch. In the past weeks and months, we have seen DX10 drivers catching up to DX9 in terms of performance and they're starting to surpass them.

I think that the roadmap was sound, but DirectX 10 was just a small incremental improvement over DX9. The big news items with DirectX 9 were pixel and vertex shaders: You could write arbitrary code and DX10 just takes that to a new level, offering geometry shaders and numerous features and modes. It doesn't change graphics in any way at all, unlike DX9. That was a giant step ahead of DirectX 7 and DirectX 8.

TG Daily: Since you are a member of Microsoft's advisory board for DirectX, you probably have a good idea what we will see next in DirectX. What can we expect and do you see a potential for a segmentation of APIs - all over again?

Sweeney: I think Microsoft is doing the right thing for the graphics API. There are many developers who always want to program through the API - either through DirectX these days or a software renderer in the past. That will always be the right solution for them. It makes things easier to get stuff being rendered on-screen. If you know your resource allocation, you'll be just fine. But realistically, I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU - using graphics processor programming language rather than DirectX. I think we're going to get there pretty quickly.

I expect that by the time of the release of the next generation of consoles, around 2012 when Microsoft comes out with the successor of the Xbox 360 and Sony comes out with the successor of the PlayStation 3, games will be running 100% on based software pipelines. Yes, some developers will still use DirectX, but at some point, DirectX just becomes a software library on top of ... you know.

TG Daily: Hardware?

Sweeney: GPU hardware. And you can implement DirectX entirely in software, on the CPU. DirectX software rendering always has been there.

Microsoft writes the reference rasterizer, which is a factor of 100 slower than what you really need. But it is there and shows that you can run an entire graphics pipeline in software. I think we're only few years away from that approach being faster than the conventional API approach - and we will be able to ship games that way. Just think about the Pixomatic software rendering.
* την Unreal Engine 4.0 (εδω)

Αποσπασμα:

TG Daily: Let’s talk about your game visions for the future and the next Unreal Engine? Where is EPIC going with the Unreal Engine 3.5 and 4.0?
Sweeney: The Unreal engine is really tied to a console cycle. We will continue to improve Unreal Engine 3 and add significant new features through the end of this console cycle. So, it is normal to expect that we will add new stuff in 2011 and 2012. We're shipping Gears of War now; we're just showing the next bunch of major tech upgrades such as soft-body physics, destructible environments and crowds. There is a long life ahead for Unreal Engine 3. Version 4 will exclusively target the next console generation, Microsoft's successor for the Xbox 360, Sony's successor for the Playstation 3 - and if Nintendo ships a machine with similar hardware specs, then that also. PCs will follow after that.

Also, we continuously work on transitions, when we go through large portions of the engine. We completely throw out parts and create large subsystems from the ground up, while we are reusing some things that are still valid.

TG Daily: Like ...?

Sweeney: The Internet bandwidth. In five years, the bandwidth isn't going to be more than 5-6 times higher than it is today. So the network code we have in the engine now will stay the same. Our tools are still valid, but we will rewrite large sections of the engine around it, as the new hardware develops.

TG Daily: What part of the engine will need a completely new development?

Sweeney: Our biggest challenge will be scaling to lots and lots of cores. UE3 uses functional subdivision and paths, so we have the rendering thread that handles all in-game rendering. We have the gameplay thread that handles all game-plays and uses AI. We have some hopper threads for physics. We scale very well from dual-core to quad-core, and actually you can see a significant performance increase when you run UT3 on a quad-core when compared to a dual-core system.

Down the road, we will have tens of processing cores to deal with and we need much, much finer grain task-parallelism in order to avoid being burdened by single-threaded code. That, of course, requires us to rewrite very large portions of the engine. We are replacing our scripting system with something completely new, a highly-threadable system. We're also replacing the rendering engine with something that can scale to much smaller rendering tasks, in- and out-of-order threads. There is a lot of work to do.

TG Daily: You already have started working on Unreal Engine 4.0?

Sweeney: We have a small Research & Development effort dedicated to the Unreal Engine 4. Basically, it is just me, but that team will be ramping up to three to four engineers by the end of this year - and even more one year after that. In some way, we resemble a hardware company with our generational development of technology. We are going to have a team developing Unreal Engine 3 for years to come and a team ramping up on Unreal Engine 4. And then, as the next-gen transition begins, we will be moving everybody to that. We actually are doing parallel development for multiple generations concurrently.

TG Daily: Stepping back, what do you see as the most significant technology trends these days?

Sweeney: When it comes to the PC, Intel will implement lots of extensions into the CPU and Nvidia will integrate many extensions into the GPU by the time next-gen consoles begin to surface. We are going to see some CPU cores that will deal with gameplay logic, some GPU stuff that will run general computing... and two different compilers. One for the GPU and one for the CPU. The result will be a reduction of our dependence on bloated middleware that slows things down, shielding the real functionality of the devices.

It would be great to be able to write code for one massively multi-core device that does both general and graphics computation in the system. One programming language, one set of tools, one development environment - just one paradigm for the whole thing: Large scale multi-core computing. If you extract Moore's Law, you see that with the number of cores that Microsoft put in Xbox 360, it is clear that around 2010 - at the beginning of the next decade - you can put tens of CPU cores on one processor chip and you will have a perfectly usable uniform computing environment. That time will be interesting for graphics as well.

At that time, we will have a physics engine that runs on a computing device, we will have a software renderer that will be able to do far more features that you can do in DirectX as a result of having general computation functionality. I think that will really change the world. That can happen as soon as next console transition begins, and it brings a lot of economic benefits there, especially if you look at the world of consoles or the world of handhelds. You have one non-commodity computing chip; it is hooked up directly to memory. We have an opportunity to economize the system and provide entirely new levels of computing performance and capabilities.
 
Δεν σχολιάζω τίποτε παρά σας παραθέτω στην υπογραφή του φίλτατου Maddog.

Προφανώς ο Sweeney (άκου όνομα..... ) δεν ξέρει τι του γίνεται.

PCs are good for nothing, except games.
 
Πίσω
Μπλουζα