open All Channels
seplocked Features and Ideas Discussion
blankseplocked Swappable graphics between OpenGL/DirectX
 
This thread is older than 90 days and has been locked due to inactivity.


 
Author Topic

Aurora Arashi
Amarr
Ministry of War
Posted - 2010.03.06 05:49:00 - [1]
 

Hi there,
Disclaimer: Not a programmer. Linux User.
One of possibly the biggest performance killers of EVE on Mac and Linux is it's implementation in DirectX. This means all graphics must be interpreted by a compatibility layer which translates the DirectX into OpenGL. It is possible that if EVE's graphical engine had an option between the two (Or even switched to OpenGL altogether) it would become much easier, and faster, to run EVE on alternative operating systems.

Examples:
Warcraft 3 (Blizzard)
World of Warcraft (Blizzard)

Both of these games run over DirectX on Windows, as well as OpenGL on their native Mac clients. The Windows client maintains an OpenGL option however, which, when used in WINE or other compatibility layers, drastically improves both compatibility and performance.

Is it possible that this could also be the case of EVE? Would the work (Which is great, I understand) be worth it?

(I might have posted this just a few ago, still learning the boards, sorry.)

Regards,
Aurora

Callista Sincera
Amarr
Hedion University
Posted - 2010.03.06 08:43:00 - [2]
 

Given that EVE's initial release was a pretty big roller coaster ride for CCP, I'd say they probably didn't have the time to design an interfacing layer (between their engine and DirectX). Maybe they added one with Trinity, but I kind of doubt it.
That would mean they are probably using DirectX directly (no pun intended) within their engine and changing it now would be ... something that's not gonna happen within the next couple of years I guess...

Aurora Arashi
Amarr
Ministry of War
Posted - 2010.03.06 16:35:00 - [3]
 

Quote:
Given that EVE's initial release was a pretty big roller coaster ride for CCP, I'd say they probably didn't have the time to design an interfacing layer (between their engine and DirectX). Maybe they added one with Trinity, but I kind of doubt it.
That would mean they are probably using DirectX directly (no pun intended) within their engine and changing it now would be ... something that's not gonna happen within the next couple of years I guess...


That is very possible. However as a company which encourages EVE's use on Mac (They distribute a Cider version) and on Linux (Used to distribute a version, have boards here still) they most likely have a good number of users on those operating systems. So perhaps it may be worth it?

masternerdguy
Gallente
Meerkat Maner
Posted - 2010.03.06 16:46:00 - [4]
 

Originally by: Aurora Arashi
Quote:
Given that EVE's initial release was a pretty big roller coaster ride for CCP, I'd say they probably didn't have the time to design an interfacing layer (between their engine and DirectX). Maybe they added one with Trinity, but I kind of doubt it.
That would mean they are probably using DirectX directly (no pun intended) within their engine and changing it now would be ... something that's not gonna happen within the next couple of years I guess...


That is very possible. However as a company which encourages EVE's use on Mac (They distribute a Cider version) and on Linux (Used to distribute a version, have boards here still) they most likely have a good number of users on those operating systems. So perhaps it may be worth it?


eve runs so well on WINE that you don't have to worry, I used to use it on WINE.

http://appdb.winehq.org/objectManager.php?sClass=version&iId=18563

Aurora Arashi
Amarr
Ministry of War
Posted - 2010.03.06 22:16:00 - [5]
 

Originally by: masternerdguy

eve runs so well on WINE that you don't have to worry, I used to use it on WINE.

http://appdb.winehq.org/objectManager.php?sClass=version&iId=18563


Well, it's sort of a given that it runs in Wine well since Cider is essentially a self-contained wine instance, as well as the Cedega wrapper CCP previously used on their Linux version. However it can take alot of resources to translate all of that directx into opengl, a process which is not perfect either. It would be much faster, and less buggy, to be able to directly feed openGL through wine. That is why I'm suggesting this.

Super Whopper
I can Has Cheeseburger
Posted - 2010.03.07 03:27:00 - [6]
 

Originally by: Aurora Arashi
Mac clients


Problem spotted.

Next time buy a proper PC and not a PC you can only run highly restricted, mostly backwards and insecure OS on (Apple don't give a damn about fixing vulnerabilities, see last hacker championship). Otherwise nobody has ever cared or will ever care.

Callista Sincera
Amarr
Hedion University
Posted - 2010.03.07 07:39:00 - [7]
 

Originally by: Super Whopper
Next time buy a proper PC and not a PC you can only run highly restricted, mostly backwards and insecure OS on (Apple don't give a damn about fixing vulnerabilities, see last hacker championship). Otherwise nobody has ever cared or will ever care.


Blah blah blah... I'm not a mac user, but you're just spilling pointless propaganda. If you want to talk about security issues, just look at Microsofts ruthless marketing strategies in the early days of the internet explororer. They literally abandoned security for "more features than netscape" with the sole intention of kicking netscape from the market and in the process starting the web's very own dark ages.

Not that I'd assume apple to have behaved any better, given the chance. I'm just saying neither one gives a crap about their customers.

Cayth
Posted - 2010.03.09 18:28:00 - [8]
 

Right, so....to get alot of stuff out of the way, I boot Windows and *nix, though usually for different applications and reasons, and I might note that OSx is also a POSIX operating system, so if your running an x86 one, it'll do anything a *nix system can, albeit with alot more work, usually.
Now, on to explanations.
DirectX and OpenGL both started as layers between the applications and the video cards, way back when most video cards had tons of odd quirks to how they might interpret various supposedly standard method calls. OpenGL was supposed to provide a standard call interface, but it had no teeth, and cards would frequently not quite follow the standard. Thus, developers had to build in workarounds for nearly every card they wanted to support, and back then there were quite alot of manufacturers, not just the effective duopoly we have now.
Microsoft started DirectX as a compatibility layer, basically, MS took care of all the workarounds for different cards, and the developers could simply call the methods and actually expect them to work. There was an inherent performance hit, since the dev's wouldn't have direct control over what cards they supported, though MS could be counted on to support all the newest and most common, point is, the dev's couldn't choose to drop a particularly annoying card to save some cycles checking for it. But since they could then concentrate on actually making their game instead of worrying about video card compatibility, it usually worked itself out.
Anyway, after a while of this, nearly every game was running on DirectX, because it was hard to do anything else and still be able to keep up with the available hardware. MS then went to the card manufacturers and used the fact that they essentially controlled the list of what cards could run current games to force better standardization of method implementation. So, in the end, an 800 pound software gorilla did what the older "open" standard couldn't, actually force standardization.
As a side effect, this means cards OpenGL implementations now tend to behave as well since the underlying architecture is standardised, and suddenly 3D accelerated gaming on OpenGL systems is possible again.
Right, enough with the history.
OpenGL and DirectX, are, at the moment, essentially the same thing: a standard set of methods, pipelines, and shader languages that allow the program to tell the card what pretty polygons to toss all over your screen. However, the process of taking actually rendering the images still requires quite a bit of work on the part of the game, but the calls and pipeline setup is going to vary greatly between OpenGL and DirectX9. Also, while the textures and models can mostly be shared directly, the shaders will be wholly incompatible, so two entirely different sets of shaders would need to be written. That's not even getting into the differences between Shader Models 2, 3 and 4...all of which are supported by both systems, but differing levels are supported by cards depending on how recent they are. And the DirectX10 pipeline is a whole different paradigm again, as it's moved to a fully programmable pipeline, which means that you'd have to write a whole new rendering engine around it...again (although the fully programmable pipeline should be alot easier to work with than the fixed pipelines used by DX9 and previous and OpenGL).
So, yah. It takes alot of work for a game to implement both a DX9 and OpenGl renderer. And frankly, there's not much reason to, unless you're blizzard and can hire literal armies of programmers to do it. WINE (which stands for Wine Is Not Emulated, by the way, they're very particular on that) works great as long as the game plays nice with it (avoids certain DX calls that are hard or counter productive to translate back and forth, etc.), and EVE's devs have always been fairly good at making an effort to do so.


 

This thread is older than 90 days and has been locked due to inactivity.


 


The new forums are live

Please adjust your bookmarks to https://forums.eveonline.com

These forums are archived and read-only