ID:2176950
 
BYOND Version:511
Operating System:Windows 10 Pro 64-bit
Web Browser:Chrome 55.0.2883.59
Applies to:Dream Seeker
Status: Open

Issue hasn't been assigned a status value.
Descriptive Problem Summary:
Some Laptops come with both a low power integrated graphics and a high power discrete graphics card along with drivers to choose what card to render graphics with.

The issue is that the display buffer will only be in the integrated card, only it will have access to the display ports and it manages ferrying data and rendering commands to the high power graphics card as it chooses. (oh, and it will almost always be an intel card for the integrated, triggering the blacklist)

When DS attempts to force rendering to the discrete card the output goes nowhere because the intel card has to be involved in the stack or it has no idea what to do with what the discrete card is rendering. Leading to a plain white map window.

I don't have such a laptop, but after working with a few people who do, I have been able to figure this out.

Solutions:
1.Somehow detect when this is the case and disable the intel blacklist and/or just use the default gpu

2.Add an option to disable the intel blacklist and just use the default gpu.

Sidenote: You can export some stuff to force rendering to the discrete card (but would still have to send directx commands to the default card)
extern "C" {
_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
extern "C"
{
__declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}


I should note that I'm a bit out of my area of expertise on this, this is what I've managed to gather googling around, and mainly based on these documents: (as well as what people on stack and hp forums have said)

http://developer.download.nvidia.com/devzone/devcenter/ gamegraphics/files/OptimusRenderingPolicies.pdf

http://www.nvidia.co.uk/object/LO_optimus_whitepapers.html

So bits of detail may be incorrect

Work arounds

Disable the nvidea card in device manager so it stops attempting to render to it
Not sure if it's the same thing MrStonedOne is talking about, but I have a laptop that uses an integrated intel gpu and auto switches to a GTX 960m when the laptop sees that it needs it.

I can always help if it is needed at all.
That explains a lot. Of course this isn't a blacklist as such, simply a preference. DS will choose the NVidia card above the Intel card for its display adapter, if it sees more than one is available.

That PDF is stunningly useless on the subject, though. I can't figure out why they bothered to write it since it says nothing of value. Aside from exporting that variable--which frankly I have no interest in doing, as that's a brittle solution at best--I'm not sure how I'm supposed to set this up in an intelligent auto-detected way.
Aside from exporting that variable--which frankly I have no interest in doing, as that's a brittle solution at best

I took a look and it seems almost all of my steam games export one or both of those variables.

+1 on this.

Both UnrealEngine and CryEngine use this exact method.
Please fix. Many visual effects require hardware mode
Well, I suppose there's no downside to exporting those vars.
My suggestion:

Remove the intel depreferencing, export those vars, then give some way in preferences for the user to override this and select what card to render on (or just add a checkbox for de-preferencing intel cards) for the rare case where they can't control the default.

The default should be good in more cases than not, and at some level, the user should be responsible for fixing it when the default isn't right. Multiple display adapters is already a non-standard setup. Users having such a setup, AND having the primary monitor hooked up to the slower one is user error, and this intel depreferencing seems like it's creating more headaches than it's solved.
How about just adding the option to choose which GPU to process with? If people have no problem as-is, leave it be, if they have issues they can switch it themselves to see if it works.
I'd like a suggestion as to how to set up any kind of preferencing in a dirt-simple way. That is, no GUI involved. Any ideas?
Client side command that you can enter? Assuming you're talking about a toggle.
I don't think this can be handled with a toggle. There could be any number of display adapters available. How do I allow for a choice between them?

I'm thinking at least some kind of config setting in cfg/seeker.txt is the way to go, since that's something a user can edit. Maybe it can be "auto" (default) for the current behavior, or "any" to pick the first adapter in the list, but what if the user wants a specific adapter? Or is that last question something better left to profiles on the system itself?
Can't you just add it under the preferences screen? Small drop-down list. Shouldn't take any GUI.

It should be accessible by anyone, not be hidden out of sight. It would make helping people with issues easier, beside that they're a lot quicker to find that place out than a config file even I have trouble finding at times.
In response to Laser50
Laser50 wrote:
Can't you just add it under the preferences screen? Small drop-down list. Shouldn't take any GUI.

Dude, a drop-down list is a GUI element.
if you have the gpus in an indexed list, you could just have a .cfg option for the gpu as an index, with special values like 0 or -1 for normal and normal without intel depreferencing.

Then the user can at least try trial and error.

It's not perfect, but it's easy to code.
In response to Lummox JR
Launch arguments? Though, not many people particularly enjoy them..
launch arguments suck honestly, because windows may or may not use them for pinned things, or recent apps in the start menu, other programs may or may not use them (like launching DS from DD using the join button)

configs are just there.

Plus i think in byond the two are the same.
In response to MrStonedOne
MrStonedOne wrote:
if you have the gpus in an indexed list, you could just have a .cfg option for the gpu as an index, with special values like 0 or -1 for normal and normal without intel depreferencing.

Then the user can at least try trial and error.

It's not perfect, but it's easy to code.

To expand on this, you could also make it a wildcard. just do a findtext() and pick the first matching one, but you will still have to make it so they can do it by index incase if they have two graphics cards that are the same name.
In response to Lummox JR
Lummox JR wrote:
Laser50 wrote:
Can't you just add it under the preferences screen? Small drop-down list. Shouldn't take any GUI.

Dude, a drop-down list is a GUI element.

Gee, a drop-down list doesn't seem that hard though, what's the reason for not using the easiest reachable method for everyone?
It's because front end code scares lummox.
Page: 1 2