Very(!) slow ramp down from high to low clock speeds leading to a significantly increased power cons...
[quote=""]PowerMizer under linux is extremely slow. It takes about 35 (!!!) seconds to reduce clocks even if there is no GPU load. I can't post img currenly, so here is gif: http://i.imgur.com/dh6bPon.gifv This is from 375 driver. Under windows it's working perfectly good: http://i.imgur.com/BGFTAZi.gifv I was taking a look @ power meter and it takes about 3 seconds to lower power consumption. So under windows it's perfectly possible. I've also checked system power consumption using blender 2.78. Under windows I get ~30W back (system) immediately after I stop moving viewport around, I even don't have to wait 3 secs. Under linux: no way, have to wait 36 seconds. IMO this seems so badly done, that under linux power management works only if you do true desktop idle. I simply don't understand why it can't be fixed.[/quote] Maybe this is why it won't be fixed: https://m.youtube.com/watch?v=_36yNWw_07g I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.
said:PowerMizer under linux is extremely slow. It takes about 35 (!!!) seconds to reduce clocks even if there is no GPU load.
I can't post img currenly, so here is gif:

http://i.imgur.com/dh6bPon.gifv

This is from 375 driver.

Under windows it's working perfectly good:

http://i.imgur.com/BGFTAZi.gifv

I was taking a look @ power meter and it takes about 3 seconds to lower power consumption. So under windows it's perfectly possible.

I've also checked system power consumption using blender 2.78. Under windows I get ~30W back (system) immediately after I stop moving viewport around, I even don't have to wait 3 secs. Under linux: no way, have to wait 36 seconds. IMO this seems so badly done, that under linux power management works only if you do true desktop idle. I simply don't understand why it can't be fixed.


Maybe this is why it won't be fixed: https://m.

I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.

#31
Posted 08/21/2017 04:31 AM   
[quote="Road_hazard"]I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.[/quote] [b]Please remove your message.[/b] Anyways, I've reported it to moderators - hopefully they will erase it. Express your thoughts about NVIDIA somewhere else. Phoronix, Reddit, Linux.com, etc. This is a [b]support forum[/b], not "I hate-NVIDIA-because-it's-trendy forum". Aaron and other NVIDIA guys here are normal people (engineers, support stuff) who don't make any decisions in regard to NVIDIA products, modus operandi, etc. etc. etc. The least they want to see is hatred towards their employer.
Road_hazard said:I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.


Please remove your message. Anyways, I've reported it to moderators - hopefully they will erase it. Express your thoughts about NVIDIA somewhere else. Phoronix, Reddit, Linux.com, etc.

This is a support forum, not "I hate-NVIDIA-because-it's-trendy forum".

Aaron and other NVIDIA guys here are normal people (engineers, support stuff) who don't make any decisions in regard to NVIDIA products, modus operandi, etc. etc. etc.

The least they want to see is hatred towards their employer.

Artem S. Tashkinov
Linux and Open Source advocate

#32
Posted 08/21/2017 08:39 AM   
[quote="Eneen"]It takes about [b]3 seconds to lower power consumption[/b]. So under windows it's perfectly possible. I've also checked system power consumption using blender 2.78. Under windows I get ~30W back (system) immediately after I stop moving viewport around, I even don't have to wait 3 secs. Under linux: no way, have to wait 36 seconds. IMO this seems so badly done, that under linux power management works only if you do true desktop idle. I simply don't understand why it can't be fixed.[/quote] Even in drivers before 381.xx it took a lot more than 3 seconds to lower the GPU frequencies (and change the P state) under Linux. I advocate for introducing a kernel module variable to configure this behaviour. [url=https://devtalk.nvidia.com/member/1882081/]@Aaron[/url], this is a nasty bug. Please bring it up!
Eneen said:It takes about 3 seconds to lower power consumption. So under windows it's perfectly possible.

I've also checked system power consumption using blender 2.78. Under windows I get ~30W back (system) immediately after I stop moving viewport around, I even don't have to wait 3 secs. Under linux: no way, have to wait 36 seconds. IMO this seems so badly done, that under linux power management works only if you do true desktop idle. I simply don't understand why it can't be fixed.

Even in drivers before 381.xx it took a lot more than 3 seconds to lower the GPU frequencies (and change the P state) under Linux. I advocate for introducing a kernel module variable to configure this behaviour.

@Aaron, this is a nasty bug. Please bring it up!

Artem S. Tashkinov
Linux and Open Source advocate

#33
Posted 08/21/2017 08:46 AM   
@birdie: This seems to be more complicated, when driver detects more peak/constant load it seems to have longer cool-down under windows. But under blender doesn't matter. Moving viewport: 42W (system), not moving viewport: 30W (system) - instantly. @Road_hazard: As @birdie stated this is support forum. If you want to help try to find other affected people.
@birdie: This seems to be more complicated, when driver detects more peak/constant load it seems to have longer cool-down under windows. But under blender doesn't matter. Moving viewport: 42W (system), not moving viewport: 30W (system) - instantly.
@Road_hazard: As @birdie stated this is support forum. If you want to help try to find other affected people.

#34
Posted 08/21/2017 10:25 AM   
[quote=""][quote="Road_hazard"]I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.[/quote] [b]Please remove your message.[/b] Anyways, I've reported it to moderators - hopefully they will erase it. Express your thoughts about NVIDIA somewhere else. Phoronix, Reddit, Linux.com, etc.[/quote] Apparently, posting here begging for Nvidia engineers to fix your bug is about as useful as me complaining about it on Phronix, Reddit, etc. Like talking to a brick wall, no? Sorry, didn't mean to trigger you, "I don't like what you're saying, I'm gonna cry to mommy and daddy and get you to shut up!" The mods on here are big boys and girls, if they don't like what I type, they can erase it without you crying about it. Cheer up, maybe Chelsea will run in 2020 and you can vote for her and take back the white house. It's obvious that the engineers could care less about fixing a power consumption issue in Linux. You're wasting your breathe here and on every support forum you mentioned above. This [s]intentional[/s] bug has survived through how many Linux driver updates? The engineers have had more than one chance to correct it but they could care less. They're in bed with Microsoft and their marching orders are to do as little as they can for Linux. Go ahead and finish up your anti-orange Jesus poster and wake up to this reality and see the world for what it is. I'll reserve you an AMD card so you can get on with your life and stop wasting your time on developer forums/support forums/soros forums, complaining about a problem that NOBODY cares about except the 5 or so people crying about it on this forum.
said:
Road_hazard said:I know for a fact that my next GPU is going to be an AMD card. AMD embraces open source a tiny bit more than Nvidia.


Please remove your message. Anyways, I've reported it to moderators - hopefully they will erase it. Express your thoughts about NVIDIA somewhere else. Phoronix, Reddit, Linux.com, etc.


Apparently, posting here begging for Nvidia engineers to fix your bug is about as useful as me complaining about it on Phronix, Reddit, etc. Like talking to a brick wall, no? Sorry, didn't mean to trigger you, "I don't like what you're saying, I'm gonna cry to mommy and daddy and get you to shut up!"

The mods on here are big boys and girls, if they don't like what I type, they can erase it without you crying about it. Cheer up, maybe Chelsea will run in 2020 and you can vote for her and take back the white house.

It's obvious that the engineers could care less about fixing a power consumption issue in Linux. You're wasting your breathe here and on every support forum you mentioned above. This intentional bug has survived through how many Linux driver updates? The engineers have had more than one chance to correct it but they could care less. They're in bed with Microsoft and their marching orders are to do as little as they can for Linux.

Go ahead and finish up your anti-orange Jesus poster and wake up to this reality and see the world for what it is. I'll reserve you an AMD card so you can get on with your life and stop wasting your time on developer forums/support forums/soros forums, complaining about a problem that NOBODY cares about except the 5 or so people crying about it on this forum.

#35
Posted 08/21/2017 01:35 PM   
Won't even dignify Road_hazards comment with rational argument, that shit doesn't work on trolls who have nothing better to do than waste their life and more importantly our time on shit like that. Time for nvidia to implement user blocking on this forum so we can let them cry in the darkness to their hearts desires.
Won't even dignify Road_hazards comment with rational argument, that shit doesn't work on trolls who have nothing better to do than waste their life and more importantly our time on shit like that.

Time for nvidia to implement user blocking on this forum so we can let them cry in the darkness to their hearts desires.

#36
Posted 08/21/2017 01:41 PM   
[quote=""]@birdie: This seems to be more complicated, when driver detects more peak/constant load it seems to have longer cool-down under windows. But under blender doesn't matter. Moving viewport: 42W (system), not moving viewport: 30W (system) - instantly. @Road_hazard: As @birdie stated this is support forum. If you want to help try to find other affected people.[/quote] I'm pretty sure that everyone that cares about this bug is already here and has sounded off on it. There an ETA on a fix? Was it even acknowledged as a bug that NEEDS fixed, or expected behavior? The last reply from Nvidia, TWO MONTHS ago, said that without risking graphic corruption, there doesn't seem to be anything they can do to fix it. Time to move on everyone, this is a lost cause.
said:@birdie: This seems to be more complicated, when driver detects more peak/constant load it seems to have longer cool-down under windows. But under blender doesn't matter. Moving viewport: 42W (system), not moving viewport: 30W (system) - instantly.
@Road_hazard: As @birdie stated this is support forum. If you want to help try to find other affected people.



I'm pretty sure that everyone that cares about this bug is already here and has sounded off on it. There an ETA on a fix? Was it even acknowledged as a bug that NEEDS fixed, or expected behavior? The last reply from Nvidia, TWO MONTHS ago, said that without risking graphic corruption, there doesn't seem to be anything they can do to fix it.

Time to move on everyone, this is a lost cause.

#37
Posted 08/21/2017 01:42 PM   
[quote="Road_hazard"]Apparently, posting here begging for Nvidia engineers to fix your bug is about as useful as me complaining about it on Phronix, Reddit, etc. Like talking to a brick wall, no? Sorry, didn't mean to trigger you, "I don't like what you're saying, I'm gonna cry to mommy and daddy and get you to shut up!"[/quote] What part of "This is a support forum and only a [b]few NVIDIA engineers[/b] frequent it" don't you understand? NVIDIA top management, CIO, CEO, CFO, etc. - [b]none of them ever come here[/b]. Could you stop bitching here and talk to Jensen Huang directly? Or better yet, abandon NVIDIA products altogether. You have just one way of fixing your problems in Linux with binary NVIDIA drivers: talk about your issues, find other affected users and post as much relevant information as possible. Bitching will not help. It will make Aaron and other NVIDIA developers avoid these threads like a plague - they are here to work with you, not to hear that their employer doesn't love open source as much as AMD does. This is f*cking irrelevant! AMD has become so open source friendly because they are the underdog and they need something to gain prominence. They cannot win by performance metrics, but they gain publicity by being open source friendly. I've updated the original post to make the information in it relevant and actual. [url=https://devtalk.nvidia.com/member/1882081/]Aaron[/url] and [url=https://devtalk.nvidia.com/member/1827029/]Sandipt[/url], please file a bug report, investigate it and solve it. High clock speeds shouldn't be retained for more than five consequent seconds in my opinion. Ten are the absolute maximum.
Road_hazard said:Apparently, posting here begging for Nvidia engineers to fix your bug is about as useful as me complaining about it on Phronix, Reddit, etc. Like talking to a brick wall, no? Sorry, didn't mean to trigger you, "I don't like what you're saying, I'm gonna cry to mommy and daddy and get you to shut up!"


What part of "This is a support forum and only a few NVIDIA engineers frequent it" don't you understand? NVIDIA top management, CIO, CEO, CFO, etc. - none of them ever come here.

Could you stop bitching here and talk to Jensen Huang directly?

Or better yet, abandon NVIDIA products altogether.

You have just one way of fixing your problems in Linux with binary NVIDIA drivers: talk about your issues, find other affected users and post as much relevant information as possible.

Bitching will not help. It will make Aaron and other NVIDIA developers avoid these threads like a plague - they are here to work with you, not to hear that their employer doesn't love open source as much as AMD does. This is f*cking irrelevant!

AMD has become so open source friendly because they are the underdog and they need something to gain prominence. They cannot win by performance metrics, but they gain publicity by being open source friendly.

I've updated the original post to make the information in it relevant and actual.

Aaron and Sandipt, please file a bug report, investigate it and solve it.

High clock speeds shouldn't be retained for more than five consequent seconds in my opinion. Ten are the absolute maximum.

Artem S. Tashkinov
Linux and Open Source advocate

#38
Posted 08/21/2017 07:21 PM   
Aaron, I want this bug to be fixed.
Aaron, I want this bug to be fixed.

Artem S. Tashkinov
Linux and Open Source advocate

#39
Posted 09/06/2017 12:57 PM   
New 384 drivers take even longer to downclock :/
New 384 drivers take even longer to downclock :/

#40
Posted 09/07/2017 07:22 PM   
I will bump this thread indefinitely. This is a blocker for God's sake.
I will bump this thread indefinitely.

This is a blocker for God's sake.

Artem S. Tashkinov
Linux and Open Source advocate

#41
Posted 09/18/2017 03:39 PM   
Obviously drivers version 384.90 are affected as well. [quote="kernelOfTruth"]New 384 drivers take even longer to downclock :/[/quote] I edited the original post to reflect this further regression.
Obviously drivers version 384.90 are affected as well.
kernelOfTruth said:New 384 drivers take even longer to downclock :/

I edited the original post to reflect this further regression.

Artem S. Tashkinov
Linux and Open Source advocate

#42
Posted 09/24/2017 05:32 PM   
[quote=""]Obviously drivers version 384.90 are affected as well. [quote="kernelOfTruth"]New 384 drivers take even longer to downclock :/[/quote] I edited the original post to reflect this further regression.[/quote] nvidia really could add an "optimal" PowerMizer setting like on Windows drivers, for maximum power savings for e.g. mostly 2D desktop operation but I doubt it's realizable under Linux without APIs, lacking feedback from the desktop/GUI/X-server, etc.
said:Obviously drivers version 384.90 are affected as well.
kernelOfTruth said:New 384 drivers take even longer to downclock :/

I edited the original post to reflect this further regression.


nvidia really could add an "optimal" PowerMizer setting like on Windows drivers, for maximum power savings for e.g. mostly 2D desktop operation

but I doubt it's realizable under Linux without APIs, lacking feedback from the desktop/GUI/X-server, etc.

#43
Posted 09/24/2017 06:25 PM   
[quote=""][quote=""]Obviously drivers version 384.90 are affected as well. [quote="kernelOfTruth"]New 384 drivers take even longer to downclock :/[/quote] I edited the original post to reflect this further regression.[/quote] nvidia really could add an "optimal" PowerMizer setting like on Windows drivers, for maximum power savings for e.g. mostly 2D desktop operation but I doubt it's realizable under Linux without APIs, lacking feedback from the desktop/GUI/X-server, etc. [/quote] I think it's not related to desktop, on blender under windows I get my clocks down in less then second. Under linux, you know. Today I noticed that I have 30sec cooldown under windows too. Clocks have been high for 30 secs just after moving cube in blender. Not sure what I did (maybe background drivers update?) and it was gone after setting power to adaptive and back to optimal and restarting PC. Not sure how long I was working with this. This is so annoying. I'm on desktop, but I can't imagine having laptop with this bug. That's one of those things that stops me from switching to Linux completely and I have to say I've done lots of work to adopt my 3D workflow and new GNOME Linux desktop is so cool. Fortunately good thing is, that Nvidia driver has same speed under Linux in blender. EDIT: just checked and I've got same drivers from 18.07.2017 under Windows.
said:
said:Obviously drivers version 384.90 are affected as well.
kernelOfTruth said:New 384 drivers take even longer to downclock :/

I edited the original post to reflect this further regression.


nvidia really could add an "optimal" PowerMizer setting like on Windows drivers, for maximum power savings for e.g. mostly 2D desktop operation

but I doubt it's realizable under Linux without APIs, lacking feedback from the desktop/GUI/X-server, etc.


I think it's not related to desktop, on blender under windows I get my clocks down in less then second. Under linux, you know.
Today I noticed that I have 30sec cooldown under windows too. Clocks have been high for 30 secs just after moving cube in blender. Not sure what I did (maybe background drivers update?) and it was gone after setting power to adaptive and back to optimal and restarting PC. Not sure how long I was working with this.
This is so annoying. I'm on desktop, but I can't imagine having laptop with this bug. That's one of those things that stops me from switching to Linux completely and I have to say I've done lots of work to adopt my 3D workflow and new GNOME Linux desktop is so cool.
Fortunately good thing is, that Nvidia driver has same speed under Linux in blender.

EDIT: just checked and I've got same drivers from 18.07.2017 under Windows.

#44
Posted 09/25/2017 10:12 AM   
Bump ;-)
Bump ;-)

Artem S. Tashkinov
Linux and Open Source advocate

#45
Posted 01/01/2018 11:23 AM   
Scroll To Top

Add Reply