Video card not heating (1050ti) (Red Dead Redemption 2)
1050ti 4gb.I play Red Dead Redemption 2 at medium settings - 37-40 fps, the temperature of the video card is 62C,
I set the high settings, the fps drops to 27-30, but the temperature becomes only 64C ...
Vidyakha has a small cooling upgrade. But why does the temperature practically not rise, and the FPS sag?
What do you recommend to do?
ps for example, in Whitcher3 the temperature before the upgrade was 82C, after the upgrade it was 72C, but when the graphics settings were changed to low, the temperature dropped to 50C. And in RDR2, it practically does not change ...
semen1900
Because the 1050ti, like the 1650, is a "cold card", if it warmed up to 80 degrees, then there were problems with cooling.
X_ray_83
In the internet on the Nvidia forums I read that the normal operating temperature of the upper limits is at 1050m: 84-85C, and 92C is critical.
I didn’t have more than 82 before upgrading the radiator and coolers, and after more than 72-72C I didn’t have it, and then, only the Witcher was on high.
ClevoGame
... so I want to find out why when the settings are increased to high or ultra - the fps falls through, and the temperature does not rise ...
Overclock the card?
semen1900
... so I want to find out why when you increase the settings to high or ultra - the fps falls through, and the temperature does not rise ...
You better try to explain where you got the idea that the temperature should rise in your case? I’m just wondering how you came to such a conclusion.
To the above about GPU temperatures, I would like to add that the PC does not consist of one video card and if any other component does not export (CPU, support, HDD, etc.), then the rest of the components are not loaded and do not work at the peak of their capabilities.
PS By the way, more than one monitoring does not give 100% of the picture, but basically you can roughly say what the problem is.
Space Marine wrote:
You better try to explain, where did you get the idea that the temperature should rise in your case? I’m just wondering how you came to such a conclusion.
That is, for you, temperature limits are a joke or something ?! =) Of course, for 1050ti 80 degrees it is too much and it shouldn't be like that. But if you look in general, there is a dependence of performance on temperature! And the person wrote to you that before he had the same 80 degrees, he may not know that the card is higher than 75, or even 70 should not be heated, so he thought that something was wrong. In vain they just minus a person!
semen1900
It's not true, don't make it up, even here you can see no more than 70 degrees! If you have such temperatures, then there are problems with ventilation in the case.
Dos9ra
In fixed ports like RDR2, there is always a load on the percents, because what he says is not surprising.
Space Marine wrote:
that the temperature should rise in your case? I’m just wondering how you came to such a conclusion.
I wonder why she doesn't grow in RDR2.
When the settings are set to "high", the load on the GPU grows, but the temperature does not change ...
Why then in The Witcher3 it changes, at minimum it is lower, and at maximum speed it is higher.
semen1900
Because there is nothing to load in RDR2 weak graphics game takes purely design!
It can be seen in Saint Denis such a "fog" to reduce the drawing range for consoles, this nonsense was transferred to the PC.
You greatly overestimate the graphics of RDR2 if you compare it with the same Assassin Creed Valhalla, then this heaven and earth does not blur distant objects, because it was tested and optimized for PC.
X_ray_83 wrote:
Because there is nothing to load in RDR2, the game takes weak graphics purely by design!
You greatly overestimate the RDR2 graphics
In the next topic, they also nominated me and wrote that there was nothing to load in The Witcher3. And here in RDR2 both sky and water, super shadows, etc.
So if there is nothing to load in RDR2, then why then 23 fps at ultras, at a temperature of 62-64 C? ...
... but I do not overestimate, the hair of people and the tails / manes of mares are even better in The Witcher3 and it's 2015)) But nature - in general, in RDR2 is made very colorful and realistic. I'm a little surprised by Roxatre, because in GTA their graph has always been a cartoon shit.
semen1900
"then why then 23 fps on ultras"
Because this is a console port and this happens, and the problematic how many patches have already been released!? Can't you remind me !?
"The hair of people and the tails / manes of mares are even better in The Witcher3"
Because the priority development was on the PC, it seems obvious))) It's like now Stalker 2 graphics on the PC, the priority is on the Xbox crumbs.
Here's The Witcher 2 2012 makes 2017 Xbox One X
X_ray_83 wrote:
"why then 23 fps on ultras"
Because this is a console port and this happens, and how many patches have already been problematic? Can't you remind me !?
I don’t remember, I’m not a gamer, my main hobby is motocross.
games like this - drive in the evening on the weekend.
... so it turns out that 3080 vidyahi is not needed for RDR2, but 1050tÑ– would be quite enough, if the developers would have strained at least a little.
and overclocking 1050 will do nothing in this case? (well, except for the risk of burning the vidyahu)
semen1900 wrote:
and overclocking 1050 will do nothing in this case? (well, except for the risk of burning vidyahu)
When monitoring, there is basically no such risk. An increase of 10% at best.
semen1900 wrote:
here school camputars 'mothers are minus everyone,
But I'm not surprised that they nominated, and not because the “mothers' komputators†minus everyone, but because some abstract question was asked in the technical subforum. No useful information regarding settings, configuration, no screenshots, vidos, etc.
If it were really interesting, you could turn on monitoring from MSI afterberner, compare the indicators, and perhaps he himself would have found an answer at least an approximate one. The load on the VC is different: some settings take up memory and do not heavily load the GPU, while others just use the performance of the GPU itself and heat the card.
semen1900 wrote:
In the next topic, they also nominated me and wrote that there was nothing to load in The Witcher3.
I want to note that they wrote that the technologies in the Witcher are outdated, and not that the Witcher does not load the map.
semen1900
Processor and video card paired does not support RDR 2. Ultra-High Settings - 24.4 FPS.
Spoilerhttps: //pc-builds.com/cyri/Intel_Xeon_E5-1650/NVIDIA_GeForce_GTX_1050_Ti/0jx0VZ/? GameList = (3i)
There are no gaps between the processor and the video card.
Spoilerhttps: //pc-builds.com/calculator/Xeon_E5-1650/GeForce_GTX_1050_Ti/0jx0VZlu/16/100/
Xeon E5-1650 + 3080 in RDR 2 - 28.9 FPS. Xeon knocked out.
Spoilerhttps: //pc-builds.com/cyri/Intel_Xeon_E5-1650/NVIDIA_GeForce_RTX_3080/0jx174/? GameList = (3i)
Dos9ra
That is, for you temperature limits is a joke or what ?!
What other temperature limits when talking about 70 degrees of a video card? Come to your senses.
semen1900
I wonder why it doesn't grow in RDR2.
When the settings are set to "high", the load on the GPU grows, but the temperature does not change ...
Why then in The Witcher3 it changes, at minimum it is lower, and at maximum speed it is higher.
Because on your video card and processor, at least at minimum speeds, at least at maximum speeds, in RDR2 the emphasis is always on your video card, it is always 100% loaded there. Therefore, temperatures do not have to change much.
The percentage is not yours, but in terms of performance it is a close analogue.
Why do you have such a range of temperatures in the Witcher 3, although you also have such a range of temperatures in it, that at low, that at ultras, there should be 100% load of your 1050 - you need to ask your collective farm cooling system.
It's not yours again, but it's not far from performance.
Although, if even under dropsy your 1050 manages to warm up above 70 degrees, I no longer know where the truth is, and where the fantasy is.
IngwardIn wrote:
There is no such risk in monitoring. An increase of 10% at best.
Afberberner is standing, I watched FPS with temperatures on it.
Well ... 10% is no longer a little, if at high 27fps or something like that, then + 10% is already 30fps, and at 30 you can already play.
Marsj wrote:
Processor and video card paired does not export RDR 2. Ultra-High Settings - 24.4 FPS.
yes, somewhere it was 23-25 ​​fps ... well, as for an old budget card, this is also not entirely bad.
Space Marine wrote:
Why do you have such a range of temperatures in the Witcher 3, although you also have such a range of temperatures in it, that at low, that at ultras, there should be 100% load of your 1050 - you need to ask your collective farm cooling system.
so I'm wondering
ps why do you write that 75 is already a high temperature for 1050?
Toms Hardway wrote
83C is the max temp before throttling, and the 90s is when the card will shut off.
... if 83 is the maximum operating temperature, then 80 is quite an operating temperature.
semen1900
ps why do you write that 75 is already a high temperature for 1050?
Because it's a 75 watt plug.
Even such a single-fan Palitov stub keeps temperatures in the region of 55-62 degrees under full load.
Spoiler
Therefore, a reasonable question arises, how could your card in the drain heat up to 82 degrees, and under dropsy up to 70+?
83, 90 and other space temperatures indicated by the manufacturer are the critical operating temperatures of the chip, which in real work on this video card can only be obtained by completely removing the cooling system from the chip or at least turning off the fan on the card.
Did you take the card in the store again, or did you buy it from some grief of the miner / Chinese?
Space Marine wrote:
83, 90 and other space temperatures indicated by the manufacturer - these are critical temperatures that the
core overclocked +175, memory +550, the
temperature rose by only 2 degrees in rdr2, but at ultras it was already 27-28 fps instead of 24, reduced a couple of twists and it turned out 31fps.