3 New Notifications

New Badge Earned
Get 1K upvotes on your post
Life choices of my cat
Earned 210

Drag Images here or Browse from your computer.

Trending Posts
Sorted by Newest First
h
hardcore gold 11.03.22 11:26 pm

nvidia merges 700x line (The Witcher 3: Wild Hunt)

It is a pity that with the same power of 760 and 960. 960 is ahead and even overtakes 770 ((
44 Comments
Sort by:
B
Behemoth cat 11.03.22

Nickbee
Of course! And I'm talking about the same thing, just using the example of n-vidia and other "hardware" everything is very clear. :)

T
TImk0 11.03.22

So which is better Radeon or Nvidia?

M
Miser-80 11.03.22

stalker8509
Well, about comparisons of two 290s and Titan - I agree, but one 290 versus 780 - I agree only partially. It cost 290 at the time of release several times cheaper than 780ti - 22,000 versus 36,000 (in reference versions), but it lost a little - after all, the frequencies of the 780th are higher, and there were also slightly more processors in terms of the number of processors, another matter that in fps this maximum of 8-10 frames gave a difference (despite the fact that it did not sag below 45,290 in any game), but it cost 780ti while being 14,000 more expensive. And for the future, 290x was more profitable, because. dx12 supported, unlike dx11 / 1 in 780 ti. But these are all trifles, but basically - yes - I agree. AMD-shniks, although lazy, but not so brazenly cheat on buyers than n-vidia.

N
Nickbee 11.03.22

http://www.playground.ru/blogs/witcher_3_wild_hunt/the_witcher_3_-_zagovor_nvidia_i_razrabotchikov_sobiraem_fakti-144985/

J
JONDROGON 11.03.22

960 is just more powerful than 760, it copes faster and better with Nvidia technologies, in general, Maxwell has more potential. take 980 I think yes, it will be better than any kepler in any game))

n
nominal 11.03.22

stalker8509
Bigger doesn't always mean better, a '76 Cadillac has a 7-liter engine, a 2015 BMW has two liters, which one will be faster?

T
TImk0 11.03.22

stalker8509
Thank you, of course, everything is coolly painted))
The question is actually the most important) What video card to buy so that it is both economical and relatively inexpensive
I have an i5-3570 percent and an AMD Radeon HD 6670 video camera
What is better to buy now? Does it make sense to change this vidyuhu?

n
nominal 11.03.22

stalker8509
I'm not saying that 980 is 50-100% faster, as in any line of cards, the performance is 10-20% faster, just like 980ti is 10-20% faster than usual, also if you compare cars, then a car the current year will be faster than the same model of the past by 10-20%, just some kind of conspiracy. 0))

T
TImk0 11.03.22

stalker8509
Yes, I would like it to be cheaper, of course, let’s say 10-15r pieces, but I understand that you can’t buy much better than my tree for such an amount, but I don’t see the point of taking a little better than mine.
So if there are better options, I’ll dig a little)) I don’t it burns like that .
Perhaps it makes sense to wait for reviews from radeo 300,
I don’t know which is better))

M
Miser-80 11.03.22

Adolom
"960 is more powerful than just 760" - Adolom, everything is simple - tests and tests again, indicate that it is more powerful only where N-vidia firewood was finished normally - in other games, poorly optimized or resource-dependent, I'm silent about all sorts of 3d -editors, 960 will not cope with higher numbers of iron. And the difference between architectures - I do not argue, it is important, but only when it is really something fundamentally new, otherwise, in fact, the fundamental difference between technologies manifests itself only after 2-3 generations of video cameras. For example, in this situation I can give a personal example: A long time ago, I bought a top-end 8800GT at that time with 1GB of video and 320 bit depth. So I had to change it only with the release of the 500th generation, and even then only because, having seen the performance of the dual-core 590th at that time, I could not refuse myself. In fact, most of the games of that time went on for me, if not at maximum speeds, then at high ones for sure. There were exceptions - the second Crisis, Metro and Stalker ZP (and even then, the last one - only because of poor optimization of the game itself). But a friend decided then to wait and later bought the 680th, which, at a cost almost like my 590th of that time, lost to it in everything (which is quite logical), with the release of the 700th, he, spitting that he was led to n-vidia advertising , again changed the video to 780 (not Ti), and I, in turn, after waiting a bit, replaced the old 590th with R9 290x. We compared the results with it and it became obvious that the 780 was losing to r9 290x, overtaking the old 590 by only 10-12 fps. Both of us then came to the conclusion that you can’t argue against the “iron” numbers. Yes, you can be smart with firewood, as they do now in n-vidia, but as soon as such a toy comes across,

M
Miser-80 11.03.22

nominar
Nominar, and where did you get the idea that the 980th will bypass the 780th? In the new licked games - yes, of course, but not in the old ones. Stalker8509, as I understand it, did not compare 980 and 980Ti, but just 980 and 780. So I'll surprise you - run the Metro 2033 system, Arma 3 or any other game (the same Stalker ZP with some kind of graff addon) without path optimization (that is, when n-vidia cannot adjust the firewood in advance) and see for yourself what will happen. 780ya in such cases bypasses by 15-20 fps. And this is just a clear demonstration of the fact that no one has yet canceled iron, but it is measured in numbers and not in wood. I do not agree with Stalker8509 in terms of frequencies, because. during overclocking and comparisons in the same MSI Afterburner, the difference is only visible at high frequencies. In other cases, just the bit depth and number of processors play a role.

B
Behemoth cat 11.03.22

stalker8509 I
completely agree, comrades! In our store, it now costs 18,500, despite the fact that at the time of the price increase it was 26,800, and the same 980 was under 40,000 (!!!). Yes, for 37,000 you can stupidly take two 290s and n-vidia will silently suck ... Now, due to the fact that amd-shniki were too lazy and did not update the pieces of iron to please the market, n-vidia simply collects the cream and uses all available methods , and therefore it is not very profitable to take their products now.

J
JONDROGON 11.03.22

Miser-80
Yes, there’s nothing to be done. I’ve been sitting on 660 for 2 years now, a little more, so 960 came out, but there 400 watt bp requires me 500, the card itself is normal for my wallet. And bp doesn’t need to be changed and it’s better by 50% 660, in general I’m thinking of changing it and I’m sure that I won’t regret it, it’ll be enough for a couple of years, then it won’t pull again, although DX12 can bring us something, who knows))

J
JONDROGON 11.03.22

stalker8509
Yes, I’m kind of already set to buy 960, well, I’ll look at this one, I haven’t even heard of it. In general, I don’t see any reason to take it, DX12 is undoubtedly the future of games, but all this can be introduced for a long time, well, I think it’s better to buy 960. And yes, in Futuremark This 760 Mars 8680 points yes, but 960 10500))

B
Behemoth cat 11.03.22

I support - synthetics are evil! And as for Asus Mars, Stalker8509, you definitely can’t find them on sale, at the time of their release, something about 500 pieces went on sale to us, I think it was the same in other stores. And so, yes - there was a vidyukha beast, and if there were two of them in sli, then in general there was a pipets - it turned out 40,000 for vidyuhi and 7000 for a block to them, and as a result, Titan vomited, and two 780s, and two r9 ... If you find her, then it will be very lucky, consider ...

E
Ev_Gen 11.03.22

Especially for stalker8509, I tried two gtx 960 cards: the first from gigabyte with 2GB, the second from palit with 4 GB. The difference is especially noticeable in games that consume more than 2 GB of video memory. For example, I played diyng light with all the maximum settings, first on a 2GB map and noticed that the game consumes all 2GB completely, while the fps was about 50, but sometimes when spitting poisonous zombies and other loaded scenes, microfreezes appeared up to 30 fps, i.e. e. at such moments, the video memory of the card for the game was significantly lacking. I put a card with 4GB of video memory and everything became clear - the game used about 3.5 GB of video memory. At the same time, the fps rose to 60 and microfreezes completely disappeared in any scenes. PS. The Witcher 3 on all high settings with off. hair produces an average of 45-55 fps, in cities 5 fps lower. The 128-bit bus from the negative side does not manifest itself in any game. Before that, there were NVIDIA GTX 9600, 550ti, 460, 560, 660, 670 video cards. Before buying the GTX 960, I tried the AMD Radeon HD 7950 with a 384-bit bus, I made one conclusion: people never take video cards from AMD! It's complete rubbish!

M
Miser-80 11.03.22

Oh-oh! Finally a sequel! Even more than that - u-tube went into action - below I will definitely insert my "five cents" :) I already thought they forgot about the topic. Ev_Gen, your words "The Witcher 3 on all high settings with hair turned off gives an average of 45-55 fps, in cities it is 5 fps lower. The 128-bit bus from the negative side does not manifest itself in any game. Before that, there were video cards NVIDIA GTX 9600, 550ti, 460, 560, 660, 670" - fully confirm that you can steam any slag, because you are the dream of marketers and n-vidia and amd-e :) That's how I neighed - you even understand yourself that the entire list of vidyuhi that you brought in PRINCIPLE will not be able to do anything in The Witcher 3, due to the fact that they are technologically outdated ! it is necessary to compare new vidyuhi, and that's exactly where I disagree with Stalker8509, tk. he mentions bit depth AT ALL, but it would be necessary not to forget about the novelty of vidyuhi. It is necessary to compare vidyuhi of two generations - current and past because of technologies and especially dirrect-x. And by the way, Ev_Gen, if you already compared something there, then here's some info for you to think about - just for The Witcher 3. For the future advice - when AMD Fury comes out, be sure to look at its comparisons with Titan X. And regarding both of you and Stalker8509 and you - you people, in my opinion, left a little on the wrong topic - we didn’t seem to be talking about amd vs n-vidia here, but about the fact that n-vidia is lying and big, moreover . Stalker8509, don't get fooled by these schoolchildren. They've already been so brainwashed that you can't set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) It is necessary to compare vidyuhi of two generations - current and past because of technologies and especially dirrect-x. And by the way, Ev_Gen, if you already compared something there, then here's some info for you to think about - just for The Witcher 3. For the future advice - when AMD Fury comes out, be sure to look at its comparisons with Titan X. And regarding both of you and Stalker8509 and you - you people, in my opinion, left a little on the wrong topic - we didn’t seem to be talking about amd vs n-vidia here, but about the fact that n-vidia is lying and big, moreover . Stalker8509, don't get fooled by these schoolchildren. They've already been so brainwashed that you can't set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) It is necessary to compare vidyuhi of two generations - current and past because of technologies and especially dirrect-x. And by the way, Ev_Gen, if you already compared something there, then here's some info for you to think about - just for The Witcher 3. For the future advice - when AMD Fury comes out, be sure to look at its comparisons with Titan X. And regarding both of you and Stalker8509 and you - you people, in my opinion, left a little on the wrong topic - we didn’t seem to be talking about amd vs n-vidia here, but about the fact that n-vidia is lying and big, moreover . Stalker8509, don't get fooled by these schoolchildren. They've already been so brainwashed that you can't set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) For the future, advice - when AMD Fury comes out, be sure to look at its comparisons with Titan X. And regarding both of you and Stalker8509 and you - you people, in my opinion, left a little on the wrong topic - we didn’t seem to be talking about amd vs n-vidia here, but about the fact that n-vidia is lying and big, moreover . Stalker8509, don't get fooled by these schoolchildren. They've already been so brainwashed that you can't set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) For the future, advice - when AMD Fury comes out, be sure to look at its comparisons with Titan X. And regarding both of you and Stalker8509 and you - you people, in my opinion, left a little on the wrong topic - we didn’t seem to be talking about amd vs n-vidia here, but about the fact that n-vidia is lying and big, moreover . Stalker8509, don't get fooled by these schoolchildren. They've already been so brainwashed that you can't set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) that you won’t set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...))) that you won’t set them right - no one will set them right at all. For them, there is SO MUCH information everywhere, but they can’t even read it carefully ...)))

K
Kirill1998 11.03.22

Yes, nvidia is fine.

E
Ev_Gen 11.03.22

stalker8509 you read my post inattentively. I didn’t write at all that the 960 would bypass the same one from AMD. Firstly, I wrote that GTX 960 with 2 and 4 gigs of memory have a significant difference in performance (you said otherwise), but only in those games that consume more than 2 gigs of video memory. In the same Witcher with a maximum video memory consumption of 1.5 GB, there will be no difference. Secondly, the "antediluvian" RADEON HD 7950 is simply a renamed R9 280 with the same tahiti chip, and its price is the same as that of the GTX 960.
Especially for Skryag-80, I have long looked through all the tests, and my purchase was made precisely according to the results of their analysis. Those video cards that I have listed, of course, will not cope with the witcher, I wrote about them as a positive example of using NVIDIA from a long time ago, and not a very pleasant impression from the tested RADEON.
PS To be honest, it doesn't matter to me what manufacturer the video card is from. The main thing is that in modern games that have just been released, the video card should work normally, give good FPS, which, unfortunately, the RADEON cannot do yet.

M
Miser-80 11.03.22

Ev_Gen
Nothing like that, dear! r9 280 is hd 7970,... further -
"it gave out good fps, which, unfortunately, RADEON cannot do yet" - but it could have produced it if n-vidia had not been wiser with firewood, and this topic would not have been up-to-date....