3 New Notifications

New Badge Earned
Get 1K upvotes on your post
Life choices of my cat
Earned 210

Drag Images here or Browse from your computer.

Trending Posts
Sorted by Newest First
T
TinyPodMartiny 02.02.20 08:39 pm

Compatibility of PCI Express 2.0 and PCI Express 3.0

For the new year going to buy the new Nvidia GeForce GTX 670,because an old GF 250 GTS has not drawn many games.But there is a problem,I have a motherboard Gigabyte GA-MA770T-UD3 and it supports only PCI Express 2.0 and the graphics card is PCI Express 3.0.The question is,will the video card work if it to connect to my motherboard?In Google searched,say to all the rules.But I want to make sure here,maybe someone had this problem?
191 Comments
Sort by:
s
smh 02.02.20

Did I understand correctly that you can safely buy vidyuhi PCI-E 3.0, if the mother supports a maximum of 2.0 ?
I have a Asus Maximus Formula (the first) 8 years. Percents Intel Core E8500.
Long time I want to update the computer, but every time I wonder what's the point??? If it all flies.
The only want to play occasionally (very occasionally).
So I think to take GTX 560 ti (this is the maximum that found on the PCI-E 2.0) or to take already with a stock of more powerful GTX 660 for example... or even athlete money and buy GTX950.

Do you think much of these options vidyuhi will lose performance due to the fact that my mother supports only PCI-E 2.0 ?

F
Fioletovyy zhiraf 02.02.20

smh
The article is just for you - https://www.overclockers.ru/lab/77399/testirovanie-starshih-videokart-nvidia-i-amd-v-interfejsah-pci-express-x16-x8-ijul-2016.html

s
smh 02.02.20

Purple giraffe
Well I would not say that it is for me.
There's a modern motherboard and the CPU in the test.
Where does my situation?

B
Belomorkanal 02.02.20

smh
If we are talking about such weak graphics cards and processors, the difference between 2.0 and 3.0 would not be at all. Take the GTX 950, the cheapest of those that require additional power directly from the PSU, powered from the bus not recommended to take.

s
smh 02.02.20

Thank you!
Still I decided to take the last. either 950 or 960... future-proofed. And maybe when we renew the rest of the computer, vidyuhi not need to be updated.

G
GeRR_Praetorian 02.02.20

smh
Not easier if you speak with a reserve about the future (but still once in a while, but play) to take 1060????

G
GeRR_Praetorian 02.02.20

smh
By the way, don't forget the percent of change to solve this bidascu. Now average the i5 - s are not so expensive.

E
EvgenyDMT 02.02.20

pisi pisi 2.0 and 3.0 are not compatible!!!If one PC 2.0 stick in a different PC 3.0,it will burn to kuam!!

s
smh 02.02.20

Stiff Of Macivor
No, not easier... I'm not such a gamer, how was 1 years ago, so none 300+ at.e to give...
150-200 still all right...

Stiff Of Macivor
By the way, don't forget the percent of change to solve this bidascu. Now average the i5 - s are not so expensive.

And then all the sadness... my new socket 775 processors not. But to change the whole system, which, though 8 years ago, but was given a lot of money, though, and so nothing slow, no sense at all...

G
GeRR_Praetorian 02.02.20

smh
Cool!!! How did you get out of breaking this gaming I do not understand.... What a pity! Almost tridtsatnik, and taaaake the money spent on the comp every year that is difficult to imagine! Not so long ago, at the end of December I spent 73000 RUB on Prots memory, cooler and motherboard, now, how come GTX1080TI and they have to buy, I think every video card will cost as much as 75 thousand, so that is 150!!!!!!!!!!! It's only two some! But I think its still to sell, so it was easier to add. Can't understand what in your understanding do not brake? I won on your comp checked recently and run games on 4k with ultra settings prohibitive. God, to say that slow - to say nothing)))) This slide show was just a gesture))) looked online that only two GTX 1080 and then doing back to back! Maybe your computer does not slow down the game a decade a long time I still believe, but modern, hmmmm, nah. For example, start a new lark or Deus ex, which was recently released on a large resolution and all settings at max))))))
I think once to put a huge amount of for some to three years Huang more money not to give, and there may even on upgrades to score. And every year already tired to change komplektuhi. If not the dollar, it was easier. Yes, more with every year less and less of the game is probably already age. Of the 293 ex games to play only two))))) And then 2-3 hours a day. For example, yesterday for the first time held a mass еffect 3!!!!)))) And now do not know what to do....Though the holiday has come! I guess I will take the add-on Shogun 2 or buy a new Deus ex. See what performance and plot!

S
SonyK_2 02.02.20

smh wrote:
Did I understand correctly that you can safely buy vidyuhi PCI-E 3.0, if the mother supports a maximum of 2.0 ?
Yes.
On my motherboard with PCI-E 2.0 (see profile) using a 2nd video card with 3.0 (first was a GTX 650, now 960) - working without problems. The main thing to remember - 2.0 honey more than 75 Watts through the slot. To avoid complications, it is better to take none with additional power connector.

s
smh 02.02.20

Stiff of Misiur wrote:
Can't understand what in your understanding do not brake?
You did not read, and can read between the lines... no brakes means no brakes. The games I've been playing for 8 years anyway... Nothing is slow, except for games (which I don't play). Now here decided to recall youth and to play new WOW (vidyuhi choose for her). Enough and again I will not play another 5 years, for sure ;)))

F
Fioletovyy zhiraf 02.02.20

Stiff Of Macivor
If you're playing in 4K, there is enough of a GTX 1070 or 1080. But you don't need all the settings to wring the maximum, because in this resolution the aliasing is no longer required. Abandoning it or putting on the minimum, you can greatly increase the performance.

In General, suggest useful video

G
GeRR_Praetorian 02.02.20

We tested the gtx 1070 and 1080. And the truth is that one gtx 1080, not to mention the 1070 is far from enough for 4k!!!! Here is a user -SK.art- (sections - who has a computer, the choice of graphics cards and something else) so he has two 1070 and even he's not playing at this resolution, but plays at 3.5 K. Here is his video:

In principle it is possible to put 4k resolution, but quality settings lower. Well, this is nonsense!!!! Buying some for +100500 RUB something to underestimate. Not easier to put everything on ultra high, but with a lower resolution (e.g. resolution of your monitor, either via DSR something more, but not 4K)
As for anti-aliasing. As this is my favourite topic I will say that at 4k with no antialiasing ladders terrible!!!! In the video, which I cited shows that even at this resolution SK.art puts shlasko mss4!!!! Personally, I always put the resolution of 2880 by 1620 and antialiasing smaa, or msaa 4. As for 4k resolution, then it is ideal either fxaa or smaa, msaa or even better 2. They are perfectly smooth ladders in this resolution!

J
Jocker-777 02.02.20

Hello) Faced with such a problem. Old mother asus p5g41t - lx3. Wanted to apnut. Took 8 GB of RAM, xeon 5440...now left with none. Tell me more menie map to were expensive. And whether or not the b\y to take?

I
Ingener_Mexanik019 02.02.20

Good day ! Tell me if my motherboard is PCI-E 2.0 bus width of 16 Will there be performance loss if I install a GTX series card which is PCI-E 3.0 ?

L
Listoman 02.02.20

Ingener_Mexanik019
1-2% lost)

I
Ingener_Mexanik019 02.02.20

ChRoN_tm
Want a nice picture -take radeon , I want more performance -get geforce(Personal experience)

I
Ingener_Mexanik019 02.02.20

Listoman
I just encountered a problem : I Have a processor i5 760 CPU for 1156 socket and it motherboard is only PCI-E 2.0. PERC itself pleasing 8 MB cache third level is the bomb!. On turboboost 3.3 GHz safely gives. But I am afraid to change my Radeon 6850 To GTX - s vidyuhi due to the difference in version of PCI-E. power supply I took a stock somewhere in the 550 W .Should we be afraid that connector is going to burn?

A
Anatoliy Burmistrov 02.02.20

Ingener_Mexanik019
the difference between 2.0 and 3.0 data transfer speed