144 posts in this topic

Ain't posted anything for a while and intend to add more photos and info later. Will change how I usually do a log and show the result below.
 
Firstly I purchased:

  • EKWB R4BE black Monoblock
  • 32GB Kingston black DDR3 2400MHz RAM
  • 1/2 ID 3/4 OD green UV tubing
  • Mayhems green UV Coolant premixed (loop with 2x480mm rads used 1.6 litres)
  • Monsoon green 45, 90-degree fittings
  • 4x green 140cm case fans

All the previous Bitspower fittings will probably be reused for 2600K+3570K CPU rebuild (would have done GPU but really not worth it unless I can get it for tops of £60). Reused the excellent 20g huge tube of MX4 TIM for block and this time applied it pea-sized than spreading it. You move the block about side to side a little to work it - trust me it's better pea-sized in the middle than anything else.
 
So yeah a lot of green but main aim was to improve the aesthetics, neater loop, and lower temps in the Summer and currently have these quiet fans on 40% at 23C ambient. Biggest issue with redoing loop is the time needed to drain it and concern about coolant that will remain within the blocks/rads. Always play safe and have different sized storage containers cos even the small ones have saved me some grief, plus about 6 plugs for sealing holes. Some tubes may not come off at all due to how much I over tighten the fittings originally - any fitting can have this problem and tube will pull off even with fitting tightened - you'll have to grab your hose cutter and give it a quick snip.
 
All the old 5/8 tubing is being replaced and will go in the bin. 3/4 tubing may seem OTT but its a breeze to work, rigid and fits on better than 5/8 tubing. The new fittings in how well they are designed and finished are something else and a new standard for others to follow. I've tried EK and they flaked plus finish had too many blemishes that EK state aren't covered under warranty, Bitspowered have great finish but washer should be thicker, it's very expensive, and not that easy to get a tight secure fitting without resorting to spanner that will damage the outer thread.

 

Besides that the only aspect of this build I would want to improve is the power cables to improve layout, otherwise its very clean and tidy considering the amount of cables required. The LED on the 780 Ti isn't that bright and you don't notice it unless you look at it, must have been the angle I took the image at and the smaller aperture (about 32 at 200 ISO) of 70D to capture the detail and fan rotation. Only finished a quick 24 hour leaktest 06:30 today and remaining bubbles will disappear over the next 5 days.
 
Image is 4.7MB, 20MP
6YcKlcRm.jpg

2

Share this post


Link to post
Share on other sites

I like the green choice :)

 

With so many fans how are the noise levels?

0

Share this post


Link to post
Share on other sites

Thanks Skr, green loop in person simply looks amazing and stands out very well :) Can't go to Red team now after all this work unless AMD donates towards new loop :P

 

I can change the fan speed on the fly but do it manually (software or fan controller) and like it that way, plus no more room for another fan controller anyway. The four 140mm case fans are on normal RPM which is about 500RPM and don't hear it in the 900D - case fans range from 300 - 1200RPM. Can turn off all fans if needed and get pure silence or fine tune it to super low RPM. In all there's 20 fans but because there are so many you never need to have them running beyond 40%. Maybe a really hot day but even then tops I set the 16 120mm rad fans to 70%, 100% only for benchmarking and use the AC. If it's really hot and by the time I have to use the AC it doesn't matter that much about the fans. The amount of heat the rads can dissipate is nice but wish it would throw it directly outside lol.

0

Share this post


Link to post
Share on other sites

Good idea and thanks for the suggestion but space is at a premium - 900D isn't that big with all the components I crammed in. This is because there is nowhere for wire slack and extensions are far from ideal due to lower current over that longer distance. Those probably aren't the ideal 24 AWG quality. Can get my existing PSU cables (proprietary PSU connectors) custom braided and would be amazing, if I had the time I could do it myself but so easy to get the heatshrink messed up or out of alignment. Most cheat by using template boards but better used for doing hundreds. Quick search and there's a 2011 guide here.

 

Oh fuck! Somehow and probably because of previous faulty memory that I had to RMA for full refund from OCUK, one of my R4BE memory slots (D1 - first one) isn't working  :angry:  Been faulty since this rebuild and obviously take super good care of all sensitive components. Good news is that somehow PC works but only in triple channel. I'm thinking new board cos Asus RMA is super slow (every RMA takes at least 4 weeks) and don't want a month-plus of downtime or have to fit stock coolers back on. Would set me back £360 for new one, should I do it?

0

Share this post


Link to post
Share on other sites

Good idea and thanks for the suggestion but space is at a premium - 900D isn't that big with all the components I crammed in. This is because there is nowhere for wire slack and extensions are far from ideal due to lower current over that longer distance. Those probably aren't the ideal 24 AWG quality. Can get my existing PSU cables (proprietary PSU connectors) custom braided and would be amazing, if I had the time I could do it myself but so easy to get the heatshrink messed up or out of alignment. Most cheat by using template boards but better used for doing hundreds. Quick search and there's a 2011 guide here.

 

Oh fuck! Somehow and probably because of previous faulty memory that I had to RMA for full refund from OCUK, one of my R4BE memory slots (D1 - first one) isn't working  :angry:  Been faulty since this rebuild and obviously take super good care of all sensitive components. Good news is that somehow PC works but only in triple channel. I'm thinking new board cos Asus RMA is super slow (every RMA takes at least 4 weeks) and don't want a month-plus of downtime or have to fit stock coolers back on. Would set me back £360 for new one, should I do it?

 

Does it mean the memory you returned was actually working? I would just RMA it, there is new platform coming out later this month, so prices should go lower.

Buying new mobo for your socket now, would probably mean you will pay extra just to have it online for couple of weeks. Is it worth it? Well you could stop folding for a month, just to be on zero balance ;)

0

Share this post


Link to post
Share on other sites

Great pic smith .

After reading what happen with you board , i guess i'll have to buy mine locally , but i will only have asrock and i do hate this brand even if asus and asrock was originally one company . The RMA thing sure is another source of stress and put your project build in pause .

Anyway i hope that beside the board , everything is working/ going fine . Keep up the hard work.

Regards.

0

Share this post


Link to post
Share on other sites

ASUS ROG SWIFT PG278Q

 

Thought to post this here since this single purchase will lead to changes elsewhere in my build... like a new 980 TI or two. Bought the Asus 27" Swift with GSync and been testing it since yesterday. From the rest of my build you'll understand my reasons for wanting to push my gaming at a nice 2560x1440 as it is very clearly capable of doing without keep resorting to DSR.

Got the ICM profile installed from TFT Central, panel is using Brightness: 22 and Contrast: 45 and Colour Temp: Manual: 100/100/100. Other than that I have Turbo on 144Hz and enabled 144Hz in Nvidia Control Panel as it defaults to 60Hz. Nvidia Control Panel Desktop Colour Settings are default too...

hfX0WFn.png

Desktop Colour Settings left at Default, have no reason to change and happy with these

 

GSync was enabled by default as was VSync for all other games too, that is fine as in game VSync needs to be disabled anyway. Other than changing GSync from fullscreen to fullscreen+window modes that was pretty much it. Once again, final check once you load into games is to disable any VSync if listed otherwise for higher FPS online games like CS:GO just enable ULMB via Nvidia Control Panel instead (this disables GSync that caps fps at 144, the max frequency of this GSync panel).

13gWMyf.png

Defaults to Enable G-Sync, switched this to cover both fullscreen and windowed mode

 

chrzyU3.png

Again, defaults to GSync. Something important to cover here is that although global settings is GSync I would strongly advise not to use GSync for desktop use to avoid PERMANENT vertical lines.

 

GSync is used also for your non-3D activities like desktop, website, movies and eventually after 3 months you get horrible vertical lines that makes text unreadable. You may forget that you spend a lot of time outside of games. Right now sitting at my Swift I see uniform shade of grey to the sides on the forum and only the matt coating stands out. I want to keep it that way.

Enabling ULMB instead is best -- yes a pain to keep switching in Nvidia Control Panel -- this preserves your panel and automatically disables GSync. You can't enable per game profiles for GSync, it requires manual enabling to work from when I double-checked the Program Settings and GSync option was missing as it had disabled GSync completely. Ticked the box again and Globally it was running once more.

This faff puts me in the mindset of plasma's / LCD's /  some LED's that required care use of the panel over static images, never had that issue but then I do take care once I am made aware of issues. None of these issues appear in the manual on the disc, good job we have people over on OCN that covered most of the potential issues.

If this raised any questions about GSync or this panel in particular, feel free to ask me and will answer them all honestly and without any bias. If I didn't cover an aspect of this monitor you wanted to know more since I didn't cover every aspect in order to keep it short, I will cover anything within reason that covers normal use. Obviously excludes me opening the panel of the Swift as I would never do this. My 3-year onsite warranty would be void as would my 28-day dead-pixel warranty from OCUK, I've tried very hard to spot any dead pixels and as yet there are none.

1

Share this post


Link to post
Share on other sites

How is the gaming with the monitor ? :))

0

Share this post


Link to post
Share on other sites

How is the gaming with the monitor ? :))

Very smooth with GSync, very smooth with ULMB and even videos are nice to watch on this. I never like to talk PR nonsense and if I can cover anything from a person point of view I will. Main benefit of the Swift is the instant reactions with keyboard and especially the mouse, either on the main menu of a game or in game I have yet to pick up on mouse lag. Coming from already playing on a 144Hz monitor and feeling the lag there even at 91fps average says it all about how well specced this panel is. Quick test to my Dell IPS that is hooked up 24/7 showed how very laggy it is when going for full 10-bit RGB.

You can't simply add GSync or ULMB to any old panel and expect to get away with anything less than impressive results from seasoned gamers. Not even missing the biggest aspect I was wanting carried over from my BenQ and that was their adjustable Black Equalizer that keeps the normal contrast and brightness levels and increases the brightness levels in the shadows to very easily see enemies. As WhyCry said recently it was like cheating and exactly what I said on the shoutbox a couple years back when I got that monitor. This monitor even with the brightness/contrast settings I mentioned originally is still very bright and can still easily see in the shadows. Need to test out some more games a bit loner that will show up more detail in the shadows like Fallout New Vegas and BioShock Infinite that should still have my SweetFX settings there.

Not bothered about the obvious lack of sharpness lost to watching YT stretched up to 1440p, for some videos I don't mind this and even 720p isn't bad to watch. In future any YT or Prime stuff watchable in 2160p I'll have to give that a go, my ISP is going to automatically binning my connection with all this data, was already happening with 1080p lol.

And not something you'll pay much attention too, over time I may take this for granted.... how fast the panel turns on, resumes from standby or sleep mode. Difference to my BenQ 144Hz required the scaler to go through to verify what ports were connected then after 5 seconds turn on the screen, one DisplayPort 1.2 is all this Swift has to focus on and no scaler to slow down usage. As soon as I move my mouse the screen turns on, talking hundredths of a millisecond as a best guess. The time it took to startup my PC versus starting up the monitor was taking 25% of the time and now startup of my PC is nicely way ahead meaning I will make more use of sleepmode!

EDIT: Also I got a free gamecode for Assassin's Creed Unity in the box valid until December 2015, erm okaaay -_- Holding off for another day until any issues then I'll redeem it and also register my panel for the warranty.

0

Share this post


Link to post
Share on other sites

Quick update with using GSync. There is a known bug that causes the Light In Motion (selectable on/off via OSD) to remain red on the base stand. Stays red (indicating GSync on) despite setting GSync off via Nvidia Control Panel and enabling ULMB that never activates visually from within the Swift OSD menu. Yes I went through and checked all options, likely down to buggy drivers not feeding the module to switch modes and affects ULMB mode too. Verified the state that fails to activate on the Swift OSD and stays in Normal mode when GSync isn't activated.

Asus Swift LIM Colour Modes
WhiteMonitor on
AmberStandby
RedGSync
YellowULMB (Ultra Low Motion Blur)
Green3D Vision 2.0

This is a well-known bug that has re-appeared throughout Nvidia drivers. Fixed with the latest hotfix 353.49 for those reporting back on the Nvidia forum and elsewhere. I shall clean out the existing and non-working GSync hotfix 353.12 and do a clean-install to see for myself if the red LIM changes colour for the first time (for me).

Should fix the two small issues I noticed: 1) red light remaining activate 2) monitor not picking up ULMB mode for general use and multiplayer. If the latest hotfix doesn't work (if so I will let Nvidia know via feedback forms) and work through the drivers until I find the most recent one that does work. Will do this later this afternoon, if I don't reply on this point you can assume that everything went to plan and the latest hotfix driver worked. I remain very positive in my enjoyment of this monitor so far.

And yes I did run the GSync Pendulum Demo. A nice simple demonstration of how smooth gaming can be with comparison to no sync whatsoever and the laggy/normal VSync. Found the smoothest range to be 41 - 143 fps, below 41 wasn't synced (should be lower but perhaps driver update will fix this) and above 143 was ever so slightly showed differences between frame smoothness. During singleplayer gaming the Swift always capped the fps at 143 and remained very smooth. To really test this monitor, I have been playing the heavily modded Fallout: New Vegas with changes to nearly every aspect of the game. FONV did feel laggy around 52fps (talking to NPC in the wasteland) when it dips suddenly from being very smooth at 106fps - this isn't a fault I would place at the monitor but the game being an unoptimised piece of work and feeling more responsive than my previous 1ms 144Hz non-GSync panel!. FONV itself without mods and just like Fallout 3 is very buggy, unforgiving, featuring a lot of micro-stuttering thanks to the ported engine. Experienced similar slowdowns/random crashes with the buggy Stalker: Clear Sky vanilla that is otherwise quite dull to play. This is why FONV will remain a test of the worst case and nothing comes close to this experience from the 17 games tested so far that keeps within playable limits of my 780 Ti Classified that ran very well and most of all looked great at 1440p!

0

Share this post


Link to post
Share on other sites

More on the Swift

Found that you could enable Ultra Low Motion Blur (built-in backlight strobing for smoother motion) at up to 120Hz for desktop and games for previous drivers as well as current 353.49 hotfix. Found the issue I previously experienced was the panel being clocked to 144Hz. This isn't easy to explain the correlation syncing GPU and the panel's refresh rate other than in a table I took time to gather data for:

SMiThaYeU.L.M.B.   MODESMiThaYe
GPU Refresh Rate60Hz85Hz100Hz120Hz144Hz
Swift Refresh Rate
60HzNONONONONO
85Hz-YES---
100Hz--YES--
120HzNOYESYESYESYES
144HzNONONONONO

Straight away you notice some Swift refresh rates kick in automatically and align with the identical refresh rate. Select 85 Hz within Nvidia Control Panel and click apply then the panel will sync to 85Hz rate, same with 100Hz too. You can't enable these frequencies manually for ULMB at any other time, only when these refresh rates are selected. Not sure why some of the other refresh rates are unavailable, perhaps a limitations of the first GSync module?

 

ULMB + SWIFT @120Hz

This goes back to the roots of ULMB where you enabled 3D Vision to fake a non-3D Vision title to be able to enable ULMB for smoother motion but this always had the expense of a lower fps and may increase input lag from your keyboard and mouse (ie on my BenQ XL2420T). This option is built within the Swift and takes no such hits in performance or create extra input lag and I've tested both options. For those wanting a quick option for all, I'd set it to 120 Hz for switching modes from GSync to ULMB. GSync has the obvious benefit of capping max fps without resorting to third-party utilities for single player titles, reduces power and heat from your beefy GPU/s and any hitching from the engine that GSync doesn't fix will be minimised from extremes in low-to-high framerates. However without GSync... Unless you can guarantee minimums above 120 at all times then you will notice the difference between GSync being on or off. With GSync on you can get away with a decent card for a while longer!

 

A little more on some of the kinks within the drivers

Don't set per application profiles for ULMB, from testing I have found that exiting games - globally were set for ULMB - disables ULMB on the monitor and annoyingly knocks it back to Normal mode that is neither ULMB or GSync (I'll humour you with calling this safe-mode). I always double-check the monitor and screen information to see if ULMB is actually on. Another indicator (moreso at lower frequencies if ULMB is enabled), the screen brightness decreases but is also dependant on the panels ULMB Pulse Width where higher is brighter (0 - 100).

Outstanding issue remaining: Swift's red Light In Motion staying on red instead of yellow (see the table in previous post about colours). I'd like to be able see this change colour for the first time.

0

Share this post


Link to post
Share on other sites

The way Nvidia have changed the G-Sync running lights has annoyed a fair few people and as much as Nvidia say it is designed this way, I feel it is more of a bug and they should sort it out. The only thing I do is set 120Hz for desktop and when I am benching, I switch G-Sync off (light goes white) via the NCP and off I go and then switch G-Sync back on and it goes red again for when I am gaming.

0

Share this post


Link to post
Share on other sites

The way Nvidia have changed the G-Sync running lights has annoyed a fair few people and as much as Nvidia say it is designed this way, I feel it is more of a bug and they should sort it out. The only thing I do is set 120Hz for desktop and when I am benching, I switch G-Sync off (light goes white) via the NCP and off I go and then switch G-Sync back on and it goes red again for when I am gaming.

Thanks for the reply from a fellow Swift owner :)

Yep I was reading about that and it goes back some time and should be fixed. I just want it to work and never has - even with both the GPU and the Swift at 120Hz.

Been using ULMB mode (set via Nvidia Control Panel > Manage 3D Settings > Global Settings) and L.I.M. remains on red, set it to Fixed Refresh and that doesn't change it either. 60 Hz doesn't work and glad it doesn't either cos that is laggy even for general use. As it works for you, what drivers are you using?

The reason I use ULMB mode is to protect the screen from vertical lines that can appear after 2-3 months, those that didn't switch modes couldn't read text hence an RMA. Plus the screen dims and makes it a better alternative for reading and quicker than changing brightness/contrasts every time despite how fast the OSD is.

0

Share this post


Link to post
Share on other sites

I am on the 353.38 bud. I thought it worked fine with earlier drivers but since a couple of months (maybe a little more), it was basically just red or white being shown, even though it was working with ULMB?

 

Quite a few I know use ULMB for competitive gaming, so I will jump onto TS this evening and see what they say.

0

Share this post


Link to post
Share on other sites

I am on the 353.38 bud. I thought it worked fine with earlier drivers but since a couple of months (maybe a little more), it was basically just red or white being shown, even though it was working with ULMB?

Quite a few I know use ULMB for competitive gaming, so I will jump onto TS this evening and see what they say.

Thanks :) Well that is odd if 353.38 hotfix works for GSync yet the others didn't - I always follow a full clear out and clean install leaving little chance for issues. I didn't use 353.38 hotfix, more extensively tested/used 353.12 hf + 353.49 hf.

To avoid this become annoying for others, I shall put up/ignore this issue. When I find a driver that does work I shall leave a little post, no updates equals issue remains. To keep this simple for others without GSync, the modes do work but the LED indicator (purely visual) on the base does not synchronise as it should with the setting.

Ignore if you already did this, may be helpful for others....

You can verify if ULMB mode is working and see if LIM works/changes colour by going to Global Settings > Monitor Technology: ULMB (this deactivates GSYNC and Normal mode). Previously covered this and have screenshots on this page.

Using your Swift OSD menu go to Image > ULMB (enable this). Next check your Swift OSD Menu again to verify the mode by navigating to System Setup > Information. Mode will show ULMB - yet despite this LIM remains on red rather than yellow.

m7QBEdo.jpg
Select to enable ULMB and leave Pulse Width at 100 for optimal picture, increase brightness if required too.

EXroic7.jpg
Verify the mode to cross-reference with Light In Motion, as most Swift owners will know it remains on red regardless.

 

1

Share this post


Link to post
Share on other sites

MSI GTX 980 TI Gaming 6GB Graphics Card

pL3krmqm.jpg

Imgur was being a pain when trying to get this uploaded... until just now and after a good round on CS:GO with these two new MSI 980 Ti Gaming cards. Redid my loop from the monoblock to the 480 rad and that required draining the entire loop, my drain port is to the back of my case on the lower rad port. Have to say using Monsoon fittings was a great choice and would highly recommend them. Then I cut the new 3/4 tubing to size and that on the left of the bottom photo is the only change to my loop. Without a GPU in the same loop CPU temps are down by 5C and had been trying 4.7GHz but had some stability issues. Likely the ICM in the 4930K isn't up to handling 32GB of 2400MHz memory, not many would lol! So you either drop down to 16GB or slow the memory down to 1866/2133MHz and may try that for when going for benchmarks another time and block these cards.

IMNg2kR.jpg
This was took on a 70D using a tripod and remote switch, don't worry this image has been optimised for forum use. The photos I quickly took before this were mistakenly took on a lower JPEG setting and despite that I will post those over the weekend. As they say, better than nothing!

No question these are amazing for any game and they should be. Feels a lot smoother than my previous card, the 780 Ti Classified when going for higher fps for multiplayer.  In CS:GO I'm talking about 300fps minimum to avoid being shot before you get the chance (looks like warping from early BF4 days) and hitting 600fps when checking out the stats. Feels so silly I was running around like an idiot killing without the worry of hitching, there I was thinking it was my slow eye-hand coordination using my 780 Ti but nope!

Haven't touched GSync yet on the Swift while using these cards. You will hit such a high fps that it isn't needed for a lot of titles apart from such titles as Metro/Crysis/Assassin's Creed series. There are a lot of titles that are capped at 30/60fps or lacking SLI support and that is where GSync will be useful. Has the benefit of sometimes removing these caps without modifying the config, but note that if the engine has been coded badly that the physics is tied in to the fps you will have the unwanted and unplayable effect of double speed or faster.

Found stable 24/7 clocks (CORE: 1228>1328MHz and boosts to 1420 MHz / VRAM: 1753>1928MHz) and holding back simply due to the heat GM200 produces. Benchmarking produce heat of 90C even though the top card has plenty of spacing, 2-slot if not for the small Phoebus above the second 980 TI. Will remedy this as soon as the blocks for the Gaming are released and would like to throw in my old MSI 660 Ti Power Edition for dedicated PhysX. May end up putting the Classified back in because it's already blocked, fits the alignment of the Gaming loop thanks to the larger depth of 40cm iirc.

How good are two 980 TI's never mind these custom cards? Take a quick look at these scores (click for larger image). Most 980 Ti's are capable of such scores but how lucky are you to find such a chip. Any boosting towards at least 1330MHz is good enough for 1080/1440 out the box or with slight overclock.

OLa5689l.jpg
Heaven 4.0: On my 780 Ti Classified with 1450MHz core I only scored about 1880 versus 4524.

.a56R3JZh.jpg
Shadow of Modor: Ultra preset at 1440p and your average is 150fps, for me that is incredible!

Sorry there is no video showing usage or gameplay - I may be inspired to create the odd video again with a mix of playing my favourites and benchmarking. Will have to check out how to set up a dedicated capture rig on my 3570K or 2600K and see how that works out. Because my Swift is Display Port only nearly all capture cards require HDMI and the Swift simply does not work with adapters. Over the years I've produce otherwise excellent quality videos (image quality not entertaining stuff lol) but would need to purchase Vegas Pro and that is as expensive as a ref 980 Ti. If it's fun and worth the time I'll check it out :) 

That was my thoughts on using Maxwell 2.0 (GM200 A1) and upgrading from Kepler (GK110 B1). Kepler wasn't bad, it was great for it's time and likely I'll be saying the same things in another 2 years about Maxwell and the crazy 8k resolutions that may be possible on another Titan using Pascal and saying goodbye to ever using AA and sharper textures.

1

Share this post


Link to post
Share on other sites

You could 'hide' the red paint on these cards by adding more green LEDs in your case. 

Judging from Heaven, you gained more than twice the performance, and that's still without 780Ti as physx card. I wonder how 3rd card in the same loop will affect the temperature of the liquid.

Did you test power consumption? How much higher is it vs one 780 ti? 

Edited by WhyCry
0

Share this post


Link to post
Share on other sites

I am embracing the red for now otherwise yes a good suggestion to add green LEDs. Trouble is wire management and you're talking £20-£25 for good quality strips, in the same breath I know you can buy only the LED strips cheap from the bay but would prefer to buy something of higher quality and safer with adjustable voltages to control brightness. When I get my blocks in a couple of months it would still look nice with LEDs and solve the temps during gaming/benchmarking. It is a certainty that I WILL block these GPUs :D

Heaven 4.0 relies heavy on tessellation iirc and does not require PhysX at all but loves a wide bus. Some titles you can happily play with a single 980 TI easily but others do require two, already ran the benchmarks for PhysX titles such as Metro 2033 Redux and maxed out at 2560: Average Framerate: 70.50 / Max. Framerate: 246.37 / Min. Framerate: 18.16. Settings used were: Resolution: 2560 x 1440; Quality: Very High; SSAA: On; Texture filtering: AF 16X; Motion Blur: Normal; Tesselation: Very High; VSync: Off; Advanced PhysX: On;

Power usage versus a 780 Ti Classified with custom BIOS/blocked, then clocked to 1280/1830MHz did pull about 530W, pushing it hard for benchmarking fun at 1440/1900-ish hits 630W. Meanwhile custom 2x 980 Ti SLI OC GPU 1328 (boost 1420MHz) / mem 1928MHz pulls 305W more. Temperature limit is set to 91C, Power Limit at 109%, and core volts +40mV offset (1.2mV actual), otherwise when gaming the only change is to limit temperature to 81C and use my AC to keep room temps down

MSI GTX 980 TI (2-way SLI) GAMING SYSTEM POWER CONSUMPTION
IDLE280WAt desktop with core parking off 4930K@4.5GHz. No power saving modes active, disabled in BIOS as usual and high performance Windows mode. Boost 2.0 doing it's job nicely.
100% LOAD835WPeak usage during the Shadow of Mordor benchmark. Monitored with plugin power meter as a guide. Issue isn't volts but limited PL.
EVGA GTX 980 TI CLASSIFIED SYSTEM POWER CONSUMPTION
IDLE240WCustom BIOS and fully unlocked with waterblock. At desktop with core parking off 4930K@4.5GHz. No power saving modes active, disabled in BIOS as usual and high performance Windows mode.
100% LOAD630WPeak usage while running 3DMark 2013 and clocked to 1440/1900Mhz with 900W of potential power available to the card, core volt of 1.30mV and PCIE lane clocked to 416KHz.

Regardless of these temps I do miss the efficiency of watercooled cards and the silence. It may be that for a single loop two 980 Ti's is the limit for two quad rads despite the general rule of one 240mm for each high power component. Knock on effect is higher CPU temperatures and whether to use two loops instead, especially with a third card for PhysX or use the 660 TI and keep the fan because it never bothered me before. Then if the benefit of SLI + PhysX GPU has no real use I can quickly uninstall it.

Edited by SMiThaYe
changed chart to clearly state 2-way SLI, thanks WC!
0

Share this post


Link to post
Share on other sites

It's actually very interesting that TWO cards in SLI consume almost as much as one 780 Ti. Well as for full load consumption, 304W more are quite a change, and my PSU would not handle it, but yet again, these are overclocked.

You should change your chart to state it's SLI, since it looks like single 980ti vs single 780ti comparison :)

Did you limit the width to  width: 500px; on ourpose?

0

Share this post


Link to post
Share on other sites

It's actually very interesting that TWO cards in SLI consume almost as much as one 780 Ti. Well as for full load consumption, 304W more are quite a change, and my PSU would not handle it, but yet again, these are overclocked.

You should change your chart to state it's SLI, since it looks like single 980ti vs single 780ti comparison :)

Did you limit the width to  width: 500px; on ourpose?

Yeah that's what's I thought but we are talking about an unlocked card free of Greenlight. We already knew how efficient perf/watt Maxwell arc is, makes it all worth while. I remember reading some time ago HardOCP ran 295X5 and a 290X for 3-way CF and pulled 1050W full load but that was a regular 3770K @4.8GHz and not a 3930K that could pull another 100W more while two 295x2 pulled close to 1400W!. Part of the reason going for GK104 680's back then. Had a 850W PSU and 1200W from Corsair and maybe Seasonic iirc was the best you could get in terms of quality units.

Thanks WhyCry and updated table to state SLI :) Changed width to 800px otherwise mobile devices will start cutting it off beyond that. Like to keep the forums fairly easy for mobile use and fast to load for those with slower connections. Prefer to be kept on my toes (not all the time though cos a man has to rest sometime) :D

And I will try and fix the photos I quickly took. Somehow my camera ignored my previous settings and put them on a much lower quality than normal and this repeated a couple more times, I apologise ahead of sorting that out for later on tonight. Won't be anything in depth at all, saving those for the blocking.

 

0

Share this post


Link to post
Share on other sites

If my rig ever pulls more than 800W I will throw it out of the window ;)

Well you can just change your table to % -> width:80% or whatever fits better. It will automatically scale down for mobile phones. 

Also you can upload the pics to forums, there is really no need to put them on imgur ;)

 

0

Share this post


Link to post
Share on other sites

BENCHMARKS

Posting benchmarks for others to see in my free time. May as well get started now that I have my 24/7 clocks and achievable for many (1454MHz boost and 1901MHz memory). Will add my own gameplay 60 second benchmarks at a later date as some games you have to reach a certain checkpoint to re-test.

You all know the setup in-depth I shall index them here and update accordingly to avoid spamming - check back every week to see new entries :) 

All benchmarks are ran with maximum ingame settings and High Quality in Nvidia Control Panel. Quite happy to do requests to cover most configs. Any issue with performance I shall note them and for your convenience type the performance numbers before the image with date tested, added a spoiler to make browsing through titles easier. There is no need to add my own graphs, you see exactly the results obtained and makes no sense with different configs.


PC config to ensure no bottlenecks, all energy saving modes in OS/BIOS are OFF

  • 4930K @4.6GHz
  • 32GB 2400MHz DDR3
  • Asus Rampage IV B.E. x79 with latest BIOS
  • 2x 840 PRO 512GB (OS), 2x 840 EVO 1TB (GAMES/BENCHMARKS)
  • Nvidia driver 353.49 hotfix
  • Windows 8.1 Pro with latest updates

 

Work in progress....

Batman: Arkham Knight 980 TI SLI OC @1440

Average: 53, Max: 70, Min 37
Date tested: 2nd August 2015
Issues: Performance was smooth but FPS lower than it should be thanks to existing issues in the game. Nvidia drivers utilise AFR1 instead of AFR2 and forcing this latter option caused performance to dip to 22fps! Backing out of the game again and re-testing with AFR1 produced the 'normal' results you see above and best of what can be expected. SLI utilisation was GPU1: 83% / GPU2 49% (fluctuated a lot down to 50/50) with clocks boosting to 1443MHz. Performance of a single card is going to be interesting. Note that this game was installed on my system drive consisting of 840 PRO 512GB in RAID-0 to rule out storage speed causing issues as these hit 1.2GB/s reads.

BzOQuURm.jpg

Metro 2033 - 980 TI OC + 980 TI dedicated PhysX (only 10% usage but nice boost to fps) @1440

Average: 46.50, Max: 127.15, Min 1.81
Date tested: 29th July 2015

cLHpGeMm.png

Shadow of Mordor - 980 TI OC @1440

Average: 98.34, Max: 132, Min 61.76
Date tested: 29th July 2015

vfLHXddm.png

Shadow of Mordor - 980 TI OC @2160

Average: 54.78, Max: 70.98, Min 44.24
Date tested: 29th July 2015

3ZTj4eBm.png

Synthetics: AIDA64 GPGPU Benchmark versus a 4930K - 980 TI-SLI OC

Date tested: 1st August 2015

4nK2gU6m.png

Synthetics: 3DMark11 Performance Benchmark - 980 Ti SLI OC (+87mV, 109% PL, +120 Core, +500 Mem)

Overall Score: P28,526, Graphics Score: 46,963, Physics Score: 13,716, Combined Score: 12,272
Date tested: 28th August 2015

9T3aSANm.png

Feel free to comment, suggestions are welcome.

Edited by SMiThaYe
0

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now