EVGA

Temperature considerations for stacked 1080ti

Page: 12 > Showing page 1 of 2
Author
9krausec
New Member
  • Total Posts : 16
  • Reward points : 0
  • Joined: 2017/09/28 10:16:21
  • Status: offline
  • Ribbons : 0
2017/09/28 10:26:43 (permalink)
Hello! I am not a miner, but I am looking to build a GPU based render farm out of EVGA 1080ti cards (once stock is replenished). I'm curious about temperatures of the 1080ti line under full load.
 
Since I am planning to stick 4 1080ti cards in each 4u server box, heat is one of my primary considerations. From what I understand, the aftermarket, non-FE 1080ti cards generate more heat output. Is this correct?

To me that doesn't make so much sense as the same fundamental guts should be used for each version of the 1080ti. The FTW3 series sounds like it has the best cooling out of all the air cooled cards as it has a triple fan setup and an "over engineered" radiator unit.
 
So I'm thinking if I stack 4 of them vertically. Space them apart by 1 1/2 inches per card, have enough air being pushed through the case AND have the case in a AC cooled server room, I would be just as well off using a FE version as I would using a SC or FTW3 version. Is this a correct assessment? Goal is to have antiquate cooling so the cards do not throttle themselves back due to temps.
 
Anyone want to weigh in on stacking 4 1080ti FTW3 cards vertically with 1 1/2 inch gaps between cards, air being forced through a 4U case and having it in an AC server room? Bad idea? Would I be best off getting the FE or single blower model?
 
Hybrid card are out of the question as I don't think I would have enough room in a server chassis to stick the radiators anywhere (unless I cut slots coming out the top of the case for the radiator and fans to protrude).
 
Thanks,
Clayton
#1

32 Replies Related Threads

    bcavnaugh
    The Crunchinator
    • Total Posts : 38977
    • Reward points : 0
    • Joined: 2012/09/18 17:31:18
    • Location: USA Affiliate E5L3CTGE12 Associate 9E88QK5L7811G3H
    • Status: offline
    • Ribbons : 282
    Re: Temperature considerations for stacked 1080ti 2017/09/28 11:52:59 (permalink)
    Welcome to the Forum 9krausec
    This for Mining and a Server Case Will Overheat so FTW Cards are not the best choice here.
    I could see using only a Blower Style Cooler is a Sever Case or 4U Server Case.
    You should just go with the FE Version so that you DO have a black plate.
    You may get more support if you post under Cryptocurrency area of the Forums.
    post edited by bcavnaugh - 2017/09/28 11:58:47

    Associate Code: 9E88QK5L7811G3H


     
    #2
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/28 12:02:11 (permalink)
    Thank you for the confirmation. I originally was going to go with the blower cases, but wanted to confirm that other versions wouldn't be optimal. Picking a card out is half the battle. Finding 10 GTX 1080ti cards is the other half.. :D
     
    Thank you again.
    #3
    demon09
    FTW Member
    • Total Posts : 1334
    • Reward points : 0
    • Joined: 2016/09/16 21:18:42
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/28 12:19:00 (permalink)
    May be worth under volting them my fee ran at a stable 1899-1911 at 0.977v and dropped the temp but doing it for 10 cards may be another story. I'd also throw my vote to blower cards . As that many 1080ti with open blowers in a 4u case would turn the inside of the case quite hot. Good luck on your hunt for 10 Fe cards I hope stock starts coming back up again for you
    post edited by demon09 - 2017/09/28 12:21:18
    #4
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/28 12:33:04 (permalink)
    Thanks for the reply! I hope stock starts coming up too.
     
    Right now I'm doing testing on my home workstation with pcie riser cables... On my home machine I have 1 GTX 1080 FTW2 and dual GTX 1080ti FTW3 cards (not in SLI as these cards are used for rendering). Stacked in the motherboard directly, the heat was a bit much.
     
    I bought a 304 Fractal Node case, gutted it, added some support brackets and want to try to mount the 2 1080ti FTW3 cards in there with pcie riser cables by thermaltake (600mm TT cables).

    After monitoring the heat output from the dual cards in the box, I was going to spring for a third 1080ti FTW3, but it sounds like I won't get past two due to heat reasons.
     
    I'm rolling the dice with the TT riser cables as it sounds like  them working is hit or miss (fingers crossed it doesn't fry my workstation in any way).
     
    I'll need to utilize the same PCIe riser cables in the farm server build as I don't want the cards right up to one another like they would if directly plugged into the MB. Blower or open, I assume that would be an issue for heat dispersion if they were so close.
     
    TT cables came in today so tonight I'll try my hand at getting things setup. Like I said, fingers crossed I don't blow anything. :D At least I can blow up my own workstation before I blow up the companies servers that I'm building from the ground up... I like my job and would rather not get in deep for blowing 25k worth of hardware...
     
    Too bad Redshift, the render engine we use, bottlenecks when using 1x to 16x risers. The minimal I hear you can go is 8x to 16x riser cables... Might as well just shoot for 16x to 16x to avoid any potential bottlenecks.
     
    Thank you again for the kind replies.
    #5
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 10:00:33 (permalink)
    Found a vendor that will hold 10 cards for us when they get a new shipment in. A lot of these vendors seem to allow only 1 per person as the demand is so high and the supply is falling short. Only reason they are allowing this is because of we'd be ordering through a business entity, not just a single person (and we order a lot of stuff through the vendor already so our IT head told me).

    On a side note, I was able mod my home workstation last night. This is a rendering rig that has 1x 1080 FTW2 in it and 2x 1080ti FTW3 in it. Moving to GPU rendering over CPU so the two 1080ti cards are just for rendering as the 1080 doesn't render, but runs my 4 monitors and the UI of the 3D applications.
     
    The two 1080ti cards and the 1080 had nominal temps in my case, but I didn't like them so close to each other over long rendering periods so I moved them out of my case (first mod I've ever done after building 3 machines before. Pretty exciting). So I cut some holes, butchered a Fractal Node 304 case and extended the 2 1080ti cards out with 600mm TT pcie riser cables.
     
    Everything registered last night and I'm going to stress test tonight! If temps stay nominal under load (I have my doubts per our discussion above), I'm going to purchase a 3rd 1080ti FTW3 card (or just a blower one to be safe) and stick it in the node case too.
     
    Attached some crappy photos for you folks. I'll update on stability and temps over the weekend at some point.
     



    #6
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 10:07:00 (permalink)
    Also... I found a good photo of a build that someone else did with the same chassis I'm going to be using for this. Going to have 4 GPUs as opposed to 6 and have a full 16x to 16x pcie riser (like the one I used on my personal workstation, but shorter) instead of the 1x to 16x risers miners use.
     
    You can get a good idea of the space I'm working with! Let me know if that changes anyone's thoughts on the discussion above of heat (probably not, but just to show a clearer image of what I'm after).
     
    Cheers everyone! Going to post this up in the mining part of this forum too to see what those folks think.
     

    #7
    bcavnaugh
    The Crunchinator
    • Total Posts : 38977
    • Reward points : 0
    • Joined: 2012/09/18 17:31:18
    • Location: USA Affiliate E5L3CTGE12 Associate 9E88QK5L7811G3H
    • Status: offline
    • Ribbons : 282
    Re: Temperature considerations for stacked 1080ti 2017/09/29 10:07:22 (permalink)
    Links would help other out that are mining.
    Post them without the http:// or https://
    Once you have 11 Posts you can then post links.
     
    No need to create a new Thread under Cryptocurrency ask to have this Thread moved.
    post edited by bcavnaugh - 2017/09/29 10:12:25

    Associate Code: 9E88QK5L7811G3H


     
    #8
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 10:21:41 (permalink)
    How do I ask this thread to be moved?
     
    Edit: What's the procedure on something like that? I've never had to move a thread on any other forum so I'm a bit clueless. Thanks. 
    #9
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 10:58:42 (permalink)
    as long as you have adequate air flow through the case and have them spaced at least a slot apart then you do not need to have blower style cards. the FTW3 has a MASSIVE heatsink and these things will never thermal throttle, even without A/C, unless you are choking the case airflow. the FE versions on the other hand you'll be required to have their fans @100% and if your room doesn't have A/C they will reach the thermal limits of the cards, their heatsink is just a tad too small imho. if you are going with a case like that last pic i honestly do not see a problem with having 4x 1080ti's FTW3 running full tilt, again as long as they are separated and have good airflow.


    #10
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 11:19:55 (permalink)
    Thanks Chris for the reply. Now I'm hearing two different thoughts on this (which is fine, more angles of thought the better). I'll get this thread moved to Cryptocurrency mining, but for now, below is a copy and past what I was going to put up as a new thread in the CryptoCurrency bit of the forum. Goes into a bit more detail on hardware and what I'm trying to accomplish.
     
    Now I'm actually thinking why not stick 6 1080ti cards per unit. Blower style if temps can be managed. Anyways, copy and paste below. Thanks everyone-

    --------------------------------


    Hey all! Right off the bat I'll say I'm not a cryptocurrency miner, but I am a 3D Artist tasked with building a GPU based render farm so there should be some parallels between rendering and cryptocurrency mining I'd like to think. This is a longer thread, but I'd appreciate anyone here willing to proof my build.
     
    For the possible render nerds here, I'll be building this for Redshift which requires at least a 8x to 16x pcie riser to not bottleneck. I'm wondering if I could get some thoughts on the build from you cryptocurrency guys to see if I've overlooked things and to ask about your thoughts on temps.
     
    4x EVGA 1080ti cards. Standard blower cards for heat considerations. Connected to MB by Thermaltake 16x to 16x pcie riser ribbon cable. Under 600mm in length, but I don't know yet how short I can go until I get the case and measure some things out.
     
    IF any of you think I can safely fit 6x EVGA 1080ti blower cards in this build then I will setup the mounts to hold 6 cards and expand over time. Heat is the biggest consideration here. I don't want these cards throttling themselves under load because of temps.
     
    The idea is to build 5 of these units, but I'm budgeted for $25k this year so I'll be purchasing additional cards each year until I max these units out.
     
    I'm going to be building the units in a Rosewill RSV-L4000 server chassis-
     
     
    Planning to use a ASUS X99-E WS motherboard with 10gb network support for future proofing-
     
     
    64 gigs of Ballistix Sport LT memory. 8gb sticks
     
    PSU right now is a toss-up between the EVGA SuperNova 1600w and Roswell Hercules 1600w. I've only ever purchased EVGA products for my home builds, but the Hercules has some good reviews and is 50 dollars cheaper. If any of you want to weigh in on which one would be better suited to supply power to this system 24/7 let me know. I'd rather spend 50 dollars more for a unit that will hold up better, but who's to say the Rosewell's won't?

    CPU will be a Intel Xeon E5-1650 v4 - There might be some light CPU rendering on it, but it's primary function is to support the GPU rendering operations and make sure all the PCIE lans are open.
     
    I'll be using a Noctua NH-C14S 140mm to cool the CPU. Low profile.

    Other than the above I'll be 3 high powered 120mm delta fans in the case and it'll be in an AC controlled server room.

    Here is a photo of a completed build done in the same chassis I'll be using for you guys to get a feel for the space I have to work in -



     
    If you made it to the bottom, I'd like to thank you for reading over this long thread. My nutz are sort of on the line for this build as it's for my workplace. I've built multiple workstations before, but I've never build a server farm. I have a dremel, a tap set and standard tool set too.
     
    Edit: couldn't post the links as links so here are text for links.
     
    Chassis-
    MB-
     
     
    RAM-
     
    Heat Sink-
     
     
    Hopefully the above links show. If they don't I must not have reached 11 posts and cannot post links at this time. Sorry.



     
    post edited by 9krausec - 2017/09/29 11:26:11
    #11
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 11:50:27 (permalink)
    your thread is already here in the crypto section, that is how i saw it ;) now i am not a miner in the strictest sense either as i run F@H. This also requires pcie x8 so i understand that limitation very well. this will limit you to a max of 5 cards for intel but you could get up to 7 on ryzen threadripper. i personally have ~14 towers with two 1080ti GPU's each, with the full heatsink not blower. they are in a non A/C environment so the ambient air temps fluctuate a lot and even during 90F days the tepms are in the 70C range for the upper card and 60C on the lower one, fan speed is also automatic and not locked to 100%. but i have a LOT of airflow through the cases to help compensate, ~75 CFM each.
     
    the only place i would personally put a blower style GPU over a full heatsink is when you have very limited space and/or poor airflow. if those are not of concern i would pass on the blower style as those will very likely hit thermal limits due to their vastly smaller heatsink, which i personally think is too small for a 250W TPD 1080ti.
    post edited by Chris21010 - 2017/09/29 11:53:48


    #12
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 12:17:16 (permalink)
    Thank you for the follow-up Chris! You hit the nail on the head with my initial questioning of blower style vs non-blower style which was-
     
    If all the units share the same base GPU processor and one has a larger heat sink and more fans, why wouldn't that be the optimal choice in card over a encased blower style card?
     
    I figured it had to do with the hot air from the heatsink being pushed out the side and directly onto the card it's placed next to. As long as the fans on the heat sinks are exhaust fans blowing air out, then a high air speed through the case should dissipate the heat through the rear.
     
    So a few questions for you good sir-
     
    Why 14 towers with only 2 cards per unit? Why not consolidate more cards per unit? Although I'm not doubting your knowledge on this topic, I am worried that 4-6 ( or now as you say 5 cards on intel) jammed in a 4U server chassis will have for more considerations on temps than 2 cards in a tower (which I assume is a full tower?).
     
    If you are also constrained to 8x to 16x for your farm application, would you mind sharing with me what risers you have found to be the "best" performing for 8x to 16x? Right now I'm planning on going full 16x to 16x with Thermaltake TT pcie riser cables, but they are costly and from the sounds of it, hit or miss (I think I lucked out with the two 600mm versions I purchased for my home workstation).
     
    I have decided to stick with Intel for the CPU. Although a Ryzen intrigues me, there are too many unknowns about it so far for me. The Xeons I have picked out I feel comfortable with having dealt with my fair share of Xeons before (they run fairly cool and I've never had any problems with them or their supported socket types before).
     
    Big thanks for pointing out that a Intel CPU can support a max of 5 cards! I had no clue about that little tidbit.
     
    What do you think of the other components in the build? How about the question about the two options for a PSU unit?
     
    Thank you for your time and a big thank you for everyone else who has replied to this thread! It's great to be able to double check this stuff before spending the companies money on this build and getting burnt for overlooking something.
     
    -Clayton
     
    EDIT: The riser cables I need will have to be long enough to reach the video cards based around the 6 GPU setup I posted in the image above.  I don't know what that distance will be until I get the case and motherboard and a single GPU to measure exact distance after mounting them. Sorry for being too vague in that arena. I know distance the riser needs to be plays a big role in what riser to get.
    post edited by 9krausec - 2017/09/29 12:21:14
    #13
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 12:50:51 (permalink)
    for my application its more cost effective to buy $400 towers to hold two cards than spend ~$3000 to hold 4. it also allows for better temps as you are not cramming as many cards in a single box and also allows me to avoid risers. but for your application of rendering i do not know how effective splitting the load over your network to other machines will be. so for your application a single tower with as many GPU's as you can fit, i think, would be ideal. 
     
    as for risers if you only have a single card you can get cheap unshielded ones but if you are using more than two you must make sure that you get shielded risers. something like this: https://www.newegg.com/Pr...x?Item=N82E16812183021
     
    as to the max of 5 GPU's i said this assuming a single intel CPU as it has a max of 40 PCIE lanes but if you are using xeons and use dual CPU's than that limit goes up to 80 PCIE lanes, or 10 cards, if you pick the right CPU's.
     
    Dual PSU will be required if you do anything over 5x 1080ti's. i personally have a rig that has 5x 1080ti's with a single 1600W PSU and its pulling ~1400W from the wall. i was also forced into putting this rig on 240V power because with 120V the amperage was too high making the powersupply/plug overheat and shutdown. but even that is pushing the limits of a single PSU.
    post edited by Chris21010 - 2017/09/29 12:56:39


    #14
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 13:09:27 (permalink)
    That makes total sense. Besides the additional cost of buying more hardware to spread things out, I have a 25U server room restriction to adhere to as well. So cramming as much as I safely can into each 4U box is an important aspect to utilize all the limited space I'm being given.
     
    I've purchased and installed 2 of those 600mm thermaltake pcie cables for my home workstation and they are working great so far (stress testing them tonight). I'll most likely end up buying the same thing for the farm, but hopefully can get away with a shorter length. PCIe lanes per CPU make total sense (a-duh on my part).
     
    I'm definitely getting a better idea for this build by the discussion here. Shielded PCIe risers are going to be necessary, BUT what are your thoughts on purchasing a 90 degree PCIe elbow? Depending on how much overhead space I have in the server, I'm concerned about potential headroom issues with the PCIe riser cable. I don't want to kink them and was looking at elbow to shoot them to a 90 degree angle. If I can get away without them that's for the better, BUT would having the elbow be an issue? Does the elbow need to be shielded?
     
    Other than that last question, the next steps will be as followed-
     
    1. Order all listed hardware except for the GPUs and risers for one test machine.
    2. Install everything into the case and figure out any custom GPU brackets I need to make if necessary. I'll have a couple of 1080ti cards around to work out spacial stuff. Measure length from PCIe slots to one of the mounted 1080ti cards.
    3. Order PCIe risers, and a single GTX 1080ti FTW3 card. With those in hand I'll have 3 1080FTW3 cards for testing heat (not five, but at least I can sandwich the cards to make sure that temps are nominal under load).
    4. If that all checks out, order enough GTX 1080ti FTW3 cards to be able to fill the test pilot machine up with the max cards. Run heavy stress testing for 1-3 weeks.
    5. After that all checks out I'll order everything else to build 5 of these units.
     
    Sounds pretty logical and the best way to mitigate risk of building these units? Thanks again.
     
     
    #15
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 15:11:58 (permalink)
    yea, i could see getting a 90 deg elbow like this but i honestly have no idea how you would run the x16 riser cable around to make it from the board to the GPU's when they are laid out like in the mining rig pic above. they can easily get away with it because they can just plug a usb into the side and not have to worry about a bulky cable... honestly you may be forced to stay with a longer cable just so you can make the turns required to connect them.
     
    that or you can go here and build out a 4U 8x GPU system with little to no hassle: http://www.thinkmate.com/...superserver-4028gr-trt
    post edited by Chris21010 - 2017/09/29 15:15:55


    #16
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 15:33:46 (permalink)
    All good points and I'm sort of blown out of the water by the thinkmate suggestion. That thing looks perfect. I'll take some time to research it over the weekend... It's all about cost per GPU at this point. Also expandability is a definite factor. Spend ~25k on 2x supermicro servers with 8 GPUs per unit OR spend 25k on 5 servers with 2 GPUs per unit, but every year budget for 3k to expand each unit to 5 GPUs.
     
    The latter option doesn't mean anything if it cannot be done as you've pointed out with the riser cables. Too much to think about. :) Just trying to stretch the funds as far as possible, but I will look into the thinkmate suggestion because as I've said, that unit looks perfect.
     
    Thank you!
    #17
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 15:38:30 (permalink)
    yea depending on your location thinkmate might not be the best place to buy but it was a place to show buildout options on a 4028gr, the system i thought fit your application the best.


    #18
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 15:56:13 (permalink)
    Yeah, it does look like it would fit my application the best. As far as riser cable management, what do you think of this setup? I know these cables really shouldn't kink at all (or at least I'm not looking to kink my cables for the obvious delicacy of the product). This would be the way I'd go about wiring the PCIe cable. Thoughts on plausibility? I still think building custom units would help stretch the dollar and allow for best future expansion of video cards added to the system, but if this just isn't possible without a custom MB like your thinkmate OR has a high chance of failure, I will just end up going the thinkmate route as it's my only option I suppose.
     
    I just have 24U of space in a server room and 25k budget. Trying to figure out options to stretch it. Thanks for sticking with me through our talks.
     



    #19
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 16:10:07 (permalink)
    that works, only down side is that i think all the 90 deg rieser would point to the opposite side from the cable when you have the cards upside down. so maybe keep all the pcie cables on bottom... and with all the cable bulk you'll get when folding the cables for that 90 deg turn upward that 6th card slot will be eaten away i believe.


    #20
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 16:51:19 (permalink)
    Oh woops, 6 cards in that shot, only 5 will be there because of the single Xeon. My bad. Well, I think I'm going to sleep on it and mule it over this weekend. I guess a question comes to mind as to how much these cables can take bend wise. For the 90 degree elbow, I'm thinking as long as I have enough headroom I may not even need it. Depends on how low I can mount the cards and how much I can bend the PCIe riser cable safely. Now I'm thinking about flipping them so power is on top too.... Going to look to see if anyone makes 90 degree bends for dual 8pin connectors. If they do, I might just want to keep them on the bottom and bring the cards as low as possible.
     
    Well, if you think that bending the cable like the image I made might just work, I think I'll buy a riser cable and test it out by putting a gradual to sharp bend in it. Any chance a broken PCIe cable would fry a video card or a motherboard? Wouldn't want to kill my system to test it so I might see about using an older MB I can get my hands on. Older video card too.
     
    I just really like the idea of 5 4U cases with a max capacity of 5 cards in each which would allow me to expand to maximum over a few years. Also the gaps between the cards would allow them to manage heat better, unlike the blower cards in that prebuilt case (which would work fine, but no gaps as they are plugged directly into the MB - sure it wouldn't throttle too much though considering the air flow). Hmm.... The gears are turning. haha. Thanks.
    #21
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 17:03:33 (permalink)
    the closest thing to 90 the power is the EVGA PowerLink. as for the things that could happen with a damaged riser.... no idea, sry.


    #22
    9krausec
    New Member
    • Total Posts : 16
    • Reward points : 0
    • Joined: 2017/09/28 10:16:21
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2017/09/29 17:08:34 (permalink)
    That's a cool little elbow. Would block the gaps between cards but thanks for posting it up. Not that I can think of any specific use for it, but I do have access to a large 3D printer... I have no clue how well those plastic bits would hold up to heat inside of a server case stacked with GPU cards, but just saying I can print just about anything and sort of work Solidworks enough to design simple stuff.
     
    Well I'll post up any updates to this thread about my discoveries. Probably just going to start testing the bend-ability of the PCIe riser cable soon. Bust that out 100 dollars, screw up a $5000 unit then I'm out substantially more.
     
    Who knows, maybe after the completion of this thread I'll post it up on a blog or something to help other people in this situation. :) Thank you for all the tips and tricks Chris. Didn't think the PCIe riser cable would be the hardest mountain to climb with this build. Makes total sense though.
     
    Cheers.
    #23
    QuintLeo
    SSC Member
    • Total Posts : 946
    • Reward points : 0
    • Joined: 2016/04/16 23:05:09
    • Status: offline
    • Ribbons : 3
    Re: Temperature considerations for stacked 1080ti 2017/09/29 18:50:25 (permalink)
    9krausec
    Hello! I am not a miner, but I am looking to build a GPU based render farm out of EVGA 1080ti cards (once stock is replenished). I'm curious about temperatures of the 1080ti line under full load.
     
    Since I am planning to stick 4 1080ti cards in each 4u server box, heat is one of my primary considerations. From what I understand, the aftermarket, non-FE 1080ti cards generate more heat output. Is this correct?

    To me that doesn't make so much sense as the same fundamental guts should be used for each version of the 1080ti. The FTW3 series sounds like it has the best cooling out of all the air cooled cards as it has a triple fan setup and an "over engineered" radiator unit.
     
    So I'm thinking if I stack 4 of them vertically. Space them apart by 1 1/2 inches per card, have enough air being pushed through the case AND have the case in a AC cooled server room, I would be just as well off using a FE version as I would using a SC or FTW3 version. Is this a correct assessment? Goal is to have antiquate cooling so the cards do not throttle themselves back due to temps.
     
    Anyone want to weigh in on stacking 4 1080ti FTW3 cards vertically with 1 1/2 inch gaps between cards, air being forced through a 4U case and having it in an AC server room? Bad idea? Would I be best off getting the FE or single blower model?
     
    Hybrid card are out of the question as I don't think I would have enough room in a server chassis to stick the radiators anywhere (unless I cut slots coming out the top of the case for the radiator and fans to protrude).
     



     
     With the case you are planning on (looks like that Rosewill "mining" 4u case?) you should have enough thermal management and airflow to handle 5 1080ti, 6 might be marginal and forget 6 if you use cards like the Aorus that are "3 slots" wide.
     
     I am pretty sure rendering wants a LOT of PCI-E bandwidth, like Folding does but NOT like mining does - USB-type risers only carry *1* PCI-E lane to the GPU and are probably going to be a serious bandwidth limitation for your usage that will have a performance impact.
     
     Aftermarket 1080 ti cards from what I have seen tend to have the same TDP as the FE cards, but have MUCH BETTER cooling solutions as a general rule.
     There MIGHT be "higher TDP" aftermarket card(s) I am not aware of, but my Aorus has the same 250 watt TDP as the FE cards have, so I'm starting do doubt it.
     
     If you decide to go with blower-type cards (probably a good idea in a rack-mount case) I would recommend the ASUS 1080ti Turbo model, as they are the ONLY 1080ti CARD ON THE MARKET that doesn't block part of the airflow exit with a huge DVI connector and will provide a lot better cooling than ANY other blower-type card as a result.
     There are some fan-type cards with fins that run horizonal that blow a lot of the hot air out the back and MIGHT cool as well due to more overall airflow through the HS assembly, but I doubt they would end up cooling much if any better than that specific ASUS in your usage.
     
    1.5 inches between 1080 ti cards is going to be VERY iffy for cooling.
    I'd aim for at LEAST 2 and preferably 2.5-3 inches.
     
     Also, the case you appear to intend to use has REVERSE AIRFLOW from normal rack-mount cases due to how the GPUs are mounted.
     Keep that in mind if your server room is a hot aisle - cold aisle setup.
     
     
     On points you didn't mention - BIOSTAR and a few other folks make fairly inexpensive motherboards with 4x PCI-E 16x slots on them (though the slots tend to not all RUN at 16x, 1 or 2 will and the rest are 8x or 4x).
     Powered ribbon risers would be a bit more of a pain to install, but probably would offer better performance than powered USB risers.
     Folding the cable on such risers isn't all that big a deal - the one and only riser rig I have has 2 90degree angle folds in it but seems to have little if ANY effect on the card in that slot.
     Just avoid any more SHARP folds than you absolutely need.
     
     Run the cable UNDER the cards, not OVER them - there should be plenty of space down there - and it'll probably work better with the risers orientation anyway.
     Most "mount many GPU" type cases leave more space below than over the cards for cable management and airflow reasons.
     

    Now that vorsholk has stopped his abuse, I'm returning to folding.
     I no longer MOO due to abuses by certain "whales" in the Gridcoin community - so I now work the Distributed.net project directly again.
     
    #24
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 18:56:35 (permalink)
    QuintLeo
     If you decide to go with blower-type cards (probably a good idea in a rack-mount case) I would recommend the ASUS 1080ti Turbo model, as they are the ONLY 1080ti CARD ON THE MARKET that doesn't block part of the airflow exit with a huge DVI connector and will provide a lot better cooling than ANY other blower-type card as a result.



    all FE style cards, by any brand, are also single slot cards without a DVI connector.


    #25
    Ranmacanada
    SSC Member
    • Total Posts : 992
    • Reward points : 0
    • Joined: 2011/09/22 10:44:47
    • Status: offline
    • Ribbons : 3
    Re: Temperature considerations for stacked 1080ti 2017/09/29 20:38:51 (permalink)
    With buying this many cards, I am wondering if it would be best to wait until the release of the 1070ti on the 26 to see if the price of cards will actually go down.  A rendering unit like this is expensive and I know downtime is even more expensive, but saving money is always good.  Also as Chris mentioned in regards to the power issues.  You're going to need a dedicated 240 volt line for this, and probably a 2000 watt power supply as a 1600 watt is not going to cut it.  http://www.super-flower.com.tw/products_detail.php?class=2&sn=16&ID=119&lang=  Superflower makes a 2000 watt PSU but I think it is only for the EU market.  

     

    ASUS TUF GAMING X570-PLUS (WI-FI)
    AMD Ryzen 2700
    Fold for the CURE!
    EVGA 1080 FTW
    EVGA 1080Ti Hybrid

    #26
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2017/09/29 21:24:06 (permalink)
    yea i do not see a power supply getting released in the US market over 1600W. because at only 1400W my 10 AWG MASSIVE plug that comes with the 1600W power supply gets hot enough that you could not hold it indefinitely without feeling like you were getting burned... the amperage was just too high @ 120V and just not safe for constant loads.
     
    server side is a different story though as those are either designed for 208V 3 phase power or some other higher voltage system to begin with.
    post edited by Chris21010 - 2017/09/29 21:26:47


    #27
    QuintLeo
    SSC Member
    • Total Posts : 946
    • Reward points : 0
    • Joined: 2016/04/16 23:05:09
    • Status: offline
    • Ribbons : 3
    Re: Temperature considerations for stacked 1080ti 2017/10/06 09:51:43 (permalink)
    Chris21010
    QuintLeo
     If you decide to go with blower-type cards (probably a good idea in a rack-mount case) I would recommend the ASUS 1080ti Turbo model, as they are the ONLY 1080ti CARD ON THE MARKET that doesn't block part of the airflow exit with a huge DVI connector and will provide a lot better cooling than ANY other blower-type card as a result.



    all FE style cards, by any brand, are also single slot cards without a DVI connector.




     I'd forgotten about the FE cards, as their cooling solution isn't all that good.
     
     It is not practical to build a ATX-type power supply for the US market of over 1600 watts due to the standard widespread usage of 15 amp NEMA 5-15R duplex outlets on 117VAC (nominal) circuits.
    Even a 1600 running at full capacity is PUSHING a little past the limits of such a circuit in areas where the voltage is lower than the nominal "117 VAC" - voltages in some areas commonly are closer to 110 VAC.
     
     Server supplies get bigger because 220 volt distribution is common in data centers (a legacy of "standard design decisions" made back in the Mainframe era when a single computer often eat many kilowatts).
     

    Now that vorsholk has stopped his abuse, I'm returning to folding.
     I no longer MOO due to abuses by certain "whales" in the Gridcoin community - so I now work the Distributed.net project directly again.
     
    #28
    idris
    New Member
    • Total Posts : 2
    • Reward points : 0
    • Joined: 2018/01/09 16:32:06
    • Status: offline
    • Ribbons : 0
    Re: Temperature considerations for stacked 1080ti 2018/01/09 17:04:44 (permalink)
    9krausec
    Hello! I am not a miner, but I am looking to build a GPU based render farm out of EVGA 1080ti cards (once stock is replenished). I'm curious about temperatures of the 1080ti line under full load.
     
    Since I am planning to stick 4 1080ti cards in each 4u server box, heat is one of my primary considerations. From what I understand, the aftermarket, non-FE 1080ti cards generate more heat output. Is this correct?

    To me that doesn't make so much sense as the same fundamental guts should be used for each version of the 1080ti. The FTW3 series sounds like it has the best cooling out of all the air cooled cards as it has a triple fan setup and an "over engineered" radiator unit.
     
    So I'm thinking if I stack 4 of them vertically. Space them apart by 1 1/2 inches per card, have enough air being pushed through the case AND have the case in a AC cooled server room, I would be just as well off using a FE version as I would using a SC or FTW3 version. Is this a correct assessment? Goal is to have antiquate cooling so the cards do not throttle themselves back due to temps.
     
    Anyone want to weigh in on stacking 4 1080ti FTW3 cards vertically with 1 1/2 inch gaps between cards, air being forced through a 4U case and having it in an AC server room? Bad idea? Would I be best off getting the FE or single blower model?
     
    Hybrid card are out of the question as I don't think I would have enough room in a server chassis to stick the radiators anywhere (unless I cut slots coming out the top of the case for the radiator and fans to protrude).
     
    Thanks,
    Clayton


     
    Hi Clayton,
     
    I'm building a small workstation for Redshift rendering for University course with 2x 1080Ti
    Would you recommend the blower or hybrid cards? or even the Kingpins?
     
    I'm looking into an i7 8700K with 32GB
    and either this case:

    or

     
    Any recommendations?
    Thanks
    Idris
    #29
    Chris21010
    FTW Member
    • Total Posts : 1587
    • Reward points : 0
    • Joined: 2006/05/03 07:26:39
    • Status: offline
    • Ribbons : 9
    Re: Temperature considerations for stacked 1080ti 2018/01/09 20:22:04 (permalink)
    you need more post in order to have links. but with 2x 1080ti's the key things you'll need to look into is keeping at least a single slot between the cards, the cards have 2+ fans, and case a plenty of airflow through it. you can go one hybrid and put that one on top, this too will help a lot with temps as the top GPU is always the hotter one.


    #30
    Page: 12 > Showing page 1 of 2
    Jump to:
  • Back to Mobile