Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:30:00
(permalink)
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself. mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed. BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
static.quai
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2021/01/12 13:10:55
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:40:40
(permalink)
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more t Submit Posthan 10 gigs.
|
dexmex88
New Member
- Total Posts : 22
- Reward points : 0
- Joined: 2019/01/29 17:39:50
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:43:58
(permalink)
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
|
Lebon14
Superclocked Member
- Total Posts : 120
- Reward points : 0
- Joined: 2020/12/11 20:13:10
- Location: Canada
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:47:05
(permalink)
STATUS OF QUEUES: EUROPE - updated by BovineGamer NORTH AMERICA - updated by enewt In queue for, in order I joined them. 10G-P5-3897-KR ~ 12/11/2020 ~ 8:17:23 PM PT ~ YES! (Jan. 20 2022 - didn't buy) 08G-P5-3755-KR ~ 4/27/2021 ~ 4:14:34 PM PT ~ No (Removed) 08G-P5-3767-KR ~ 4/22/2021 ~ 3:28:25 PM PT ~ No (Removed) 08G-P5-3797-KL ~ 6/10/2021 ~ 6:48:07 AM PT ~ YES! (21 sept., 2021 - BOUGHT!)
|
static.quai
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2021/01/12 13:10:55
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:50:58
(permalink)
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
|
dexmex88
New Member
- Total Posts : 22
- Reward points : 0
- Joined: 2019/01/29 17:39:50
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 11:53:44
(permalink)
static.quai
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
But that's what Dabadger84 wrote so I'm confused on the incorrect part.
|
static.quai
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2021/01/12 13:10:55
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:00:33
(permalink)
dexmex88
static.quai
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
But that's what Dabadger84 wrote so I'm confused on the incorrect part.
I think he just said people are bonkers for buying a 3090. But I'm not perfect, maybe I misread it. I don't think he said anything about the 20g of ram that come with that card.
|
dexmex88
New Member
- Total Posts : 22
- Reward points : 0
- Joined: 2019/01/29 17:39:50
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:03:08
(permalink)
static.quai
dexmex88
static.quai
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
But that's what Dabadger84 wrote so I'm confused on the incorrect part.
I think he just said people are bonkers for buying a 3090. But I'm not perfect, maybe I misread it. I don't think he said anything about the 20g of ram that come with that card.
No big deal was just curious. The last comment in the paragraph was "or REALLY need that 24GB frame buffer of vRAM."
|
static.quai
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2021/01/12 13:10:55
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:08:43
(permalink)
dexmex88
static.quai
dexmex88
static.quai
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
But that's what Dabadger84 wrote so I'm confused on the incorrect part.
I think he just said people are bonkers for buying a 3090. But I'm not perfect, maybe I misread it. I don't think he said anything about the 20g of ram that come with that card.
No big deal was just curious. The last comment in the paragraph was "or REALLY need that 24GB frame buffer of vRAM."
Fair point, and a good callout. You are correct he called that out as a condition.
|
Cavell219
New Member
- Total Posts : 29
- Reward points : 0
- Joined: 2018/01/07 06:31:54
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:09:34
(permalink)
Dabadger84
Cavell219 Most miners will actually user server PSU's with a breakout board over high power super expensive PSUs.. Saves alot of money
I only mentioned it because I recently finally drug a LEPA G 1600W PSU I've had sitting on the shelf for 2+ years (since I got the eVGA 1200W Platinum PSU) and sold it on to get that heavy-arse box off me shelves, and the guy that bought it explicitly stated he was going to be using it in a multi-GPU setup for "various applications" so to be "sure to include all the GPU cables". Guy paid the BIN price instead of even trying to bid so I take from that he either A: really wanted a high wattage PSU and with the shortage, didn't wanna wait, or B: money is no object cuz it's for a mining rig. I'll never know, since it's going from me to NYC then to Kuwait And the "To Kuwait" part ain't my problem cuz the listing was restricted to U.S. only, once it was signed for in NYC my job was done in terms of insuring intact delivery. But she was packaged to withstand a tank hit, airpack all around & most of the outside of the outer box was supported with tape to semi-waterproof the box, at least from snow, I hope.
Yeah I sell on Ebay alot. The GSP shipments are supposed to be inspected at the shipping centers before sending, so once it leaves there it is no longer your responsibility. It's nice because without that I would never ship internationally, too much of a headache.
|
Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:31:16
(permalink)
Cavell219 Yeah I sell on Ebay alot. The GSP shipments are supposed to be inspected at the shipping centers before sending, so once it leaves there it is no longer your responsibility. It's nice because without that I would never ship internationally, too much of a headache.
It's even better than that in my case. He purchased with a US address as the ship to and that's where I shipped it to, he's having someone else ship it from there to him - so it's out of my hands. But like I said, I did make sure it was very well packaged & that everything was included, I pride myself on my 100% feedback on there across over 100 transactions involving mostly computer parts, don't wanna lose that over an international shipping mishap that is actually my fault :-D I haven't actually sold stuff on EBay in a while before recently when I started back up dumping that 2070 Super XC Hybrid I bought for Step Up queue then decided to heck with waiting on that - things have changed a little bit with the whole people paying tax then them deducting it from the payout and all that jazz, I'm assuming that was states going "HEY! HEY! SALES TAX BOOOIIIIIIII!" which I think was juuuuust starting to be a thing right after when I last sold on EBay, which was 2 R9 295x2s during the bitcoin craze of initial mining madness, still find it insane those things resold for $1000 & $1100 respectively after I got them for $999 (new) & $750 (used) right after their price dropped from the original $1500 MSRP. Ah the days when Multi-GPU was not only a thing but it actually worked pretty darn well in a fair bit of titles. Edit: Actually I take that back, I resold my Titan Xp's on there after running SLi of them for a while, got tired of the blower-style noise & heat output on that setup and went to the single eVGA 1080 Ti FTW3 Hybrid. But that was over 2 years back as well.
post edited by Dabadger84 - 2021/02/08 12:43:11
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
https.shaf
New Member
- Total Posts : 22
- Reward points : 0
- Joined: 2019/09/29 14:50:22
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:34:11
(permalink)
Any movement on the 3897 queue?
|
mech9t5
FTW Member
- Total Posts : 1413
- Reward points : 0
- Joined: 2007/06/13 16:18:55
- Status: offline
- Ribbons : 2
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:36:16
(permalink)
https.shaf Any movement on the 3897 queue?
Unfortunately, none so far for any cards have been reported here
Associate Code: P7JUX093GU7RID0
|
jlybp
New Member
- Total Posts : 24
- Reward points : 0
- Joined: 2020/08/19 18:00:14
- Location: NYC
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:38:33
(permalink)
no drop today
|
Vladiemoose
New Member
- Total Posts : 15
- Reward points : 0
- Joined: 2020/09/26 08:10:39
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:38:36
(permalink)
I'm a couple hours away from a 3080. They should just close out the notification queue system after me because 10/6 is going to take YEARS to finish LOL
|
Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:40:46
(permalink)
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
mech9t5
FTW Member
- Total Posts : 1413
- Reward points : 0
- Joined: 2007/06/13 16:18:55
- Status: offline
- Ribbons : 2
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:51:28
(permalink)
Associate Code: P7JUX093GU7RID0
|
nomoss
FTW Member
- Total Posts : 1559
- Reward points : 0
- Joined: 2009/04/04 19:45:27
- Status: offline
- Ribbons : 7
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:52:11
(permalink)
|
mech9t5
FTW Member
- Total Posts : 1413
- Reward points : 0
- Joined: 2007/06/13 16:18:55
- Status: offline
- Ribbons : 2
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:54:59
(permalink)
nomoss My wife's keyboard. It's clicky and fast and syncs with all her RGB. You should like it.
That's good to hear.
Associate Code: P7JUX093GU7RID0
|
Alynoser
New Member
- Total Posts : 88
- Reward points : 0
- Joined: 2017/03/10 23:53:09
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:56:51
(permalink)
https://www.tomshardware.com/news/alternate-europe-rtx3000-availability
Bad news from toms hardware
|
static.quai
SSC Member
- Total Posts : 682
- Reward points : 0
- Joined: 2021/01/12 13:10:55
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:57:12
(permalink)
Meh. I don't know how to share an image
post edited by static.quai - 2021/02/08 12:59:56
Attached Image(s)
|
philipma1957
SSC Member
- Total Posts : 593
- Reward points : 0
- Joined: 2016/12/23 05:42:13
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 12:58:35
(permalink)
Entered newegg shuffle for a ryzen 5900x no gpus on the list.
|
Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:12:15
(permalink)
philipma1957 Entered newegg shuffle for a ryzen 5900x no gpus on the list.
Did you get it? Good luck! mech9t5 I decided to buy this after viewing/reading a lot of reviews. I usually read the worst user reviews to see if the bad things people say about it are consistent or not. So we will see. I'm going to try it this evening. And yes, the price is up there for "high end" (lol) keyboards.
I watched & read a lot of keyboard based reviews and narrowed it down to the SteelSeries Apex tier, the K95 or the K100... or a Das Keyboard. I think if I end up not liking the Corsair K100 enough to keep it, I'm just gonna go opposite end of the high end spectrum (in terms of flashy etc) and get the Das Keyboard - hopefully the K100 will not disappoint, I haven't had any switches by Cherry MX Browns in ages, think my last 4 keyboards or so have all be them, and that's spanning back probably 6+ years back I think. I almost made the mistake of getting a Vulcan AIMO, reviews quickly steered me away from that thing though, the text on keys "wearing off" after a few days/weeks of use, yikes on Quality Control lol
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:29:52
(permalink)
drewski989 It is hard to say "please use my associate's code" and not feel like you are inherently self-promoting. I have basically stuck with the phrase "please remember to apply an associate's code on checkout, there are a bunch out there available, I want to see people take advantage of the program and not turn down free money. That being said, here is mine if you'd like to use it." Makes me feel a little less greasy and not like I am pushing people to use my code.
Just hit me, as an aside to this discussion, if anyone feels so inclined (and everyone should, keep reading), don't forget that you can also input a "Rewards Program Code" during REGISTRATION of any eVGA Product (does NOT have to be purchased on eVGA.com, just has to be registered within 30 days of purchase), then once you upload your invoice on the "My Products" page, the person who's code you used gets 2 eVGA Bucks. Almost everyone qualifies to have an eVGA Rewards Code (think it has easier requirements than Elite Status), so it's literally a free way to "spread the love" around during this whole process. Don't think I've mentioned that one more than once or twice in the whole thread previously - yet 12 different people have used mine O_o which is awesome. And for those, it actually tells you the person's name, so you can thank them if you so choose to. I have never seen InfusionOfFear before in the names on this here forum to the best of my knowledge, but dude used my rewards code for an AIO & a PSU - free 4 eVGA Bucks, YEEEEEEEEEET Yet another simple step that can help someone out a weebit, those add up eventually too. Find someone in the thread that you know hasn't gotten their card yet & has a code listed for Rewards, use theirs, then they get 2 eVGA Bucks to put towards their card when their queue spot gets called - win win :-D
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
philipma1957
SSC Member
- Total Posts : 593
- Reward points : 0
- Joined: 2016/12/23 05:42:13
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:42:28
(permalink)
Dabadger84
philipma1957 Entered newegg shuffle for a ryzen 5900x no gpus on the list.
Did you get it? Good luck!
mech9t5 I decided to buy this after viewing/reading a lot of reviews. I usually read the worst user reviews to see if the bad things people say about it are consistent or not. So we will see. I'm going to try it this evening. And yes, the price is up there for "high end" (lol) keyboards.
I watched & read a lot of keyboard based reviews and narrowed it down to the SteelSeries Apex tier, the K95 or the K100... or a Das Keyboard. I think if I end up not liking the Corsair K100 enough to keep it, I'm just gonna go opposite end of the high end spectrum (in terms of flashy etc) and get the Das Keyboard - hopefully the K100 will not disappoint, I haven't had any switches by Cherry MX Browns in ages, think my last 4 keyboards or so have all be them, and that's spanning back probably 6+ years back I think. I almost made the mistake of getting a Vulcan AIMO, reviews quickly steered me away from that thing though, the text on keys "wearing off" after a few days/weeks of use, yikes on Quality Control lol
I have a corsair k63 wired model it is pretty good. The wireless model 63 can drop synch and it is a P.I.T.A to set back up. I will find out about the ryzen cpu in an hour or 2. If I get it I will try to talk my friend into that over the intel 11th gen. I7 As for getting card unfortunately gpu card mining is better today than yesterday. 3080 3060ti 3070 3090 in that order of best per dollar. With current daily earning rising yet a gain from 11.22 cents per mh to 12.45 cent per mh So the demand from miners is higher today than yesterday. The worst part of miners it not that they take a card away from a gamer. It is that they may take 10 or 20 or 30 or 40 cards away. So they really can hurt the availability of cards. Second problem is they will overpay a scalper. For a miner with cheap power 1000 dollar 3060ti or 3070 are fine. Hurting gamers more. My business of building 2 card gamer/miner rigs is dead in the water as I can't get cards. I built 2-3 rigs a month in Oct Nov Dec 8 all told. Sold them to friends and prior customers. I never do huge volume maybe 25 rigs in a year. So far this year 1 rig.
|
mizzer
iCX Member
- Total Posts : 309
- Reward points : 0
- Joined: 2010/02/06 12:52:24
- Status: offline
- Ribbons : 1
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:52:51
(permalink)
Dabadger84
Nice choice. It was on my short list. The minimalist approach on the APEX Pro ultimately won out for me. Still, the K100 was in my top 3.
Gigabyte Aorus Master x570 * AMD 5900X * 32GB G.SKILL Trident 3600 * Arctic Liquid Freezer II 420 * GeForce 4090 FE * Corsair RM1000x PSU * 1TB Samsung 970 EVO NVMe * 2x 2TB Samsung 870 QVO SSDs * LG OLED48CXPUB * Fractal Design Meshify 2 XL
|
TheRealMikeVan
SSC Member
- Total Posts : 818
- Reward points : 0
- Joined: 2020/10/30 04:49:58
- Location: OH
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:57:01
(permalink)
static.quai
dexmex88
static.quai
dexmex88
static.quai
dexmex88
static.quai
Dabadger84
rjbarker Considering a 9900K OC'd (all Cores) to 5.2Ghz w 1.4v (vcore) can peak between 300W - 400W I would say your "well north of 200W" is being quite conservative. (of course this all dependent on vcore applied). As always with PSU, keeping your load to around 60% of rated will keep you in the "high efficiency" park....vs running PSU at say 85%...... OC'ing both GPU and CPU adds a whole lot to the equation, a lot more than folks....on the other hand how often do you run both your CPU and GPU under "max load" simultaneously.
Meanwhile my lowly 9900k @ 5GHz slurps down 170-200W under load and can't go to even 5.1GHz regardless of voltage cuz I either can't keep it cool or it's just a garbage bin tier die. The 200W figure is more of a "typical OC" folks, I.E. those that'll only push it to try & get the "All Core" speed to the stock max boost - I expect it to be more like 250-300W for that for most, and anything north of that is going to get stupid hot/high wattage, very fast. I'm buying the Z590 Dark as much for the looks as I am for it's abilities, which is a rarity for me, I don't think I've ever purchased hardware before based more/as much because of how it looks as how it performs (assuming it lives up to the Z390 & Z490 Dark's legacies) The whole loading both "when does that happen besides benchmarks" thing is always a good point, but then there's always the "but when it does, you need this much power" counter point holds up. That's why I bought a 1200W "I will not have issues until this PSU gets old" solution myself.
mech9t5 So two hours in and no one has posted e-mail drops. It doesn't look like a good start for this week lol
And it's already 11:30 Pacific almost, not a good sign indeed.
BovineGamer Dabadger84 gets a lot of flak, but you will not find many 3090 owners like him outright saying you are fine with a 3080. He is the reason I have not yet pulled the trigger on a 3090
I do quite frankly stand by that point: anyone seriously wanting a 3090 for any other reason than "because I can" or "I want the best no matter the cost" is bonkers imo. The performance difference for 90% of use cases is never going to justify the increase in cost. Even in my "semi-unique" use case of wanting to drive 5120 x 1440 (7.4M pixels, almost equal to 2160p/"4K") at acceptable frame rates in the newest titles, it's still an eeeeeeeeeeh moment. I did it because I'm living this year in a very "yolo" mode after the horrible year I, and I imagine a good lot of people, had in 2020 - and going specifically for the Kingpin was a "I haven't had one of those since they were called Classified cards" bonus for me personally. It in particular is at least something you can always fiddle with when bored, at least for a while until you figure out the limits of your particular card regardless of voltage, more of because of cooling. The 3090 is an amazing GPU - but we have to keep in mind that we're likely going to be getting 3090-like performance, for $1k or so, in probably 6 months or so, so spending $1600+ on one now, unless you REALLY need it, or REALLY need that 24GB frame buffer of vRAM... yeah, no. Don't do et. DON'T DO THAT SKIP!
This is incorrect. I'm not a gamer. I'm a data scientist. I need that extra ram on the card. I'm currently rocking a 1080TI and a 1070 for machine learning tasks. The 1080TI has more RAM than the 3080. I work with very large datasets. I need more than 10 gigs.
What exactly was incorrect?
That people don't need a 3090. Maybe gamers don't.
But that's what Dabadger84 wrote so I'm confused on the incorrect part.
I think he just said people are bonkers for buying a 3090. But I'm not perfect, maybe I misread it. I don't think he said anything about the 20g of ram that come with that card.
No big deal was just curious. The last comment in the paragraph was "or REALLY need that 24GB frame buffer of vRAM."
Fair point, and a good callout. You are correct he called that out as a condition.
To also be fair, they did say that in like 90% of cases so it looks like you just happen to be in the 10% where it would actually be needed.
EVGA RTX 3080 Ti FTW3 Ultra Intel i7-9700k Corsair HX1200i Gigabyte Z390m Gaming 32GB 3200MHz DDR4 G. Skill Ripjaws V Series Lian Li O11 Dynamic XL White
|
Dabadger84
CLASSIFIED Member
- Total Posts : 3426
- Reward points : 0
- Joined: 2018/05/11 23:49:52
- Location: de_Overpass, USA
- Status: offline
- Ribbons : 10
Re: New EVGA.com Notification Checkout Process
2021/02/08 13:59:55
(permalink)
philipma1957 I have a corsair k63 wired model it is pretty good. The wireless model 63 can drop synch and it is a P.I.T.A to set back up. I will find out about the ryzen cpu in an hour or 2. If I get it I will try to talk my friend into that over the intel 11th gen. I7 As for getting card unfortunately gpu card mining is better today than yesterday. 3080 3060ti 3070 3090 in that order of best per dollar. With current daily earning rising yet a gain from 11.22 cents per mh to 12.45 cent per mh So the demand from miners is higher today than yesterday. The worst part of miners it not that they take a card away from a gamer. It is that they may take 10 or 20 or 30 or 40 cards away. So they really can hurt the availability of cards. Second problem is they will overpay a scalper. For a miner with cheap power 1000 dollar 3060ti or 3070 are fine. Hurting gamers more. My business of building 2 card gamer/miner rigs is dead in the water as I can't get cards. I built 2-3 rigs a month in Oct Nov Dec 8 all told. Sold them to friends and prior customers. I never do huge volume maybe 25 rigs in a year. So far this year 1 rig.
I don't think I'll ever buy a wireless keyboard even if they prove it's "as fast or faster" than wired... Wireless headset is one thing, I like being able to walk around/do stuff maybe while a test or benchmark or Folding @ Home is running, & listen to music with the privacy of a headset, but a keyboard or a mouse wireless? Nah mate, nevar! lol I looked in to mining a little bit, and from what I can tell, with the fact that I'd only be able to mine during the day (solar power so it's only "free" during the day), and the amount of heat the card puts out during that meaning I'd actually have to run AC during the day in the room I'm in despite it being winter (and I sleep in here lol), further increasing the power draw, it's just not worth the $110/month or however much it would be, hassle wise. I'd rather keep putting that computing power in to Folding @ Home when I can, it's less stressful on the GPU overall (much lower load temps on the vRAM, I was seeing 80-84C on my vRAM Junction Temp mining Ethereum) and going to a good cause. Mining has been cannibalizing the GPU gaming market for a long while now, it's just especially bad & 'worse' for us gamers right now, because of the supply shortage, combined with the increase in the "value" of mining resulting in further demand on an already too thin supply... Combine those already crapoli factoids with the Lunar New Year stoppage and the second round of "tariff exemptions ending" resulting in prices probably going up more in April/May, and this year is going to be a major bummer for PC gamers trying to upgrade or build new systems - and I don't think the 3080 Ti launch, assuming it comes in July or September, will go any better, as it will result in refreshes (probably Super variants) for at least the 3070 as well I'd wager, which will probably be a "gold mine" for miners, again, assuming crypto doesn't crash between now then, which it likely won't. Times like these I'm glad I didn't actually take up system building as a "full time" side hobby in 2019 like I almost did, I'd be in a similar situation to you, only worse, because the "gaming" clientele out here is fewer & mostly centralized to the 3 "big" cities in the state, the closest of which is ~50 miles away from me. But I'd still take a computer repair/builder type job in a heart beat. Maybe once the Global Bassard is extinguished I'll hop in to the field more formally.
ModRigs: https://www.modsrigs.com/detail.aspx?BuildID=42891 Specs:5950x @ 4.7GHz 1.3V - Asus Crosshair VIII Hero - eVGA 1200W P2 - 4x8GB G.Skill Trident Z Royal Silver @ 3800 CL14Gigabyte RTX 4090 Gaming OC w/ Core: 2850MHz @ 1000mV, Mem: +1500MHz - Samsung Odyssey G9 49" Super-Ultrawide 240Hz Monitor
|
jehoffman1
Superclocked Member
- Total Posts : 128
- Reward points : 0
- Joined: 2020/12/11 12:01:42
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 14:18:45
(permalink)
"Spring Festival begins on Friday, but I would fully expect companies observing the holiday to start to close down by Wednesday...the knock-on effect of the closure to hit the queue a few weeks down the road." Offices may close, but production will continue though the holiday. No-one leaves expensive production equipment idle when there is high demand for finished product. They just pay workers overtime and give them compensatory time off later. The only reason why EVGA would shut down production during the holiday is if they don't have the parts with which to manufacture products. There is no way in hell that Samsung and TSMC are going to be stopping production of chips in their multi-billion dollar fabs when demand is through the roof.
|
GrapeCollie
New Member
- Total Posts : 38
- Reward points : 0
- Joined: 2018/02/12 06:42:56
- Status: offline
- Ribbons : 0
Re: New EVGA.com Notification Checkout Process
2021/02/08 14:27:11
(permalink)
jehoffman1 "Spring Festival begins on Friday, but I would fully expect companies observing the holiday to start to close down by Wednesday...the knock-on effect of the closure to hit the queue a few weeks down the road." Offices may close, but production will continue though the holiday. No-one leaves expensive production equipment idle when there is high demand for finished product. They just pay workers overtime and give them compensatory time off later. The only reason why EVGA would shut down production during the holiday is if they don't have the parts with which to manufacture products. There is no way in hell that Samsung and TSMC are going to be stopping production of chips in their multi-billion dollar fabs when demand is through the roof.
Factories and ports might shut down... they increase production to 150 percent for a few weeks before the 3 week shutdown
12G-P5-3657-KR 2/25/2021 9:21:08 AM PT YES 08G-P5-3665-KR 12/8/2020 4:21:56 PM PT No 08G-P5-3667-KR 12/2/2020 7:45:03 AM PT No 10G-P5-3897-KR 11/10/2020 10:38:02 PM PT No 10G-P5-3895-KR 11/10/2020 10:35:37 PM PT No 10G-P5-3885-KR 11/10/2020 10:28:58 PM PT No 10G-P5-3883-KR 11/10/2020 10:26:21 PM PT No 08G-P5-3755-KR 11/8/2020 9:21:12 PM PT YES
|