Announcement

Collapse
No announcement yet.

Radeon RX 7900 Series Launched

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Re: Radeon RX 7900 Series Launched

    Originally posted by shovenose
    What are others plans this go-around?
    No plans, as usual.

    I just get cheap junk video cards when they are extra cheap (i.e. scrap value).
    I bought a Radeon HD5830 (ROP-castrated HD5870) several years back for about $15 total (with shipping) to my door. Used it for about 1 or maybe 1.5 years now doing some light gaming on it (30-50% GPU load max) and kept the temperatures fairly low (as low as a HD5830 would stay, LOL.) It just died last month "out of the blue" after I forgot to turn on MSI Afterburner to keep the fan speeds cranked up to keep it cool. Oh well, no big loss. Just the cooler alone from that card is worth the $15 I paid for it.

    Funny thing is, just the week before my HD5830 died, I purchased another card off of eBay - this time, an HD7570 branded by HP/Pegatron. It was a salvaged card from a recycled PC. Looked dirty AF. Figured it be a quick restoration job. Wanted to get it to see how much power it uses and how hot it runs with its stock Pegatron cooler, as I want to eventually get another HD7570 for an SFF machine. Anyways, long story short, that Pegatron HD7570 turned out to be a little more than a quick restoration projects, missing close to 20 random SMDs on the back. But an hour or two later repairing some ripped tracks and putting back those ripped SMDs, it came back to life. And it runs surprisingly cool with the stock Pegatron cooler. Go figure! Anyways, just the next day after I got it working and tested is when my HD5830 failed. So all of that effort wasn't in vain - I got my "gaming PC" back up and running in no time... and for only $13. Sure the HD7570 may not be as powerful as the HD5830... but in Fortnite (the only "modern" game I care to play nowadays), it doesn't make a difference, since that game cares mostly about the CPU and not GPU. Otherwise, I'd have swapped that GPU with something better.

    My current best (working) video card is a HIS Radeon HD7950 3 GB... and it's sitting in a drawer. I used it for a bit, but haven't played too many games lately that would require this kind of power (quite literally .) And Fortnite doesn't care for its better capabilities, at least with the rest of the system it's used with. In my case, the bottleneck is with the CPU (Xeon E5649)... or rather, Fortnite/Epic just being poorly coded and not wanting to utilize all of the cores properly (only ~30% CPU utilization total.) Meh, whatever. Not like I have that much time for gaming anymore either (once or twice a week tops.)

    Originally posted by Topcat View Post
    1080's can be had all day long for a couple hundred now. I'm still running my 24gb M6000....and unless something happens to it, zero plans of replacing it.
    $100-200 in my area, depending on how often you check and how fast you act. I saw one guy sell 2x on Craigslist for $100. Thought about buying one, but the 1000 series and newer don't have analog output. And I only game on CRTs. So no benefit to me. Also, looking at how often these newer cards fail, judging by the Motherboard/GPU subforum on here, I'm not too keen on spending too much money on these. I bought my HD7950 for $40 right before the onset of the great GPU depression in early 2020. At some point, I could have easily sold that card for $100, but I just didn't bother. Now they can be had for less probably. I can also get 1060's for $100 all day long... but won't for the same reasons listed above.

    Originally posted by Dan81 View Post
    Honestly, I would NOT go past 1000 series for nVidia and at best the R9 lineage. Everything else newer just sucks.
    R9's are terrible for reliability.
    Then again, so are all of the HD7k series too, including my HD7950. The HD7570 is an exception, though, being a 40 nm "Turks" core - same exact thing as the HD6570. So it's really a HD6k series card, and much more reliable. HD5k is also not bad, but they run a little hotter.

    Originally posted by ChaosLegionnaire View Post
    well... im picking the fifth option: i dont upgrade my gpu anymore. modern gpus stink and just suck! radeon 9800 pro 4tw! gf 6800 ultra 4tw! 9800gtx, gts 250 4tw! radeon 3870 agp 4tw!
    retro's the best.

    Unfortunately, all of the above also fail quite frequently... but only if used with the stock or inadequate coolers. Unlike modern video cards, which just literally "burn through" their silicone (no really, they do - think about how much silicon there is in a single transistor at sub-10 nm technology compared to the old stuff), the old cards can last quite a bit when cooled properly.

    I'm personally more of a fan of older mid-range cards like the Radeon 9600 (for Win98 and early XP games) along with HD4650/4670 for late XP era games.

    Originally posted by ChaosLegionnaire View Post
    imagine just how much those 10k bitcoins were worth when crypto was at its peak! just do the math yourself!
    That's why I refuse to get into or even try crypto "mining" - the "value" is only there when there is some hype behind it. Of course, it probably will never completely go away now, since some 3rd world countries have such unstable banking systems, that crypto really is the only viable alternative. Well, it's that and also people who like to use it to make illegal transactions / money transfers / money laundering... though it seems the authorities are starting to catch onto this pretty quickly now.

    Originally posted by Hondaman View Post
    I don't understand why we even need so much power -- I just surf the net, check e-mail and do basic budgeting for my household.
    Same, and I don't even need a discreet GPU on the machine I do this on. It has a....
    wait for it...
    ...
    .
    .
    Intel "Extreme Graphics" 2 onboard... i.e. the good ol' i865 chipset with built-in graphics on a P4 system. LOL!

    FWIW, the HD6770 you have will still play some modern e-sports titles decently well (Rocket League, Fortnite, CS:GO, and some older COD titles.)
    Last edited by momaka; 01-01-2023, 12:57 AM.

    Comment


      #22
      Re: Radeon RX 7900 Series Launched

      Oh by the way, I saw this awhile back on eBay. Don't remember if I posted it here or not already (please excuse me if I did), as I've been a goner for a while... but here it is anyways:



      Clearly a joke, but I though quite funny.
      .
      And man, I've really been gone a while, haven't I - it took me a few good moments to remember how to upload images on here again
      Attached Files

      Comment


        #23
        Re: Radeon RX 7900 Series Launched

        Originally posted by momaka View Post
        R9's are terrible for reliability.
        Then again, so are all of the HD7k series too, including my HD7950.
        I have a 7870, a standard R9 280 (aka 7950) and a R9 280x (aka 7970GHz).

        If anything I've learnt, as long as I don't pair these with a AMD FX furnace, they do incredibly well on most platforms I've tested them on - X58+i7 920, H55+i3 540, P55 (MSI)+ i5 750, P55+i7 860 and Z68+i7 2600k. All did well with all three GPUs, running Win10 LTSC IoT on a 320GB Seagate 7200.6 laptop drive.

        The only machine I've seen them literally choke on is a FX-4100 over a 880GM-UD2H. That chip is a literal waste of sillicon, and I hate myself for not being smart enough to avoid it and go for a Phenom.
        Main rig:
        Gigabyte B75M-D3H
        Core i5-3470 3.60GHz
        Gigabyte Geforce GTX650 1GB GDDR5
        16GB DDR3-1600
        Samsung SH-224AB DVD-RW
        FSP Bluestorm II 500W (recapped)
        120GB ADATA + 2x Seagate Barracuda ES.2 ST31000340NS 1TB
        Delux MG760 case

        Comment


          #24
          Re: Radeon RX 7900 Series Launched

          4090s are nice but I got my 3080 founds at MSRP over a year ago and it works great for my needs. I thought about a 7900 card but hearing about the problems with the units I will probably wait for the next gen before I do anything which at that point I will have to replace my whole AM4 system and go to AM5.
          My Computer: AMD Ryzen 9 3900X, Asrock X370 Killer SLI/AC, 32GB G.SKILL TRIDENT Z RGB DDR4 3200, 500GB WD Black NVME and 2TB Toshiba HD,Geforce RTX 3080 FOUNDERS Edition, In-Win 303 White, EVGA SuperNova 750 G3, Windows 10 Pro

          Comment


            #25
            Re: Radeon RX 7900 Series Launched

            I'm still on Intel UHD Graphics 630. It does fine for what I'm doing. My laptop has an Nvidia T1000 (not a Terminator), but I don't need the extra performance compared to Intel graphics.

            Comment


              #26
              Re: Radeon RX 7900 Series Launched

              Just throwing in my $0.02.
              IMO, the physical size of GPUs this gen is getting entirely out of hand. 4 to 5 slots for a single GPU? Insanity!
              Don't buy those $10 PSU "specials". They fail, and they have taken whole computers with them.

              My computer doubles as a space heater.

              Permanently Retired Systems:
              RIP Advantech UNO-3072LA (2008-2021) - Decommissioned and taken out of service permanently due to lack of software support for it. Not very likely to ever be recommissioned again.
              Asus Q550LF (Old main laptop, 2014-2022) - Decommissioned and stripped due to a myriad of problems, the main battery bloating being the final nail in the coffin.


              Kooky and Kool Systems
              - 1996 Power Macintosh 7200/120 + PC Compatibility Card - Under Restoration
              - 1993 Gateway 2000 80486DX/50 - Fully Operational/WIP
              - 2004 Athlon 64 Retro Gaming System - Indefinitely Parked
              - Main Workstation - Fully operational!

              sigpic

              Comment


                #27
                Re: Radeon RX 7900 Series Launched

                Originally posted by TechGeek View Post
                Just throwing in my $0.02.
                IMO, the physical size of GPUs this gen is getting entirely out of hand. 4 to 5 slots for a single GPU? Insanity!
                You're not alone into thinking this and I agree with what you say 100%.

                The reason video cards keep getting bigger and bigger is because of the ever-increasing TDP requirements. These top-end GPUs can use (dissipate into heat)... what is it now, upwards of 300 Watts? And all of that in 400-600 mm^2 of surface area. That's probably more heat output per surface area than my 75 Watt soldering station. So just think about it for a second! Probably sooner than later, that gag picture I posted above about the RTX 4090 may even become a reality.

                And the way things keep going, I'm just surprised why manufacturers haven't ditched the whole concept of video cards being considered as "add-on" cards. IMO, they should have switched the design a long time ago, where you can mount a CPU tower cooler onto the GPU die... though with 300+ Watts of power, I think even many of the high-end CPU "tower" coolers would have a hard time getting rid of this much heat.

                Maybe the better solution would be to make it impossible to "mine" crypto currency on GPUs. And also for game development studios to get their act together and stop pumping out silly games that have no "substance" and only flashy graphics (Cyberpuke and Crysis come to mind here?) I don't even want to go into this rabbit hole, but I think gaming has seen very little advancement in the last 10 years, in terms of gameplay. It's mostly the same games getting recycled over and over each year, but just with slight improvement in the visuals (but at the expense of requiring more processing / GPU power.) On the other side, it should also be noted that many research facilities use a modified version of these modern GPUs to do large data processing and AI. So perhaps we will never see an end to this madness. However, the gaming industry and crypto craze sure are the ones adding most fuel to the fire... and in this case, that statement is not even a pun, since many places' electric power does come from fossil fuel sources (or at least some portion of it.)
                Last edited by momaka; 01-09-2023, 06:21 PM.

                Comment


                  #28
                  Re: Radeon RX 7900 Series Launched

                  Originally posted by momaka View Post
                  You're not alone into thinking this and I agree with what you say 100%.

                  The reason video cards keep getting bigger and bigger is because of the ever-increasing TDP requirements. These top-end GPUs can use (dissipate into heat)... what is it now, upwards of 300 Watts? And all of that in 400-600 mm^2 of surface area. That's probably more heat output per surface area than my 75 Watt soldering station. So just think about it for a second! Probably sooner than later, that gag picture I posted above about the RTX 4090 may even become a reality.

                  And the way things keep going, I'm just surprised why manufacturers haven't ditched the whole concept of video cards being considered as "add-on" cards. IMO, they should have switched the design a long time ago, where you can mount a CPU tower cooler onto the GPU die... though with 300+ Watts of power, I think even many of the high-end CPU "tower" coolers would have a hard time getting rid of this much heat.

                  Maybe the better solution would be to make it impossible to "mine" crypto currency on GPUs. And also for game development studios to get their act together and stop pumping out silly games that have no "substance" and only flashy graphics (Cyberpuke and Crysis come to mind here?) I don't even want to go into this rabbit hole, but I think gaming has seen very little advancement in the last 10 years, in terms of gameplay. It's mostly the same games getting recycled over and over each year, but just with slight improvement in the visuals (but at the expense of requiring more processing / GPU power.) On the other side, it should also be noted that many research facilities use a modified version of these modern GPUs to do large data processing and AI. So perhaps we will never see an end to this madness. However, the gaming industry and crypto craze sure are the ones adding most fuel to the fire... and in this case, that statement is not even a pun, since many places' electric power does come from fossil fuel sources (or at least some portion of it.)
                  In regards to ditching the add-in card form factor, why not just go to MXM form-factor GPUs? You plug the GPU into an MXM socket on the motherboard and mount whatever cooler you want on top, pretty much like a CPU. It still attaches to the PCIe bus, but you're not losing 5 slots worth of expansion.
                  Just my (additional) $0.02.
                  Don't buy those $10 PSU "specials". They fail, and they have taken whole computers with them.

                  My computer doubles as a space heater.

                  Permanently Retired Systems:
                  RIP Advantech UNO-3072LA (2008-2021) - Decommissioned and taken out of service permanently due to lack of software support for it. Not very likely to ever be recommissioned again.
                  Asus Q550LF (Old main laptop, 2014-2022) - Decommissioned and stripped due to a myriad of problems, the main battery bloating being the final nail in the coffin.


                  Kooky and Kool Systems
                  - 1996 Power Macintosh 7200/120 + PC Compatibility Card - Under Restoration
                  - 1993 Gateway 2000 80486DX/50 - Fully Operational/WIP
                  - 2004 Athlon 64 Retro Gaming System - Indefinitely Parked
                  - Main Workstation - Fully operational!

                  sigpic

                  Comment


                    #29
                    Re: Radeon RX 7900 Series Launched

                    Originally posted by TechGeek View Post
                    In regards to ditching the add-in card form factor, why not just go to MXM form-factor GPUs? You plug the GPU into an MXM socket on the motherboard and mount whatever cooler you want on top, pretty much like a CPU. It still attaches to the PCIe bus, but you're not losing 5 slots worth of expansion.
                    Just my (additional) $0.02.
                    Why not? Many reasons.

                    First off, it's nearly impossible (if not just impossible) for MXM GPUs to have the same TDP as their desktop counterparts. High-TDP GPU chips require high-power VRMs. To do that on the small space offered by an MXM card will require some really expensive and top-dollar parts to get really good efficiency. Otherwise the VRM could overheat. In contrast, a regular PCI-E card can have more space, and thus cheaper parts could be used. And in the case of really really high output power like RTX 4090 desktop chips, it just might be impossible to make a VRM that fits on an MXM module. So the limited space of an MXM card will certainly either a) drive up the costs of the card or b) limit the output power... or in many cases c) both.

                    Next... and let's imagine the VRM issues above do not exist for a moment... the question is, how are you going to get that kind of power to an MXM card? Add 6/8-pin PCI-E connectors? If so, then it's no longer MXM format - at least not something you can stick in a laptop again. Also, the MXM edge connector will surely not be able to handle the power required by anything higher than a 100-120W TDP GPU, if even that much (though such is the case with PCI-E too already.)

                    And lastly... let's not forget that laptop GPUs aren't always exactly the same as their full-blown desktop counterparts. Either they have parts of the core cut off or run at a lower frequency to use less power. But in the cases where they are, then the mobile GPU essentially has to be "creme of the crop" silicon - i.e. the chips that can run at the highest possible clocks with the lowest possible voltages. So what happens to all of the GPU chips of the same model that can still run at the same frequency as the "top" silicon parts, but with much higher TDP requirements (due to needing higher voltage)? They either would have to be nerfed down in performance in order to meet the same TDP spec or not used at all. The latter would be a major loss for any production, so clearly that's undesirable. And the former is a missed opportunity to sell the same silicon as a better-performing card.

                    To see what I mean, take for example a regular RTX 2060 and compare it to a mobile RTX 2060.

                    regular RTX 2060:
                    https://www.techpowerup.com/gpu-spec...rtx-2060.c3310

                    mobile RTX 2060:
                    https://www.techpowerup.com/gpu-spec...0-mobile.c3348

                    You can see above that while both GPUs use the same exact silicon die, the mobile version is running at a much lower frequency (960 MHz) vs. that of the desktop version (1365 MHz). Same goes for the boost frequencies. As a result, the TDP of the MXM card is also lower. But then that's a loss in performance - a whooping 30% in some cases.

                    So going to an MXM form factor doesn't bring any advantages to what we have already with PCI-E.

                    Comment

                    Working...
                    X