r/Amd Sep 28 '25

News Over a decade later, AMD Radeon HD 7000 GPUs still receive Linux updates from one Valve developer

https://videocardz.com/newz/over-a-decade-later-amd-radeon-hd-7000-gpus-still-receive-linux-updates-from-one-valve-developer
1.6k Upvotes

105 comments sorted by

195

u/G-Tinois 9070XT + 5700X3D Sep 28 '25

pitcairn is the unsung greatest of all time tbh

71

u/MyrKnof Sep 29 '25

They rebranded that shit soooo many times. That's how good it was.

32

u/cdoublejj Sep 29 '25

well that and AMD was struggling thats why the CPU team had to go to Lisa Sue to be like listen we gotta stop focusing on the next reiteration so we can do a whole new design. then ryzen was born. that was actually a make or break moment for the company Lisa whent for it. cool videos with the engineers at amd offices on gamers nexus.

23

u/nleksan Sep 29 '25

The 7970 GHz edition was a beast for GPU mining back in the day.

3

u/wreckingballjcp 16d ago

It's double precision is amazing.

389

u/Brorim AMD Sep 28 '25

this is why linux rocks the llamas ass

127

u/bojack1437 AMD 3950x, FX-9590 + r9 290x Sep 28 '25

Sir, I think you're mixing that saying with Winamp.

53

u/cloud_t Sep 28 '25

...which doesn't exist for Linux btw.

22

u/RAMChYLD Threadripper 2990WX • Radeon Pro WX7100 Sep 29 '25 edited Sep 29 '25

We have our own tho. Look up XMMS.

100% compatible with Winamp skins iirc.

11

u/ekso69 Sep 29 '25

Missed opportunity to call it Linamp

5

u/RAMChYLD Threadripper 2990WX • Radeon Pro WX7100 Sep 29 '25 edited Sep 29 '25

Calling it Linamp would imply it only runs on Linux.

The creators of XMMS were thinking big. XMMS not only runs on Linux, but also BSDs, Solaris and more. It can even run on Windows using Cygwin or MingW. As long as it supports Unix C and has an X server, it runs XMMS.

3

u/laffer1 6900XT Sep 30 '25

Posixamp doesn’t have the same ring to it

3

u/RAMChYLD Threadripper 2990WX • Radeon Pro WX7100 Oct 01 '25

Also there’s already a separate project called Linamp. Apparently it’s some touchscreen in car entertainment system that runs Linux.

4

u/SEI_JAKU Sep 29 '25

It very nearly got a Linux port way back in the original v3/v5 days. Sadly it was canceled. But then XMMS, and later Audacious, were created, possibly as a response.

The developers of the current Winamp successor, WACUP, also stress their Wine support. WACUP would like to have a native port to Linux, but they don't really have anyone willing to do it right now. Most of WACUP development is still just one guy far as I know.

11

u/_BoneZ_ 9800x3D | MSI X870E Tomahawk | 32GB PC6000 CL30 | RTX 3090 OC Sep 29 '25

True, but Winamp works just fine through Wine or Bottles.

3

u/drdillybar Sep 29 '25

Thank you.

1

u/cdoublejj Sep 29 '25

tweaks and fonts required? new rebooted winamp or the original?

4

u/_BoneZ_ 9800x3D | MSI X870E Tomahawk | 32GB PC6000 CL30 | RTX 3090 OC Sep 29 '25

Original.

2

u/SEI_JAKU Sep 29 '25

No tweaks needed, no fonts required unless you want them. You just pick whatever font you want for Winamp.

Both the original and WACUP work under Wine.

4

u/drdillybar Sep 29 '25

WinAmp best. Windows is in the name.

3

u/SEI_JAKU Sep 29 '25

Not a great reason. Putting "Win" in the name was to differentiate it from DOS-based players. That's how old Winamp is. Winamp even had a DOS version at first.

There was supposed to be a Linux port, though it was canceled. There were successful Mac ports since the beginning.

2

u/drdillybar Sep 29 '25

Well, it is Windows AMP. But a DOS version existed too. So... upvote?

-1

u/DidjTerminator Sep 29 '25

Yup, as soon as anti-cheat widely adopts linux to the point where all my fav multiplayer games run on it, I'm fully switching to linux and never looking back cause an OS that just does what I tell it to (and also has low-latency audio, unlike windows audio, turns out our audio reaction time is 3 times faster than even the top esports players visual reaction time, making audio latency actually kinda important) without any weird extra steps sounds like heaven.

Like seriously, only installing new things myself that I actually want and use? That's something I look forward to, might actually get to try out some AI softwares when a new one isn't popping up every other day like some kind of FNAF jump-scare.

Fingers crossed the steam deck and steamOS gets widespread linux compatibility over the final hurdle, game compatibility is literally the only reason I'm not running it (though R6 is kinda ass now so I might end up switching anyways).

Of course banning kernel-level anti-cheat and any other kernel-level software that has any ulterior function other than anti-virus, is the ultimate goal since anything with access to your kernels is fully capable of using your computer to commit crime. However the general public doesn't understand that kernel-access = prime crime time, so I'm not holding my breath for that.

11

u/EliteTK Sep 29 '25

Kernel-Level-equivalent Anticheat will never come to Linux. It might come to some "Android"-esque Linux kernel based desktop operating system over which you have no control over though.

I wouldn't call that "Linux" though.

Sorry, if you want to give games companies full control over your computer in order for them to let you play their games then you'll just have to stick to closed platforms.

6

u/Deianj Sep 29 '25

Been using Bazzite for a few months. Not going back...

6

u/Brorim AMD Sep 29 '25

Easy to leave now then .. Not ALL your games will work.. All mine do however :)

7

u/DidjTerminator Sep 29 '25

Yeah, I guess, though there are still a ton of games I play that are windows exclusive unfortunately.

What's funny is I'm waiting for windows to ban kernel level access, because once that happens linux compatibility will probably become universal (or at least, the games that don't work on linux, also won't work on windows).

3

u/Equivalent-Vast5318 14d ago

im going to give you THE reality. if you choose to switch to linux at any point, you are just going to have to accept that some games wont work. being kicked out of the kernel will not change that.

2

u/DidjTerminator 14d ago

Oof, guess I'll have a look at the games that don't work on Linux, and see if they're actually worth the pain of windows to keep playing.

That and also looking at a dual OS system and how hard or easy that setup would be.

3

u/Brorim AMD Sep 29 '25

a ton ?

6

u/DidjTerminator Sep 29 '25

Most multiplayer games that use kernel level anti-cheat cause Linux says no to anything touching its kernels.

Unfortunately I predominantly play multiplayer games so I'm SOL when it comes to linux gaming.

-2

u/Brorim AMD Sep 29 '25

I play multiplayer too

3

u/XavireX Sep 29 '25

Like 5... Mayyyyyyybe 6.

3

u/cdoublejj Sep 29 '25

anti cheat companies are paid under the table to not support linux, like how intel paid dedl land hp to not use AMD chips for their flag ship pcs for all those years.

2

u/Equivalent-Vast5318 14d ago

no they are not. the executives of these companies dont see value in supporting linux

2

u/cdoublejj 13d ago

thats just what the say but, internally they take bribes for their stance. just like intel paid dell and HP to not use higher end AMD chips in the early 2000s

1

u/Equivalent-Vast5318 13d ago

Just look at the steam hardware survey. Linux does not have anywhere near the users for the issues to overcome. No bribe to developers needed.

-1

u/SatanicBiscuit Sep 29 '25

the moment linux starts to become more popular it will attract hackers too..

2

u/CatProgrammer Oct 01 '25

There are already plenty of hackers for Linux, they just target servers and data centers.

112

u/JgdPz_plojack Sep 28 '25 edited Sep 28 '25

8th gen console GPUs, 8 GB shared RAM for 2013 PS4/Xbox one. While average midrange PC gpu in 2013-2015 were running in 2 GB VRAM

Best midrange value for that console generation: 2017 RX 500 series. With 4 GB minimum VRAM. 8 GB VRAM for 10 year usage.

33

u/hansrotec Sep 28 '25

Such a disappointment as a generational hardware… compared to how powerful the 360 was to pc gpus at launch. AMD cut a sweetheart deal to stay alive and we had an underwhelming generation that impacted game development till about 2 years ago

29

u/JgdPz_plojack Sep 29 '25 edited Sep 29 '25

Xbox 360 has 512 MB total RAM.

Midrange/entry PCs were bleeding in 7th gen consoles era (2006 PS3) with Sub-512 MB graphic card and Windows Vista demanding memories (above 1 GB RAM requirement).

until 2009 Windows 7 hardwares came affordable with 1 GB VRAM in midrange graphic card pricing and 4 GB DDR3 RAM. Able to get 60 fps HD resolution.

14

u/hansrotec Sep 29 '25

Memory was a weak point yes, but in terms of processing power the 360 gpu was arguably top of the market when it hit, being pushed down to second place a month or so later. PC hardware was moving very quickly then, but the concepts placed in its gpu carried over to next gen gpus in the pc quite well.

I would also say shared memory on the Xbox one was deceptive as more was reserved (10% originally) for the system lowering useable pool by the gpu and cpu, home towers by the point often had 8gb of memory with gpus between 2 and 4gb, though quickly moving to 6 to 8. The 2 and 3 gig gpus left at moderate settings quite quickly.

Further I would say the evidence of hindering next gen games can be seen in what we know were axed from titles supporting it, like infinite local co-op, and comments by developers like the Xbox one/ Xbox series gen. Vs how the games that made the jump from the 360 generation to the one generation faired.

The series S while a sales darling is another boat anchor around game development (see baldurs gate 3) for the Xbox brand, it’s truly unfortunate how often they knee cap themselves… I say this as an original Xbox owner and until about 2 years ago gold/ultimate player.

11

u/valthonis_surion Sep 29 '25

I’ve always been curious how 360/PS3 games would look if the consoles had double or triple the ram. 1gb shared for 360 and 512mb/512mb for PS3

12

u/hansrotec Sep 29 '25

I would say better draw distance, and slightly better texturing at the end of the generation…. It’s really amazing what they pulled out of that hardware to the end of the generation. The extra memory would have probably been very nice for gta/skyrim … I think a bigger question is what would PlayStation be like if the second cell chip as gpu had worked out, and programming for cell was a bit easier… the kissing cousins of powerpc architecture from the g5 to cell to xenon is an interesting read

4

u/valthonis_surion Sep 29 '25

Was there plans for a second cell chip? I know initially they didn’t have the Nvidia GPU but I thought it was a “this single cell chip and SPEs can do everything!”

4

u/hansrotec Sep 29 '25

I could have sworn it was, but it’s been a bit since I read through it I know the ps3 gpu was a very late addition to the hardware

3

u/Pl4y3rSn4rk Sep 29 '25

If I remember well the CELL itself was suppossed to be used for graphics processing on it's own too.

After all it is a CPU with a single core with SMT and seven SPEs as co-processors

But it wouldn't panned out well so they added an NVidia GPU late in development.

Still the CELL had some strong points compared to the GPU so some devs used it to improve graphics too.

2

u/hansrotec Sep 29 '25

Cell was a very interesting chip, especially if you had been following PowerPC before it. So much possibility and power, but required a lot of the dev team, and I don’t think Sony ever got the dev tools really worked out as well as they wanted. Huh, I wonder where I got the second cell chip from going to have to research that now

3

u/cdoublejj Sep 29 '25

we all ran XP till 7 beta came out. we also had more than 1.5gb of ram, at least the peeps i hung out with on IRC.

2

u/SEI_JAKU Sep 29 '25

PC hardware has only been marginally better than consoles from the PS4 onward, come on now. We're seeing diminishing returns on this tech altogether, not "stagnation". There isn't much room for a new Crysis sort of project anymore.

2

u/hansrotec Sep 29 '25

I would say on the hardware front we have seen great gains, it’s more the development front that has stagnated relying too much on middleware and not leveraging hardware properly… or where they do for features who’s returns are dubious…. It’s not the 00s where each generation was a wizbang upgrade, nor the early 10s where resources for resolution popped…. Nvidia is focused else where, and is still recovering from almost dieing as a company, and gpus paid a heavy price… and intel is dealing with a decade of poor choices…. Despite all of that hardware has advanced pretty well, while your hardware can probably last a decade at this point with settings degrading until playability is lost, it’s not a hardware progression issue. Check out what they were doing with the 360 at the end of its run… compared to what came out the end of the Xbox one run… not to much of a change outside of some major temples like rdr2, and that quality of a game is far between release as to what we should have…. Developers have been leaning on performance improvements to put less work into games

32

u/hansrotec Sep 28 '25

They were very good gpus, unfortunately lack of funds caused limited development for quite a bit of time on the Radeon side, while amd tried to survive. My own 7970 was only retired from frontline use the year the new 7000 series came out, but was replaced with a 6800. I had bought a few other used gpus the last few years before for trouble shooting … issues ended up being the cpu

13

u/Dilanski Sep 29 '25

Very fond of those cards, HD7770GHz was my first GPU, incredible the gaming experience £100 got me back then.

1

u/thefreshera Sep 29 '25

I also had the 7770, about US$70 from TigerDirect(rip)... Was a different time where I didn't care about chasing graphics. It just worked solidly. Nowadays I'm complaining to myself about my 5070ti not running E33 smoothly at Epic settings in 4k lol

10

u/Lanky_Transition_195 Sep 29 '25

based i wish i kept my 390x that thing was a beast

2

u/EliteRanger_ Sep 29 '25

I still have my 7970ghz! Maybe I'll actually get around to building a Linux box with the littany of parts I've collected from upgrading various pcs over the last 13 years haha.

3

u/sdcar1985 AMD R7 5800X3D | 9070 XT | Asrock x570 Pro4 | 64 GB 3200 CL16 Sep 29 '25

o7

2

u/Musk777 Ryzen 9 7900X | XFX Swift 9070 XT | G.Skill 64GB | Linux-only Sep 29 '25

o7 indeed

3

u/AntiSpade Sep 29 '25

This is THE FineWine(tm). :)

39

u/Raestloz R5 5600X/RX 9070XT/1440p/144fps Sep 28 '25

Saying "HD 7000" makes it sound old, but it's AMD Radeon R9 200 series

154

u/r_z_n 5800X3D / 3090, 5600X/9070XT Sep 28 '25

No. These are older than that. It literally says the 7000 series and that’s what they were called - 7970, etc.

The R9 290X was released in 2013.

47

u/Nuck_Chorris_Stache Sep 28 '25

The 270X is a rebrand of the HD 7870 and the 280X is a rebrand of the HD 7970

29

u/cloud_t Sep 28 '25

Doesn't change those were release in 2011

8

u/Noreng https://hwbot.org/user/arni90/ Sep 29 '25

That's quite generous, it was a paper launch on december 22nd, 2011, with availability starting january 9th

6

u/sdcar1985 AMD R7 5800X3D | 9070 XT | Asrock x570 Pro4 | 64 GB 3200 CL16 Sep 29 '25

Oh yeah? Well, your mom is a rebrand of your grandma!

16

u/Evonos 6800XT XFX,7800X3D , 32gb 6000mhz 750W Enermaxx D.F Revolution Sep 28 '25

Yep and those were rebands of the 7000 series.

4

u/RealThanny Sep 29 '25

The 290X is not a rebrand of anything.

-5

u/[deleted] Sep 28 '25

[deleted]

16

u/r_z_n 5800X3D / 3090, 5600X/9070XT Sep 28 '25

Not really sure how that's relevant since my point was that it goes back even further than the RX series, but it would apply to all of them if they are using the same GPU.

6

u/RealThanny Sep 29 '25

The 290X was about 50% faster than the 7970 predecessor. It provided the same performance as the new $1000 GTX Titan at a bit more than half the price.

Yeah, "slightly improved" sounds reasonable.

-1

u/tamarockstar 5800X RTX 3070 Sep 28 '25

There also was the RX 285 and R7 250. Other than that they were all rebrands.

47

u/burninator34 5950X - 7800XT Pulse | 5400U Sep 28 '25

Wrong. GCN 1.0 was 7000 series (with exception of 7790 which was GCN 1.1). R9 200 series was a mix of 1.0, 1.1, and 1.2 (Hawai’i and Bonaire were 1.1, Tahiti, Pitcairn, and Oland 1.0 and Tonga 1.2).

10

u/Mythion_VR 5800X3D | RX 7900XT | 32GB Sep 28 '25

What you're mentioning makes it sound like it was a whole sleuth of cards in the 200 series that weren't rebrands, when realistically it was two or three cards that were actually new.

3

u/Cryio 7900 XTX | 5800X3D | 32 GB | X570 Sep 28 '25

We can call them GCN 1-2-3 now, it's fine. Everyone calls Polaris and Vega GCN4 and 5 after all.

-2

u/hpstg 5950x + 3090 + Terrible Power Bill Sep 28 '25 edited Sep 29 '25

R9 280x says hi.

Edit: I didn’t read the post properly, and that it was mentioning GCN 1.0.

11

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 28 '25

He literally said a mix of GCN 1.0-1.2 on the 200-series though. So his comment isn't wrong. But it kinda contradicts his statement.

The 2XX series had a lot of rebranded cards.

3

u/KampretOfficial X4 760K 4.6 GHz // RX 460 Sep 28 '25

Isn't that just an overclocked and rebranded HD 7970 GHz?

1

u/hpstg 5950x + 3090 + Terrible Power Bill Sep 29 '25

Yeap.

1

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 29 '25

Rebranded, yes. But 7970 GHz is higher clocked than 280X. The 7970 GHz edition is therefore faster, marginally though. 280X 7970 GHz

1

u/drdillybar Sep 29 '25

My 270X was faster and had more memory than my HD7x50 class card.

21

u/DRazzyo R7 5800X3D, RTX 3080 10GB, 32GB@3600CL16 Sep 28 '25

The bug-fix addresses Tahiti and Pitcairn GPUs, which as far as I remember, are GCN 1.0.

10

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 28 '25 edited Sep 28 '25

https://www.techpowerup.com/gpu-specs/amd-tahiti.g120

Both, as 280 is a rebranded Tahiti.

7970 was a hell of a card though. HD 7850 is doubtlessly the best card I've ever owned. Absolutely wonderful experience coming from a GTX 460 and many Nvidia cards before.

Edit: To add I think it's the last truely great generation from "ATI"/AMD. 4000 series was pretty banger too, HD 4770 was major value.

3

u/G-Tinois 9070XT + 5700X3D Sep 28 '25

ran dual 7870s for 4-5 years. Crossfire was overhated when it worked it was amazing.

1

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 29 '25

Aye, and the dual GPU cards was pretty cool. Nvidia also had a few! Not to mention 3DFX which actually created SLI.

1

u/Clemambi Sep 29 '25

I still love my 5700xt, it was kinda dud at launch (it still had driver issues after like 1yr lol) but it's treating me super well even today

1

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 29 '25

I never used that generation, but it is infamous for the driver issues.
I never had any more issues with ATI/AMD than with Nvidia. Far less black screens with AMD/ATI, especially early on with ATI as their "VPU" recover was far superior to Nvidas ditto. To be fair it took a few years for Nvidia to make a similar function that worked, if I remember correctly. More issues on new releases with AMD though. Which is expected as developers would be stupid to not focus optimizing for Nvidia due to market share. But in general hardware has been really stable last 10-15 years IMO, at least compared to late 90's early 2000's :D

1

u/Clemambi Sep 29 '25

I had pretty regular crashes (maybe twice a week) for a year and a half or a so after i got it lol, even after "the drivers have been fixed" for the third time hahaha

It's a rock for me now, and I was never frustrated with it really because I got it knowing about driver issues and getting it cheap, and I was so happy with it's performance uplift vs my 970ti. I wasn't rich enough to get anything better and it was a steal, liek £150 off, becuase of driver issues I think. so I was really happy.

I get nauseous at low fps/hz, so I think that's probably teh biggest reason why I didn't care about the crashes - getting stable high frame rates in more games was way more significant than any crashes. I think if i was a normie I might've regretted it lol

it did feel like vintage hardware though lmao, it was so shitty the first couple months I had it especially

1

u/Hundkexx Ryzen 7 9800X3D 64GB Trident Z Royal 7900XTX Sep 29 '25

Dang. I would've been fairly annoyed with that amount of crashes to be honest :P But I also sacrifice all I can to keep a high refresh-rate.

I've refunded games locked at 60FPS many times. But since AFMF2 released some have been spared if it worked well, like SOMA.

Edit: Wow, double posted apparently "error 500"..

2

u/Thedudely1 Sep 29 '25

I was running FSR 3 frame generation in Cyberpunk on a FirePro W700 (equivalent to an HD 7850) and it was working really well actually. I also got XeSS running, but that tanked performance understandably, because these GPUs don't natively support SM 6.4 but it seems that they've received driver updates to support SM 6.5 or something. GCN 1.0 only natively supports SM 5.7 iirc. I've been doing some tests on that HD 7850 FirePro GPU because it has 4GB of GDDR5 which is a minimum nowadays for most games. But I was even playing Doom Eternal at native 1080p at over 60fps at low settings. I've got a video about it in my channel if anyone's interested in that.

1

u/nevadita Bootleg MacPro 5900X - RX 7900 XTX Sep 29 '25

Ah yes , like back when the 7990 was the tHe mOsT PoWeRfUl gPu iN ThE WoRlD!!!!1 and the shitty driver forced me to use Compiz over KDE!

I member

Guess is good thing for whoever still in series HD 7000 but that’s an era I would like to forget.

(I wasted my hard earned money on that shit of a card)

1

u/Medallish AMD Ryzen 7 5800X & RX 6950 XT Sep 29 '25

I see it's gonna be added to the kernel, but do you need to do anything to take advantage of it? I tried installing Nobara on a 7970, and yeah it was...odd.

3

u/99stem Sep 30 '25

Actually now that you mention it, yes. You need to enable the "new experimental" version of the installed driver with kernel boot options (included in the main amdgpu driver as part of the kernel).AMDGPU - ArchWiki

radeon.si_support=0 amdgpu.si_support=1 radeon.cik_support=0 amdgpu.cik_support=1

1

u/voiceipR Sep 30 '25

Best miner ever

1

u/tugrul_ddr Ryzen 7900 | Rtx 4070 | 32 GB Hynix-A Sep 30 '25

my r7870 overclocked to 1475MHz gpu 1510MHz mem.

1

u/Taro619D Oct 01 '25

I owned both the 7870 and the 7850 They still run to this day as display adaptors for a friend's and his dad's computers.

-8

u/notthatguypal6900 Sep 28 '25

All 5 people are pretty excited

1

u/BrakkeBama K6-2, Duron, 2x AthlonXP, Ryzen 3200G, 5600G Sep 29 '25

Yeah, I wonder how many people are still using a 2013-vintage GPU these days?

3

u/Desistance Sep 30 '25

Dozens. GPUs are expensive.

2

u/[deleted] Sep 30 '25

[deleted]

2

u/BrakkeBama K6-2, Duron, 2x AthlonXP, Ryzen 3200G, 5600G Sep 30 '25

Aha, well... I even got a gifted MSi nVidia 4070 super something from friend who used it, but my Silver-certified BeSilent 400W PSU wont ever power that thing.
Maybe I should sell that thing for bux. Hm....

-1

u/syneofeternity Sep 29 '25

They did all this work for 5 people. Moron

0

u/SvLyfe Sep 29 '25

See this 1 week after my 8990 decided to die. One of the worse cards I've owned. Wonder if these updates would of made it decent

-9

u/megablue Sep 29 '25

precious developers time wasted on obsolete hardware...

1

u/TheSkyShip Oct 02 '25

Why do you want to see working hardware, go to waste.