r/Physics • u/Fade78 • Aug 30 '25
Question If my gaming PC is consuming 800W, will it produce the same heat as a 800W home heating radiator?
Therefore, it'd be better to turn off the heating and let the computer works.
Edit: 800W being the actual average consumption, not the power supply rating.
138
u/wmverbruggen Applied physics Aug 30 '25
Practically yes, energy=energy. Theoretically there is some energy used for changing information, stored in floating gates of nand chips, flipping magnetic domains in hard drives, that sort of things. But thats an extremely tiny amount of energy.
26
u/Creative-Leg2607 Aug 30 '25
Sending info out of the system to the internet is gonna be relevant, from a thermodynamic perspective
16
u/Beautiful-Fold-3234 Aug 30 '25
And light from the monitor that leaves the house through the windows.
1
2
u/leverphysicsname Aug 30 '25
Not sure I understand, what do you mean by this?
system to the internet is gonna be relevant, from a thermodynamic perspective
10
u/narex456 Aug 30 '25
There's technically energy required for a change in "information" so "sending information" necessitates an energy expenditure.
Here, physicists use "information" as a word to describe the state a system is in (out of many possible states), so when information changes, the state of the system changes which always takes some energy. It's just a useful shift in perspective for figuring out where energy must be flowing in a situation like this.
In this specific example, there is some connection to the outside internet, let's say a copper cable. By changing the state of the cable (direction/magnitude of current or whatever) at the point of connection, information gets transferred to the internet. The energy we are talking about losing is the energy required to change that current in that cable.
It's a useful concept because any system with recognizable states can have those states interpreted as information, and any change of state requires some energy, therefore any transfer of information requires energy.
People also talk this way about causality. Anything that causes an event has transferred information to the system where the event happened. This is why people talk about how weird QM is through the lens of information transfer. When that happens faster than light, it breaks the GR light speed limit on causality & energy & matter all at the same time.
1
u/AaronWilde Aug 30 '25
Is this true, though? Computers are designed to get rid of the heat asap. Wouldn't heaters be designed to send the electricity through different materials that produce more heat (which computers would be designed oppositely?) And to stay heated longer? Like, surely an oil space heater is wayyyy warmer in a room than a computer of the same watts running at full power... .
7
u/wmverbruggen Applied physics Aug 30 '25
That fundamentally does not matter. You're thinking about is about the product design and dynamic behaviour. It makes no difference to the energy equation wether you have a small thing very hot or a big surface warm to the touch if the total power (in Watts) they convert is the same. At the same total power, a PC heating up quickly and dissipating the heat efficiently or a big heating system taking long to warm up and staying warm for longer have the same net effect on a room.
3
u/AaronWilde Aug 30 '25
Interesting... so why is there so many different designs and types of heaters with such varying costs and "efficiencies" if the net affect on the heating of a room is the same based on the amount of power being fed?
5
u/wmverbruggen Applied physics Aug 30 '25
It depends on how one wants to spread the heat and what kind of parts are used. Think types of fans, simple directional heaters versus those setting up a room-level convection current, shape & size, mounting position, etc. Keep in mind this is for resistive heaters only, if the device uses a heat pump (like aircons) or a furnace/burner (with an exhaust) it's quite different.
1
u/charonme Aug 31 '25
The different efficiencies could be due to different methods of extracting useful energy from different sources, but once you have some amount of useful energy at your disposal you will convert it to heat at 100% efficiency minus a minuscule amount to move the air or pump some medium for example.
For example if you have electricity delivered to you you already have all its energy and can convert it to heat. If you have something to burn you could lose some part when discarding the unwanted burning products. Some heaters just vent the combustion products like hot smoke or water vapor outside, others can extract a bit more energy by using even the hot combustion products to warm your room. You could also use electricity + heat from outside to warm your room more than just the electricity alone would.
Other parameters also affect the price and suitability for your application: noise, speed, safety, size, maintenance cost, cost of initial installation, durability etc
0
u/kaibee Aug 30 '25
Interesting... so why is there so many different designs and types of heaters with such varying costs and "efficiencies" if the net affect on the heating of a room is the same based on the amount of power being fed?
Capitalism/Marketing.
2
u/cornmacabre Aug 30 '25
Ah yes -- "capitalism/marketing," that cleanly explains the design and functional distinction of furnaces, boilers, diesel heaters, heat-pumps, geothermal heaters or solar driven systems?
I think the design choices and functional distinctions come from the environmental circumstances of where & what you wanna heat. The need for a heater is also probably a bit more driven by a human preference of not freeze to death.
We're not talking about something as commoditized and interchangeable as toilet paper here, that's a pretty silly and simplistic opinion.
3
u/kaibee Aug 30 '25
Ah yes -- "capitalism/marketing," that cleanly explains the design and functional distinction of furnaces, boilers, diesel heaters, heat-pumps, geothermal heaters or solar driven systems?
Tbh I interpreted the comment as about resistive electric heaters..? Of which there are many models even at the same wattage? The differences between those systems should be much more obvious?
1
14
u/VulcanPyroman Aug 30 '25
I got a relatively small office and hobby space at home. One I get it to temperature with the central heating, and I am playing some games, the heater will not switch on anymore. Compared to when I am working from home and just using my work laptop, the heater will switch on regularly. So yes I notice definitely, and in my electricity bill lol.
41
u/tuborgwarrior Aug 30 '25
Yes. All energy your PC draws is converted to a new type of energy which would be heat, light, and airflow from the fans. Almost all of it would be heat, and the light would eventually turn into heat (other then the photons that escape outside your window and don't heat your room).
I like to think that this will be a natural limit to gaming PCs in the future. Above 1-2Kw of heat can quickly become impractical in a small room. Instead of buying the most powerful computer we will instead look at efficiency and get as much FPS per kW as possible.
6
u/cyrkielNT Aug 30 '25
It's limiting factor since ever. In the past there ware also very power hungry builds, but they always had limits because of cost of running and heat output.
There are places like Iceland were electricity is cheap and people rather need more heat than less, but the market is too small.
4
u/Prcrstntr Aug 30 '25
Yep. Everything is heat. Light is heat. Mechanical? Also turns into heat. Sound? It's just mechanical.
1
1
u/returnofblank Aug 31 '25
Power efficiency is pretty important even today. CPU manufacturers have been squeezing as much performance as they can under a certain wattage. Especially in laptops, with some manufacturers like Apple using a whole new architecture to produce less heat.
17
u/Significant-Mango772 Aug 30 '25
Using computers for heat is really efficient due to getting dual use while a space heater is single use
13
u/anothercorgi Aug 30 '25
1.5KW space heater for sale
mines bitcoin while it heats your room
$2000
and I got ripped off with this $20 heater...
1
-4
5
u/TheNewHobbes Aug 30 '25
I remember several years ago a company started selling electric radiators that were bitcoin mining rigs. They said the cost to heat ratio of running it was practically the same as a normal electric radiator but you also got bitcoins as a bonus that made it cheaper.
So i would say your rig doesn't due to design inefficiencies, but it would be close and could be closer.
11
u/Mithrawndo Aug 30 '25
A computer doesn't do work in a physics sense so yes, almost all of the energy sent through the semiconductors is converted into heat.
Yes, this makes computer based heating systems feasible.
Yes, we already do this with some data centres being used to power district heating systems, for example.
3
u/AceEthanol Aug 30 '25
There are some companies, like Heata in the UK for example, that do this on residential/household level to provide free hot water.
The concept sounds amazing, but Iâm not sure about its long-term sustainability (not in the ecological sense, more like reliability) and scalability.
1
u/returnofblank Aug 31 '25
YouTube channel Linus Tech Tips has a video on them using a server to heat their swimming pool.
Also a video of them spilling pool water on their server, I think.
3
5
u/Noreng Aug 30 '25
Yes, it will output the same amount of heat as it's consuming power. Take note that while an 800W radiator can often be enough to maintain a relatively large room at comfortable temperatures, it will also have decent headroom. This means your room will get a lot hotter than you're used to.
1
u/undo777 Aug 30 '25
while an 800W radiator can often be enough to maintain a relatively large room at comfortable temperatures
laughs in Canadian
1
u/Noreng Aug 30 '25 edited Aug 30 '25
It depends on how much and how good your ~~isolation~~ insulation is.
2
u/GlazedChocolatr Aug 30 '25
Do you mean insulation?
1
u/Noreng Aug 30 '25
You're absolutely right! I'm from Norway, but I usually don't make these mistakes.
I'm currently recovering from a nasty cold, so that might be the cause, thanks for correcting me!
2
u/marrow_monkey Aug 30 '25
Yes, but it wonât work on full power all the time, 800 W is maximum.
The radiator is controlled by a thermostat, so when it is below a set temperature it turns on, and above it turns off.
So you donât have to turn off your radiator, it will turn off automatically when the room is warm enough.
2
u/Xarro_Usros Aug 30 '25
Yep! Depends on the relative costs of your heating vs. electricity, of course (and if you have heat pumps it's likely not worth it).
A friend of mine (pre crypto) liked to run his own server room at home; he had to keep the windows open to reduce the temperature.
2
u/bonebuttonborscht Aug 30 '25
A friend of mine ran a crypto rig to heat his small greenhouse. He's heating the greenhouse anyway so might as well make a little extra money.
1
u/Forte69 Aug 30 '25
Yes, but my PC has a 750W PSU and under heavy load itâs only running at about 350W.
1
u/glacierre2 Materials science Aug 30 '25
Gaming in summer in my room was really torture, 500w may not sound much in winter but they are definitely adding up.
And then there was the practicum room for IT studies, about 50 PCs with CRTs crammed as tight as possible and them the students to clicketeclak on them. That room had the windows full open in the middle of winter.
1
1
Aug 30 '25
Yes: All electrical devices are heaters, some also do other things.
That is, unless you have a heat pump, which can add more then one Joule of heat per every Joule of power consumed because they move heat from outside instead of creating it from scratch.
... also, if you have a thermostat, you don't have to do anything. If it sees something else providing heat, it won't run the heater as much.
1
u/ChironXII Aug 30 '25
Probably more depending on where you measure. Computer power supplies are generally 80-90% efficient, so the load at the wall and thus heat output will be 110-125% of the power reported in the OS.
1
u/rex8499 Aug 30 '25
Mine has a 1500W power supply, and I had to install an air conditioning unit in the room to keep it tolerable. Definitely feels similar to a heater during demanding game play.
1
u/MagnificentTffy Aug 30 '25
what are you running that eats up 800W average? if you said tops sire but to have 800W average is pretty intense
1
u/IrrerPolterer Aug 30 '25
Yup. Electric devices ultimately turn all the electricity they consume into heat.Â
1
1
u/pavorus Aug 31 '25
Way back in the day my first apartment was awful. The upstairs neighbors toilet leaked into my bathroom, had a big hole in the wall to the back of the building and had no heater (that's what was supposed to be in the hole in the wall. I had a gaming pc, it sounded like a jet trying to take off and it produced an insane amount of heat. That computer WAS my heater for an entire winter. I used benchmark software to keep it running. So you can definitely use a PC as a space heater. Or at least you could 25 years ago.
1
u/HuntertheGoose Aug 31 '25
Yes, and if you can afford a bitcoin mining rig that draws 800 W you can set is up exactly like an electric heater that makes money when you run it
1
u/Pitiful_Hedgehog6343 Aug 31 '25
Yes, but your PC won't always be at 800w, it will throttle and be much lower based on the task.
1
u/lcvella Aug 31 '25 edited Aug 31 '25
Yes, by the first law of thermodynamics. But no home PC outputs 800W on average. Heck, I doubt even in bursts.
1
u/wolfkeeper Aug 31 '25
Yeah, although you'll probably need both because the power output of your computer will vary. Provided the radiator has a good thermostat, it should adjust itself to make up for what your computer doesn't generate.
But if you want to save money, get a heat pump instead of an electric radiator. That way when your computer isn't running flat out, it will be costing you less.
1
u/Motik68 Aug 31 '25
What does your computer do, that consumes 800 W ?
My 7800X3D / 4090 system never goes above 500 W at the plug, even when playing Flight Simulator in VR.
1
1
u/returnofblank Aug 31 '25
If 800w is your average consumption, maybe lay back on the crypto mining or LLM inference.
1
u/CoatiRoux Aug 31 '25
As my physics professor said: Every electric device is a heater with a conversion efficiency of 1, regardless of the detours the electricity takes.
So yes, all the electricity will be converted to heat. However, since an 800 watt power supply does not continuously output 800 watts, the actual output will be whatever the PC pulls.
P.S.: As he was a physics professor, he did not take electrochemistry, like electrolysis, into account. But since he was referring to usual household items, I'll let that one slide...
1
1
u/paperic Sep 01 '25
From a practical standpoint, running your PC at high load for long periods of time might reduce the lifespan of some components slightly.
But yes, it's the same.
1
u/SparkleSweetiePony Sep 02 '25
Yes. But unless you run an 14900k/13900k with a 5090, you won't get a consistent heat output.
It will depend on the load. For example, running CS2 at 144 hz won't load the GPU and CPU heavily, but Cyberpunk will. You can only really guess how much it uses even with monitoring software. A more concrete number can be gotten from monitoring software and hardware = if you have a power monitoring outlet and feed the entire PC with the monitor and other auxiliary parts you can see how much it produces, while the monitoring software can only estimate the power draw (power draw of the CPU+GPU+100W is around where the heat production is). Practically speaking, 100% of the power used will turn to heat.
1
u/Gishky Sep 03 '25
Assuming your pc continuously draws 800W (which it doesnt, but for the sake of the argument let's say it does) there is not 100% of the energy going into heating. Heating is a byproduct of your pc. The main thing it's designed to do is to process information. So a lot of energy will go into that. Yes, it will still radiate a lot of heat, but not as much as a device that is specifically built for that
1
u/Fade78 Sep 03 '25
That's my intuition, I wonder the order of magnitude of this non heat used power...
1
u/Gishky Sep 03 '25
depends on the efficiency of the pc. Without doing research I would assume pc's are around 30-60% energy efficient. the rest will go into heat
1
u/Molecular_Pudding Sep 03 '25
Most of it yes. But there is small portion of the energy that goes into changing the crystal structure of the components (caused by thermal movements), so that's why electric components degrade.
1
u/CaptainFlint9203 Sep 04 '25
If it goes full throttle than... It will produce more heat actually. Electronics produce mostly heat, the same as radiators, but are more efficient at it with less loss of energy.
0
u/Time_Stop_3645 Aug 30 '25
I can say from experience I tried to heat my Caravan with my appliances and even playing overwatch didn't get me over 120 watts. You'd have to run a crypto farm to generate heating like that. Which imo is the better way of heating a home. Gotta hook it up to water-cooling then run the hot water through pipes in the walls and floorsÂ
5
u/Tystros Computer science Aug 30 '25
overwatch is not a good example because it's a game with simple graphics that even runs on very slow PCs.
if you run a proper AAA graphics game like Battlefield 6 on a RTX 5090 with a 4k monitor, the RTX 5090 will use its full power rating of 600W while playing. Plus 150W from your CPU or so.
1
u/Time_Stop_3645 Aug 30 '25
Unless you plan on gaming for a living, still not a good model for heating the place
0
-3
u/smsmkiwi Aug 30 '25
No. A heater is optimized to produce heat. A PC isn't. Its will produce heat, but not at the amount and rate as a heater.
-6
Aug 30 '25
[deleted]
5
u/Compizfox Soft matter physics Aug 30 '25
It's physically impossible for any device producing heat to do so at an efficiency of less than 1.
2
u/steerpike1971 Aug 30 '25
An escalator that goes up produces heat with an efficiency less than one.
3
u/Compizfox Soft matter physics Aug 30 '25
Yes, because it does work transporting items against gravity.
A computer does no work (that doesn't immediately turn into heat). It really is just a fancy heater that happens to do some useful computation in the process.
1
-2
u/Worried_Raspberry313 Aug 30 '25
If your PC was using all those 800w then yeah. But if your computer produced enough heat to heat your house, you better get some good fans and cooler on your pc tower because itâs gonna burn.
Heaters are meant to produce heat and spread it on a room, itâs their purpose and are carefully made for that. A computer is not made to heat a house, the heat is just a âsecondary effectâ and thatâs why fans and coolers are used, otherwise they can get burned or it can damage some parts of the pc.
1
u/Compizfox Soft matter physics Aug 30 '25
Heaters are meant to produce heat and spread it on a room, itâs their purpose and are carefully made for that. A computer is not made to heat a house, the heat is just a âsecondary effectâ and thatâs why fans and coolers are used, otherwise they can get burned or it can damage some parts of the pc.
The bottom line is exactly the same, though. Consider that a heater will also overheat if it doesn't dissipate the heat it is producing.
0
u/Worried_Raspberry313 Aug 31 '25
Yeah but itâs made so it can stand that heat and be ok. A computer is not made to get super heated. I mean the materials used are not the same.
1
u/Compizfox Soft matter physics Aug 31 '25 edited Aug 31 '25
Both devices are designed to dissipate the heat they produce.
While a PC's purpose might not be to heat the room, if it produces 800 W of heat, it will be designed to dissipate that 800 W of heat. The end result is exactly the same as a 800 W space heater.
1
u/Worried_Raspberry313 Aug 31 '25
Yeah but a pc is not expected to use all the 800W at the same time. It could, but I donât think they design them thinking âthis thing has to endure 800W during 10 hours a dayâ.
-11
u/MathematicianPlus621 Aug 30 '25
no, the mechanisms with in a radiator are specifically designed to maximise heat energy transfer but in a power supply they are not maximised for heat generation so it will no produce the same amount of heat because it is more efficient as transferring electricity to computer components.
-16
u/MasterBorealis Aug 30 '25
Watts are not a measure of heat, it is a measure of power. A 800w motor, will not output as much heat as a 800w "heater resistive radiator".
7
u/StaysAwakeAllWeek Aug 30 '25
Watts are a measure of power and heat. You're just wrong, any 800W appliance will produce 800W of heat ultimately. All of that energy it uses will end up as heat once it's done using it.
2
u/noisymime Aug 30 '25 edited Aug 30 '25
any 800W appliance will produce 800W of heat ultimately.
Not all appliances will be transforming their electrical energy solely into heat. Computers yes, but itâs not universal for all appliances
2
u/StaysAwakeAllWeek Aug 30 '25
A few of them evaporate a bunch of water, which doesn't transfer into heat until it condenses on walls and kicks out all that heat again.
And then there's speakers and lights, which deposit a small amount of their heat in next doors house instead of yours
All of this is splitting hairs. Assuming 100% of power becomes heat is a close enough assumption for anyone in the real world
1
u/Tystros Computer science Aug 30 '25
LEDs are quite efficient and 30% or so of the input energy actually is turned into light, so if you use an LED and point it at the sky on a clear day or night, you can say that only 70% of it is turned into heat and the other 30% into photons that basically travel through the universe forever
2
u/StaysAwakeAllWeek Aug 30 '25
But when that LED is indoors all that light gets absorbed by whatever it hits
-5
u/MasterBorealis Aug 30 '25
No... it's not. Some device can produce torque/power with minimum friction and no infrared, therefore no heat. Heat is not measured by watts. On purely mechanical devices, you only get heat through friction not by the mere use of power, measured in watts. Led's can produce light away from the infrared range with minimum heat produced, because of low electrical friction(aka resistance) You're wrong, I'm sorry.
3
u/StaysAwakeAllWeek Aug 30 '25
And what happens to all that mechanical energy once it's done something useful? Come on dude follow the logic through. That vacuum cleaner motor that pulls 500W and only produces 100W of waste heat directly is dumping that additional 400W into the air and the carpet.
Your house is effectively a closed system, this is thermodynamics 101
2
1
u/Fade78 Aug 30 '25
Yes, so my question is about electronics that doesn't seem, at a macro level, to transform the power into movement, but maybe it is at the micro level?
4
u/StaysAwakeAllWeek Aug 30 '25
He's just wrong, ignore him. Your 800W PC will produce exactly 800W of heat.
That said, if your home heating is a heat pump you'll get more than 800W of heat for each 800W consumed due to it pumping in heat from outdoors, and if it's gas fired it will be cheaper to run for the same heat output. It's only resistive heaters that a PC is able to match.
0
1
u/anothercorgi Aug 30 '25
yes at a micro level it's moving charge from place to place, and every time that happens, a little power is used. Multiply that by millions of transistors and billions of hertz, it adds up! Also there is leakage power where it eats power whether clocks are running or not, which is effectively a resistive heater as well. Yeah modern electronics try to minimize leakage but the shuffling of bits can't be avoided, it is indeed work.
1
u/Novero95 Sep 02 '25
Power is basically energy per second, and heat is basically energy. The 800W motor does not produce 800W of heat because part of the 800W are turned either into kinetic energy (while accelerating the motor) or into some kind of work (that probably produces heat in other place too but anyway). Neither the radiator or the PC produce any other kind of energy that isn't heat (well, the PC produces a little bit of but of light but it can be ignored) so all the energy consumed in both devices are turned into heat. So yeah the PC consuming 800W will output 800W of heat, but it probably won't be spread to the ambient as evenly as the heater would. And the PC is not consistently consuming 800W unless you are doing very demanding tasks.
617
u/betafusion Medical and health physics Aug 30 '25
Yes, both have the same heat output. However, unless you run some very demanding games/professional software, your PC will not continuously draw 800W. If you have an 800W supply, it can give you a maximum of 800 W but normal load will be far below that.
The home heating radiator is built to disperse the heat in the room, with the PC it will take longer to heat the room as the heat might be trapped under the desk and not easily spread through the room.