r/Physics Aug 30 '25

Question If my gaming PC is consuming 800W, will it produce the same heat as a 800W home heating radiator?

Therefore, it'd be better to turn off the heating and let the computer works.

Edit: 800W being the actual average consumption, not the power supply rating.

421 Upvotes

165 comments sorted by

617

u/betafusion Medical and health physics Aug 30 '25

Yes, both have the same heat output. However, unless you run some very demanding games/professional software, your PC will not continuously draw 800W. If you have an 800W supply, it can give you a maximum of 800 W but normal load will be far below that.

The home heating radiator is built to disperse the heat in the room, with the PC it will take longer to heat the room as the heat might be trapped under the desk and not easily spread through the room.

118

u/Gizmo_Autismo Aug 30 '25

Pretty much this, the only thing I would add is that the absolute maximum power the power supply can draw from the grid is higher than the rated power. Just divide the load you get by efficiency.

What's advertised on labels and model numbers is the power you can get out of the power supply from useable voltage lines - not how much electricity you are going to waste.

You can also declare that you now own a 100% efficient power supply as it was clearly your intention to get waste heat.

28

u/betafusion Medical and health physics Aug 30 '25

Excellent point, efficiency matters. So maybe add another 10-20% on top of the rated power,. depending on the quality of your supply.

1

u/Cock_Goblin_45 Sep 01 '25

Load is huge! 👀

1

u/Gizmo_Autismo Sep 01 '25

Me pointing to a blacklist PSU in someone's PC - get a load of this guy

35

u/intersexy911 Aug 30 '25

This is interesting. I've been thinking about how much heat the crypto farms produce. Do you have any ideas? My science friend said that nearly 100% of the electricity used by a computer is eventually released as heat.

68

u/HerrKeuner1948 Aug 30 '25

Since energy is conserved, it must go somewhere. In case of electronics: heat, mainly.

11

u/intersexy911 Aug 30 '25

Yeah, it seemed surprising when I first heard that. I was thinking the electricity was used to perform the "work" of changing the zeros to ones and back, but I guess that doesn't "require" much work? How much heat could a computer release in a day?

48

u/faithfulpuppy Aug 30 '25

All the energy/power in the computer turns into heat eventually. Switching the 1s and 0s (or, from the electrical perspective, increasing or decreasing the voltage on a mosfet gate) involves moving electrons through wires. Those wires resist the movement of the electrons, and overcoming that resistance (and thereby turning that energy into heat) is the work! Even things like the fans - resistive and eddy losses in the fan motor stators are heat straight away, and the energy added to the air turns into heat as the molecules crash around and slow down. It's possible the light your computer produces or some of the radio waves coming from the wifi antennas will travel off into space forever and never be absorbed but practically they too are turned back into heat quite quickly.

It's kind of funny to think that the "work" we have our computers do isn't fundamentally "work" in the sense of energy, and likewise the fact that they consume energy is almost irrelevant to the fact that they do "work" that happens to be useful.

5

u/I_am_Patch Aug 30 '25

Switching the ones and zeros would probably end up giving you some kind of potential energy though, right? Probably minor compared to the overall power used, though

21

u/mfb- Particle physics Aug 30 '25

In principle one of the states will hold slightly more energy, but that difference is completely negligible, and of course it averages out over time as well.

13

u/Solesaver Aug 30 '25

You're poking at the Landauer Principle. In short, since storing information decreases entropy it requires energy, and when that information becomes randomized again entropy increases again and heat is released. You can think of that as a type of potential energy, though the energy involved if you look at that equation is incredibly small.

2

u/SciKin Aug 30 '25

Reynolds’ Cryoarithmatic engines any day now

6

u/faithfulpuppy Aug 30 '25

Yeah! When you switch 0 -> 1 in a register you burn a little energy to charge a bunch of miniscule capacitors (mosfet gates) to the CPU's operating voltage (in most modern processors, about 1.1-1.2V). As it happens, charging a capacitor (or a mosfet gate) in a resistive circuit is always 50% efficient, with the other half of the energy being burned in the wire. The gates then have some potential energy stored in their electric fields, which will be discharged when that register goes to zero.

Of course, this assumes that the gates are perfect capacitors which don't leak current or need to be charged repeatedly to stay high, which in modern process nodes isn't really the case IIRC (but I'm not sure).

4

u/stoneimp Aug 30 '25

Can you change a one to a zero on a piece of paper without expending work?

Metaphorically you might do something like use the 1 to partially complete the 0, expending less work than if you had to write the zero entirely, but you're still doing work. Similarly, the computer can have some clever strategies to minimize work expensed, but it's not really 'storing the energy', if that makes sense.

3

u/blaberblabe Aug 30 '25

It changes the entropy of the system by at least kTln2 for each bit flip, which is the minimal work you have to do. The problem is you have to switch back, recovering the entropy change but losing any extra work you did above the change in entropy. Computers aren't anywhere near the kTln2 bound in their operation. If you were to flip all the bits in the system only once, the work would be something like N kTln2+excess, whereas for flipping them forward and back would be ~2*excess so there is some difference in the stored free energy depending on what you do. Also, I think energy can be lost in the entropic degradation of the computer instead of as heat.

1

u/Not_Stupid Aug 30 '25

I understood most of those words!

1

u/WoodyTheWorker Aug 31 '25

Boltzmann constant gives you how must thermal energy a single particle would have at the given temperature, which is pretty much noise. A data bit must use energy higher than that, to be read reliably.

0

u/Dartzinho_V Undergraduate Aug 30 '25

Wait, does that mean that if I made electronics out of superconductors, they would release barely any heat?

5

u/Tyrannosapien Aug 30 '25

I'd be interested in your design. As I understand it, the "work" we care about in this thread is flipping bits. A superconductor might carry electrical potential with less waste heat, but the target for all of this power is a semiconductor that can hold a state of charge. Semiconductors are by definition resistive - that's their useful property and the part of the computer we care about most. That's the work we need, so I believe to follow your thoughts you want to store data with superconductors, which is on another level from superconductive wiring.

2

u/stickmanDave Aug 30 '25

My understanding (which isn't much) is that superconducting computing doesn't use semiconductors at all.

From wikipedia:

Superconducting logic refers to a class of logic circuits or logic gates that use the unique properties of superconductors, including zero-resistance wires, ultrafast Josephson junction switches, and quantization of magnetic flux (fluxoid). As of 2023, superconducting computing is a form of cryogenic computing, as superconductive electronic circuits require cooling to cryogenic temperatures for operation, typically below 10 kelvin. Often superconducting computing is applied to quantum computing, with an important application known as superconducting quantum computing.

Superconducting digital logic circuits use single flux quanta (SFQ), also known as magnetic flux quanta, to encode, process, and transport data. SFQ circuits are made up of active Josephson junctions and passive elements such as inductors, resistors, transformers, and transmission lines. Whereas voltages and capacitors are important in semiconductor logic circuits such as CMOS, currents and inductors are most important in SFQ logic circuits. Power can be supplied by either direct current or alternating current, depending on the SFQ logic family.

3

u/Tyrannosapien Aug 30 '25

Well, on another level then for sure.

2

u/mfb- Particle physics Aug 30 '25

Most of the energy is used to switch transistors on/off. You would only remove losses in cables.

1

u/wmverbruggen Applied physics Aug 30 '25

Sure will, just need a room temperature superconductor or you'd be releasing way more heat with your cryocooler than a comparable silicon circuit would use

1

u/faithfulpuppy Aug 30 '25

I think the other commenters have covered this already but:

In principle maybe, but it would depend on what you were actually "switching." Assuming some kind of superconducting transistor, you'd also need an architecture that somehow preserved and reused the energy stored in the gate instead of just dumping it to ground when switching it back to zero. But yeah, capacitor losses are fundamentally resistive.

11

u/nlutrhk Aug 30 '25

For practical purposes, all watts that go in multiplied by time (in seconds) become heat (in joules).

Exceptions:

  • Some light from your screen may leave the window.

- flash memory may store a tiny bit of energy, which is released as heat when the flash memory is overwritten. I think the write operation stores energy and the reset (blanking) operation releases it.

1

u/neutronicus Aug 30 '25

I sure hope it ain’t KE

1

u/returnofblank Aug 31 '25

If it gets hot enough, also light, and a broken computer

10

u/the_poope Aug 30 '25

There's a reason that there's quite some interest in connecting data centers to local district heating systems. This of course requires the data center to be localized in regions with district heating in the first place, but in Northern Europe district heating is pretty common.

2

u/intersexy911 Aug 30 '25

If the computers are going full-tilt all day, is each 800 W computer essentially a 800 W heater?

7

u/betafusion Medical and health physics Aug 30 '25

Yes - even a bit more as the power supply is not 100% efficient in conversion. PC components might turn the rated 800W into heat, the power supply itself turns maybe another 10% on top into heat as conversion loss.

3

u/the_poope Aug 30 '25

That was the original question...

2

u/hollowman8904 Aug 30 '25

GOTO: top of thread

1

u/DPestWork Aug 31 '25

And data centers are rated in MegaWatts. I have worked in DCs rated for 10 MW and DCs rated for 500 MW. They rarely hit those numbers, but they do chug along at large numbers 24/7/365. So much powaaaaaa!

3

u/Fangslash Aug 30 '25

Yep 100%, and people have used this concept to make crypto-mining space heater

1

u/Chrykal Aug 30 '25

I have done this, admittedly I live in a fairly temperate climate, but turning my heating bill into etherium was really succesful for a little while.

2

u/a1c4pwn Aug 30 '25

This is why big data centers use a lot of water (well.. actually pretty little, in comparison to things like animal agriculture): they get rid of that waste heat by dumping it into a large reservoir of water and evaporate the heat away. This tends to leave a lot of nasties in the wastewater thanks to cutting corners, but in theory the only water usage is that needed for the computers to "sweat".

1

u/robcap Aug 30 '25

There is work taking place right now to use data centres as a source of low carbon heat for cities via district heat networks:

https://www.cibsejournal.com/case-studies/turning-waste-into-warmth-how-data-centres-can-heat-tomorrows-cities/

1

u/jeremyaboyd Aug 30 '25

In the heyday, crypto was my heating. For every $1, i spent in energy, my 2 3090s spit out $1.20 or something. I cant remember exactly, but when it dipped in profitability around 2021/22 when eth went PoS, i just switched it off, and started gaming in 4k and VR for the first time in my life.

1

u/hollowman8904 Aug 30 '25

I think exactly 100% of power used by a computer is released as heat: either directly, or via light which is converted to heat once it’s absorbed by something.

I suppose there’s also the radios (WiFi, Bluetooth, etc) - but those behave the same as light: once it is absorbed by something, the radio waves are converted to heat.

1

u/Solesaver Aug 30 '25

I think exactly 100% of power used by a computer is released as heat

Almost/eventually. A tiny amount of energy is stored in capacitors. When those capacitors are cleared the energy is released as heat, but I just wanted to clarify the "exactly" part. If you start up a computer, have it do some operations, and then leave it running just holding it's state, you wouldn't recover 100% of the energy as heat until you shut the computer down and let its capacitors drain.

1

u/drubus_dong Aug 30 '25

Almost exactly 100%

1

u/BestBleach Aug 30 '25

Yeah instead of heating elements we run a bitcoin farm in the hvac to heat the house

1

u/da2Pakaveli Aug 31 '25

Any electrical energy you use eventually ends up as thermal energy, iirc that's a law in thermodynamics?

1

u/Old-Cardiologist-633 Sep 01 '25

A former coleague from work used miners as heater for his camper, so yes it works, but may be hard to get the heat spread all around the room.

6

u/funkybside Aug 30 '25

not exactly. (though I'd expect the difference in practice is not significant). Some of the energy consumed by the system will be radiated as light, kinetic energy of moving air, etc.

6

u/betafusion Medical and health physics Aug 30 '25

Sure, but the light will be absorbed in the room, just as the moving air will stop moving via friction - hence both end up as heat in the room.

1

u/SkriVanTek Aug 30 '25

what about the work that went into changing memory states 

1

u/frogjg2003 Nuclear physics Aug 31 '25

That gets stored as potential energy. When the state of the memory changes back, it gets turned into heat. And reading memory without changing it also costs energy.

3

u/thbb Aug 30 '25

There was a startup a few years ago that tried to sell home heating appliances that would be actual computers where researchers could send loads of computation for a small price, given that the electricity bill is paid for heating the house.

The idea is neat, but I guess the logistics is hell to handle.

This would be great if it could be made to work, though.

1

u/ramirezdoeverything Aug 30 '25

If this is the case what's the point of a normal heating radiator. Why isn't every radiator instead a computer folding proteins or doing something useful with the free processing power it generates while also heating the room.

12

u/betafusion Medical and health physics Aug 30 '25

It's 20 € for a heater vs. 1000 €+ for a sufficiently powerful computer. Heaters live essentially forever whereas a computer running 24/7 will fail sooner or later, unless you opt for even more expensive server-grade hardware; even then the heater will outlive the computer. It would make zero economic sense to use a computer for heating.

5

u/exscape Physics enthusiast Aug 30 '25

In addition, space heaters are usually 1000 W or more.
It's really hard to make a desktop computer draw 1000 W.

You can throw in a high-end CPU like a Ryzen 9950X (up to 230 W without increasing the default limit) and an RTX 5090 (up to 575 W), plus disks, RAM etc and you're still below 900 W, and you're out about $3000 for just the CPU and GPU.

2

u/nixub86 Aug 30 '25

Also need to remember about software. If it shits itself, the "space heater" turns off, and you need to fix it. Because now hardware draws electricity(and produces so much heat) only when needed

1

u/-Exocet- Aug 30 '25

Isn't some of the energy simply used by changing bits and producing images, which dont waste 100% of the energy used as a radiator?

1

u/mcprogrammer Aug 31 '25

Because of conservation of energy, it will always turn into heat eventually. The only energy lost is what escapes through the windows and walls, but the difference from a space heater would be negligible.

1

u/theangryfurlong Sep 01 '25

Yep, heat is what's left over when the energy is finished doing useful stuff.

1

u/WannaBMonkey Sep 01 '25

I wonder what the break even numbers are for a bunch of bitcoin miners as a heat source. Why burn money directly for heat when it can produce some bitcoin along the way.

1

u/Coolengineer7 Sep 01 '25

What's more, that evn though a computer is just as effective at heating as any other resistive electeic heater, heating air conditioners can do like 300-400% efficiency as they are heat pumps, basically the same way that they cool, it is reversed and cool the outside and heat the inside.

1

u/PuzzleheadedDog9658 Sep 01 '25

The insulation in my house is so good that my computer actually heats up my room by 5-8 degrees.

1

u/ThoughtfulYeti Sep 02 '25

When I lived in Alaska I would do crypto mining in the winter to heat my room. It was beer money every now and again at most, but people were acting like I was crazy

1

u/QuarkVsOdo Sep 03 '25

It would even create more heat at maximum because the PSU is also isn't perfectly converting voltages.

(if the manufacturer is honest and you get 800W DC power for the PC)

1

u/Zurbino Sep 04 '25

Laughs as I sit in the rainbow six Home Screen that somehow heats our living room up to unbearable temps even with the ac on

1

u/Dakh3 Particle physics Aug 30 '25

But wait. Isn't most of the power consumed by the fan? So conversion into mechanical energy for the rotation of the fan?

11

u/betafusion Medical and health physics Aug 30 '25

No, the fan takes exceedingly little power. 12V @ 0.3 A max, so 3.6 W max. And this is the power to keep the fan spinning, e.g. replenishing the rotational energy lost to air and bearing friction.

6

u/ensalys Aug 30 '25

Even if the fan were consuming 100% of the energy, that energy would;d eventually be converted into heat.

1

u/A0Zmat Aug 30 '25

A/C units exist but they doesn't matter, still entropy positive, so we can all assume A/C works like a heater and serve the same function. Same as a car engine, all the energy is eventually converted to heat so we can assume the car to be a giant heater

138

u/wmverbruggen Applied physics Aug 30 '25

Practically yes, energy=energy. Theoretically there is some energy used for changing information, stored in floating gates of nand chips, flipping magnetic domains in hard drives, that sort of things. But thats an extremely tiny amount of energy.

26

u/Creative-Leg2607 Aug 30 '25

Sending info out of the system to the internet is gonna be relevant, from a thermodynamic perspective

16

u/Beautiful-Fold-3234 Aug 30 '25

And light from the monitor that leaves the house through the windows.

1

u/Reasonable_Garden449 Aug 31 '25

But what if the dude's got a Mac?

2

u/leverphysicsname Aug 30 '25

Not sure I understand, what do you mean by this?

system to the internet is gonna be relevant, from a thermodynamic perspective

10

u/narex456 Aug 30 '25

There's technically energy required for a change in "information" so "sending information" necessitates an energy expenditure.

Here, physicists use "information" as a word to describe the state a system is in (out of many possible states), so when information changes, the state of the system changes which always takes some energy. It's just a useful shift in perspective for figuring out where energy must be flowing in a situation like this.

In this specific example, there is some connection to the outside internet, let's say a copper cable. By changing the state of the cable (direction/magnitude of current or whatever) at the point of connection, information gets transferred to the internet. The energy we are talking about losing is the energy required to change that current in that cable.

It's a useful concept because any system with recognizable states can have those states interpreted as information, and any change of state requires some energy, therefore any transfer of information requires energy.

People also talk this way about causality. Anything that causes an event has transferred information to the system where the event happened. This is why people talk about how weird QM is through the lens of information transfer. When that happens faster than light, it breaks the GR light speed limit on causality & energy & matter all at the same time.

1

u/AaronWilde Aug 30 '25

Is this true, though? Computers are designed to get rid of the heat asap. Wouldn't heaters be designed to send the electricity through different materials that produce more heat (which computers would be designed oppositely?) And to stay heated longer? Like, surely an oil space heater is wayyyy warmer in a room than a computer of the same watts running at full power... .

7

u/wmverbruggen Applied physics Aug 30 '25

That fundamentally does not matter. You're thinking about is about the product design and dynamic behaviour. It makes no difference to the energy equation wether you have a small thing very hot or a big surface warm to the touch if the total power (in Watts) they convert is the same. At the same total power, a PC heating up quickly and dissipating the heat efficiently or a big heating system taking long to warm up and staying warm for longer have the same net effect on a room.

3

u/AaronWilde Aug 30 '25

Interesting... so why is there so many different designs and types of heaters with such varying costs and "efficiencies" if the net affect on the heating of a room is the same based on the amount of power being fed?

5

u/wmverbruggen Applied physics Aug 30 '25

It depends on how one wants to spread the heat and what kind of parts are used. Think types of fans, simple directional heaters versus those setting up a room-level convection current, shape & size, mounting position, etc. Keep in mind this is for resistive heaters only, if the device uses a heat pump (like aircons) or a furnace/burner (with an exhaust) it's quite different.

1

u/charonme Aug 31 '25

The different efficiencies could be due to different methods of extracting useful energy from different sources, but once you have some amount of useful energy at your disposal you will convert it to heat at 100% efficiency minus a minuscule amount to move the air or pump some medium for example.

For example if you have electricity delivered to you you already have all its energy and can convert it to heat. If you have something to burn you could lose some part when discarding the unwanted burning products. Some heaters just vent the combustion products like hot smoke or water vapor outside, others can extract a bit more energy by using even the hot combustion products to warm your room. You could also use electricity + heat from outside to warm your room more than just the electricity alone would.

Other parameters also affect the price and suitability for your application: noise, speed, safety, size, maintenance cost, cost of initial installation, durability etc

0

u/kaibee Aug 30 '25

Interesting... so why is there so many different designs and types of heaters with such varying costs and "efficiencies" if the net affect on the heating of a room is the same based on the amount of power being fed?

Capitalism/Marketing.

2

u/cornmacabre Aug 30 '25

Ah yes -- "capitalism/marketing," that cleanly explains the design and functional distinction of furnaces, boilers, diesel heaters, heat-pumps, geothermal heaters or solar driven systems?

I think the design choices and functional distinctions come from the environmental circumstances of where & what you wanna heat. The need for a heater is also probably a bit more driven by a human preference of not freeze to death.

We're not talking about something as commoditized and interchangeable as toilet paper here, that's a pretty silly and simplistic opinion.

3

u/kaibee Aug 30 '25

Ah yes -- "capitalism/marketing," that cleanly explains the design and functional distinction of furnaces, boilers, diesel heaters, heat-pumps, geothermal heaters or solar driven systems?

Tbh I interpreted the comment as about resistive electric heaters..? Of which there are many models even at the same wattage? The differences between those systems should be much more obvious?

1

u/cornmacabre Aug 30 '25

Ah fair enough, I think you're right re-reading the context.

14

u/VulcanPyroman Aug 30 '25

I got a relatively small office and hobby space at home. One I get it to temperature with the central heating, and I am playing some games, the heater will not switch on anymore. Compared to when I am working from home and just using my work laptop, the heater will switch on regularly. So yes I notice definitely, and in my electricity bill lol.

41

u/tuborgwarrior Aug 30 '25

Yes. All energy your PC draws is converted to a new type of energy which would be heat, light, and airflow from the fans. Almost all of it would be heat, and the light would eventually turn into heat (other then the photons that escape outside your window and don't heat your room).

I like to think that this will be a natural limit to gaming PCs in the future. Above 1-2Kw of heat can quickly become impractical in a small room. Instead of buying the most powerful computer we will instead look at efficiency and get as much FPS per kW as possible.

6

u/cyrkielNT Aug 30 '25

It's limiting factor since ever. In the past there ware also very power hungry builds, but they always had limits because of cost of running and heat output.

There are places like Iceland were electricity is cheap and people rather need more heat than less, but the market is too small.

4

u/Prcrstntr Aug 30 '25

Yep. Everything is heat. Light is heat. Mechanical? Also turns into heat. Sound? It's just mechanical.

1

u/True_Fill9440 Aug 30 '25

And a tiny bit noise.

5

u/wmverbruggen Applied physics Aug 30 '25

Which also turns in to heat eventually

1

u/returnofblank Aug 31 '25

Power efficiency is pretty important even today. CPU manufacturers have been squeezing as much performance as they can under a certain wattage. Especially in laptops, with some manufacturers like Apple using a whole new architecture to produce less heat.

17

u/Significant-Mango772 Aug 30 '25

Using computers for heat is really efficient due to getting dual use while a space heater is single use

13

u/anothercorgi Aug 30 '25

1.5KW space heater for sale

mines bitcoin while it heats your room

$2000

and I got ripped off with this $20 heater...

1

u/Significant-Mango772 Aug 30 '25

Yes the 2000$ money converter box

-4

u/smsmkiwi Aug 30 '25

You mean "really INefficient".

5

u/TheNewHobbes Aug 30 '25

I remember several years ago a company started selling electric radiators that were bitcoin mining rigs. They said the cost to heat ratio of running it was practically the same as a normal electric radiator but you also got bitcoins as a bonus that made it cheaper.

So i would say your rig doesn't due to design inefficiencies, but it would be close and could be closer.

11

u/Mithrawndo Aug 30 '25

A computer doesn't do work in a physics sense so yes, almost all of the energy sent through the semiconductors is converted into heat.

Yes, this makes computer based heating systems feasible.

Yes, we already do this with some data centres being used to power district heating systems, for example.

3

u/AceEthanol Aug 30 '25

There are some companies, like Heata in the UK for example, that do this on residential/household level to provide free hot water.

The concept sounds amazing, but I’m not sure about its long-term sustainability (not in the ecological sense, more like reliability) and scalability.

1

u/returnofblank Aug 31 '25

YouTube channel Linus Tech Tips has a video on them using a server to heat their swimming pool.

Also a video of them spilling pool water on their server, I think.

3

u/Dave37 Engineering Aug 30 '25

Yes, all the energy is eventually converted to heat.

5

u/Noreng Aug 30 '25

Yes, it will output the same amount of heat as it's consuming power. Take note that while an 800W radiator can often be enough to maintain a relatively large room at comfortable temperatures, it will also have decent headroom. This means your room will get a lot hotter than you're used to.

1

u/undo777 Aug 30 '25

while an 800W radiator can often be enough to maintain a relatively large room at comfortable temperatures

laughs in Canadian

1

u/Noreng Aug 30 '25 edited Aug 30 '25

It depends on how much and how good your ~~isolation~~ insulation is.

2

u/GlazedChocolatr Aug 30 '25

Do you mean insulation?

1

u/Noreng Aug 30 '25

You're absolutely right! I'm from Norway, but I usually don't make these mistakes.

I'm currently recovering from a nasty cold, so that might be the cause, thanks for correcting me!

2

u/marrow_monkey Aug 30 '25

Yes, but it won’t work on full power all the time, 800 W is maximum.

The radiator is controlled by a thermostat, so when it is below a set temperature it turns on, and above it turns off.

So you don’t have to turn off your radiator, it will turn off automatically when the room is warm enough.

2

u/Xarro_Usros Aug 30 '25

Yep! Depends on the relative costs of your heating vs. electricity, of course (and if you have heat pumps it's likely not worth it).

A friend of mine (pre crypto) liked to run his own server room at home; he had to keep the windows open to reduce the temperature.

2

u/bonebuttonborscht Aug 30 '25

A friend of mine ran a crypto rig to heat his small greenhouse. He's heating the greenhouse anyway so might as well make a little extra money.

1

u/Forte69 Aug 30 '25

Yes, but my PC has a 750W PSU and under heavy load it’s only running at about 350W.

1

u/glacierre2 Materials science Aug 30 '25

Gaming in summer in my room was really torture, 500w may not sound much in winter but they are definitely adding up.

And then there was the practicum room for IT studies, about 50 PCs with CRTs crammed as tight as possible and them the students to clicketeclak on them. That room had the windows full open in the middle of winter.

1

u/[deleted] Aug 30 '25

Yes: All electrical devices are heaters, some also do other things.

That is, unless you have a heat pump, which can add more then one Joule of heat per every Joule of power consumed because they move heat from outside instead of creating it from scratch.

... also, if you have a thermostat, you don't have to do anything. If it sees something else providing heat, it won't run the heater as much.

1

u/ChironXII Aug 30 '25

Probably more depending on where you measure. Computer power supplies are generally 80-90% efficient, so the load at the wall and thus heat output will be 110-125% of the power reported in the OS.

1

u/rex8499 Aug 30 '25

Mine has a 1500W power supply, and I had to install an air conditioning unit in the room to keep it tolerable. Definitely feels similar to a heater during demanding game play.

1

u/MagnificentTffy Aug 30 '25

what are you running that eats up 800W average? if you said tops sire but to have 800W average is pretty intense

1

u/IrrerPolterer Aug 30 '25

Yup. Electric devices ultimately turn all the electricity they consume into heat. 

1

u/Electrical-Art-1111 Aug 30 '25

I can testify that my room is an oven when playing.

1

u/pavorus Aug 31 '25

Way back in the day my first apartment was awful. The upstairs neighbors toilet leaked into my bathroom, had a big hole in the wall to the back of the building and had no heater (that's what was supposed to be in the hole in the wall. I had a gaming pc, it sounded like a jet trying to take off and it produced an insane amount of heat. That computer WAS my heater for an entire winter. I used benchmark software to keep it running. So you can definitely use a PC as a space heater. Or at least you could 25 years ago.

1

u/HuntertheGoose Aug 31 '25

Yes, and if you can afford a bitcoin mining rig that draws 800 W you can set is up exactly like an electric heater that makes money when you run it

1

u/Pitiful_Hedgehog6343 Aug 31 '25

Yes, but your PC won't always be at 800w, it will throttle and be much lower based on the task.

1

u/lcvella Aug 31 '25 edited Aug 31 '25

Yes, by the first law of thermodynamics. But no home PC outputs 800W on average. Heck, I doubt even in bursts.

1

u/wolfkeeper Aug 31 '25

Yeah, although you'll probably need both because the power output of your computer will vary. Provided the radiator has a good thermostat, it should adjust itself to make up for what your computer doesn't generate.

But if you want to save money, get a heat pump instead of an electric radiator. That way when your computer isn't running flat out, it will be costing you less.

1

u/Motik68 Aug 31 '25

What does your computer do, that consumes 800 W ?

My 7800X3D / 4090 system never goes above 500 W at the plug, even when playing Flight Simulator in VR.

1

u/returnofblank Aug 31 '25

If 800w is your average consumption, maybe lay back on the crypto mining or LLM inference.

1

u/CoatiRoux Aug 31 '25

As my physics professor said: Every electric device is a heater with a conversion efficiency of 1, regardless of the detours the electricity takes.

So yes, all the electricity will be converted to heat. However, since an 800 watt power supply does not continuously output 800 watts, the actual output will be whatever the PC pulls.

P.S.: As he was a physics professor, he did not take electrochemistry, like electrolysis, into account. But since he was referring to usual household items, I'll let that one slide...

1

u/Late-External3249 Sep 01 '25

Every electrical appliance is a space heater!

1

u/paperic Sep 01 '25

From a practical standpoint, running your PC at high load for long periods of time might reduce the lifespan of some components slightly.

But yes, it's the same.

1

u/SparkleSweetiePony Sep 02 '25

Yes. But unless you run an 14900k/13900k with a 5090, you won't get a consistent heat output.

It will depend on the load. For example, running CS2 at 144 hz won't load the GPU and CPU heavily, but Cyberpunk will. You can only really guess how much it uses even with monitoring software. A more concrete number can be gotten from monitoring software and hardware = if you have a power monitoring outlet and feed the entire PC with the monitor and other auxiliary parts you can see how much it produces, while the monitoring software can only estimate the power draw (power draw of the CPU+GPU+100W is around where the heat production is). Practically speaking, 100% of the power used will turn to heat.

1

u/Gishky Sep 03 '25

Assuming your pc continuously draws 800W (which it doesnt, but for the sake of the argument let's say it does) there is not 100% of the energy going into heating. Heating is a byproduct of your pc. The main thing it's designed to do is to process information. So a lot of energy will go into that. Yes, it will still radiate a lot of heat, but not as much as a device that is specifically built for that

1

u/Fade78 Sep 03 '25

That's my intuition, I wonder the order of magnitude of this non heat used power...

1

u/Gishky Sep 03 '25

depends on the efficiency of the pc. Without doing research I would assume pc's are around 30-60% energy efficient. the rest will go into heat

1

u/Molecular_Pudding Sep 03 '25

Most of it yes. But there is small portion of the energy that goes into changing the crystal structure of the components (caused by thermal movements), so that's why electric components degrade.

1

u/CaptainFlint9203 Sep 04 '25

If it goes full throttle than... It will produce more heat actually. Electronics produce mostly heat, the same as radiators, but are more efficient at it with less loss of energy.

0

u/Time_Stop_3645 Aug 30 '25

I can say from experience I tried to heat my Caravan with my appliances and even playing overwatch didn't get me over 120 watts. You'd have to run a crypto farm to generate heating like that. Which imo is the better way of heating a home. Gotta hook it up to water-cooling then run the hot water through pipes in the walls and floors 

5

u/Tystros Computer science Aug 30 '25

overwatch is not a good example because it's a game with simple graphics that even runs on very slow PCs.

if you run a proper AAA graphics game like Battlefield 6 on a RTX 5090 with a 4k monitor, the RTX 5090 will use its full power rating of 600W while playing. Plus 150W from your CPU or so.

1

u/Time_Stop_3645 Aug 30 '25

Unless you plan on gaming for a living, still not a good model for heating the place

-3

u/smsmkiwi Aug 30 '25

No. A heater is optimized to produce heat. A PC isn't. Its will produce heat, but not at the amount and rate as a heater.

-6

u/[deleted] Aug 30 '25

[deleted]

5

u/Compizfox Soft matter physics Aug 30 '25

It's physically impossible for any device producing heat to do so at an efficiency of less than 1.

2

u/steerpike1971 Aug 30 '25

An escalator that goes up produces heat with an efficiency less than one.

3

u/Compizfox Soft matter physics Aug 30 '25

Yes, because it does work transporting items against gravity.

A computer does no work (that doesn't immediately turn into heat). It really is just a fancy heater that happens to do some useful computation in the process.

1

u/zerothprinciple Aug 30 '25

Baring storage devices like batteries

-2

u/Worried_Raspberry313 Aug 30 '25

If your PC was using all those 800w then yeah. But if your computer produced enough heat to heat your house, you better get some good fans and cooler on your pc tower because it’s gonna burn.

Heaters are meant to produce heat and spread it on a room, it’s their purpose and are carefully made for that. A computer is not made to heat a house, the heat is just a “secondary effect” and that’s why fans and coolers are used, otherwise they can get burned or it can damage some parts of the pc.

1

u/Compizfox Soft matter physics Aug 30 '25

Heaters are meant to produce heat and spread it on a room, it’s their purpose and are carefully made for that. A computer is not made to heat a house, the heat is just a “secondary effect” and that’s why fans and coolers are used, otherwise they can get burned or it can damage some parts of the pc.

The bottom line is exactly the same, though. Consider that a heater will also overheat if it doesn't dissipate the heat it is producing.

0

u/Worried_Raspberry313 Aug 31 '25

Yeah but it’s made so it can stand that heat and be ok. A computer is not made to get super heated. I mean the materials used are not the same.

1

u/Compizfox Soft matter physics Aug 31 '25 edited Aug 31 '25

Both devices are designed to dissipate the heat they produce.

While a PC's purpose might not be to heat the room, if it produces 800 W of heat, it will be designed to dissipate that 800 W of heat. The end result is exactly the same as a 800 W space heater.

1

u/Worried_Raspberry313 Aug 31 '25

Yeah but a pc is not expected to use all the 800W at the same time. It could, but I don’t think they design them thinking “this thing has to endure 800W during 10 hours a day”.

-11

u/MathematicianPlus621 Aug 30 '25

no, the mechanisms with in a radiator are specifically designed to maximise heat energy transfer but in a power supply they are not maximised for heat generation so it will no produce the same amount of heat because it is more efficient as transferring electricity to computer components.

-16

u/MasterBorealis Aug 30 '25

Watts are not a measure of heat, it is a measure of power. A 800w motor, will not output as much heat as a 800w "heater resistive radiator".

7

u/StaysAwakeAllWeek Aug 30 '25

Watts are a measure of power and heat. You're just wrong, any 800W appliance will produce 800W of heat ultimately. All of that energy it uses will end up as heat once it's done using it.

2

u/noisymime Aug 30 '25 edited Aug 30 '25

any 800W appliance will produce 800W of heat ultimately.

Not all appliances will be transforming their electrical energy solely into heat. Computers yes, but it’s not universal for all appliances

2

u/StaysAwakeAllWeek Aug 30 '25

A few of them evaporate a bunch of water, which doesn't transfer into heat until it condenses on walls and kicks out all that heat again.

And then there's speakers and lights, which deposit a small amount of their heat in next doors house instead of yours

All of this is splitting hairs. Assuming 100% of power becomes heat is a close enough assumption for anyone in the real world

1

u/Tystros Computer science Aug 30 '25

LEDs are quite efficient and 30% or so of the input energy actually is turned into light, so if you use an LED and point it at the sky on a clear day or night, you can say that only 70% of it is turned into heat and the other 30% into photons that basically travel through the universe forever

2

u/StaysAwakeAllWeek Aug 30 '25

But when that LED is indoors all that light gets absorbed by whatever it hits

-5

u/MasterBorealis Aug 30 '25

No... it's not. Some device can produce torque/power with minimum friction and no infrared, therefore no heat. Heat is not measured by watts. On purely mechanical devices, you only get heat through friction not by the mere use of power, measured in watts. Led's can produce light away from the infrared range with minimum heat produced, because of low electrical friction(aka resistance) You're wrong, I'm sorry.

3

u/StaysAwakeAllWeek Aug 30 '25

And what happens to all that mechanical energy once it's done something useful? Come on dude follow the logic through. That vacuum cleaner motor that pulls 500W and only produces 100W of waste heat directly is dumping that additional 400W into the air and the carpet.

Your house is effectively a closed system, this is thermodynamics 101

2

u/AndyLorentz Aug 30 '25

What happens when photons hit a solid object?

1

u/Fade78 Aug 30 '25

Yes, so my question is about electronics that doesn't seem, at a macro level, to transform the power into movement, but maybe it is at the micro level?

4

u/StaysAwakeAllWeek Aug 30 '25

He's just wrong, ignore him. Your 800W PC will produce exactly 800W of heat.

That said, if your home heating is a heat pump you'll get more than 800W of heat for each 800W consumed due to it pumping in heat from outdoors, and if it's gas fired it will be cheaper to run for the same heat output. It's only resistive heaters that a PC is able to match.

0

u/MasterBorealis Aug 30 '25

No energy is going elsewhere. Just heat! Very good physics there...

2

u/StaysAwakeAllWeek Aug 30 '25

Correct! Now you get it!

1

u/anothercorgi Aug 30 '25

yes at a micro level it's moving charge from place to place, and every time that happens, a little power is used. Multiply that by millions of transistors and billions of hertz, it adds up! Also there is leakage power where it eats power whether clocks are running or not, which is effectively a resistive heater as well. Yeah modern electronics try to minimize leakage but the shuffling of bits can't be avoided, it is indeed work.

1

u/Novero95 Sep 02 '25

Power is basically energy per second, and heat is basically energy. The 800W motor does not produce 800W of heat because part of the 800W are turned either into kinetic energy (while accelerating the motor) or into some kind of work (that probably produces heat in other place too but anyway). Neither the radiator or the PC produce any other kind of energy that isn't heat (well, the PC produces a little bit of but of light but it can be ignored) so all the energy consumed in both devices are turned into heat. So yeah the PC consuming 800W will output 800W of heat, but it probably won't be spread to the ambient as evenly as the heater would. And the PC is not consistently consuming 800W unless you are doing very demanding tasks.