code for melting computer

A place to discuss the implementation and style of computer programs.

Moderators: phlip, Moderators General, Prelates

Posts: 509
Joined: Tue Apr 24, 2012 1:10 am UTC

code for melting computer

Postby >-) » Fri Sep 29, 2017 1:36 am UTC

What sort of code could I write in a low level language like C in order to induce maximum energy consumption (and hence melt my computer)?

I remember being shown graphs in my computer architecture class that showed most of the energy expended during computation was from moving bits around, and that the actual operations in the ALU didn't consume that much energy. Based on this information, I was thinking that writing some loops to copy random bits between a few arrays which fit in the L3 cache would heat my computer up the most.

The reason why I think the L3 cache would work better than copying to DRAM is because I've never seen or heard of DRAM needing cooling like the CPU does, so I figured DRAM doesn't consume much energy.

My question: would this actually result in more energy consumption than say, running your average computer stress test? I know the small FFT tests in Prime95 are designed to stress the fpu and the cache, so I wonder how this strategy would compare to that.

Also, does the same logic apply to testing out my GPU? Would it be better there to stress the main memory or the scratchpad memory? I know GPU's don't have much cache to speak of.

User avatar
Posts: 1021
Joined: Fri Apr 13, 2007 6:54 pm UTC

Re: code for melting computer

Postby hotaru » Fri Sep 29, 2017 4:57 pm UTC

unless the hardware has a serious flaw, the worst you'll be able to do is waste electricity and maybe shut down the system (the CPU and GPU are more likely to throttle and stay within safe temperature ranges, but if you have especially poor cooling (for example, a laptop packed full of dust), you might be able to get something hot enough that it shuts off to prevent damage.

Code: Select all

factorial product enumFromTo 1
isPrime n 
factorial (1) `mod== 1

User avatar
Posts: 46
Joined: Thu Jul 17, 2014 8:19 pm UTC
Location: Somewhere in your past light cone

Re: code for melting computer

Postby mosgi » Fri Sep 29, 2017 7:24 pm UTC

Ooh, I know this one!

Modern computers have very good thermal throttling - ie, when they get hot, they slow way the fuck down to avoid overheating. Old CPUs (10+ years old by now) didn't always have thermal sensors, so you could get them to burn up by running even fairly standard programs without a heatsink - here's an example, from back when Intel had thermal sensors but AMD didn't.

These days you can't really cause that to happen, but I remember seeing a StackOverflow post on how to get your CPU to use as much power as possible. The answer is here - this will burn a ton of power in a very localized area of your processor (the vector unit in each core), so expect it to start throttling pretty quickly if you try running that code.
(they pronouns please)

Return to “Coding”

Who is online

Users browsing this forum: No registered users and 6 guests