There are always on and off threads regarding CPU temperature and computer overheating due to BOINC.Here is a curious fact. Seti@Home is the hottest. My notebook CPUs run up to 75C during Seti runs. The lowest I've seen at 100% CPU utilization on both CPUs was another project which peaked at 60C.
Quote from: michael37 on 15 Sep 2007, 01:11:07 pmThere are always on and off threads regarding CPU temperature and computer overheating due to BOINC.Here is a curious fact.═ Seti@Home is the hottest.═ My notebook CPUs run up to 75C during Seti runs.═ The lowest I've seen at 100% CPU utilization on both CPUs was another project which peaked at 60C.I've often wondered if it would be possible to code a sequence which has no pipeline stalls, keeps all execution units fully occupied, uses cache fully, and uses main memory transfers at peak. In a sense that's the ultimate goal of optimization; nothing wasted. I don't think we're anywhere near that yet, but I take higher CPU temperatures as a good sign in general. OTOH, a bad algorithm can use a lot of CPU without being optimal, so I don't think temperature is a sufficient indicator of effective optimization. Joe
There are always on and off threads regarding CPU temperature and computer overheating due to BOINC.Here is a curious fact.═ Seti@Home is the hottest.═ My notebook CPUs run up to 75C during Seti runs.═ The lowest I've seen at 100% CPU utilization on both CPUs was another project which peaked at 60C.
Quote from: Josef W. Segur on 15 Sep 2007, 03:36:54 pm...OTOH, a bad algorithm can use a lot of CPU without being optimal, so I don't think temperature is a sufficient indicator of effective optimization. JoeAlong your comment Joe, I remember Francois commenting once (& Alex long before him ), about the importance of minimizing L1 thrashing. Francois rather threw that in everyone's face a year ago that the then-current code thrashed heavily and that would be a primary area he'd focus/clean-up.To your point, this would contribute to higher CPU temps, right?.... but would certainly not be optimized performance.How much validity do you think there is to his L1 thrashing comment today?Is it really that much of an issue/area of opportunity?
...OTOH, a bad algorithm can use a lot of CPU without being optimal, so I don't think temperature is a sufficient indicator of effective optimization. Joe