Forum > GPU crunching

It works!

<< < (2/10) > >>

Jason G:

--- Quote from: Yellow_Horror on 24 Feb 2009, 06:36:30 am ---What may be the cause?

--- End quote ---

Not sure, but I've seen some kindof doscussion about DLL's, maybe that's it.  I'm not sure which ones I'm running anymore, but as I updated to v9 recently, I may get the same issue.  Will see.

Jason

Yellow_Horror:

--- Quote from: Jason G on 24 Feb 2009, 05:32:18 am ---When tasks marked with an older revision run out, it no longer deletes the old app, if it is marked with a different plan_class .  That is good, because for my own use it enabled me to manually give my cpu app a new application name in app_info,  then modify client_state to allocate chosen tasks to CPU to work on simualtaneously with cuda ones.  Works very well for me, but not for the beginner or faint of heart!  :)

--- End quote ---

It isn't an option for me, because while i like an idea to share my free CPU/GPU cycles to SETI science, i really hate an idea to constantly do the same with my own free time. I already spend some time to play with CUDA versions and options because it was an interesting innovation, but i grade to get tired from it. The SETI CUDA application was not beta-tested enough from the beginning (maybe nVidia pushed public release before the New Year due to advertising interests?), it is still wery user-annoying when running, and the BOINC user options to manage CUDA work are still (after 2 months!) deficient even in development versions... Where are the "Run And Forget" spirit of the old good distributed computing projects like original SETI@home?

Jason G:

--- Quote from: Yellow_Horror on 24 Feb 2009, 08:18:24 am ---It isn't an option for me, because while i like an idea to share my free CPU/GPU cycles to SETI science, i really hate an idea to constantly do the same with my own free time. I already spend some time to play with CUDA versions and options because it was an interesting innovation, but i grade to get tired from it. The SETI CUDA application was not beta-tested enough from the beginning (maybe nVidia pushed public release before the New Year due to advertising interests?), it is still wery user-annoying when running, and the BOINC user options to manage CUDA work are still (after 2 months!) deficient even in development versions... Where are the "Run And Forget" spirit of the old good distributed computing projects like original SETI@home?

--- End quote ---

Oh all agreed! Having to mess with workarounds and settings and special configs is annoying and error prone.  Hopefully the gradual movement in Boinc code will fix the scheduling issues so that anonymous platform will work 'out of the box' . There seems to be downsides to any particular workaround  approach at the moment, which I hope will be resolved.  We shall see.

Jason

Jason G:
Any Ideas what's stopping this from asking for Cuda work ?


--- Quote ---2/24/2009 11:55:25 PM      [wfd] ------- start work fetch state -------
2/24/2009 11:55:25 PM      [wfd] CPU: shortfall 0.00 nidle 0.00 est. delay 376335.23 RS fetchable 100.00 runnable 100.00
2/24/2009 11:55:25 PM   SETI@home   [wfd] CPU: runshare 1.00 debt 0.00 backoff dt 0.00 int 0.00
2/24/2009 11:55:25 PM      [wfd] CUDA: shortfall 371520.00 nidle 1.00 est. delay 0.00 RS fetchable 0.00 runnable 0.00
2/24/2009 11:55:25 PM   SETI@home   [wfd] CUDA: runshare 0.00 debt 0.00 backoff dt 81174.41 int 86400.00
2/24/2009 11:55:25 PM   SETI@home   [wfd] overall_debt 0
2/24/2009 11:55:25 PM      [wfd] ------- end work fetch state -------
2/24/2009 11:55:25 PM      No project chosen for work fetch

--- End quote ---

Both 6.6.9 & 6.6.10 seem to be doing this for me, and I don't quite know what it means.

Yellow_Horror:

--- Quote from: Jason G on 24 Feb 2009, 05:32:18 am ---it no longer deletes the old app, if it is marked with a different plan_class

--- End quote ---

Is there any clues how to define plan_class for CPU version of Seti MB app?

P.S. I switch to V7 CUDA app to see if the CUDA freezes still persist (to diff-diag the trouble between the new BOINC and the new app).

Navigation

[0] Message Index

[#] Next page

[*] Previous page

Go to full version