SETI@home analyses astronomical data for signs of life
In fact, you might be able to do a lot better than double it, some people have reported a tenfold improvement from one simple tweak: CUDA!
Put simply, CUDA is a way of getting your graphics card to help out with computational tasks. And it turns out that in the hands of CUDA-enabled software the peculiar architecture of a graphics processor absolutely rocks at hardcore number-crunching like SETI@home.
What do I need?
- A compatible Nvidia graphics card. Generally speaking: Geforce 8XXX or better.
- The latest drivers for that card.
- The latest BOINC software.
How do I get started?
Simply update to the latest version of BOINC. If everything is ok you should see a message about CUDA and your GPU being detected as a coprocessor when you run the CPU benchmarks (done automatically after installing).
There are some problems. BOINC’s site claims in one place that 6.4.5 is ok for CUDA, but mine just wouldn’t work. Elsewhere on the site they say you need 6.6.36 or better, but the version in the Ubuntu repos is only 6.4.5. You can get the latest version here, just uninstall any existing copy of BOINC, extract the folder somewhere on your system and set the contained run_client file within to run at startup (System > Prefs > Startup Applications).
What sort of performance will I get?
BOINC say between 2 and 10 times the speed. But a picture is worth a thousand words:
Can you tell what day I switched to CUDA?
Is it just for SETI@home?
No, you can also sign up to GPUgrid to do biological simulations, and users with ATI graphics cards will soon be able to crunch for Milkyway@home and Collatz Conjecture, all still using the BOINC distributed computing platform.