Volunteer grid computing projects

Most people have heard of SETI@Home, the volunteer distributed grid computing project in which computer owners let software run on their machine when it is idle (especially at night) that helps search through electromagnetic data from space in an effort to find communications from extra-terrestials. But this is only one of many such projects; over a dozen are described in “Volunteer Computer Grids: Beyond SETI@home” by Michael Muchmore, many of them devoted to health applications.
Why do people donate their computer cycles. At first glance, why not? These programs, most of which run BOINC (Berkeley Open iNfrastructure for Networked Computing), are careful to only use CPU cycles not in demand by the computer owner’s software, so the cycles donated are free, right? Well, sort of, but it takes time to download and install the software, there is some risk of infecting one’s machine with a virus, many users may perceive some risk that the CPU demands will infringe on their own use, etc. Most users will believe there is some amount of cost.
With certain projects, volunteers may get some pleasure or entertainment value out of participating: for example, the search for large Mersennes primes is exciting to those who enjoy number theory; searching for alien intelligence probably provides a thrill to many.
I suspect a related motivation is sufficient for most volunteers: the projects generally have a socially valuable goal, so people can feel like they are helping make the world a better place, at a rather small cost to themselves. For example there are projects to screen cancer drugs, search for medications for tuberous sclerosis, and help calibrate the Large Hadron Collider (for physics research). As Muchmore writes, “a couple of the projects—Ubero and Gómez—will pay you a pittance for your processing time. But wouldn’t you feel better curing cancer or AIDS?”
These projects appear to attract a lot of volunteerism. Muchmore reports estimates of participation that range from one to over five million computers at any given moment. According to the BOINC project, volunteers are generating about 400 teraflops/second of processing, far more than the 280 tps that the largest operational supercomputer can provide.

But that’s just the tip of the iceberg…

CNet captures some anecdotes about the rise in splog (spamming blogs) in “Blogosphere suffers spam explosion“. They’re right of course, but the following was not the most impressive summary:

While technology and legislation may have made spam in e-mail manageable, there is still some way to go when it comes to keeping it out of blogs.

Two common types of splog are comments or tracebacks that point to a commercial site (often for medications or porn), or comments (or fake blogs) filled with links to raise the PageRank (Google index strength) for sites.

Been splogged

Just a quick personal note: this is the least publicized blog on the planet (and no one seems to care enough about it to leave comments!), but I’ve been splogged nonetheless. Was a few weeks ago, in the midst of last week of class so I tucked this away for a better day (the site seems to be gone, so I’m putting in the full posting including URL):
Sent: Thursday, April 20, 2006 11:29 PM
To: jmm@umich.edu
Subject: [ICD stuff] New TrackBack Ping to Entry 2992 (Principal-agent problem in action)
A new TrackBack ping has been sent to your weblog, on the entry 2992 (Principal-agent problem in action).
IP Address:
Title: pregnant movies
pregnant porn pregnant fuck pregnant milk gallery

CAPTCHAs (2): Technical screens vulnerable to motivated humans

A particularly interesting approach to breaking purely technical screens, like CAPTCHAs, is to provide humans with incentives to end-run the screen. The CAPTCHA is a test that is easy for humans to pass, but costly or impossible for machines to pass. The goal is to keep out polluters who rely on cheap CPU cycles to proliferate their pollution. But polluters can be smart, and in this case the smart move may be “if you can’t beat ’em, join ’em”.

Continue reading CAPTCHAs (2): Technical screens vulnerable to motivated humans

CAPTCHAs (1): Technical screens are vulnerable to technical progress

One of the most wildly successful technical screening mechanisms for blocking pollution in recent years is the CAPTCHA (Complete Automated Public Turing Test to Tell Computers and Humans Apart). The idea is ingenious, and respects basic incentive-centered design principles necessary for a screen to be successful. However, it suffers from a common flaw: purely technical screens often are not very durable because technology advances. I think it may be important to include human-behavior incentive features in screening mechanisms.

Continue reading CAPTCHAs (1): Technical screens are vulnerable to technical progress