This article in the New York Times, while driving home a sad and painful situation we academics all share (not being able to provide our students with all the attention they want), also illustrates the problem faced by any incentive system to discourage unwanted email: where is the boundary?
The article discusses the increase we all have experienced in email from students, sometimes inappropriate or unreasonably demanding (not always!). Clearly, some of this mail we would rather not receive. But what filter or incentive system or other mail management mechanism can tell which mail from our students we don’t want to receive? The possibilities for Type II errors are a bit scary.
Payola probe turns from labels to radio
As I mentioned a couple of entries ago, payola (“pay to play”) schemes are still ongoing in the radio and music distribution industry. Sony and Warner settled with NY State for millions; now Attorney General Eliot Spitzer says he has proof that some of the largest radio groups have taken payments from top executives in the recording industry.
Here is a tongue-in-check list of flaws in proposed anti-spam technologies / protocols / policies. It was written as a humorous commentary on the immediate flaming most proposals receive, but it is a pretty insightful and potentially useful checklist to run against any serious proposals (from craphound.com):
Continue reading Lots of anti-spam ideas are crap
Dave Crocker of Brandenburg Consulting wrote a message today to Dave Farber’s “Interesting People” mail list in which he made the following observation:
We must continue with efforts to detect and deal with Bad Actors, but there is a separate path that is at least as valuable: We need methods for distinguishing Good Actors. Folks who are deemed “safe”. In effect, we need a Trust Overlay for Internet mail, to permit differential handling of mail from these good actors. In general terms, a trust overlay requires reliable and accurate identification of the actor and a means of assessing their goodness.
In other words, authentication and reputation.
Crocker is talking about screening for “good actors”: some test that distinguishes trusted senders from the rest (this is not necessarily equivalent to identifying “bad actors” because there may be a vast middle that is neither good nor bad). Screening mechanisms are one of the two categories of fundamental mechanisms for dealing with hidden information problems, the hidden information in this case being the sender’s private knowledge of whether she is a good or a bad type.
Continue reading Screening for Good (email) Actors
IgoUgo: Travel Reviews, Vacation Pictures, Travel Deals
IgoUgo is (apparently) a popular travel review site (I found it mentioned in the NYT article linked in my preceeding entry).
They offer to pay incentives to people who post reviews: “Go Points” they will redeem for gift cards, frequent flier miles (natch!), etc, from Amazon, iTunes, and others. Here’s a screen shot (in case they change it or take it away):
Hotel Reviews Online: In Bed With Hope, Half-Truths and Hype – New York Times
The NYT discusses an increasing problem with informal review and recommendation sites: insincere or misleading postings. Here, they talk about hotels that either post fake (positive) reviews about themselves, or that offer inducements (discounts, etc.) to customers to post positive reviews, or that bribe web sites and blogs to remove negative reviews.
Continue reading Incentives to misrepresent
We’ve been expecting this for years. Looks like a serious large-scale experiment in email charging is beginning. But first some background…
Continue reading Incentives for large email senders: Yahoo and AOL start charging
ICD is the science of designing systems or institutions that align participants’ (individual) incentives with overall system (social) goals. Incentive-centered design is fundamental for modern information systems because performance of distributed and collaborative systems depends critically on the strategic choices users make when interacting with the system and with each other, yet mismatch between individual interests and system goals is pervasive. Careful attention to individual incentives can lead to vast improvements in systems and institutions. This approach necessarily builds equally on the social sciences that address motivated human behavior, cognition and group processes, and on the engineering sciences that address computation and communications system design. We take a broad view of individual motivations for strategic behavior, drawing on economic, psychological, and sociological theories, and combine these with the design and engineering sciences of artificial intelligence, software, operations research and networking.
We apply ICD to
- user-contributed content
- reputation systems
- public goods provision
- recommender systems
- online auction design
- prediction markets
- matching systems
- social computing
ICD in various forms it is gaining interest from many overlapping research communities. Nevertheless, as a coherent field ICD is still quite young, and its potential as a multidisciplinary foundation for research on information system problems has not yet fully developed.
For the past several years, I’ve been one of the leaders of a group of faculty and students at UM (and beyond) developing “incentive-centered design” (ICD) as a core intellectual field for information science.
I’m going to experiment with keeping a blog to express thoughts I have about ICD, gather links to relevant stories in the websphere, point to research projects, etc. I doubt that I am going to try to attract a lot of readers, or to create a lot of content. Low volume, and I hope high quality, or at least things that will be useful to a small coterie of fellow travelers.
Whoami: Jeff MacKie-Mason (or Jeff Mason in my non-professional persona), jmm.