Web 2.0 would enable users to participate and turn the web into a place where users is at the center. But after a couple of years this maybe a dream as more and more robots come online and start to demand their place on the web. A place to help their masters who need to host backlinks, content or some weird project like the Graffiti Network research project at Brown University.
Looking at the logfiles of one Web 2.0 application based on MediaWiki gives enough to think about. The MediaWiki installation is protected by some extensions to prevent spam and this is the reason I didn’t notice some issues. Those extensions don’t stop abusers from trying to post (between 17 and 21 KB per post) their content to the Main_Page. And with about 10000 postings by 3600 machines for every day of the week it gives an indication how big the issue has become.
For now they are unable to post, but the network appears to be big and growing. And some day they will find a way to work around the current antispam extensions. So the time has come to start collecting data about those abuse attempts and match them with spam on weblogs and spammy e-mails. Hopefully there is a relation, but I fear the moment that Web 2.0 is realy to open for the world to handle.