21

In your run-of-the-mill, post-apocalyptic science fiction fare, it was once commonplace to predict that the turn of the millennium, and the years after, would become a battlefield between machine and man. Robots would overrun their organic masters, and there would generally be a sense of doom and gloom among the survivors attempting to rebuild their once-lofty civilization.

The future hasn’t quite shaken out to be the militaristic, often gray and blue-filtered narrative that we were promised, but it has been an interesting look at the challenges of combining artificial intelligences with human interests. This is no more evident than it is with search engine marketing, and it’s a problem that’s become paramount for the engineers of the search engines themselves.

Attacking the Automatic

To get a clearer understanding of the topic, you only need to look at Google’s three previous algorithm updates, and what they intended to correct or add to the user’s search experience:

  • Panda, and Panda 4.1, both had a strong focus on reducing the effectiveness of keyword stuffing, as well as increasing user security. Affiliate marketers were hit especially hard with the 4.1 release; Google’s goal is to cut the middle man, and his profits, or anything that stands between the user and what they want.
  • Pigeon, the most recent algorithm change, focused on local search engine results, and helped to bring reviews and user-generated content into the picture as well. As part of an increasing focus on social media dynamics, Pigeon’s done both good and bad on the front of making the experience more “human.”
  • “Phantom,” an unnamed update that came before Penguin 2.0, was just as mysterious as its nickname implies. Released in May of 2013, this one seemed to target sister sites in particular, or those that cross-link to one another consistently