The official PotBS site has some information for players about sharing game data. What does that mean? Check it out:
In the end of my previous dev log, I alluded to exporting game state via an internal server that Wes wrote called the "Crawler Server." The original indented use of the Crawler Server was to allow us to write internal web tools inspect game data like a character or a landmark, instead of having to be logged in to a cluster and use GM commands in the chat window. However, we soon realized we could use this functionality to export data to the public as well. Specifically, I was interested in exporting the port state (unrest levels, port resources, etc) and server victory state (the scoreboard displayed in the server victory dialog in game).
You might wonder why we can't just write web tools that read the game database directly. The primary reason is that even though the game data is stored in a MS SQL database that a web app could theoretically query, all of the data is in encoded binary blobs, rather than one column per field (think of an excel spreadsheet with one player's character information per row and the character stats like 'level' and 'xp' in each column). We use blobs because it's faster to move game data in and out of the DB if it's in one large binary chunk. We also have game code that intelligently upgrades the binary blobs if fields are added and removed. For example, we may add a new character stat and thus need to upgrade existing characters in the DB, who were created before this new stat ever existed. Also, there is the issue of maintaining integrity of the game data. Web tool development is usually more fast and loose, and thus more prone to corrupting data on accident. It would be a bummer if we accidentally corrupted character due to a bug in one of our web tools.
Enter the "Crawler Server." Its job is to walk through the game database and save its data to the crawler database in expanded excel spreadsheet style that a web tool can easily read. It does this by "crawling" through the game data on a regular interval (once every few minutes) looking for any game objects that have changed. The blobs have a last updated timestamp and a last "crawled" time that lets us determine if we need to crawl that object again. If a blob changed, we parse the blob and write out the individual fields to the crawler DB that all of our web tools can access.
There is a lot more to the article so stop by the PotBS site.