GhostPages

OurWork Edit-chalk-10bo12.png (Stephen Judkins)

What (summary)

All "bot pages"--pages that have not been edited by a human--are served up dynamically based on bot-crawled metadata and a template. Changing the template will affect all bot pages near-instantaneously. When a bot page is edited, it is materialized and becomes a standard wiki page.

Why this is important

  • We can't do anything to the huge number of domain pages out there.
  • Solves numerous problems related to categories, related domains, etc.

DoneDone

  • Detect whether a given page has only bot edits.
  • When serving up a page, if it has only bot edits, serve up a ghost page instead.
  • When a user edits a page, it is materialized and saved as a run of the mill wiki page.
  • Build up the database of bot metadata based on either existing bot database or Michael's new web service.
    • determine best way to store bot metadata.
  • Build better parser cache that performs better given the long-tail characteristics of our site.
    • Pending further analysis of site traffic
    • DB persistence?
  • Use WWW::Mechanize integration tests to ensure high code quality.


Retrieved from "http://aboutus.com/index.php?title=GhostPages&oldid=14739973"