CompostUs

OurWork CompostUs


CompostUsPath.png CompostUsRequestPath.png

What (summary)

Have our rails framework, Compost, handle basic page views and proxy all complex wiki requests back to Mediawiki.

Why this is important

We are sick and tired of being hobbled by our primary PHP environment. If we have every request handled primarily by Rails will give us a great deal of flexibility in writing new extensions, while still maintaining Mediawiki for a continuously shrinking subset of requests will make migration to our new platform feasible.

Punchlist

Deploy

  • DNS for compostus.aboutus.org
  • apache configuration for compostus.aboutus.org
  • cap compostus:deploy from the compostus branch prompts for TAGNAME updates /www/aboutus/compostus on squal1, squal2, squal3, squal-bot1
  • Adjust apache configuration to serve static content directly
  • populate dev stages with sample data from site
    • just using live data
  • Move master over to www
    • Tweak the apache conf to serve up the mediawiki front end from www
    • Adjust cap deploy for the www branch to deploy our current mediawiki front end

Functionality

  • section editing
  • %29) redirect when loading /
  • au_web_services needs to be added to the config/routes.rb
  • signup
  • openId
    • explore existing openid
      • db schema table: user_openid columns: uoi_openid, uoi_user.
      • connection between openid and mediawiki traditional logins through user_id , can use email-password-reminder to enable traditional login
      • libraries required
      • organization provided by plugin
    • find ruby implementation
    • send error list to janrain
  • Colons are escaped in url strings User%3ABrandon+CS+Sanders
  • spaces are + rather than _ in url strings User%3ABrandon+CS+Sanders
  • redirects
    • create redirect test cases
    • When a user asks for User:MyPage and that page has a redirect loop we show the redirect loop User:MyPage >> User:MyOtherPage >> User:MyPage (so that we can edit any page in the loop)
    • Same for a long chain
      • Long chain of redirects ... Redirect 1 --> Redirect 2 --> Redirect 3 --> Redirect 4 --> Redirect 5 --> Redirect 6 --> NonRedirect
      • Redirect loop ... Redirect I --> Redirect II --> Redirect III --> Redirect I
    • find redirect code in mediawiki
    • find code for what redirects here
  • search should short circuit to existing pages
  • parser functions that create full url links create them pointing at mediawiki:8888 rather than the correct server

Skin

  • when tabbing out a link, the wiki functions never show up (the ones that magically appear when you hover over a section)
  • styling for signup views
  • Navigation and edit tools all work
    • Missing methods
  • Finish the skin
  • Section editing doesn't work
  • Need to finish up the authentication screens
    • will probably need assistance (vinh!)

Claims

  • Create a claim retrieval function in compost

Mediawiki API

  • Pass in text, get back parsed html (for claims etc)

Bugs

Moved to CompostUsBugs -Stephen Judkins

DoneDone

Steps to get to DoneDone

  • CompostUs/Skin ... Port AboutUs skin to rails. (Vinh, Stephen)
  • Build shell Mediawiki skin that serializes relevant output and relays it back to Rails. (Stephen)
  • Build well-tested Rails controller that proxies back all GET and POST requests to Mediawiki.
  • CompostUs/Auth Build authentication framework that uses Mediawiki's user DB but relies on Rails sessions.
    • Integrate Ruby OpenID extensions with existing Mediawiki OpenID DB schema.
    • Disable Mediawiki log in pages.
  • Build parser for Mediawiki titles (IE "Template:Bob" gets translated to {:title => "Bob", :namespace => 10})
    • This is already done. In claims branch, look at Page.extract_namespace_and_title(title) in page.rb script.
  • Build casespace resolver for page titles
    • This is also done. In claims branch, see Page.find_by_casespace(title) in page.rb script.
  • CompostUs/Caching Build database parser cache to persistently keep Mediawiki parser output to minimize usage of PHP
    • Mediawiki is invoked only if there exists no output in the parser cache.

Suggestions

  • Suggestion to utilize categories: Can categories be turned into text cloud? Where as a tag will appear big if it has more pages in it and small if it has less pages. This is to discourage folks using empty categories and/or using similar sounding categories to shift to using proper categories. --Sa'ad talk|email|chat 04:54, 11 March 2008 (PDT)
This is very interesting Sa'ad. I like this idea, many of the most popular sites are using it for tags now, even for most active, most linked to, or currently featured pages. But how will it fit into a wiki exactly without flawing the design too much? --Nick Burrus [ Talk - Contribs ] 13:22, 12 March 2008 (PDT)
Nick, It will remain wiki because one could still enter them manually and remove them just like regular text. But once you hit save, an algorithm would check the keywords' 'popularity' and increase/decrease their size (they'd still be called categories, and you could click one and browse it like you normally do). This visually structured way of displaying categories should encourage people to use popular categories (which would stand out more) giving a quick overview of a website's main areas of operation. --Sa'ad talk|email|chat 21:26, 12 March 2008 (PDT)


Retrieved from "http://aboutus.com/index.php?title=CompostUs&oldid=15448203"