Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with HTTPS or Subversion.

Download ZIP
branch: master
Commits on Apr 20, 2011
  1. @thedatachef
  2. @thedatachef
  3. @thedatachef
  4. @thedatachef

    Version bump to 0.0.1

    thedatachef authored
Commits on Apr 19, 2011
  1. @thedatachef
Commits on Apr 18, 2011
  1. @thedatachef
  2. @thedatachef
Commits on Mar 23, 2011
  1. @thedatachef
Commits on Mar 17, 2011
  1. @thedatachef
  2. @thedatachef
Commits on Mar 9, 2011
  1. @thedatachef
Commits on Mar 5, 2011
  1. @thedatachef
Commits on Mar 3, 2011
  1. @thedatachef
Commits on Feb 27, 2011
  1. @thedatachef

    handle for a filesytem file object should be an attr_accessor so, whe…

    thedatachef authored
    …n all else fails, one can have access to the file handle
  2. @thedatachef

    updated readme

    thedatachef authored
  3. @thedatachef
Commits on Feb 26, 2011
  1. @thedatachef
  2. @thedatachef
Commits on Feb 25, 2011
  1. @thedatachef
  2. @thedatachef

    s3 filesystem in process of being implemented, uses right_aws gem, so…

    thedatachef authored
    …mewhat cludgy but so far works
Commits on Feb 24, 2011
  1. @thedatachef

    copy from local task added

    thedatachef authored
  2. @thedatachef

    copy from local task added

    thedatachef authored
Commits on Feb 23, 2011
  1. @thedatachef
Commits on Feb 21, 2011
  1. @thedatachef

    removed hadoop merge file

    thedatachef authored
  2. @thedatachef

    got rid of idempotency at the script level, not all scripts have file…

    thedatachef authored
    …system output and idempotency should be left to the workflow designer themselves, updated example to reflect this, stripped out pig_classpath and pig_options but added the ability to set env for any script via the "env" method (way more flexible)
Commits on Feb 18, 2011
  1. @thedatachef

    local mode should actually work now (at least for pig scripts and r s…

    thedatachef authored
    …cripts), still broken for wukong scripts since --run=local is broken
Commits on Feb 17, 2011
  1. @thedatachef
Commits on Feb 16, 2011
  1. @thedatachef
  2. @thedatachef

    simple script for fetching hadoop logs in the "correct" way, not inte…

    thedatachef authored
    …grated into workflow yet, still need to work out a sane way to get the job id for a currently running job
  3. @thedatachef
  4. @thedatachef

    minor (but really annoying) bug in hadoop filesystem, needs to add a …

    thedatachef authored
    …"/" after the path to hadoop conf if it isnt already there
  5. @thedatachef

    notes on logging, thinking it might be helpful to have a http filesys…

    thedatachef authored
    …tem for doing a directory listing...? workflows need a method for getting/setting logdir
Commits on Feb 15, 2011
  1. @thedatachef

    filesystem should be autoloaded when called, when open is called on t…

    thedatachef authored
    …he hadoop filesystem with a block, it MUST close itself after
Commits on Feb 12, 2011
  1. @thedatachef

    updated readme

    thedatachef authored
  2. @thedatachef

    made example work with new abstraction, changed script to use new abs…

    thedatachef authored
    …traction, added methods for merging hdfs files and copying to local without concatenating them
Something went wrong with that request. Please try again.