An HTTP interface to serve files/directories from HDFS. This is a temprorary project. I suggest looking at HttpFs for a more robust solution. A couple of reasons why we're using this instead:
- Read-Only access - This doesn't even support other operations
- File Merging - If you give it a path to a directory it will merge all files within that directory and write it to the response
This code is built with the following assumptions. You may get mixed results if you deviate from these versions.
- Hadoop 0.20.2+
To make a jar you can do:
The jar file is then located under
Running an instance
In order to run tabaqui on another machine you will probably want to use the dist assembly like so:
The zip file now under the
target directory should be deployed to
TABAQUI_HOME on the remote server.
To run Tabaqui you can use
bin/tabaqui. Here is an example of using the regular tabaqui script:
REST Request Format
- The path is considered to be an HDFS path. All of its file contents will be read and returned in a single response as text/plain.
Here's the list of HTTP response codes that Tabaqui could send back:
- 200 OK - A file/directory was found and written to the response.
All aspects of this software are distributed under Apache Software License 2.0. See LICENSE file for full license text.
- Xavier Stevens (@xstevens)