Skip to content

ForkProject/hdfs-connector

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

91 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hadoop File System (HDFS) Cloud Connector

The Apache™ Hadoop® project develops open-source software for reliable, scalable, distributed computing.

The Apache Hadoop software library is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is designed to scale up from single servers to thousands of machines, each offering local computation and storage. Rather than rely on hardware to deliver high-avaiability, the library itself is designed to detect and handle failures at the application layer, so delivering a highly-availabile service on top of a cluster of computers, each of which may be prone to failures.

Installation and Usage

For information about usage and installation you can check our documentation at http://mulesoft.github.com/hdfs-connector.

Reporting Issues

We use GitHub:Issues for tracking issues with this connector. You can report new issues at this link https://github.com/mulesoft/hdfs-connector/issues

About

Connects Mule to Hadoop Distributed File System.

Resources

License

Stars

Watchers

Forks

Packages

No packages published