Skip to content

Big Data 🐘 & Cloud Data ☁️ Sync Tool : cloud data sync tool will be very useful during the migration process to move all the partitioned/non-partitioned HDFS data to cloud buckets ☁️.

master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 

README.md

article
cloud-datasync

Big Data & Cloud Data Sync Tool

License

Summary

Cloud data sync tool will be very useful during the migration process to move all the partitioned/non-partitioned HDFS data to Cloud buckets.

Features and Limitations

The cloud-datasync tool currently supports to incrementally copy the partitioned and bulk copy the non-partitioned data to GCP/AWS cloud data storage from any local linux/mac machines. These are new features I have planned for the tool,

  • Using ‘distcp’ command effectively copy the data from local HDFS cluster to any cloud service
  • Automatically generate Azkaban job flow and schedule the data copy jobs in Azkaban servers. We can use Azkaban Python Package to implement this feature.

If you want to share any new features/issues, feel free to open an issue in the GitHub repository.

Directory Layout

.
├── LICENSE
├── README.md
├── doc
│   └── blob
│       └── cloud-data-sync.png
├── modules                                             --> module folder
│   ├── authentication.sh
│   ├── conf                                            --> configurations for data sync tool
│   │   └── aws-conf.properties                         --> sample access and secret keys for AWS 
│   ├── datePatternValidation.sh
│   ├── dateUtilsLinux.sh                               --> date utils for linux machine 
│   ├── dateUtilsMac.sh                                 --> date utils for mac machine 
│   ├── processBulk.sh                                  --> to upload bulk data
│   └── processPartitioned.sh                           --> to upload incrementally the partitioned data
├── runDataSync.sh                                      --> main script that uses the module
├── sample_AWS_command.sh                               --> sample commands to copy the data to AWS S3 bucket 
├── sample_GCP_command.sh                               --> sample commands to copy the data to GCP storage 
└── test                                                --> test scripts for each modules 
    ├── authenticationTest.sh
    ├── datePatternValidationTest.sh
    ├── dateUtilsLinuxTest.sh
    └── dateUtilsMacTest.sh

License

MIT © Renien

About

Big Data 🐘 & Cloud Data ☁️ Sync Tool : cloud data sync tool will be very useful during the migration process to move all the partitioned/non-partitioned HDFS data to cloud buckets ☁️.

Resources

License

Releases

No releases published

Packages

No packages published

Languages

You can’t perform that action at this time.