Skip to content
Browse files

first commit

  • Loading branch information...
0 parents commit de857d3c745a8cb306ef230df2dda0aa1ccaf41e @garethlatwork garethlatwork committed Oct 9, 2011
Showing with 990 additions and 0 deletions.
  1. +15 −0 Gemfile
  2. +1 −0 Procfile
  3. +17 −0 README.rdoc
  4. +2 −0 config.ru
  5. +148 −0 config.yml
  6. +473 −0 file_mover.rb
  7. +23 −0 file_mover_job.rb
  8. +76 −0 file_mover_job_launcher.rb
  9. +84 −0 quickbase-s3.rb
  10. +10 −0 run_file_mover.rb
  11. +27 −0 run_file_mover_background_job.rb
  12. +114 −0 s3client.rb
15 Gemfile
@@ -0,0 +1,15 @@
+source "http://rubygems.org"
+gem "thin"
+gem "sinatra"
+gem "quickbase_client"
+gem "rack-flash"
+gem "httpclient"
+gem "gmail_sender"
+gem "simple_worker"
+gem "activesupport"
+gem "i18n"
+gem "right_aws"
+gem "rack-ssl-enforcer"
+gem "tzinfo"
+
+
1 Procfile
@@ -0,0 +1 @@
+web: bundle exec ruby quickbase-s3.rb -p $PORT
17 README.rdoc
@@ -0,0 +1,17 @@
+= quickbase-s3
+
+Copy or move QuickBase File Attachments to Amazon S3.
+
+An instance of the app is running at https://quickbase-s3.heroku.com .
+
+= License
+
+The MIT License
+
+Copyright (c) 2011
+
+Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
2 config.ru
@@ -0,0 +1,2 @@
+require_relative "quickbase-s3"
+run Sinatra::Application
148 config.yml
@@ -0,0 +1,148 @@
+
+# copy_files: Set to false to move (not copy) files from QuickBase to Amazon S3. Default is true.
+copy_files: true
+
+# username: The QuickBase user name to use for downloading files from QuickBase and updating the related records. REQUIRED.
+username: username@email.com
+
+# password: The QuickBase password to use for downloading files from QuickBase and updating the related records. REQUIRED.
+password: password
+
+# realm: If your QuickBase applications are on (e.g.) https://my_company.quickbase.com, change www to my_company. Default is www.
+realm: www
+
+#apptoken: A QuickBase Application Token for more secure access to QuickBase
+#apptoken: sdfsdfsdfsadsdfaadasdfafd
+
+# quickbase_table_?: Use up to 100 QuickBase tables or applications to control which tables to process. REQUIRED.
+# Important: Only the tables with a report matching the 'file_list_report_name' entry below will be included.
+# If quickbase_table_1 is ALL TABLES, all accessible tables will be processed.
+# If the entry appears to be the ID of a Application, all the application's child tables will be processed.
+# If the entry appears to be the ID of a Table, the Table will be processed.
+# If the entry is the name of a Application, all the Application's child tables will be processed.
+quickbase_table_1: ALL TABLES
+
+# file_list_report_name: The name of the QuickBase report to use in every table, to control which File Attachment records and fields should be processed.
+# Default is z-Files->S3.
+file_list_report_name: z-Files->S3
+
+# aws_access_key: Your Amazon Web Services access key. REQUIRED. No default value.
+aws_access_key: 0XXMMP8XMMMM2
+
+# aws_access_key: Your Amazon Web Services secret key. REQUIRED. No default value.
+aws_secret_key: ErdDadfasdf0EQReXXXXZZZZbEYGbTHQohVSP
+
+# bucket_name: The name of the top-level Amazon S3 folder where uploaded files will be stored. Default is quickbase_files.
+bucket_name: quickbase_files
+
+# download_url: The base URL for downloading files from S3. Default is https://quickbase-s3.heroku.com/download_file?key=.
+download_url: https://quickbase-s3.heroku.com/download_file?key=
+
+# s3_url_field_name_addition: The text to append to your QuickBase File Attachment field names; used to create the related S3 download URL field.
+# Default is ' - s3 url'.
+s3_url_field_name_addition: ' - s3 url'
+
+# s3_file_field_name_addition: The text to append to your QuickBase File Attachment field names; used to create the related S3 file name field.
+# Default is ' - s3 file'.
+s3_file_field_name_addition: ' - s3 file'
+
+#s3_key_format: The format of the key for files uploaded to Amazon S3. Each '/' creates a 'sub-folder' in Amazon S3.
+s3_key_format: "%{realm} table: %{table_name} (id: %{table_id})/record: %{record_id}, field: %{field_name}, (id: %{field_id}), time: %{time}/%{file_name}"
+
+# tmp_dir_name: The name of the local (temporary) folder to use while copying files from QuickBase to Amazon S3. Default is ./tmp.
+tmp_dir_name: ./tmp
+
+# remove_temp_files: Set to true to remove the local (temporary) folder to use while copying files from QuickBase to Amazon S3. Default is false.
+remove_temp_files: false
+
+# no_log_file: Set to true if you do NOT want to create a local log file. Default is false.
+no_log_file: false
+
+# logfile: The name of the local file to use for logging errors and information. Default is file_mover.log, in the same location as this file.
+logfile: file_mover.log
+
+#logfile_age: How often to backup up the local log file. Must be one of daily, weekly or monthly. Default is daily.
+logfile_age: daily
+
+# log_errors_to_quickbase: Set to true to add a QuickBase record for each error log entry. Default is false.
+log_errors_to_quickbase: false
+
+# log_info_to_quickbase: Set to true to add a QuickBase record for each informational (not error) log entry. Default is false.
+log_info_to_quickbase: false
+
+# quickbase_log_entry_table_id: The QuickBase table ID to use for logging information to QuickBase. Not required.
+quickbase_log_entry_table_id: bhi7xxxzz
+
+# quickbase_log_entry_field_name: The name of the QuickBase field to use for each log entry written to QuickBase. Default is Log Entry.
+quickbase_log_entry_field_name: Log Entry
+
+# upload_log_file_to_quickbase: Set to true to upload the log file to QuickBase after this utility has finished running. Default is false.
+upload_log_file_to_quickbase: false
+
+# quickbase_log_file_table_id: The ID of the QuickBase table to use for storing log files.
+# Required if upload_log_file_to_quickbase is true.
+quickbase_log_file_table_id: bhi7xxxzz
+
+# quickbase_log_file_field_name: The name of the QuickBase File Attachment field use to store Log Files. Default is Log File.
+quickbase_log_file_field_name: Log File
+
+# email_errors_via_gmail: Set to true to email each error log separately, via gmail. Default is false.
+email_errors_via_gmail: false
+
+# log_info_via_gmail: Set to true to email each informational (not error) log separately, via gmail. Default is false.
+log_info_via_gmail: false
+
+# email_log_file: Set to true to email the log file contents after this utility has finished running. Default is false.
+email_log_file: false
+
+# gmail_username: The gmail email address to use for emailing log entries.
+# Required if any one of email_log_file, email_errors_via_gmail, or log_info_via_gmail is true.
+gmail_username: username@gmail.com
+
+# gmail_password: The gmail password for the above email address.
+# Required if any one of email_log_file, email_errors_via_gmail, or log_info_via_gmail is true.
+gmail_password: password
+
+# gmail_subject: The subject line to use when logging information via gmail. Default is QuickBase to S3 File Mover Log.
+gmail_subject: QuickBase to S3 File Mover Log
+
+# email_recipient: The email address to which email log entries should be sent (via gmail).
+# Required if any one of email_log_file, email_errors_via_gmail, or log_info_via_gmail is true.
+email_recipient: recipient@email.com
+
+# debug: Set this to true to print log messages to the screen. Default is false.
+debug: false
+
+# debug_quickbase: Set this to true to print QuickBase API information to the screen. Default is false.
+debug_quickbase: false
+
+# debug_simpleworker: Set this to true to run SimpleWorker locally instead of in the cloud. Default is false.
+debug_simpleworker: false
+
+# sw_access_key: Your SimpleWorker access key (for running this service in the cloud: https://www.simpleworker.com/).
+# Required in order to run the File Mover as a background job.
+sw_access_key: 7cc9e5ccc15881292cxxxxx2481e7c981e7c
+
+# sw_access_key: Your SimpleWorker secret key (for running this service in the cloud: https://www.simpleworker.com/).
+# Required in order to run the File Mover as a background job.
+sw_secret_key: 8c8afcccc9af4bbbb4465xdse5d5a
+
+#start_time: when to start the background job on SimpleWorker.
+start_time: ASAP
+
+# timezone: your time zone
+#timezone:
+
+#frequency: how frequently to run the background job on SimpleWorker.
+frequency: 'N/A - Don't Repeat'
+
+# use_environmment_variables: set this to true to allow environment variables to override the
+# configuration settings in this file
+use_environmment_variables: false
+
+# site_owner_gmail_address: used for information requests from the web site
+site_owner_gmail_address: username@gmail.com
+
+# site_owner_gmail_address: used for information requests from the web site
+site_owner_gmail_password: password
+
473 file_mover.rb
@@ -0,0 +1,473 @@
+
+require 'gmail_sender'
+require 'quickbase_client'
+require 'logger'
+require 'right_aws'
+require 'i18n'
+require 'cgi'
+require 'fileutils'
+require_relative 's3client'
+
+REPORT_NAME = "z-Files->S3"
+LOG_FILE = "file_mover.log"
+LOGFILE_AGE = "daily"
+TMP_DIR = "./tmp"
+S3_URL = " - s3 url"
+S3_FILE = " - s3 file"
+DOWNLOAD_URL = "https://quickbase-s3.heroku.com/download_file?key="
+REALM = "www"
+RFID = "3"
+GMAIL_SUBJECT = "QuickBase to S3 File Mover Log"
+QB_LOG_FILE_TABLE = "S3 File Mover Log Files"
+QB_LOG_FILE_FIELD = "Log File"
+QB_LOG_ENTRY_TABLE = "S3 File Mover Log Entries"
+QB_LOG_ENTRY_FIELD = "Log Entry"
+S3_KEY_FORMAT = "%{realm} table: %{table_name} (id: %{table_id})/record: %{record_id}, field: %{field_name}, (id: %{field_id}), time: %{time}/%{file_name}"
+
+class FileMover
+
+ def initialize(configuration)
+ @config = configuration.dup
+ @qbc = QuickBase::Client.init(@config)
+ @qbc.cacheSchemas=true
+ @qbc.apptoken = @config["apptoken"]
+ @qbc.printRequestsAndResponses = debug_quickbase?
+ @qbc.getUserInfo
+ if @qbc.requestSucceeded
+ @s3c = S3Client.new(@config)
+ @s3c.create_bucket
+ @quickbase_log_fields_added = false
+ else
+ log "Connecting to QuickBase: #{@qbc.lastError}", true
+ end
+ end
+
+ def method_missing(sym)
+ m = sym.to_s
+ if m.end_with?("?")
+ m[-1,1]=""
+ ret = config?(m)
+ else
+ ret = @config[m]
+ end
+ ret
+ end
+
+ def copy_or_move_files_to_s3
+ log "Starting copy or move."
+
+ report_name = get_report_name
+ log "Using Report '#{ report_name}' to find files in QuickBase."
+
+ do_copy_files = true
+ do_copy_files = copy_files? if @config["copy_files"]
+ if do_copy_files
+ log "Files will be copied (not moved) from QuickBase to Amazon S3."
+ else
+ log "Files will be moved (not copied) from QuickBase to Amazon S3."
+ end
+
+ realm = realm || REALM
+ log "Your QuickBase tables are expected to be under https://#{realm}.quickbase.com."
+
+ download_url = download_url || DOWNLOAD_URL
+ log "File download URLs will start with '#{download_url}'."
+
+ s3_url_field_name = s3_url_field_name_addition || S3_URL
+ s3_file_field_name = s3_file_field_name_addition || S3_FILE
+ log "The S3 URL and File field names in your QuickBase tables will end with '#{s3_url_field_name}', '#{s3_file_field_name}'."
+
+ total_files_copied = 0
+ total_files_moved = 0
+
+ get_tables.each{|dbid|
+
+ table_name = @qbc.getTableName(dbid)
+ log "Processing table '#{table_name}' (#{dbid})."
+
+ fields, query, slist = get_report_params(dbid,report_name,table_name)
+ create_s3_fields(dbid,fields,s3_url_field_name,s3_file_field_name)
+ tmp_dir = create_tmp_dir
+
+ files_copied = 0
+ files_moved = 0
+
+ @qbc.iterateRecords(dbid,fields.keys,query,nil,nil,fields.values,slist){|record|
+
+ record_id = nil
+ record.each{|field_name,field_value| record_id = field_value if fields[field_name] == RFID }
+
+ if record_id
+ if record.count > 1
+ record.each{|field_name,field_value|
+ unless fields[field_name] == RFID
+
+ ret, file_contents = @qbc.downloadFile(dbid,record_id,fields[field_name])
+ if ret and file_contents and file_contents.length > 0
+
+ writable_file_name = local_file_name(field_value)
+ File.open("#{tmp_dir}/#{writable_file_name}","wb"){|f|f.write(file_contents);f.flush;f.close}
+ log "File '#{field_value}' downloaded successfully from QuickBase record '#{record_id}', field '#{field_name}' to temporary folder '#{tmp_dir}'."
+ log "QuickBase file download URL is: #{@qbc.downLoadFileURL} "
+
+ s3key = format_s3_key(realm,table_name,dbid,record_id,field_name,fields,field_value)
+ ret = @s3c.upload_file("#{tmp_dir}/#{writable_file_name}",s3key)
+ if ret.is_a?(Hash) and ret["etag"]
+ log "Successfully uploaded file to Amazon S3 key '#{s3key}', from QuickBase record '#{record_id}', field '#{field_name}'."
+ files_copied += 1
+ download_url = "#{download_url}#{s3key.dup}"
+ log "S3 download URL is: #{download_url} "
+ update_quickbase_record(dbid,record_id,field_name,field_value,s3_url_field_name,s3_file_field_name,download_url,do_copy_files,fields,files_moved)
+ else
+ log "Uploading file to Amazon S3 from QuickBase record '#{record_id}', field '#{fields[field_name]}': #{ret}.", true
+ log "(File not removed from QuickBase)" unless do_copy_files
+ end
+ else
+ log "Downloading file from QuickBase record '#{record_id}', field '#{fields[field_name]}': #{@qbc.lastError}", true
+ end
+ end
+ }
+ else
+ log "Table '#{table_name}' (#{dbid}) will not be processed: report '#{report_name}' does not contain any File Attachment fields."
+ break
+ end
+ else
+ log "Table '#{table_name}' (#{dbid}) will not be processed: unable to determine the record ID for records."
+ break
+ end
+ }
+ if do_copy_files
+ log "#{files_copied} files copied for table '#{table_name}' (#{dbid})."
+ total_files_copied += files_copied
+ else
+ log "#{files_moved} files moved for table '#{table_name}' (#{dbid})."
+ total_files_moved += files_moved
+ end
+ }
+ remove_temp_files if remove_temp_files?
+ total_files_processed = "(#{total_files_copied} files copied, #{total_files_moved} files moved)"
+ do_copy_files ? log("Copy complete #{total_files_processed}.") : log("Move complete #{total_files_processed}.")
+ email_log_file if email_log_file?
+ upload_log_file_to_quickbase if upload_log_file_to_quickbase?
+ rescue StandardError => exception
+ log exception, true
+ end
+
+ def update_quickbase_record(dbid,record_id,field_name,field_value,s3_url_field_name,s3_file_field_name,download_url,do_copy_files,fields,files_moved)
+ @qbc.editRecord(dbid, record_id,{"#{field_name}#{s3_url_field_name}" => download_url, "#{field_name}#{s3_file_field_name}" => field_value.dup})
+ if @qbc.requestSucceeded
+ unless do_copy_files
+ @qbc.removeFileAttachment(dbid,record_id,field_name)
+ if @qbc.requestSucceeded
+ log "File Attachment removed from QuickBase record '#{record_id}', field '#{fields[field_name]}'."
+ files_moved += 1
+ else
+ log "Removing File Attachment from QuickBase record '#{record_id}', field '#{fields[field_name]}': #{@qbc.lastError}", true
+ end
+ end
+ else
+ log "Editing QuickBase record '#{record_id}', field '#{fields[field_name]}': #{@qbc.lastError}", true
+ end
+ end
+
+ def get_report_params(dbid,report_name,table_name)
+ log "Retrieving information for the '#{report_name}' report from table '#{table_name}' (#{dbid})."
+ fields = {}
+ clist = @qbc.getColumnListForReport(nil,report_name)
+ slist = @qbc.getSortListForReport(nil,report_name)
+ query = @qbc.getCriteriaForReport(nil,report_name)
+ field_ids = []
+ field_ids = clist.split(/\./) if clist
+ field_ids << RFID unless field_ids.include?(RFID)
+ field_ids.each{|field_id|
+ field_name = @qbc.lookupFieldNameFromID(field_id)
+ field_type = @qbc.lookupFieldTypeByName(field_name)
+ if field_type == "file" or field_id == RFID
+ fields[field_name] = field_id
+ end
+ }
+ return fields, query, slist
+ end
+
+ def create_s3_fields(dbid,fields,s3_url_field_name,s3_file_field_name)
+ current_field_names = @qbc.getFieldNames(dbid)
+ fields.each{|field_name,field_value|
+ next if field_value == RFID
+ url_field_name = "#{field_name}#{s3_url_field_name}"
+ unless current_field_names.include?(url_field_name)
+ @qbc.addField(dbid,url_field_name,"url")
+ error = @qbc.requestSucceeded ? "" : ": #{@qbc.lastError}"
+ log "Adding URL field '#{url_field_name}' to table #{dbid}'#{error}.", (@qbc.requestSucceeded == false)
+ end
+ file_field_name = "#{field_name}#{s3_file_field_name}"
+ unless current_field_names.include?(file_field_name)
+ @qbc.addField(dbid,file_field_name,"text")
+ error = @qbc.requestSucceeded ? "" : ": #{@qbc.lastError}"
+ log "Adding Text field '#{file_field_name}' to table '#{dbid}'#{error}.", (@qbc.requestSucceeded == false)
+ end
+ }
+ end
+
+ def get_tables
+ log "Getting list of QuickBase tables to process..."
+ tables = get_table_ids
+ filter_tables(tables)
+ log "#{tables.count} tables will be processed."
+ tables
+ end
+
+ def get_table_ids
+ tables = []
+ (1..100).each{|i|
+ table_entry = "quickbase_table_#{i}"
+ if @config[table_entry]
+ config_value = @config[table_entry]
+ if i == 1 and config_value and config_value == "ALL TABLES"
+ grantedDBs = @qbc.grantedDBs
+ tables = grantedDBs.map{|db|db.dbinfo.dbid} if grantedDBs
+ break
+ elsif QuickBase::Misc.isDbidString?(config_value)
+ @qbc.getSchema(config_value)
+ if @qbc.requestSucceeded
+ child_tables = @qbc.getTableIDs(config_value)
+ if child_tables
+ log "Adding tables from the Application with the id '#{config_value}' (from the \"#{table_entry}\" configuration entry)."
+ tables += child_tables
+ else
+ log "Adding Table '#{config_value}' (from the \"#{table_entry}\" configuration entry)."
+ tables << config_value
+ end
+ else
+ log "'#{config_value}' is not an accessible QuickBase Aplication or Table ID (from the \"#{table_entry}\" configuration entry)."
+ end
+ else
+ app_dbid = @qbc.findDBByName(config_value)
+ if app_dbid
+ log "Adding tables from the '#{config_value}' (#{app_dbid}) Application (from the \"#{table_entry}\" configuration entry)."
+ tables += @qbc.getTableIDs(app_dbid)
+ else
+ log "'#{config_value}' is not an accessible QuickBase Aplication (from the \"#{table_entry}\" configuration entry)."
+ end
+ end
+ end
+ }
+ tables
+ end
+
+ def filter_tables(tables)
+ report_name = get_report_name
+ tables.reject!{|table_dbid|
+ report_names = @qbc.getReportNames(table_dbid)
+ exclude = (report_names.include?(report_name) == false)
+ if exclude
+ log "Excluding table '#{table_dbid}' because it does not have a '#{report_name}' report."
+ end
+ exclude
+ }
+ end
+
+ def create_tmp_dir
+ tmp_dir = tmp_dir_name || TMP_DIR
+ unless File.directory?(tmp_dir)
+ Dir.mkdir(tmp_dir)
+ log "Local '#{tmp_dir}' folder created for files downloaded from QuickBase."
+ end
+ tmp_dir
+ end
+
+ def config?(option)
+ @config[option] and (@config[option] == true or @config[option] == "true")
+ end
+
+ def remove_temp_files
+ tmp_dir = tmp_dir_name || TMP_DIR
+ FileUtils.rm_rf if File.directory?(tmp_dir)
+ log "Local '#{tmp_dir}' temporary files folder removed."
+ end
+
+ def local_file_name(name)
+ file_name = name.dup.strip
+ file_name.gsub!(/\W/,"_")
+ file_name
+ end
+
+ def get_report_name
+ file_list_report_name || REPORT_NAME
+ end
+
+ def format_s3_key(realm,table_name,dbid,record_id,field_name,fields,field_value)
+ format_string = s3_key_format|| S3_KEY_FORMAT
+ format_hash = {
+ realm: realm.gsub(/\W/,"_"),
+ table_name: table_name.gsub(/\W/,"_"),
+ table_id: dbid,
+ record_id: record_id,
+ field_name: field_name.gsub(/\W/,"_"),
+ field_id: fields[field_name],
+ time: Time.now,
+ file_name: field_value
+ }
+ format_string % format_hash
+ end
+
+ def log(msg,error=false)
+ error ? log_error(msg) : log_info(msg)
+ end
+
+ def log_error(error)
+ puts "ERROR: #{error}" if debug?
+ log_error_to_file(error) unless no_log_file?
+ email_msg(error,true) if email_errors_via_gmail?
+ log_to_quickbase(error,true) if log_errors_to_quickbase?
+ end
+
+ def log_info(info)
+ puts "INFO: #{info}" if debug?
+ log_info_to_file(info) unless no_log_file?
+ email_msg(info) if log_info_via_gmail?
+ log_to_quickbase(info) if log_info_to_quickbase?
+ end
+
+ def log_to_quickbase(msg,error=false)
+ add_quickbase_log_fields unless @quickbase_log_fields_added
+ log_entry = error ? "Error: #{msg}" : msg
+ get_qb_logger.addRecord(quickbase_log_entry_table_id,{quickbase_log_entry_field_name || QB_LOG_ENTRY_FIELD => log_entry})
+ rescue StandardError => exception
+ puts_exception(exception,msg)
+ end
+
+ def email_msg(msg,error=false)
+ @gmail_sender ||= GmailSender.new(gmail_username, gmail_password)
+ @gmail_subject ||= gmail_subject || GMAIL_SUBJECT
+ subject = error ? "Error: #{@gmail_subject}" : @gmail_subject
+ @gmail_sender.send({ :to => gmail_recipient || gmail_username, :subject => subject, :content => msg.to_s })
+ rescue StandardError => exception
+ puts_exception(exception,msg)
+ end
+
+ def email_log_file
+ if @logger
+ @logger.close
+ logfile = logfile() || LOG_FILE
+ if File.exist?(logfile)
+ logfile_contents = IO.read(logfile)
+ if logfile_contents and logfile_contents.length > 0
+ @gmail_sender ||= GmailSender.new(gmail_username, gmail_password)
+ @gmail_subject ||= gmail_subject || GMAIL_SUBJECT
+ @gmail_sender.send({ :to => email_recipient || gmail_username, :subject => @gmail_subject, :content => logfile_contents })
+ end
+ end
+ end
+ rescue StandardError => exception
+ puts_exception(exception)
+ end
+
+ def upload_log_file_to_quickbase
+ if @logger
+ @logger.close
+ logfile = logfile() || LOG_FILE
+ if File.exist?(logfile)
+ add_quickbase_log_fields unless @quickbase_log_fields_added
+ get_qb_logger.uploadFile(quickbase_log_file_table_id,logfile,quickbase_log_file_field_name || QB_LOG_FILE_FIELD)
+ end
+ end
+ rescue StandardError => exception
+ puts_exception(exception)
+ end
+
+ def add_quickbase_log_fields
+ qbl = get_qb_logger
+ if upload_log_file_to_quickbase?
+ dbid = quickbase_log_file_table_id
+ field = quickbase_log_file_field_name || QB_LOG_FILE_FIELD
+ fid = nil
+ if dbid
+ qbl.getSchema(dbid)
+ if qbl.requestSucceeded
+ fid = qbl.lookupFieldIDByName(field,dbid)
+ else
+ dbid = nil
+ end
+ end
+ if dbid
+ qbl.addField(dbid,field,"file") unless fid
+ else
+ dbid, appdbid = qbl.createDatabase(QB_LOG_FILE_TABLE,QB_LOG_FILE_TABLE)
+ if qbl.requestSucceeded
+ qbl.addField(dbid,field,"file")
+ else
+ puts "Error: #{qbl.lastError}"
+ end
+ end
+ @config["quickbase_log_file_table_id"] = dbid
+ @config["quickbase_log_file_field_name"] = field
+ end
+ if log_errors_to_quickbase? or log_info_to_quickbase?
+ dbid = quickbase_log_entry_table_id
+ field = quickbase_log_entry_field_name || QB_LOG_ENTRY_FIELD
+ fid = nil
+ if dbid
+ qbl.getSchema(dbid)
+ if qbl.requestSucceeded
+ fid = qbl.lookupFieldIDByName(field,dbid)
+ else
+ dbid = nil
+ end
+ end
+ if dbid
+ qbl.addField(dbid,field,"text") unless fid
+ else
+ dbid, appdbid = qbl.createDatabase(QB_LOG_ENTRY_TABLE,QB_LOG_ENTRY_TABLE)
+ if qbl.requestSucceeded
+ qbl.addField(dbid,field,"text")
+ else
+ puts "Error: #{qbl.lastError}"
+ end
+ end
+ @config["quickbase_log_entry_table_id"] = dbid
+ @config["quickbase_log_entry_field_name"] = field
+ end
+ @quickbase_log_fields_added = true
+ end
+
+ def log_error_to_file(error)
+ get_file_logger.error(error)
+ rescue StandardError => exception
+ puts_exception(exception,error)
+ end
+
+ def log_info_to_file(info)
+ get_file_logger.info(info)
+ rescue StandardError => exception
+ puts_exception(exception,info)
+ end
+
+ class MyLogFormatter < Logger::Formatter
+ def call(severity, time, progname, msg)
+ "%s, [%s#%d] %5s -- %s: %s\r\n" % [severity[0..0], format_datetime(time), $$, severity, progname, msg2str(msg)]
+ end
+ end
+
+ def get_file_logger
+ unless @logger
+ @logger = Logger.new(logfile || LOG_FILE, logfile_age || LOGFILE_AGE)
+ @logger.formatter = MyLogFormatter.new
+ end
+ @logger
+ end
+
+ def get_qb_logger
+ @qb_logger_client ||= QuickBase::Client.init(@config)
+ @qb_logger_client.cacheSchemas=true
+ @qb_logger_client.printRequestsAndResponses = debug_quickbase?
+ @qb_logger_client
+ end
+
+ def puts_exception(e,msg=nil)
+ puts msg if msg
+ puts e
+ puts e.backtrace
+ end
+
+end
+
23 file_mover_job.rb
@@ -0,0 +1,23 @@
+
+require 'simple_worker'
+require_relative 'file_mover'
+
+class FileMoverJob < SimpleWorker::Base
+
+ attr_accessor :params
+
+ merge_gem "quickbase_client"
+ merge_gem "right_http_connection"
+ merge_gem "right_aws"
+ merge_gem "i18n"
+ merge_gem "gmail_sender"
+ merge "file_mover"
+ merge "s3client"
+
+ def run
+ fm = FileMover.new(params)
+ fm.copy_or_move_files_to_s3
+ end
+
+end
+
76 file_mover_job_launcher.rb
@@ -0,0 +1,76 @@
+
+require_relative 'file_mover_job'
+require 'tzinfo'
+
+class FileMoverJobLauncher
+
+ def initialize(params,move_files)
+ @params = params.dup
+ @move_files = move_files
+ end
+
+ def config_simple_worker
+ SimpleWorker.configure {|swconfig|
+ swconfig.access_key = @params["sw_access_key"]
+ swconfig.secret_key = @params["sw_secret_key"]
+ }
+ end
+
+ def launch_file_mover_job
+ @params["copy_files"] = @move_files ? "false" : "true"
+ config_simple_worker
+ fmj = FileMoverJob.new
+ fmj.params = @params.dup
+ sw_params = get_sw_params(@params)
+ if @params["debug_simpleworker"] == "true"
+ fmj.run_local
+ elsif sw_params
+ fmj.schedule(sw_params)
+ else
+ fmj.queue
+ end
+ end
+
+ def get_sw_params(params)
+ sw_params = {}
+ if params["start_time"] and params["start_time"] != "ASAP"
+ hour, am_pm = params["start_time"].split(/ /)
+ hour = hour.to_i
+ hour += 12 if am_pm == "PM"
+ if params["timezone"]
+ tz = TZInfo::Timezone.get(params["timezone"])
+ current_time = tz.now
+ current_hour = current_time.hour
+ if hour > current_hour
+ sw_params[:start_at] = current_time + ((hour - current_hour)*60*60)
+ elsif hour < current_hour
+ sw_params[:start_at] = current_time + (((hour+24) - current_hour)*60*60)
+ else
+ sw_params[:start_at] = current_time
+ end
+ end
+ end
+ if params["frequency"] and params["frequency"] != "N/A - Don't Repeat"
+ if params["frequency"] == "Hour"
+ sw_params[:run_every] = (60*60)
+ elsif params["frequency"] == "Day"
+ sw_params[:run_every] = (60*60*24)
+ elsif params["frequency"] == "Week"
+ sw_params[:run_every] = (60*60*24*7)
+ else
+ number, units = params["frequency"].split(/ /)
+ number = number.to_i
+ if units == "minutes"
+ sw_params[:run_every] = (number*60)
+ elsif units == "hours"
+ sw_params[:run_every] = (number*60*60)
+ elsif units == "days"
+ sw_params[:run_every] = (number*60*60*24)
+ end
+ end
+ end
+ sw_params = nil if sw_params.empty?
+ sw_params
+ end
+
+end
84 quickbase-s3.rb
@@ -0,0 +1,84 @@
+
+require 'sinatra'
+require 'rack/ssl-enforcer'
+require 'rack/flash'
+require 'yaml'
+require_relative 'file_mover_job_launcher'
+require_relative 's3client'
+
+ENV['RACK_ENV'] ||= 'development'
+
+configure do
+ if settings.environment == :production
+ use Rack::SslEnforcer
+ end
+end
+
+not_found do
+ "<h1>Invalid request</h1>"
+end
+
+get '/' do
+ "<h1>Invalid request</h1>"
+end
+
+get '/download_file' do
+ download_file(params)
+end
+
+post '/download_file' do
+ download_file(params)
+end
+
+post '/copy_files_to_s3' do
+ launch_file_mover_job(params,false)
+end
+
+post '/move_files_to_s3' do
+ launch_file_mover_job(params,true)
+end
+
+private
+
+def download_file(params)
+ ret = "<h2>Error downloading file from S3.</h2>"
+ config = get_config(params)
+ s3c = S3Client.new(config)
+ response = s3c.download_file(config[:key])
+ unless response.is_a?(RightAws::AwsError)
+ key_parts = config[:key].split(/\//)
+ filename = key_parts[-1]
+ attachment(filename)
+ content_type(Rack::Mime::MIME_TYPES[File.extname(filename)] || "text/html" )
+ ret = response[:object]
+ end
+ ret
+ rescue StandardError => exception
+ msg = "Error downloading file: #{exception}"
+ puts msg
+ puts exception.backtrace
+ msg
+end
+
+def launch_file_mover_job(params,move_files)
+ config = get_config(params)
+ fmjl = FileMoverJobLauncher.new(config,move_files)
+ fmjl.launch_file_mover_job
+ "Background job launched successfully."
+ rescue StandardError => exception
+ msg = "Error launching background job: #{exception}"
+ puts msg
+ puts exception.backtrace
+ msg
+end
+
+def get_config(params)
+ params.keys{|key|params[key.to_s] = params[key]}
+ params.reject!{|k,v|(v and v.length == 0)}
+ config = YAML.load_file("config.yml")
+ if config["use_environmment_variables"] and config["use_environmment_variables"] == "true"
+ config.each{|k,v| config[k] = ENV[k.to_s] if ENV[k.to_s]}
+ end
+ config.merge!(params)
+ config
+end
10 run_file_mover.rb
@@ -0,0 +1,10 @@
+
+require_relative 'file_mover'
+require 'yaml'
+
+config = YAML.load_file(ARGV[0] || "config.yml")
+if config["use_environmment_variables"] and config["use_environmment_variables"] == "true"
+ config.each{|k,v| config[k] = ENV[k.to_s] if ENV[k.to_s]}
+end
+fm = FileMover.new(config)
+fm.copy_or_move_files_to_s3
27 run_file_mover_background_job.rb
@@ -0,0 +1,27 @@
+
+require_relative 'file_mover_job_launcher'
+require 'yaml'
+
+move_files = false
+config_file = "config.yml"
+
+if ARGV[0]
+ ARGV.each{|arg|
+ if arg == "-move"
+ move_files = true
+ else
+ config_file = arg
+ end
+ }
+end
+
+config = YAML.load_file(config_file)
+if config["use_environmment_variables"] and config["use_environmment_variables"] == "true"
+ config.each{|k,v| config[k] = ENV[k.to_s] if ENV[k.to_s]}
+end
+
+fmjl = FileMoverJobLauncher.new(config,move_files)
+fmjl.launch_file_mover_job
+
+puts "File Mover background job launched successfully."
+
114 s3client.rb
@@ -0,0 +1,114 @@
+
+require 'right_aws'
+require 'yaml'
+
+class Net::HTTP
+ alias_method :old_initialize, :initialize
+ def initialize(*args)
+ old_initialize(*args)
+ @verify_mode = OpenSSL::SSL::VERIFY_NONE
+ @use_ssl=true
+ end
+end
+
+BUCKET_NAME = "quickbase_files"
+
+class S3Client
+
+ attr_reader :bucket_name
+
+ def initialize(params=nil)
+ if params
+ @config = params.dup
+ elsif File.exist?("config.yml")
+ @config = YAML.load_file("config.yml")
+ end
+ @client = RightAws::S3Interface.new(@config["aws_access_key"],@config["aws_secret_key"])
+ @bucket_name = @config["bucket_name"] || BUCKET_NAME
+ end
+
+ def bucket_exists?
+ exists = false
+ begin
+ exists = @client.list_bucket(@bucket_name.dup)
+ rescue
+ end
+ exists
+ end
+
+ def create_bucket
+ @client.create_bucket(@bucket_name.dup)
+ bucket_exists?
+ end
+
+ def list_buckets
+ puts "link:\n #{@client.list_bucket_link(@bucket_name.dup)}\n\n"
+ @client.incrementally_list_bucket(@bucket_name.dup) { |c| p c }
+ end
+
+ def delete_folder(key)
+ response = nil
+ raise "key is missing" unless key
+ begin
+ response = @client.delete_folder(@bucket_name, key)
+ rescue StandardError => error
+ response = error
+ end
+ response
+ end
+
+ def key_exists?(bucket_instance,string)
+ exists = false
+ begin
+ key = RightAws::S3::Key.create(bucket_instance,string)
+ exists = key.exists?
+ rescue StandardError => error
+ puts error
+ end
+ exists
+ end
+
+ def keys(prefix=nil)
+ response = nil
+ begin
+ response = @client.list_bucket(@bucket_name.dup, { 'prefix' => prefix } )
+ rescue StandardError => error
+ puts error
+ end
+ response
+ end
+
+ def delete_key(key)
+ response = nil
+ raise "key is missing" unless key
+ begin
+ response = @client.delete(@bucket_name.dup, key)
+ rescue StandardError => error
+ response = error
+ end
+ response
+ end
+
+ def upload_file(file_name,key=nil)
+ response = nil
+ begin
+ key ||= file_name
+ response = @client.store_object({:bucket => @bucket_name.dup, :key => key, :md5 => "a507841b1bc8115094b00bbe8c1b2954", :data => File.open(file_name,"rb") })
+ rescue StandardError => error
+ response = error
+ end
+ response
+ end
+
+ def download_file(file_name,key=nil)
+ response = nil
+ begin
+ key ||= file_name
+ response = @client.retrieve_object({:bucket => @bucket_name.dup, :key => key })
+ rescue StandardError => error
+ response = error
+ end
+ response
+ end
+
+end

0 comments on commit de857d3

Please sign in to comment.
Something went wrong with that request. Please try again.