Skip to content

Loading…

(PDB-221) Add facts to import/export #875

Merged
merged 2 commits into from

3 participants

@senior

This commit imports/exports facts similar to how we currently import/export catalogs and reports. Anonymize doesn't currently work for facts, which is going to be added separately.

@senior senior (PDB-221) Add facts to import/export
This commit imports/exports facts similar to how we currently import/export
catalogs and reports. Anonymize doesn't currently work for facts, which is
going to be added separately.
4e7ad25
@kbarber
Puppet Labs member

@senior this all seems pretty good.

Although through manual testing I've picked up 1 gotcha with our current solution.

(cli.export/get-active-node-names) will only return nodes that have recent catalogs. However, some users only use PuppetDB to store facts. So in my case, I had thrown a whole bunch of facts at the system but none of these nodes show exports.

We never saw this for reports, because most people who have report storage also have catalog storage.

I've tried removing the filter but it looks like then we just don't handle the later 404's gracefully. The end solution might be getting 3 lists one for each data object, or either skipping over missing data (and just swallowing the 404's?). Either way, I think this needs solving.

@kbarber kbarber commented on an outdated diff
src/com/puppetlabs/puppetdb/cli/export.clj
((55 lines not shown))
- {:node node
- :reports (reports-for-node host port node)})
+ []
+ {:msg (str "Exporting PuppetDB metadata")
+ :file-suffix [export-metadata-file-name]
+ :contents (json/generate-pretty-string
+ {:timestamp (now)
+ :command-versions
+ ;; This is not ideal that we are hard-coding the command version here, but
+ ;; in our current architecture I don't believe there is any way to introspect
+ ;; on which version of the `replace catalog` matches up with the current
+ ;; version of the `catalog` endpoint... or even to query what the latest
+ ;; version of a command is. We should improve that.
+ {:replace-catalog catalog-version
+ :store-report 2
+ :facts 1}})})
@kbarber Puppet Labs member
kbarber added a note

:replace-facts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
@senior senior Added support for exporting facts/reports when
no catalog has been persisted for the given node
4ed5aca
@kbarber kbarber merged commit f0fc29d into puppetlabs:master

1 check passed

Details default The Travis CI build passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Feb 28, 2014
  1. @senior

    (PDB-221) Add facts to import/export

    senior committed
    This commit imports/exports facts similar to how we currently import/export
    catalogs and reports. Anonymize doesn't currently work for facts, which is
    going to be added separately.
Commits on Mar 4, 2014
  1. @senior

    Added support for exporting facts/reports when

    senior committed
    no catalog has been persisted for the given node
View
35 acceptance/helper.rb
@@ -595,6 +595,7 @@ def compare_export_data(export_file1, export_file2, opts={})
:catalogs => true,
:metadata => true,
:reports => true,
+ :facts => true
}.merge(opts)
# NOTE: I'm putting this tmpdir inside of cwd because I expect for that to
@@ -615,9 +616,14 @@ def compare_export_data(export_file1, export_file2, opts={})
relative_path = f.sub(/^#{export_dir1}\//, "")
export1_files.add(relative_path)
expected_path = File.join(export_dir2, relative_path)
- assert(File.exists?(expected_path), "Export file '#{export_file2}' is missing entry '#{relative_path}'")
+
+ if(relative_path !~ /^puppetdb-bak\/facts.*/ || opts[:facts])
+ assert(File.exists?(expected_path), "Export file '#{export_file2}' is missing entry '#{relative_path}'")
+ end
+
puts "Comparing file '#{relative_path}'"
next if File.directory?(f)
+
export_entry_type = get_export_entry_type(relative_path)
case export_entry_type
when :catalog
@@ -626,6 +632,8 @@ def compare_export_data(export_file1, export_file2, opts={})
compare_metadata(f, expected_path) if opts[:metadata]
when :report
compare_report(f, expected_path) if opts[:reports]
+ when :facts
+ compare_facts(f, expected_path) if opts[:facts]
when :unknown
fail("Unrecognized file found in archive: '#{relative_path}'")
end
@@ -633,6 +641,10 @@ def compare_export_data(export_file1, export_file2, opts={})
export2_files = Set.new(
Dir.glob("#{export_dir2}/**/*").map { |f| f.sub(/^#{Regexp.escape(export_dir2)}\//, "") })
+
+ export1_files.delete_if{ |path| !opts[:facts] && /^puppetdb-bak\/facts.*/.match(path)}
+ export2_files.delete_if{ |path| !opts[:facts] && /^puppetdb-bak\/facts.*/.match(path)}
+
diff = export2_files - export1_files
assert(diff.empty?, "Export file '#{export_file2}' contains extra file entries: '#{diff.to_a.join("', '")}'")
@@ -648,11 +660,25 @@ def get_export_entry_type(path)
:catalog
when /^puppetdb-bak\/reports\/.*\.json$/
:report
+ when /^puppetdb-bak\/facts\/.*\.json$/
+ :facts
else
:unknown
end
end
+ def compare_facts(facts1_path, facts2_path)
+ f1 = JSON.parse(File.read(facts1_path))
+ f2 = JSON.parse(File.read(facts2_path))
+
+ diff = hash_diff(f1, f2)
+
+ if (diff)
+ diff = JSON.pretty_generate(diff)
+ end
+
+ assert(diff == nil, "Catalogs '#{facts1_path}' and '#{facts2_path}' don't match!' Diff:\n#{diff}")
+ end
def compare_catalog(cat1_path, cat2_path)
cat1 = munge_catalog_for_comparison(cat1_path)
@@ -1011,7 +1037,8 @@ def create_remote_site_pp(host, manifest)
remote_path
end
- def run_agents_with_new_site_pp(host, manifest)
+ def run_agents_with_new_site_pp(host, manifest, env_vars = {})
+
manifest_path = create_remote_site_pp(host, manifest)
with_puppet_running_on host, {
'master' => {
@@ -1020,7 +1047,9 @@ def run_agents_with_new_site_pp(host, manifest)
'autosign' => 'true',
'manifest' => manifest_path
}} do
- run_agent_on agents, "--test --server #{host}", :acceptable_exit_codes => [0,2]
+ #only some of the opts work on puppet_agent, acceptable exit codes does not
+ agents.each{ |agent| on agent, puppet_agent("--test --server #{host}", { 'ENV' => env_vars }), :acceptable_exit_codes => [0,2] }
+
end
end
View
4 acceptance/tests/anonymize/anonymize_profile.rb
@@ -47,11 +47,11 @@
if type == "none"
step "verify original export data matches new export data" do
- compare_export_data(export_file1, export_file2)
+ compare_export_data(export_file1, export_file2, :facts => false)
end
else
step "verify anonymized data matches new export data" do
- compare_export_data(anon_file, export_file2)
+ compare_export_data(anon_file, export_file2, :facts => false)
end
end
end
View
4 acceptance/tests/db_garbage_collection/node_ttl.rb
@@ -15,6 +15,10 @@ def restart_to_gc(database)
test_name "validate that nodes are deactivated and deleted based on ttl settings" do
+ step "clear puppetdb database so that we can import into a clean db" do
+ clear_and_restart_puppetdb(database)
+ end
+
with_puppet_running_on master, {
'master' => {
'autosign' => 'true'
View
15 acceptance/tests/import_export/import_export.rb
@@ -13,9 +13,16 @@
}
MANIFEST
- run_agents_with_new_site_pp(master, manifest)
+ run_agents_with_new_site_pp(master, manifest, {"facter_foo" => "bar"})
end
+ step "verify foo fact present" do
+ result = on master, "puppet facts find #{master.node_name} --terminus puppetdb"
+ facts = JSON.parse(result.stdout.strip)
+ assert_equal('bar', facts['values']['foo'], "Failed to retrieve facts for '#{master.node_name}' via inventory service!")
+ end
+
+
export_file1 = "./puppetdb-export1.tar.gz"
export_file2 = "./puppetdb-export2.tar.gz"
@@ -33,6 +40,12 @@
sleep_until_queue_empty(database)
end
+ step "verify facts were exported/imported correctly" do
+ result = on master, "puppet facts find #{master.node_name} --terminus puppetdb"
+ facts = JSON.parse(result.stdout.strip)
+ assert_equal('bar', facts['values']['foo'], "Failed to retrieve facts for '#{master.node_name}' via inventory service!")
+ end
+
step "export data from puppetdb again" do
on database, "#{sbin_loc}/puppetdb export --outfile #{export_file2}"
scp_from(database, export_file2, ".")
View
88 acceptance/tests/import_export/import_export_facts_only.rb
@@ -0,0 +1,88 @@
+test_name "export and import tools" do
+ sbin_loc = puppetdb_sbin_dir(database)
+
+ step "clear puppetdb database so that we can control exactly what we will eventually be exporting" do
+ clear_and_restart_puppetdb(database)
+ end
+
+ def run_agents_without_persisting_catalogs(host, manifest, env_vars = {})
+
+ manifest_path = create_remote_site_pp(host, manifest)
+ with_puppet_running_on host, {
+ 'master' => {
+ 'storeconfigs' => 'false',
+ 'autosign' => 'true',
+ 'manifest' => manifest_path
+ }} do
+ #only some of the opts work on puppet_agent, acceptable exit codes does not
+ agents.each do |agent|
+ on agent,
+ puppet_agent("--test --server #{host}", { 'ENV' => env_vars }),
+ :acceptable_exit_codes => [0,2]
+ end
+
+ end
+ end
+
+
+ step "setup a test manifest for the master and perform agent runs" do
+ manifest = <<-MANIFEST
+ node default {
+ @@notify { "exported_resource": }
+ notify { "non_exported_resource": }
+ }
+ MANIFEST
+
+ run_agents_without_persisting_catalogs(master, manifest, {"facter_foo" => "bar"})
+ end
+
+ step "verify foo fact present" do
+ result = on master, "puppet facts find #{master.node_name} --terminus puppetdb"
+ facts = JSON.parse(result.stdout.strip)
+ assert_equal('bar', facts['values']['foo'], "Failed to retrieve facts for '#{master.node_name}' via inventory service!")
+ end
+
+ step "Verify that the number of active nodes is what we expect" do
+ result = on database, %Q|curl -G http://localhost:8080/v3/nodes|
+ result_node_statuses = JSON.parse(result.stdout)
+ assert_equal(agents.length, result_node_statuses.length, "Should only have 1 node")
+
+ node = result_node_statuses.first
+ assert(node["catalog_timestamp"].nil?, "Should not have a catalog timestamp")
+ assert(node["facts_timestamp"], "Should have a facts timestamp")
+ end
+
+ export_file1 = "./puppetdb-export1.tar.gz"
+ export_file2 = "./puppetdb-export2.tar.gz"
+
+ step "export data from puppetdb" do
+ on database, "#{sbin_loc}/puppetdb export --outfile #{export_file1}"
+ scp_from(database, export_file1, ".")
+ end
+
+ step "clear puppetdb database so that we can import into a clean db" do
+ clear_and_restart_puppetdb(database)
+ end
+
+ step "import data into puppetdb" do
+ on database, "#{sbin_loc}/puppetdb import --infile #{export_file1}"
+ sleep_until_queue_empty(database)
+ end
+
+ step "verify facts were exported/imported correctly" do
+ result = on master, "puppet facts find #{master.node_name} --terminus puppetdb"
+ facts = JSON.parse(result.stdout.strip)
+ assert_equal('bar', facts['values']['foo'], "Failed to retrieve facts for '#{master.node_name}' via inventory service!")
+ end
+
+ step "Verify that the number of active nodes is what we expect" do
+ result = on database, %Q|curl -G http://localhost:8080/v3/nodes|
+ result_node_statuses = JSON.parse(result.stdout)
+ assert_equal(agents.length, result_node_statuses.length, "Should only have 1 node")
+
+ node = result_node_statuses.first
+ assert(node["catalog_timestamp"].nil?, "Should not have a catalog timestamp")
+ assert(node["facts_timestamp"], "Should have a facts timestamp")
+ end
+
+end
View
202 src/com/puppetlabs/puppetdb/cli/export.clj
@@ -18,13 +18,32 @@
[clojure.java.io :as io]
[clj-http.client :as client]
[com.puppetlabs.archive :as archive]
- [slingshot.slingshot :refer [try+]]))
+ [slingshot.slingshot :refer [try+]]
+ [com.puppetlabs.puppetdb.schema :as pls]
+ [schema.core :as s]
+ [clojure.string :as str]))
+
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Internal Schemas
+
+(def tar-item {:msg String
+ :file-suffix [String]
+ :contents String})
+
+(def node-map {:catalog_timestamp (s/maybe String)
+ :facts_timestamp (s/maybe String)
+ :report_timestamp (s/maybe String)
+ :name String
+ :deactivated (s/maybe String)})
(def cli-description "Export all PuppetDB catalog data to a backup file")
(def export-metadata-file-name "export-metadata.json")
(def export-root-dir "puppetdb-bak")
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Catalog Exporting
+
(defn catalog-for-node
"Given a node name, retrieve the catalog for the node."
[host port node]
@@ -39,6 +58,46 @@
{ :accept :json})]
(when (= status 200) body)))
+(pls/defn-validated catalog->tar :- tar-item
+ "Create a tar-item map for the `catalog`"
+ [node :- String
+ catalog-json-str :- String]
+ {:msg (format "Writing catalog for node '%s'" node)
+ :file-suffix ["catalogs" (format "%s.json" node)]
+ :contents catalog-json-str})
+
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Fact Exporting
+
+(pls/defn-validated facts-for-node
+ :- {String s/Any}
+ "Given a node name, retrieve the catalog for the node."
+ [host :- String
+ port :- s/Int
+ node :- String]
+ (let [{:keys [status body]} (client/get
+ (format
+ "http://%s:%s/v3/nodes/%s/facts"
+ host port node)
+ {:accept :json})]
+ (when (= status 200)
+ (reduce (fn [acc {:strs [name value]}]
+ (assoc acc name value))
+ {} (json/parse-string body)))))
+
+(pls/defn-validated facts->tar :- tar-item
+ "Creates a tar-item map for the collection of facts"
+ [node :- String
+ facts :- {String s/Any}]
+ {:msg (format "Writing facts for node '%s'" node)
+ :file-suffix ["facts" (format "%s.json" node)]
+ :contents (json/generate-pretty-string
+ {"name" node
+ "values" facts})})
+
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Report Exporting
+
(defn events-for-report-hash
"Given a report hash, returns all events as a vector of maps."
[host port report-hash]
@@ -78,7 +137,40 @@
#(merge % {:resource-events (events-for-report-hash host port (get % :hash))})
(json/parse-string body true))))))
-(defn get-active-node-names
+(pls/defn-validated report->tar :- [tar-item]
+ "Create a tar-item map for the `report`"
+ [node :- String
+ reports :- [{:configuration-version s/Any
+ :start-time s/Any
+ s/Any s/Any}]]
+ (mapv (fn [{:keys [configuration-version start-time] :as report}]
+ {:msg (format "Writing report '%s-%s' for node '%s'" start-time configuration-version node)
+ :file-suffix ["reports" (format "%s-%s-%s.json" node start-time configuration-version)]
+ :contents (json/generate-pretty-string (dissoc report :hash))})
+ reports))
+
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Node Exporting
+
+(pls/defn-validated get-node-data
+ :- {:node String
+ :facts [tar-item]
+ :reports [tar-item]
+ :catalog [tar-item]}
+ "Returns tar-item maps for the reports, facts and catalog of the given
+ node, ready for being written to the filesystem"
+ [host :- String
+ port :- s/Int
+ {:keys [name] :as node-data} :- node-map]
+ {:node name
+ :facts (when-not (str/blank? (:facts_timestamp node-data))
+ [(facts->tar name (facts-for-node host port name))])
+ :reports (when-not (str/blank? (:report_timestamp node-data))
+ (report->tar name (reports-for-node host port name)))
+ :catalog (when-not (str/blank? (:catalog_timestamp node-data))
+ [(catalog->tar name (catalog-for-node host port name))])})
+
+(defn get-nodes
"Get a list of the names of all active nodes."
[host port]
{:pre [(string? host)
@@ -88,54 +180,27 @@
(format "http://%s:%s/v3/nodes" host port)
{:accept :json})]
(if (= status 200)
- (map :name
- (filter #(not (nil? (:catalog_timestamp %)))
- (json/parse-string body true))))))
+ (json/parse-string body true))))
+
+;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
+;;; Metadata Exporting
-(def export-metadata
+(pls/defn-validated export-metadata :- tar-item
"Metadata about this export; used during import to ensure version compatibility."
- {:timestamp (now)
- :command-versions
- ;; This is not ideal that we are hard-coding the command version here, but
- ;; in our current architecture I don't believe there is any way to introspect
- ;; on which version of the `replace catalog` matches up with the current
- ;; version of the `catalog` endpoint... or even to query what the latest
- ;; version of a command is. We should improve that.
- {:replace-catalog catalog-version
- :store-report 2}})
-
-(defn get-catalog-for-node
- "Utility function for retrieving catalog data from the PuppetDB web service.
- Returns a map containing the node name and the corresponding catalog; this
- allows us to run this function against multiple nodes in parallel, and still
- be able to identify which node we've retrieved the data for when it returns."
- [host port node]
- {:pre [(string? host)
- (integer? port)
- (string? node)]
- :post [(map? %)
- (contains? % :node)
- (contains? % :catalog)]}
- {:node node
- :catalog (catalog-for-node host port node)})
-
-(defn get-reports-for-node
- "Utility function for retrieving report data from the PuppetDB web service.
- Returns a map containing the node name and all the reports related to the
- node; this allows us to run this function against multiple nodes in parallel,
- and still be able to identify which node we've retrieved the data for when
- it returns."
- [host port node]
- {:pre [(string? host)
- (integer? port)
- (string? node)]
- :post [(map? %)
- (contains? % :node)
- (string? (get % :node))
- (contains? % :reports)
- (seq? (get % :reports))]}
- {:node node
- :reports (reports-for-node host port node)})
+ []
+ {:msg (str "Exporting PuppetDB metadata")
+ :file-suffix [export-metadata-file-name]
+ :contents (json/generate-pretty-string
+ {:timestamp (now)
+ :command-versions
+ ;; This is not ideal that we are hard-coding the command version here, but
+ ;; in our current architecture I don't believe there is any way to introspect
+ ;; on which version of the `replace catalog` matches up with the current
+ ;; version of the `catalog` endpoint... or even to query what the latest
+ ;; version of a command is. We should improve that.
+ {:replace-catalog catalog-version
+ :store-report 2
+ :replace-facts 1}})})
(defn- validate-cli!
[args]
@@ -151,32 +216,25 @@
:puppetlabs.kitchensink.core/cli-error (System/exit 1)
:puppetlabs.kitchensink.core/cli-help (System/exit 0))))))
+(pls/defn-validated add-entry
+ :- nil
+ "Writes the given `tar-item` to `tar-writer` using
+ export-root-directory as the base directory for contents"
+ [tar-writer
+ {:keys [file-suffix contents]} :- tar-item]
+ (archive/add-entry tar-writer "UTF-8"
+ (.getPath (apply io/file export-root-dir file-suffix))
+ contents))
+
(defn -main
[& args]
(let [[{:keys [outfile host port]} _] (validate-cli! args)
- nodes (get-active-node-names host port)
- get-catalog-fn (partial get-catalog-for-node host port)
- get-reports-fn (partial get-reports-for-node host port)]
-;; TODO: do we need to deal with SSL or can we assume this only works over a plaintext port?
+ nodes (get-nodes host port)]
+ ;; TODO: do we need to deal with SSL or can we assume this only works over a plaintext port?
(with-open [tar-writer (archive/tarball-writer outfile)]
- (archive/add-entry tar-writer "UTF-8"
- (.getPath (io/file export-root-dir export-metadata-file-name))
- (json/generate-string export-metadata {:pretty true}))
-
- ;; Write out catalogs
- (doseq [node nodes]
- (println (format "Writing catalog for node '%s'" node))
- (archive/add-entry tar-writer "UTF-8"
- (.getPath (io/file export-root-dir "catalogs" (format "%s.json" node)))
- (:catalog (get-catalog-fn node))))
-
- ;; Write out reports
+ (add-entry tar-writer (export-metadata))
(doseq [node nodes
- report (:reports (get-reports-fn node))]
- (let [confversion (get report :configuration-version)
- starttime (get report :start-time)
- reportstr (json/generate-string (dissoc report :hash) {:pretty true})]
- (println (format "Writing report '%s-%s' for node '%s'" starttime confversion node))
- (archive/add-entry tar-writer "UTF-8"
- (.getPath (io/file export-root-dir "reports" (format "%s-%s-%s.json" node starttime confversion)))
- reportstr))))))
+ :let [node-data (get-node-data host port node)]]
+ (doseq [{:keys [msg] :as tar-item} (mapcat node-data [:catalog :reports :facts])]
+ (println msg)
+ (add-entry tar-writer tar-item))))))
View
24 src/com/puppetlabs/puppetdb/cli/import.clj
@@ -70,6 +70,23 @@
(when-not (= pl-http/status-ok (:status result))
(log/error result))))
+(defn submit-facts
+ "Send the given wire-format `facts` (associated with `host`) to a
+ command-processing endpoint located at `puppetdb-host`:`puppetdb-port`."
+ [puppetdb-host puppetdb-port fact-payload]
+ {:pre [(string? puppetdb-host)
+ (integer? puppetdb-port)
+ (string? fact-payload)]}
+ (let [payload (-> fact-payload
+ json/parse-string)
+ result (command/submit-command-via-http!
+ puppetdb-host puppetdb-port
+ (command-names :replace-facts)
+ 1
+ fact-payload)]
+ (when-not (= pl-http/status-ok (:status result))
+ (log/error result))))
+
(defn process-tar-entry
"Determine the type of an entry from the exported archive, and process it
accordingly."
@@ -81,7 +98,8 @@
(map? metadata)]}
(let [path (.getName tar-entry)
catalog-pattern (str "^" (.getPath (io/file export-root-dir "catalogs" ".*\\.json")) "$")
- report-pattern (str "^" (.getPath (io/file export-root-dir "reports" ".*\\.json")) "$")]
+ report-pattern (str "^" (.getPath (io/file export-root-dir "reports" ".*\\.json")) "$")
+ facts-pattern (str "^" (.getPath (io/file export-root-dir "facts" ".*\\.json")) "$")]
(when (re-find (re-pattern catalog-pattern) path)
(println (format "Importing catalog from archive entry '%s'" path))
;; NOTE: these submissions are async and we have no guarantee that they
@@ -96,6 +114,10 @@
(println (format "Importing report from archive entry '%s'" path))
(submit-report host port
(get-in metadata [:command-versions :store-report])
+ (archive/read-entry-content tar-reader)))
+ (when (re-find (re-pattern facts-pattern) path)
+ (println (format "Importing facts from archive entry '%s'" path))
+ (submit-facts host port
(archive/read-entry-content tar-reader)))))
(defn- validate-cli!
View
27 test/com/puppetlabs/puppetdb/test/cli/export.clj
@@ -2,10 +2,13 @@
(:require [com.puppetlabs.puppetdb.query.catalogs :as c]
[com.puppetlabs.puppetdb.query.reports :as r]
[com.puppetlabs.puppetdb.query.events :as e]
- [cheshire.core :as json]
+ [com.puppetlabs.cheshire :as json]
[com.puppetlabs.puppetdb.testutils.catalogs :as testcat]
[com.puppetlabs.puppetdb.testutils.reports :as testrep]
- [com.puppetlabs.puppetdb.cli.export :as export])
+ [com.puppetlabs.puppetdb.cli.export :as export]
+ [com.puppetlabs.puppetdb.command :as command]
+ [com.puppetlabs.puppetdb.command.constants :refer [command-names]]
+ [com.puppetlabs.puppetdb.testutils.repl :as turepl])
(:use [clojure.java.io :only [resource]]
clojure.test
[com.puppetlabs.puppetdb.fixtures]
@@ -20,10 +23,10 @@
original-catalog (json/parse-string original-catalog-str)]
(testcat/replace-catalog original-catalog-str)
- ;; This is explicitly set to v3, as per the current CLI tooling
- (let [exported-catalog (c/catalog-for-node :v3 "myhost.localdomain")]
- (is (= (testcat/munge-catalog-for-comparison original-catalog)
- (testcat/munge-catalog-for-comparison exported-catalog)))))))
+ ;; This is explicitly set to v3, as per the current CLI tooling
+ (let [exported-catalog (c/catalog-for-node :v3 "myhost.localdomain")]
+ (is (= (testcat/munge-catalog-for-comparison original-catalog)
+ (testcat/munge-catalog-for-comparison exported-catalog)))))))
(testing "Exporting a JSON report"
(testing "the exported JSON should match the original import JSON"
@@ -37,6 +40,12 @@
(testrep/munge-report-for-comparison exported-report)))))))
(testing "Export metadata"
- (is (= {:replace-catalog catalog-version
- :store-report 2}
- (:command-versions export/export-metadata)))))
+ (let [{:keys [msg file-suffix contents]} (export/export-metadata)
+ metadata (json/parse-string contents true)]
+ (is (= {:replace-catalog catalog-version
+ :store-report 2
+ :replace-facts 1}
+ (:command-versions metadata)))
+ (is (= ["export-metadata.json"] file-suffix))
+ (is (= "Exporting PuppetDB metadata" msg)))))
+
Something went wrong with that request. Please try again.