diff --git a/doc/schema-migration.md b/doc/schema-migration.md new file mode 100644 index 000000000..7c00b5e42 --- /dev/null +++ b/doc/schema-migration.md @@ -0,0 +1,85 @@ +# Schema Migration + +Schema migration with Datahike is the evolution of your current schema into a future schema. + +## Why using the schema migration tool? +You could use the `transact`-fn of the api-ns to apply your schema, but with our +`norm`-ns you can define your migrations centrally and they will be applied once and only +once to your database. + +This helps when your production database has limited accessibility from your developer +machines and you want to apply the migrations from a server next to your production code. +In case you are setting up your database from scratch for e.g. development purpose you can +rely on your schema to be up-to-date with your production environment because you are +keeping your original schema along with your migrations in a central repository. + +## How to migrate a database schema +When we are speaking of changes to your schema, these should always add new definitions and +never change existing definitions. In case you want to change existing data to a new format +you will have to create a new schema and transact your existing data transformed again. A +good intro to this topic [can be found here](https://docs.datomic.com/cloud/schema/schema-change.html). + +## Transaction-functions +Your transaction functions need to be on your classpath to be called and they need to take +one argument, the connection to your database. Each function needs to return a vector of +transactions so that they can be applied during migration. + +Please be aware that with transaction-functions you will create transactions that need to be +held in memory. Very large migrations might exceed your memory. + +## Norms? +Like [conformity for Datomic](https://github.com/avescodes/conformity) we are using the term +norm for our tool. You can use it to declare expectations about the state of your database +and enforce those idempotently without repeatedly transacting schema. These expectations +can be the form of your schema, data in a certain format or pre-transacted data for e.g. +a development database. + +## Migration folder +Preferably create a folder in your project resources called `migrations`. You can however +use any folder you like even outside your resources. If you don't want to package the +migrations into a jar you can just run the migration-functions with a path as string passed. +In your migration-folder you store your migration-files. Be aware that your chosen +migration-folder will include all subfolders for reading the migrations. Don't store +other files in your migration-folder besides your migrations! + +## How to migrate + +1. Create a folder of your choice, for now let's call it `migrations`. In this folder you +create a new file with an edn-extension like `001-my-first-norm.edn`. Preferably you name the +file beginning with a number. Please be aware that the name of your file will be the id of +your migration. Taking into account that you might create some more migrations in the future +you should left-pad the names with zeros to keep a proper sorting. Keep in mind that your +migrations are transacted sorted after your chosen ids one after another. Spaces will be +replaced with dashes to compose the id. + +2. Write the transactions itself into your newly created file. The content of the file needs +to be an edn-map with one or both of the keys `:tx-data` and `tx-fn`. `:tx-data` is just +transaction data in the form of a vector, `:tx-fn` is a function that you can run during the +execution to migrate data from one attribute to another for example. This function needs to +be qualified and callable from the classpath. It will be evaluated during the migration and +needs to return transactions. These transactions will be transacted with `:tx-data` together +in one transaction. + +Example of a migration: +```clojure +{:tx-data [{:db/doc "Place of occupation" + :db/ident :character/place-of-occupation + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}] + :tx-fn my-transactions.my-project/my-first-tx-fn} + ``` + +3. When you are sufficiently confident that your migrations will work you usually want to store +it in some kind of version control system. To avoid conflicts with fellow colleagues we +implemented a security net. Run the function `update-checksums!` from the `datahike.norm.norm` +namespace to create or update a `checksums.edn` file inside your migrations-folder. This file +contains the names and checksums of your migration-files. In case a colleague of yours +checked in a migration that you have not been aware of, your VCS should avoid merging the +conflicting `checksums.edn` file. + +4. To apply your migrations you most likely want to package the migrations into a jar together +with datahike and a piece of code that actually runs your migrations and run it on a server. +You should check the correctness of the checksums with `datahike.norm.norm/verify-checksums` +and finally run the `datahike.norm.norm/ensure-norms!` function to apply your migrations. For +each migration that is already applied there will be a `:tx/norm` attribute stored with the +id of your migration so it will not be applied twice. diff --git a/src/datahike/norm/norm.clj b/src/datahike/norm/norm.clj new file mode 100644 index 000000000..2992d4390 --- /dev/null +++ b/src/datahike/norm/norm.clj @@ -0,0 +1,245 @@ +(ns datahike.norm.norm + (:require + [clojure.java.io :as io] + [clojure.string :as string] + [clojure.pprint :as pp] + [clojure.data :as data] + [clojure.edn :as edn] + [clojure.spec.alpha :as s] + [taoensso.timbre :as log] + [hasch.core :as h] + [hasch.platform :as hp] + [datahike.api :as d] + [datahike.tools :as dt]) + (:import + [java.io File] + [java.util.jar JarFile JarEntry] + [java.net URL])) + +(def checksums-file "checksums.edn") + +(defn- attribute-installed? [conn attr] + (some? (d/entity @conn [:db/ident attr]))) + +(defn- ensure-norm-attribute! [conn] + (if-not (attribute-installed? conn :tx/norm) + (:db-after (d/transact conn {:tx-data [{:db/ident :tx/norm + :db/valueType :db.type/keyword + :db/cardinality :db.cardinality/one}]})) + @conn)) + +(defn- norm-installed? [db norm] + (->> {:query '[:find (count ?t) . + :in $ ?tn + :where + [_ :tx/norm ?tn ?t]] + :args [db norm]} + d/q + some?)) + +(defn- get-jar [resource] + (-> (.getPath resource) + (string/split #"!" 2) + first + (subs 5) + JarFile.)) + +(defmulti ^:private retrieve-file-list + (fn [file-or-resource] (type file-or-resource))) + +(defmethod ^:private retrieve-file-list File [file] + (if (.exists file) + (let [migration-files (file-seq file) + xf (comp + (filter #(.isFile %)) + (filter #(string/ends-with? (.getPath %) ".edn")))] + (into [] xf migration-files)) + (dt/raise (format "Norms folder %s does not exist." (str file)) {:folder file}))) + +(defmethod ^:private retrieve-file-list URL [resource] + (if resource + (let [abs-path (.getPath resource) + last-path-segment (-> abs-path (string/split #"/") peek)] + (if (string/starts-with? abs-path "file:") + (->> (get-jar resource) + .entries + enumeration-seq + (filter #(and (string/starts-with? (.getName %) last-path-segment) + (not (.isDirectory %)) + (string/ends-with? % ".edn")))) + (->> (file-seq (io/file abs-path)) + (filter #(not (.isDirectory %)))))) + (dt/raise "Resource does not exist." {:resource (str resource)}))) + +(defmethod ^:private retrieve-file-list :default [arg] + (dt/raise "Can only read a File or a URL (resource)" {:arg arg :type (type arg)})) + +(defn- filter-file-list [file-list] + (filter #(and (string/ends-with? % ".edn") + (not (string/ends-with? (.getName %) checksums-file))) + file-list)) + +(defn filename->keyword [filename] + (-> filename + (string/replace #" " "-") + (keyword))) + +(defmulti ^:private read-edn-file + (fn [file-or-entry _file-or-resource] (type file-or-entry))) + +(defmethod ^:private read-edn-file File [f _file] + (when (not (.exists f)) + (dt/raise "Failed reading file because it does not exist" {:filename (str f)})) + [(-> (slurp f) + edn/read-string) + {:name (.getName f) + :norm (filename->keyword (.getName f))}]) + +(defmethod ^:private read-edn-file JarEntry [entry resource] + (when (nil? resource) + (dt/raise "Failed reading resource because it does not exist" {:resource (str resource)})) + (let [file-name (-> (.getName entry) + (string/split #"/") + peek)] + [(-> (get-jar resource) + (.getInputStream entry) + slurp + edn/read-string) + {:name file-name + :norm (filename->keyword file-name)}])) + +(defmethod ^:private read-edn-file :default [t _] + (dt/raise "Can not handle argument" {:type (type t) :arg t})) + +(defn- read-norm-files [norm-list file-or-resource] + (->> norm-list + (map (fn [f] + (let [[content metadata] (read-edn-file f file-or-resource)] + (merge content metadata)))) + (sort-by :norm))) + +(defn- compute-checksums [norm-files] + (->> norm-files + (reduce (fn [m {:keys [norm] :as content}] + (assoc m + norm + (-> (select-keys content [:tx-data :tx-fn]) + h/edn-hash + hp/hash->str))) + {}))) + +(s/def ::tx-data vector?) +(s/def ::tx-fn symbol?) +(s/def ::norm-map (s/keys :opt-un [::tx-data ::tx-fn])) +(defn- validate-norm [norm] + (if (s/valid? ::norm-map norm) + (log/debug "Norm validated" {:norm-map norm}) + (let [res (s/explain-data ::norm-map norm)] + (dt/raise "Invalid norm" {:validation-error res})))) + +(defn- neutral-fn [_] []) + +(defn- transact-norms [conn norm-list] + (let [db (ensure-norm-attribute! conn)] + (log/info "Checking migrations ...") + (doseq [{:keys [norm tx-data tx-fn] + :as norm-map + :or {tx-data [] + tx-fn 'datahike.norm.norm/neutral-fn}} + norm-list] + (log/info "Checking migration" norm) + (validate-norm norm-map) + (when-not (norm-installed? db norm) + (log/info "Running migration") + (->> (d/transact conn {:tx-data (vec (concat [{:tx/norm norm}] + tx-data + ((var-get (requiring-resolve tx-fn)) conn)))}) + (log/info "Done")))))) + +(defn- diff-checksums [checksums edn-content] + (let [diff (data/diff checksums edn-content)] + (when-not (every? nil? (butlast diff)) + (dt/raise "Deviation of the checksums found. Migration aborted." {:diff diff})))) + +(defmulti verify-checksums + (fn [file-or-resource] (type file-or-resource))) + +(defmethod verify-checksums File [file] + (let [norm-list (-> (retrieve-file-list file) + filter-file-list + (read-norm-files file)) + edn-content (-> (io/file (io/file file) checksums-file) + (read-edn-file file) + first)] + (diff-checksums (compute-checksums norm-list) + edn-content))) + +(defmethod verify-checksums URL [resource] + (let [file-list (retrieve-file-list resource) + norm-list (-> (filter-file-list file-list) + (read-norm-files resource)) + edn-content (-> (->> file-list + (filter #(-> (.getName %) (string/ends-with? checksums-file))) + first) + (read-edn-file resource) + first)] + (diff-checksums (compute-checksums norm-list) + edn-content))) + +(defmulti ^:private ensure-norms + (fn [_conn file-or-resource] (type file-or-resource))) + +(defmethod ^:private ensure-norms File [conn file] + (let [norm-list (-> (retrieve-file-list file) + filter-file-list + (read-norm-files file))] + (transact-norms conn norm-list))) + +(defmethod ^:private ensure-norms URL [conn resource] + (let [file-list (retrieve-file-list resource) + norm-list (-> (filter-file-list file-list) + (read-norm-files resource))] + (transact-norms conn norm-list))) + +(defn ensure-norms! + "Takes Datahike-connection and optional a java.io.File object + or java.net.URL to specify the location of your norms. + Defaults to the resource `migrations`. + Returns nil when successful and throws exception when not. + + Ensures your norms are present on your Datahike database. + All the edn-files in this folder and its subfolders are + considered migration-files aka norms and will be transacted + ordered by their names into your database. All norms that + are successfully transacted will have an attribute that + marks them as migrated and they will not be applied twice." + ([conn] + (ensure-norms! conn (io/resource "migrations"))) + ([conn file-or-resource] + (ensure-norms conn file-or-resource))) + +(defn update-checksums! + "Optionally takes a folder as string. Defaults to the + folder `resources/migrations`. + Returns nil when successful and throws exception when not. + + All the edn-files in the folder and its subfolders are + considered migration-files aka norms. For each of + these norms a checksum will be computed and written to + the file `checksums.edn`. Each time this fn is run, + the `checksums.edn` will be overwritten with the current + values. + This prevents inadvertent migrations of your database + when used in conjunction with a VCS. A merge-conflict + should be raised when trying to merge a checksums.edn + with stale data." + ([] + (update-checksums! "resources/migrations")) + ([^String norms-folder] + (let [file (io/file norms-folder)] + (-> (retrieve-file-list file) + filter-file-list + (read-norm-files file) + compute-checksums + (#(spit (io/file norms-folder checksums-file) + (with-out-str (pp/pprint %)))))))) diff --git a/test/datahike/norm/norm_test.clj b/test/datahike/norm/norm_test.clj new file mode 100644 index 000000000..474e9fae3 --- /dev/null +++ b/test/datahike/norm/norm_test.clj @@ -0,0 +1,120 @@ +(ns datahike.norm.norm-test + (:require [clojure.test :refer [deftest is]] + [clojure.string :as string] + [clojure.java.io :as io] + [datahike.api :as d] + [datahike.norm.norm :as sut :refer [verify-checksums]] + [datahike.test.utils :as tu])) + +(def ensure-norms #'sut/ensure-norms) + +(deftest simple-test + (let [conn (tu/setup-db {} true) + _ (verify-checksums (io/file "test/datahike/norm/resources/simple-test")) + _ (ensure-norms conn (io/file "test/datahike/norm/resources/simple-test")) + schema (d/schema (d/db conn))] + (is (= #:db{:valueType :db.type/string, :cardinality :db.cardinality/one, :doc "Place of occupation", :ident :character/place-of-occupation} + (-> (schema :character/place-of-occupation) + (dissoc :db/id)))) + (is (= #:db{:valueType :db.type/string, :cardinality :db.cardinality/one, :doc "Simpsons character name", :ident :character/name, :db/unique :db.unique/identity} + (-> (schema :character/name) + (dissoc :db/id)))) + (is (= #:db{:ident :tx/norm, :valueType :db.type/keyword, :cardinality :db.cardinality/one} + (-> (schema :tx/norm) + (dissoc :db/id)))))) + +(defn tx-fn-test-fn [conn] + (-> (for [[eid value] (d/q '[:find ?e ?v + :where + [?e :character/place-of-occupation ?v]] + (d/db conn))] + [:db/add eid + :character/place-of-occupation (string/lower-case value)]) + vec)) + +(deftest tx-fn-test + (let [conn (tu/setup-db {} true) + _ (verify-checksums (io/file "test/datahike/norm/resources/tx-fn-test/first")) + _ (ensure-norms conn (io/file "test/datahike/norm/resources/tx-fn-test/first")) + _ (d/transact conn {:tx-data [{:character/place-of-occupation "SPRINGFIELD ELEMENTARY SCHOOL"} + {:character/place-of-occupation "SPRINGFIELD NUCLEAR POWER PLANT"}]}) + _ (verify-checksums (io/file "test/datahike/norm/resources/tx-fn-test/second")) + _ (ensure-norms conn (io/file "test/datahike/norm/resources/tx-fn-test/second"))] + (is (= #{["springfield elementary school"] ["springfield nuclear power plant"]} + (d/q '[:find ?v + :where + [_ :character/place-of-occupation ?v]] + (d/db conn)))))) + +(defn tx-data-and-tx-fn-test-fn [conn] + (-> (for [[eid] + (d/q '[:find ?e + :where + [?e :character/name] + (or-join [?e] + [?e :character/name "Homer Simpson"] + [?e :character/name "Marge Simpson"])] + (d/db conn))] + {:db/id eid + :character/children [[:character/name "Bart Simpson"] + [:character/name "Lisa Simpson"] + [:character/name "Maggie Simpson"]]}) + vec)) + +(deftest tx-data-and-tx-fn-test + (let [conn (tu/setup-db {} true) + _ (verify-checksums (io/file "test/datahike/norm/resources/tx-data-and-tx-fn-test/first")) + _ (ensure-norms conn (io/file "test/datahike/norm/resources/tx-data-and-tx-fn-test/first")) + _ (d/transact conn {:tx-data [{:character/name "Homer Simpson"} + {:character/name "Marge Simpson"} + {:character/name "Bart Simpson"} + {:character/name "Lisa Simpson"} + {:character/name "Maggie Simpson"}]}) + _ (verify-checksums (io/file "test/datahike/norm/resources/tx-data-and-tx-fn-test/second")) + _ (ensure-norms conn (io/file "test/datahike/norm/resources/tx-data-and-tx-fn-test/second")) + margehomer (d/q '[:find [?e ...] + :where + [?e :character/name] + (or-join [?e] + [?e :character/name "Homer Simpson"] + [?e :character/name "Marge Simpson"])] + (d/db conn))] + (is (= [#:character{:name "Homer Simpson", + :children + [#:character{:name "Bart Simpson"} + #:character{:name "Lisa Simpson"} + #:character{:name "Maggie Simpson"}]} + #:character{:name "Marge Simpson", + :children + [#:character{:name "Bart Simpson"} + #:character{:name "Lisa Simpson"} + #:character{:name "Maggie Simpson"}]}] + (d/pull-many (d/db conn) '[:character/name {:character/children [:character/name]}] margehomer))))) + +(defn naming-and-sorting-test-fn [conn] + (-> (for [[eid] (d/q '[:find ?e + :where + [?e :character/name] + (or-join [?e] + [?e :character/name "Bart Simpson"] + [?e :character/name "Lisa Simpson"])] + (d/db conn))] + {:db/id eid + :character/occupation :student}) + vec)) + +(deftest naming-and-sorting-test + (let [conn (tu/setup-db {} true) + _ (verify-checksums (io/file "test/datahike/norm/resources/naming-and-sorting-test")) + _ (sut/ensure-norms! conn (io/file "test/datahike/norm/resources/naming-and-sorting-test")) + lisabart (d/q '[:find [?e ...] + :where + [?e :character/occupation :student]] + (d/db conn))] + (is (= [{:db/id 10, + :character/name "Bart Simpson", + :character/occupation :student} + {:db/id 11, + :character/name "Lisa Simpson", + :character/occupation :student}] + (d/pull-many (d/db conn) '[*] lisabart))))) diff --git a/test/datahike/norm/resources/naming-and-sorting-test/001-a1-example.edn b/test/datahike/norm/resources/naming-and-sorting-test/001-a1-example.edn new file mode 100644 index 000000000..53b124a52 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/001-a1-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Place of occupation" + :db/ident :character/place-of-occupation + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}] + :tx-fn datahike.norm.norm/neutral-fn} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/002-a2-example.edn b/test/datahike/norm/resources/naming-and-sorting-test/002-a2-example.edn new file mode 100644 index 000000000..3c5597a60 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/002-a2-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons character name" + :db/ident :character/name + :db/unique :db.unique/identity + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}]} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/003-tx-fn-test.edn b/test/datahike/norm/resources/naming-and-sorting-test/003-tx-fn-test.edn new file mode 100644 index 000000000..5f99e0470 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/003-tx-fn-test.edn @@ -0,0 +1 @@ +{:tx-fn datahike.norm.norm-test/tx-fn-test-fn} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/004-tx-data-and-tx-fn-test.edn b/test/datahike/norm/resources/naming-and-sorting-test/004-tx-data-and-tx-fn-test.edn new file mode 100644 index 000000000..bd9f4d98e --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/004-tx-data-and-tx-fn-test.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons children reference" + :db/ident :character/children + :db/valueType :db.type/ref + :db/cardinality :db.cardinality/many}] + :tx-fn datahike.norm.norm-test/tx-data-and-tx-fn-test-fn} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/01-transact-basic-characters.edn b/test/datahike/norm/resources/naming-and-sorting-test/01-transact-basic-characters.edn new file mode 100644 index 000000000..365c66926 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/01-transact-basic-characters.edn @@ -0,0 +1,2 @@ +{:tx-data [{:character/name "Bart Simpson"} + {:character/name "Lisa Simpson"}]} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/02 add occupation.edn b/test/datahike/norm/resources/naming-and-sorting-test/02 add occupation.edn new file mode 100644 index 000000000..342768b85 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/02 add occupation.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Occupation" + :db/ident :character/occupation + :db/valueType :db.type/keyword + :db/cardinality :db.cardinality/one}] + :tx-fn datahike.norm.norm-test/naming-and-sorting-test-fn} diff --git a/test/datahike/norm/resources/naming-and-sorting-test/checksums.edn b/test/datahike/norm/resources/naming-and-sorting-test/checksums.edn new file mode 100644 index 000000000..9d9b35061 --- /dev/null +++ b/test/datahike/norm/resources/naming-and-sorting-test/checksums.edn @@ -0,0 +1,12 @@ +{:001-a1-example.edn + "b28aa2247b363cb556b4abba3593774b6c960d5958458bcf3ed46d86e355d563ea77c66b1f4a21cb35f232227a8d4747dd7aec519bab036c706670eb5cf4a05c", + :002-a2-example.edn + "0a143795833f67d022ec26283fc57dd951fb0aac3e830336f56606d893352c274cbc72682097e60372c5724af619451fac05a0c8dad45f8f5358d734981dfcdc", + :003-tx-fn-test.edn + "221bca617e681443437ddeccee0aaafc9c63aa72f9c84fd84458d9f8ad0be6aa6e3095a716aa2b255374bd8992c0638ec4b038b5f862745cee859022ddb0880d", + :004-tx-data-and-tx-fn-test.edn + "d26413ec992aaf5c7f0703855212906df0311b27be170cc9012242f47a36c72d52e662977dc9f66fd39c11979d09880323ce7e69a3de10fbd760b59bb985b3b4", + :01-transact-basic-characters.edn + "7e826c77fdabbca47a697dab41510418d11a82ece9735b61c61c551d1d4a336f1becc15527b920bc677171e5b9335acf8c4ab35136fa0b66f33e49cfb90f9ff6", + :02-add-occupation.edn + "9d11ec4f9de67561da29a337cee59d68dac01fba3edac0de6241a7671a5a26f976a1302c59b2b0b8743104a859c6e019810fc9742d30829883aabd7e28ec5e3b"} diff --git a/test/datahike/norm/resources/simple-test/001-a1-example.edn b/test/datahike/norm/resources/simple-test/001-a1-example.edn new file mode 100644 index 000000000..53b124a52 --- /dev/null +++ b/test/datahike/norm/resources/simple-test/001-a1-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Place of occupation" + :db/ident :character/place-of-occupation + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}] + :tx-fn datahike.norm.norm/neutral-fn} diff --git a/test/datahike/norm/resources/simple-test/002-a2-example.edn b/test/datahike/norm/resources/simple-test/002-a2-example.edn new file mode 100644 index 000000000..3c5597a60 --- /dev/null +++ b/test/datahike/norm/resources/simple-test/002-a2-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons character name" + :db/ident :character/name + :db/unique :db.unique/identity + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}]} diff --git a/test/datahike/norm/resources/simple-test/checksums.edn b/test/datahike/norm/resources/simple-test/checksums.edn new file mode 100644 index 000000000..e7a067eda --- /dev/null +++ b/test/datahike/norm/resources/simple-test/checksums.edn @@ -0,0 +1,4 @@ +{:001-a1-example.edn + "b28aa2247b363cb556b4abba3593774b6c960d5958458bcf3ed46d86e355d563ea77c66b1f4a21cb35f232227a8d4747dd7aec519bab036c706670eb5cf4a05c", + :002-a2-example.edn + "0a143795833f67d022ec26283fc57dd951fb0aac3e830336f56606d893352c274cbc72682097e60372c5724af619451fac05a0c8dad45f8f5358d734981dfcdc"} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/001-a1-example.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/001-a1-example.edn new file mode 100644 index 000000000..53b124a52 --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/001-a1-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Place of occupation" + :db/ident :character/place-of-occupation + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}] + :tx-fn datahike.norm.norm/neutral-fn} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/002-a2-example.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/002-a2-example.edn new file mode 100644 index 000000000..3c5597a60 --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/002-a2-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons character name" + :db/ident :character/name + :db/unique :db.unique/identity + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}]} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/003-tx-fn-test.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/003-tx-fn-test.edn new file mode 100644 index 000000000..5f99e0470 --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/003-tx-fn-test.edn @@ -0,0 +1 @@ +{:tx-fn datahike.norm.norm-test/tx-fn-test-fn} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/checksums.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/checksums.edn new file mode 100644 index 000000000..ec7ef1644 --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/first/checksums.edn @@ -0,0 +1,6 @@ +{:001-a1-example.edn + "b28aa2247b363cb556b4abba3593774b6c960d5958458bcf3ed46d86e355d563ea77c66b1f4a21cb35f232227a8d4747dd7aec519bab036c706670eb5cf4a05c", + :002-a2-example.edn + "0a143795833f67d022ec26283fc57dd951fb0aac3e830336f56606d893352c274cbc72682097e60372c5724af619451fac05a0c8dad45f8f5358d734981dfcdc", + :003-tx-fn-test.edn + "221bca617e681443437ddeccee0aaafc9c63aa72f9c84fd84458d9f8ad0be6aa6e3095a716aa2b255374bd8992c0638ec4b038b5f862745cee859022ddb0880d"} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/004-tx-data-and-tx-fn-test.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/004-tx-data-and-tx-fn-test.edn new file mode 100644 index 000000000..bd9f4d98e --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/004-tx-data-and-tx-fn-test.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons children reference" + :db/ident :character/children + :db/valueType :db.type/ref + :db/cardinality :db.cardinality/many}] + :tx-fn datahike.norm.norm-test/tx-data-and-tx-fn-test-fn} diff --git a/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/checksums.edn b/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/checksums.edn new file mode 100644 index 000000000..c7edc0ef2 --- /dev/null +++ b/test/datahike/norm/resources/tx-data-and-tx-fn-test/second/checksums.edn @@ -0,0 +1,2 @@ +{:004-tx-data-and-tx-fn-test.edn + "d26413ec992aaf5c7f0703855212906df0311b27be170cc9012242f47a36c72d52e662977dc9f66fd39c11979d09880323ce7e69a3de10fbd760b59bb985b3b4"} diff --git a/test/datahike/norm/resources/tx-fn-test/first/001-a1-example.edn b/test/datahike/norm/resources/tx-fn-test/first/001-a1-example.edn new file mode 100644 index 000000000..53b124a52 --- /dev/null +++ b/test/datahike/norm/resources/tx-fn-test/first/001-a1-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Place of occupation" + :db/ident :character/place-of-occupation + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}] + :tx-fn datahike.norm.norm/neutral-fn} diff --git a/test/datahike/norm/resources/tx-fn-test/first/002-a2-example.edn b/test/datahike/norm/resources/tx-fn-test/first/002-a2-example.edn new file mode 100644 index 000000000..3c5597a60 --- /dev/null +++ b/test/datahike/norm/resources/tx-fn-test/first/002-a2-example.edn @@ -0,0 +1,5 @@ +{:tx-data [{:db/doc "Simpsons character name" + :db/ident :character/name + :db/unique :db.unique/identity + :db/valueType :db.type/string + :db/cardinality :db.cardinality/one}]} diff --git a/test/datahike/norm/resources/tx-fn-test/first/checksums.edn b/test/datahike/norm/resources/tx-fn-test/first/checksums.edn new file mode 100644 index 000000000..e7a067eda --- /dev/null +++ b/test/datahike/norm/resources/tx-fn-test/first/checksums.edn @@ -0,0 +1,4 @@ +{:001-a1-example.edn + "b28aa2247b363cb556b4abba3593774b6c960d5958458bcf3ed46d86e355d563ea77c66b1f4a21cb35f232227a8d4747dd7aec519bab036c706670eb5cf4a05c", + :002-a2-example.edn + "0a143795833f67d022ec26283fc57dd951fb0aac3e830336f56606d893352c274cbc72682097e60372c5724af619451fac05a0c8dad45f8f5358d734981dfcdc"} diff --git a/test/datahike/norm/resources/tx-fn-test/second/003-tx-fn-test.edn b/test/datahike/norm/resources/tx-fn-test/second/003-tx-fn-test.edn new file mode 100644 index 000000000..5f99e0470 --- /dev/null +++ b/test/datahike/norm/resources/tx-fn-test/second/003-tx-fn-test.edn @@ -0,0 +1 @@ +{:tx-fn datahike.norm.norm-test/tx-fn-test-fn} diff --git a/test/datahike/norm/resources/tx-fn-test/second/checksums.edn b/test/datahike/norm/resources/tx-fn-test/second/checksums.edn new file mode 100644 index 000000000..1ca9a4da1 --- /dev/null +++ b/test/datahike/norm/resources/tx-fn-test/second/checksums.edn @@ -0,0 +1,2 @@ +{:003-tx-fn-test.edn + "221bca617e681443437ddeccee0aaafc9c63aa72f9c84fd84458d9f8ad0be6aa6e3095a716aa2b255374bd8992c0638ec4b038b5f862745cee859022ddb0880d"} diff --git a/tests.edn b/tests.edn index 9af0d90b0..c2e1bd1e9 100644 --- a/tests.edn +++ b/tests.edn @@ -11,6 +11,8 @@ #_{:id :cljs :type :kaocha.type/cljs :ns-patterns ["datahike.test."]} + {:id :norm + :test-paths ["test/datahike/norm"]} {:id :integration :test-paths ["test/datahike/integration_test"]}] ;; More verbose than the default reporter, and with prettier errors