Giter Club home page Giter Club logo

datahike's Introduction

replikativ CircleCI Gitter

Project homepage

Roadmap (suggestions)

0.3.0

  • Investigate JS side integration of http://y-js.org/
  • Investigate integration with similar systems, eg. IPFS pubsub
  • Split middleware from replicated datatype implementations
  • Improve network IO library kabel (Android support) [DONE]
  • Move hashing into fetch middleware to simplify parallelization. [DONE]
  • Experimental automatic Gossip protocol
  • Experimental Snapshot Isolation
  • Build reasonable small support libraries to partition application data for efficient client side consumption, Datomic and Datascript. Look into datsync etc.
  • Add a monitoring interface as a cljs library with basic web views for applications to communicate their synching state to the user in a uniform way. [DONE]
  • Introduce clojure.spec to stage/... API.

0.4.0

  • Authentication with signed public-private key signatures
  • Model some level of consistency between CRDTs, probably Snapshot Isolation, to compose CRDTs. (NMSI, Antidote, research)
  • Implement more useful CRDTs (counter, vector-clock, ...) from techreview and other papers and ship by default.

0.5.0

  • Use p2p block distribution similar to BitTorrent for immutable values (similar to blocks)
  • support WebRTC for value distribution similar to BitTorrent
  • Java bindings

Long-term (1.0.0)

  • Encryption of transaction with CRDT key encrypted by userkeys, public key schema, explore pub/private key solutions. Maybe metadata signing can work (slowly) on a DHT?
  • Distribute bandwidth between CRDTs.
  • Negotiate middlewares with versioning.
  • Implement diverse prototypes, from real-time to "big-data".

Contributors

  • Konrad Kuehne
  • Christian Weilbach

Support

If you would like to get some commercial support for replikativ, feel free to contact us at lambdaforge.

License

Copyright © 2013-2018 Christian Weilbach, Konrad Kühne

Distributed under the Eclipse Public License, the same as Clojure.

datahike's People

Contributors

abrooks avatar alekcz avatar bamarco avatar benfleis avatar boxed avatar brandonbloom avatar claj avatar dijonkitchen avatar dthume avatar frankiesardo avatar green-coder avatar grischoun avatar izirku avatar jsmassa avatar kordano avatar lynaghk avatar mattsenior avatar montyxcantsin avatar mrebbinghaus avatar ncalexan avatar purrgrammer avatar runejuhl avatar sundbry avatar thegeez avatar timokramer avatar tonsky avatar vlaaad avatar wambat avatar whilo avatar yflim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datahike's Issues

How to specify a schema

I would like to use a schema the same way datascript offers here:

(let [schema {:aka {:db/cardinality :db.cardinality/many}}]
  (d/create-conn schema))

How would I go about that?

I am basically porting my application from datascript to datahike so that my app has persistent data, but I have certain schematic requirements such as cardinality-many, refs, etc.

Schema Migration

I saw in #4 that schema migration is not currently supported. Are there plans to support this in the future? What's the current workaround for schema migration? Load the database with the old schema, dump it to a file, delete the database, create a database with the new schema and import the data dump?

NullPointerException for simple query

Hey, with the file /tmp/example.edn with this content:

#datahike/Datom [1 :user/name "max" 536870913 true]
#datahike/Datom [9 :comment/upvotes 1 536870914 true]

then this code:

  (let [schema { :user/name {:db/type        :db.type/string
                             :db/unique      :db.unique/identity}

                :comment/upvotes {:db/valueType   :db.type/ref
                                   :db/cardinality :db.cardinality/many
                                   :db/isComponent true}}
        dburl "datahike:file:///tmp/example.db"
        _     (datahike.api/delete-database dburl)
        _ (datahike.api/create-database-with-schema dburl schema)
        conn (datahike.api/connect dburl)]

    (datahike.migrate/import-db conn "/tmp/example.edn")

    (d/q '[:find ?cid
           :in $ ?uname
           :where
           [?uid :user/name ?uname]
           [?cid :comment/upvotes ?uid]]
         "max"))

throws a big NPE:

java.lang.NullPointerException
	at clojure.lang.Numbers.ops(Numbers.java:1018)
	at clojure.lang.Numbers.equiv(Numbers.java:217)
	at clojure.lang.Numbers.equiv(Numbers.java:213)
	at clojure.lang.Numbers.equiv(Numbers.java:3933)
	at datahike.db$eval17132$fn__17133.invoke(db.cljc:326)
	at hitchhiker.tree.core$eval13502$fn__13503$G__13493__13510.invoke(core.cljc:114)
	at hitchhiker.tree.messaging$eval14581$lookup_fwd_iter__14587$fn__14589.invoke(messaging.cljc:264)
	at clojure.core$drop_while$step__5661.invoke(core.clj:2966)
	at clojure.core$drop_while$fn__5664.invoke(core.clj:2969)
	at clojure.lang.LazySeq.sval(LazySeq.java:40)
	at clojure.lang.LazySeq.seq(LazySeq.java:49)
	at clojure.lang.RT.seq(RT.java:528)
	at clojure.lang.SeqIterator.hasNext(SeqIterator.java:38)
	at clojure.lang.TransformerIterator.step(TransformerIterator.java:74)
	at clojure.lang.TransformerIterator.hasNext(TransformerIterator.java:97)
	at clojure.lang.RT.chunkIteratorSeq(RT.java:510)
	at clojure.core$sequence.invokeStatic(core.clj:2654)
	at clojure.core$sequence.invoke(core.clj:2639)
	at datahike.db$slice.invokeStatic(db.cljc:491)
	at datahike.db$slice.invoke(db.cljc:439)
	at datahike.db$slice.invokeStatic(db.cljc:441)
	at datahike.db$slice.invoke(db.cljc:439)
	at datahike.db.DB._search(db.cljc:503)
	at datahike.query$lookup_pattern_db.invokeStatic(query.cljc:363)
	at datahike.query$lookup_pattern_db.invoke(query.cljc:360)
	at datahike.query$lookup_pattern.invokeStatic(query.cljc:395)
	at datahike.query$lookup_pattern.invoke(query.cljc:392)
	at datahike.query$_resolve_clause.invokeStatic(query.cljc:628)
	at datahike.query$_resolve_clause.invoke(query.cljc:616)
	at datahike.query$resolve_clause.invokeStatic(query.cljc:642)
	at datahike.query$resolve_clause.invoke(query.cljc:634)
	at clojure.lang.PersistentVector.reduce(PersistentVector.java:341)
	at clojure.core$reduce.invokeStatic(core.clj:6747)
	at clojure.core$reduce.invoke(core.clj:6730)
	at datahike.query$_q.invokeStatic(query.cljc:645)
	at datahike.query$_q.invoke(query.cljc:644)
	at datahike.query$q.invokeStatic(query.cljc:769)
	at datahike.query$q.doInvoke(query.cljc:754)
	at clojure.lang.RestFn.invoke(RestFn.java:439)
	at maxapp.comment.upvote$eval68731.invokeStatic(form-init3536000562099756439.clj:15)
	at maxapp.comment.upvote$eval68731.invoke(form-init3536000562099756439.clj:1)
	at clojure.lang.Compiler.eval(Compiler.java:7062)
	at clojure.lang.Compiler.eval(Compiler.java:7025)
	at clojure.core$eval.invokeStatic(core.clj:3206)
	at clojure.core$eval.invoke(core.clj:3202)
	at clojure.main$repl$read_eval_print__8572$fn__8575.invoke(main.clj:243)
	at clojure.main$repl$read_eval_print__8572.invoke(main.clj:243)
	at clojure.main$repl$fn__8581.invoke(main.clj:261)
	at clojure.main$repl.invokeStatic(main.clj:261)
	at clojure.main$repl.doInvoke(main.clj:177)
	at clojure.lang.RestFn.applyTo(RestFn.java:137)
	at clojure.core$apply.invokeStatic(core.clj:657)
	at clojure.core$apply.invoke(core.clj:652)
	at refactor_nrepl.ns.slam.hound.regrow$wrap_clojure_repl$fn__59911.doInvoke(regrow.clj:18)
	at clojure.lang.RestFn.invoke(RestFn.java:1523)
	at clojure.tools.nrepl.middleware.interruptible_eval$evaluate$fn__47777.invoke(interruptible_eval.clj:87)
	at clojure.lang.AFn.applyToHelper(AFn.java:152)
	at clojure.lang.AFn.applyTo(AFn.java:144)
	at clojure.core$apply.invokeStatic(core.clj:657)
	at clojure.core$with_bindings_STAR_.invokeStatic(core.clj:1965)
	at clojure.core$with_bindings_STAR_.doInvoke(core.clj:1965)
	at clojure.lang.RestFn.invoke(RestFn.java:425)
	at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invokeStatic(interruptible_eval.clj:85)
	at clojure.tools.nrepl.middleware.interruptible_eval$evaluate.invoke(interruptible_eval.clj:55)
	at clojure.tools.nrepl.middleware.interruptible_eval$interruptible_eval$fn__47822$fn__47825.invoke(interruptible_eval.clj:222)
	at clojure.tools.nrepl.middleware.interruptible_eval$run_next$fn__47817.invoke(interruptible_eval.clj:190)
	at clojure.lang.AFn.run(AFn.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Am I doing something wrong? This looks pretty straightforward to me?

Remove unnecessary dependencies

Currently datahike automatically pulls in ClojureScript, timbre, guava and a few other things. We should make these dependencies optional by going through our stack. This will also make it easier to have datahike run on GraalVM. @borkdude

File-based storage medium fails in Winodws 10

Hi,

I try to execute (def conn (connect uri)) after calling (create-database uri) with (def uri "datahike:file:///U:/path/to/my/database") and get the exception below. It seems to create a database which cannot be read afterwords. My wild guess is that the parser mixes file path and class path seperators in Windows which is different from Unix-based systems, indicated by Illegal char <:> at index 2 error description.

I use Windows 10, Oracle JDK 10.0.1+10, Clojure 1.9.0, datahike 0.1.2

The created database files: database.zip

The leveldb storage medium seems to work.

Exception

CompilerException clojure.lang.ExceptionInfo: Could not read key. {:type :read-error, :key :db, :exception #error {
 :cause "Illegal char <:> at index 2: /U:/path/to/my/database/data/0594e3b6-9635-5c99-8142-412accf3023b"
 :via
 [{:type java.nio.file.InvalidPathException
   :message "Illegal char <:> at index 2: /U:/path/to/my/database/data/0594e3b6-9635-5c99-8142-412accf3023b"
   :at [sun.nio.fs.WindowsPathParser normalize nil -1]}]
 :trace
 [[sun.nio.fs.WindowsPathParser normalize nil -1]
  [sun.nio.fs.WindowsPathParser parse nil -1]
  [sun.nio.fs.WindowsPathParser parse nil -1]
  [sun.nio.fs.WindowsPath parse nil -1]
  [sun.nio.fs.WindowsFileSystem getPath nil -1]
  [konserve.filestore$read_edn invokeStatic "filestore.clj" 169]
  [konserve.filestore$read_edn invoke "filestore.clj" 163]
  [konserve.filestore.FileSystemStore _get_in "filestore.clj" 341]
  [konserve.core$get_in$fn__15601$state_machine__8749__auto____15606$fn__15609 invoke "core.cljc" 71]
  [konserve.core$get_in$fn__15601$state_machine__8749__auto____15606 invoke "core.cljc" 68]
  [clojure.core.async.impl.ioc_macros$run_state_machine invokeStatic "ioc_macros.clj" 973]
  [clojure.core.async.impl.ioc_macros$run_state_machine invoke "ioc_macros.clj" 972]
  [clojure.core.async.impl.ioc_macros$run_state_machine_wrapped invokeStatic "ioc_macros.clj" 977]
  [clojure.core.async.impl.ioc_macros$run_state_machine_wrapped invoke "ioc_macros.clj" 975]
  [konserve.core$get_in$fn__15601 invoke "core.cljc" 68]
  [clojure.lang.AFn run "AFn.java" 22]
  [java.util.concurrent.ThreadPoolExecutor runWorker nil -1]
  [java.util.concurrent.ThreadPoolExecutor$Worker run nil -1]
  [java.lang.Thread run nil -1]]}}, compiling:(form-init9469297056318643012.clj:12:11) 

hash changes for the same transaction

The same transaction on a db value results in different hash values:

(:require '[datahike.core :as d])

(def db (d/db-with (d/empty-db) [{:db/id 1 :name "Konrad"}]))
(hash (d/db-with db [[:db.fn/retractEntity 1]]))
;; => 32706373
(hash (d/db-with db [[:db.fn/retractEntity 1]]))
;; => 13589158

Reevaluate custom datom storage layouts

If we do not use custom types for the indices and datoms in serialized form, then we should remove the temporary code. The index should also be made pluggable.

Fix schema handling

Cardinality settings on entities are currently not respected, i.e. unique attributes that are updated seem to be still visible sometimes as in the example from #11.

datahike api does not support lazy data structures

The Datahike API fails to support lazy sequences (clojure.lang.LazySeq). Seeing how pervasive lazy sequences are in Clojure, I would expect this to work, and to realize the sequences where required. I have observed this behaviour in both q and in transact.

For q it's as easy as adding this defmethod to api.cljc:

(defmethod d/q clojure.lang.LazySeq
  [query & inputs]
  (apply d/q (vec query) inputs))

I have not looked into what it would take to do something similar for transact.

Using idents as query parameters doesn't unify then

While using idents I noticed that keyword literals are unified just fine in queries. However, when passing ident keywords as arguments they are not unified. A minimal example looks like this:

(require '[datahike.api :as d])

(def uri "datahike:mem:memdb")

(def schema [{:db/ident :ident
              :db/valueType :db.type/ref
              :db/cardinality :db.cardinality/one}
             {:db/ident :an/ident}])

(d/create-database uri :initial-tx schema)

(def conn (d/connect uri))

@(d/transact! conn [[:db/add 1 :ident :an/ident]])
@(d/transact! conn [[:db/add 2 :ident :an/ident]])

(d/q '[:find ?e
       :where [?e :ident :an/ident]]
     (d/db conn))
;;=> #{[2] [1]}

(d/q '[:find ?e
       :in $ ?ident
       :where [?e :ident ?ident]]
     (d/db conn)
     :an/ident)
;;=> #{}

The above queries are equivalent, so I'd expect to get the same result from both.

migrate namespace missing?

I'm so excited about this project, thank you.

Where has the datahike.migrate namespace gone? Is there a new way to export and import?

(require '[datahike.migrate :refer [export-db import-db]]) no longer works.

datahike on Andriod app?

Could anyone tell me if datahike can be used in Android app on react native?

If not, any plan to support Android?

Updating raw vector and map values fails

I have schema with two items with :cardinality/one. In :my/vec I'm going to store vector and in :my/map a hash-map (not a good design obviously but it's useful for quick prototyping). First transaction which adds new data works as expected, however another one which is supposed to change stored values fails.

Splitting them into separate transaction shows that updating vector actually adds another value to db (like schema is being ignored), and updating map silently fails. It works just the same for db without schema (:my/vec is appended, :my/map fails).

I've tested same code with datascript and it works as expected - both values were changed.
See example below:

(def schema {:my/id {:db/cardinality :db.cardinality/one
                     :db/unique :db.unique/identity}
             :my/vec {:db/cardinality :db.cardinality/one}
             :my/map {:db/cardinality :db.cardinality/one}})
(def uri "datahike:mem:///dbtest")
(d/create-database-with-schema uri schema)
(def conn (d/connect uri))

@(d/transact conn [{:my/id "x"
                    :my/vec ["a" "b" "c"]
                    :my/map {:a "a" :b "b" :c "c"}}])
(d/seek-datoms @conn :eavt) ; all fine, three datoms there

@(d/transact conn [[:db/add [:my/id "x"] :my/vec ["x" "y" "z"]]
                   [:db/add [:my/id "x"] :my/map {:x "x" :y "y" :z "z"}]])
(d/seek-datoms @conn :eavt) ; problem: nothing changed in db, also no exception or warning

;let's try it separately
@(d/transact conn [[:db/add [:my/id "x"] :my/vec ["x" "y" "z"]]])
(d/seek-datoms @conn :eavt) ; problem: new value was added instead of replaced old one
(d/pull @conn '[*] [:my/id "x"]) ; returns old value for :my/vec
(d/q '[:find ?e ?v
       :where [?e :my/vec ?v]]
  @conn) ; shows both value for same entity

@(d/transact conn [[:db/add [:my/id "x"] :my/map {:x "x" :y "y" :z "z"}]])
(d/seek-datoms @conn :eavt) ; problem: value was not added

Question: Will datahike allow arbitrary "forking" of db?

Datomic, through its transaction process, ensures there's one linear history for the db (although you can create temporary speculative copies). With Datascript, the DB is like any other Clojure data structure and you can make arbitrary changes off of any version of the DB, and all are equally valid. The connection as an atom is just a convenience if you want to use it in a linear fashion.

Which model is datahike going to follow?

Support durable transactor functions

Currently transactor functions work by directly referencing vars in the current execution context. This will not work reliably once we get replicated. We should draft a durable deployment scheme for these functions. A design issue is to keep cljs support working in a single runtime without eval.

LevelDB vs file?

When would I use leveldb vs a direct file? What are the implications of each?

misuse of transient

Hi,

I was checking out if I can find low hanging fruits/obvious performance optimizations to apply by just reading the code before pulling out a profiler and I stumbled upon this:

(defn into-via-doseq [to from]
  (let [res (transient [])]
    (doseq [x from]  ;; checking chunked iter
      (conj! res x))
    (persistent! res)))

It's a bit weird for 2 reasons:

  • to is never used, I guess the intent was to use it as a source to transient
  • while this might work because of an implementation detail of vectors, this is a misuse of transient as stated by the doc. It probably doesn't matter much since it's in the tests, but you can simply replace that with (into [] ..) which is both faster (from what I tested) and correct.

Transients support a parallel set of 'changing' operations, with similar names followed by ! - assoc!, conj! etc. These do the same things as their persistent counterparts except the return values are themselves transient. Note in particular that transients are not designed to be bashed in-place. You must capture and use the return value in the next call. In this way, they support the same code structure as the functional persistent code they replace. As the example will show, this will allow you to easily enhance the performance of a piece of code without structural change.

On a side note I found this by grep'ing for use of doseq to potentially replace it with run! when possible since it hits the reducible path of the source when possible/makes-sense.
I have a few other things like this that could be easy gains if you are interested by a PR. I can also submit a PR to fix that one.

lein repl crashes with OpenJDK 10 on Ubuntu 18.04

Process crashes with java.lang.ClassNotFoundException: javax.xml.bind.DatatypeConverter, compiling:(cljs/closure.clj:1:1) with

java --version
openjdk 10.0.2 2018-07-17
OpenJDK Runtime Environment (build 10.0.2+13-Ubuntu-1ubuntu0.18.04.3)
OpenJDK 64-Bit Server VM (build 10.0.2+13-Ubuntu-1ubuntu0.18.04.3, mixed mode)

Performance issues

Hi,
first of all, thanks for datahike. I really think this soft has its place in the clojure ecosystem.
So I start using it, and I created a small (110k eids) file database.
The query engine exhibits some strange performance behaviour. I'll use my db logic for example since i didn't try to reproduce it on other database.

(time (ds/q '[:find ?p ?v
                   :where
                   [2 ?p ?v]] @db ))
"Elapsed time: 5.704363 msecs"
#{[:similarity/name "w2v"] [:w2v/iterations 10] [:w2v/window-size 5] [:w2v/layer-size 20] [:w2v/seed 42] [:w2v/stop-words []] [:w2v/min-word-frequency 5]}
;; using reference directly is really fast

(time (ds/q '[:find ?sim :where
                    [?id :trajectory/dataset "huffpost"]
                    [?id :trajectory/query "moon"]
                    [?id :trajectory/sim ?sim]] @db))
"Elapsed time: 9.537365 msecs"
#{[2]}
;; Getting reference is really fast

(time (ds/q '[:find ?sim ?p ?v 
                   :where
                  [?id :trajectory/dataset "huffpost"]
                  [?id :trajectory/query "moon"]
                  [?id :trajectory/sim ?sim]
                  [?sim ?p ?v]] @db))
"Elapsed time: 3908.029339 msecs"
#{[2 :w2v/iterations 10] [2 :w2v/min-word-frequency 5] [2 :similarity/name "w2v"] [2 :w2v/stop-words []] [2 :w2v/layer-size 20] [2 :w2v/seed 42] [2 :w2v/window-size 5]}
;; Doing the union is really slow 

(time (ds/q '[:find ?sim ?p ?v :where
                                     [?id :trajectory/sim ?sim]
                                     [?sim ?p ?v]] @db))
"Elapsed time: 4246.928569 msecs" 
#{[25490 :w2v/min-word-frequency 5] [84456 :w2v/seed 42] [41442 :w2v/window-size 5] [98356 :w2v/layer-size 20]  ...}
;; not doing the union gives similar time (somewhat slower)

Any idea of what may cause that union tremendous overhead ?

entity/touch entity/get not resolving db/ident

I have a DB where i have a bunch of idents, and when i get an entity out of the DB and use (get) or (touch) which uses an ident ref, it will come back as a set of :db/id

Is the expected behaviour for get or touch in this case to give me back the translated value of the ident (in this case keywords)?

also, possibly related, when i use (walk/postwalk identity entity) this throws an error. this is something that works in datomic.

anyway, if idents are supposed to resolve the way i described, i can try to add that functionality, otherwise working with idents seems to be a big PITA.

Storing java.math.BigDecimal silently "disconnects" from storage

Example:

(def uri (str "datahike:file://" (.getAbsolutePath (java.io.File. "testdb"))))
(d/create-database uri)
(def conn (d/connect uri))

@(d/transact conn [{:my/prop1 42.5}])
(d/seek-datoms @conn :eavt) ; all fine, our single datom is there

@(d/transact conn [{:my/prop2 42.5M}])
(d/seek-datoms @conn :eavt) ; still fine, we he two datoms now
(d/q '[:find [(pull ?e [*]) ...]
       :where [?e :my/prop2]]
  @conn)
(type (:my/prop2 (first *1))) ; -> java.math.BigDecimal

@(d/transact conn [{:my/prop3 "abc"}])
(d/seek-datoms @conn :eavt) ; now we have three datoms

(d/release conn)

(def conn (d/connect uri))
(d/seek-datoms @conn :eavt) ; only our first datom is there

(d/release conn)
(d/delete-database uri)

It doesn't matter if db was created with or without schema. Until disconnection we have all transacted data available but after reconnection everything since inserting bigdec is not available.

I would expect inserting unsupported data type should throw exception.

Adding data to the DB is very very slow

i am adding 1000 small objects to the DB at once (maybe 30 datoms) and it's taking 72 seconds.

i'm using transact!

is this expected? is it possible that this is an area for easy optimization? is this possibly an issue with hitchhiker tree?

Cannot open a relative filepath

Hey! Thanks for this amazing library.

In my application I'd like to open a database at a relative file path. I've tried many permutations of paths but all seem to throw errors in one form or another. See below for my attempts and their results. Is this possible?

(d/create-database "datahike:file://./data/db")
1. Unhandled java.io.IOException
   Permission denied

(Presumably trying to write to /data instead of ./data)

(d/create-database "datahike:file:data/db")
;; or
(d/create-database "datahike:file:./data/db")

Both throw:

1. Unhandled java.lang.NullPointerException
   (No message)

             filestore.clj:  395  konserve.filestore/check-and-create-folder
             filestore.clj:  392  konserve.filestore/check-and-create-folder
             filestore.clj:  439  konserve.filestore/new-fs-store
             filestore.clj:  432  konserve.filestore/new-fs-store
               RestFn.java:  410  clojure.lang.RestFn/invoke
                store.cljc:   69  datahike.store$eval51949$fn__51951/invoke
              MultiFn.java:  229  clojure.lang.MultiFn/invoke
            connector.cljc:  147  datahike.connector$eval52150$fn__52157/invoke
            connector.cljc:   73  datahike.connector$eval52088$fn__52089$G__52075__52096/invoke
                  AFn.java:  156  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj:  660  clojure.core/apply
            connector.cljc:   85  datahike.connector$eval52140$fn__52143/doInvoke
               RestFn.java:  423  clojure.lang.RestFn/invoke
            connector.cljc:   73  datahike.connector$eval52088$fn__52089$G__52075__52096/invoke
            connector.cljc:  183  datahike.connector$create_database/invokeStatic
            connector.cljc:  182  datahike.connector$create_database/doInvoke
               RestFn.java:  410  clojure.lang.RestFn/invoke
                      REPL:  101  tabrasa.store/eval52344
                      REPL:  101  tabrasa.store/eval52344
             Compiler.java: 7177  clojure.lang.Compiler/eval
             Compiler.java: 7132  clojure.lang.Compiler/eval
                  core.clj: 3214  clojure.core/eval
                  core.clj: 3210  clojure.core/eval
                  main.clj:  437  clojure.main/repl/read-eval-print/fn
                  main.clj:  437  clojure.main/repl/read-eval-print
                  main.clj:  458  clojure.main/repl/fn
                  main.clj:  458  clojure.main/repl
                  main.clj:  368  clojure.main/repl
               RestFn.java:  137  clojure.lang.RestFn/applyTo
                  core.clj:  665  clojure.core/apply
                  core.clj:  660  clojure.core/apply
                regrow.clj:   18  refactor-nrepl.ns.slam.hound.regrow/wrap-clojure-repl/fn
               RestFn.java: 1523  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   79  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:   55  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:  142  nrepl.middleware.interruptible-eval/interruptible-eval/fn/fn
                  AFn.java:   22  clojure.lang.AFn/run
               session.clj:  171  nrepl.middleware.session/session-exec/main-loop/fn
               session.clj:  170  nrepl.middleware.session/session-exec/main-loop
                  AFn.java:   22  clojure.lang.AFn/run
               Thread.java:  835  java.lang.Thread/run

(d/create-database "datahike:file://data/db")
1. Unhandled java.io.IOException
   No such file or directory

       UnixFileSystem.java:   -2  java.io.UnixFileSystem/createFileExclusively
                 File.java: 1024  java.io.File/createNewFile
             filestore.clj:  398  konserve.filestore/check-and-create-folder
             filestore.clj:  392  konserve.filestore/check-and-create-folder
             filestore.clj:  439  konserve.filestore/new-fs-store
             filestore.clj:  432  konserve.filestore/new-fs-store
               RestFn.java:  410  clojure.lang.RestFn/invoke
                store.cljc:   69  datahike.store$eval51949$fn__51951/invoke
              MultiFn.java:  229  clojure.lang.MultiFn/invoke
            connector.cljc:  147  datahike.connector$eval52150$fn__52157/invoke
            connector.cljc:   73  datahike.connector$eval52088$fn__52089$G__52075__52096/invoke
                  AFn.java:  156  clojure.lang.AFn/applyToHelper
                  AFn.java:  144  clojure.lang.AFn/applyTo
                  core.clj:  667  clojure.core/apply
                  core.clj:  660  clojure.core/apply
            connector.cljc:   85  datahike.connector$eval52140$fn__52143/doInvoke
               RestFn.java:  423  clojure.lang.RestFn/invoke
            connector.cljc:   73  datahike.connector$eval52088$fn__52089$G__52075__52096/invoke
            connector.cljc:  183  datahike.connector$create_database/invokeStatic
            connector.cljc:  182  datahike.connector$create_database/doInvoke
               RestFn.java:  410  clojure.lang.RestFn/invoke
                      REPL:  101  tabrasa.store/eval52350
                      REPL:  101  tabrasa.store/eval52350
             Compiler.java: 7177  clojure.lang.Compiler/eval
             Compiler.java: 7132  clojure.lang.Compiler/eval
                  core.clj: 3214  clojure.core/eval
                  core.clj: 3210  clojure.core/eval
                  main.clj:  437  clojure.main/repl/read-eval-print/fn
                  main.clj:  437  clojure.main/repl/read-eval-print
                  main.clj:  458  clojure.main/repl/fn
                  main.clj:  458  clojure.main/repl
                  main.clj:  368  clojure.main/repl
               RestFn.java:  137  clojure.lang.RestFn/applyTo
                  core.clj:  665  clojure.core/apply
                  core.clj:  660  clojure.core/apply
                regrow.clj:   18  refactor-nrepl.ns.slam.hound.regrow/wrap-clojure-repl/fn
               RestFn.java: 1523  clojure.lang.RestFn/invoke
    interruptible_eval.clj:   79  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:   55  nrepl.middleware.interruptible-eval/evaluate
    interruptible_eval.clj:  142  nrepl.middleware.interruptible-eval/interruptible-eval/fn/fn
                  AFn.java:   22  clojure.lang.AFn/run
               session.clj:  171  nrepl.middleware.session/session-exec/main-loop/fn
               session.clj:  170  nrepl.middleware.session/session-exec/main-loop
                  AFn.java:   22  clojure.lang.AFn/run
               Thread.java:  835  java.lang.Thread/run

tools.reader error on repl start

On starting a lein repl, it crashes. Top of stacktrace:

#error {
 :cause reader-error does not exist
 :via
 [{:type clojure.lang.Compiler$CompilerException
   :message Syntax error compiling at (clojure/tools/reader/edn.clj:1:1).
   :data #:clojure.error{:phase :compile-syntax-check, :line 1, :column 1, :source clojure/tools/reader/edn.clj}
   :at [clojure.lang.Compiler load Compiler.java 7647]}
  {:type java.lang.IllegalAccessError
   :message reader-error does not exist
   :at [clojure.core$refer invokeStatic core.clj 4249]}]
 :trace

[...]

I found out that a very old version of org.clojure/tools.reader is used. Relevant part of lein deps :tree:

[io.replikativ/datahike "0.1.3"]
   [io.replikativ/hitchhiker-tree "0.1.4"]
     [com.taoensso/carmine "2.12.2"]
       [com.taoensso/encore "2.32.0"]
         [com.taoensso/truss "1.0.0"]
         [org.clojure/tools.reader "0.10.0"]

transact vs transact!

Hi there 👋

Amazing project, thank you for sharing.

Out of curiosity, why transact instead of transact!? Both DataScript and Datomic use the bang ! convention, I'm wondering why Datahike doesn't do the same.

Konserve crash by throwaway database

Trying to transact into a throwaway database:

(defn m-exec []
  (let [transaction-temporary-db
        (datahike.core/create-conn schema)
        _ @(datahike.api/transact transaction-temporary-db transfer-transaction)
...

Results in exception:

Exception in thread "async-dispatch-9" java.lang.NullPointerException
	at clojure.core$deref_future.invokeStatic(core.clj:2292)
	at clojure.core$deref.invokeStatic(core.clj:2312)
	at clojure.core$deref.invoke(core.clj:2298)
	at konserve.core$get_lock.invokeStatic(core.cljc:29)
	at konserve.core$get_lock.invoke(core.cljc:28)
	at konserve.core$assoc_in$fn__15615$state_machine__8622__auto____15622$fn__15625.invoke(core.cljc:87)
	at konserve.core$assoc_in$fn__15615$state_machine__8622__auto____15622.invoke(core.cljc:87)
	at clojure.core.async.impl.ioc_macros$run_state_machine.invokeStatic(ioc_macros.clj:973)
	at clojure.core.async.impl.ioc_macros$run_state_machine.invoke(ioc_macros.clj:972)
	at clojure.core.async.impl.ioc_macros$run_state_machine_wrapped.invokeStatic(ioc_macros.clj:977)
	at clojure.core.async.impl.ioc_macros$run_state_machine_wrapped.invoke(ioc_macros.clj:975)
	at konserve.core$assoc_in$fn__15615.invoke(core.cljc:87)
	at clojure.lang.AFn.run(AFn.java:22)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)

{:db/ident keyword} should be able to be transacted

the official datomic documents give me the impression that i should be allowed to do something like

(dh.api/transact conn [{:db/ident :a/keyword}])

however, when i attempt to do this i get an error

   Incomplete schema transaction attributes, expected :db/ident,
   :db/valueType, :db/cardinality
   {:error :transact/schema,
    :entity #:db{:ident :real-estate-equity/medical}}
                   db.cljc: 1301  datahike.db$transact_tx_data/invokeStatic
                   db.cljc: 1234  datahike.db$transact_tx_data/invoke
                 core.cljc:  231  datahike.core$with/invokeStatic
                 core.cljc:  224  datahike.core$with/invoke
                 core.cljc:  438  datahike.core$_transact_BANG_$fn__24426/invoke

reference docs
https://docs.datomic.com/cloud/schema/schema-modeling.html
https://docs.datomic.com/on-prem/identity.html

abstract index structures

  • the db should not differentiate between hitchhiker tree, the b+-set or foundation db as index
    The following should be done:

  • a protocol for the indices should be created defining basic index functions: slice, rslice, diff, ...

  • a record should be created for the hitchhiker tree

  • a record should be created for the b+-tree

  • a record should be created for the foundation db index

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.