Eva: An open supply entity-attribute-value database in Clojure

Eva: An open supply entity-attribute-value database in Clojure

Eva is a distributed database-system implementing an entity-attribute-value data-model
that’s time-aware, accumulative, and atomically constant. Its API is
by-and-large suitable with Datomic’s. This software program must be thought of alpha
for the needs of high quality and stability. Take a look at the FAQ for more information.

Getting Began

If you’re model new to Eva, we recommend studying by this whole readme to familiarize your self with Eva as an entire. Afterwards, be sure you take a look at the Eva tutorial collection, which break down and go over virtually every little thing it would be best to know.

Required Instruments

  1. Java Growth Equipment (JDK) v8
  2. Leiningen Construct Instrument

Instance: Howdy World

First we kick off the repl with:

lein repl

Subsequent we create a connection (conn) to an in-memory Eva database. We additionally must outline the actual fact (datom) we wish to add to Eva. Lastly we use the transact name so as to add the actual fact into the system.

(def conn (eva/join ))
(def datom [:db/add (eva/tempid :db.half/person)
            :db/doc "howdy world"])
(deref (eva/transact conn [datom]))

Observe: deref can be utilized interchangeably with the @ image.

Now we will run a question to get this truth out of Eva. We do not use conn to make a question however fairly we get hold of an immutable database worth like so:

Subsequent we execute a question that returns all entity ids within the system matching the doc string "howdy world".

(eva/q '[:discover ?e :the place [?e :db/doc "howdy world"]] db)

If we wish to return the complete illustration of those entities, we will do this by including pull to our question.

(eva/q '[:discover (pull ?e [*]) :the place [?e :db/doc "howdy world"]] db)

Challenge Construction

  1. mission.clj accommodates the mission construct configuration
  2. core/* main releasable codebase for Eva Transactor and Peer-library
    1. core/src clojure supply information
    2. core/java-src java supply information
    3. core/take a look at take a look at supply information
    4. core/sources non-source information
  3. dev/* codebase used throughout improvement, however not launched
    1. dev/src clojure supply information
    2. dev/java-src java supply information
    3. dev/take a look at take a look at supply information
    4. dev/sources non-source information
    5. dev/test-resources non-source information used to help integration testing

Growth Duties

Operating the Take a look at Suite

lein take a look at

Eva exposes a lot of configuration-properties that may be configured
utilizing java system-properties. Some particular configuration-properties can
even be configured utilizing environment-variables.

The eva.config namespace, linked right here, accommodates
descriptions and default values for the config vars.

Entity-Attribute-Worth (EAV)

EAV data-entities encompass:

  1. a definite entity-id
  2. 1-or-more attribute-value pairs related to a single entity-id

EAV information will be represented within the following (equal) varieties:

  1. as an object or map:
  2. as a listing of EAV tuples:
      [12345, :attribute1, "value1"],
      [12345, :attribute2, "value2"]


To make the EAV data-model time-aware, we prolong the EAV-tuple
into an EAVT-tuple containing the transaction-id (T) that launched the tuple:

;;  E      A            V         T
   [12345, :attribute1, "value1", 500],
   [12345, :attribute2, "value2", 500]


To make the EAVT data-model accumulative, we prolong the EAVT-tuple
with a closing flag that signifies if the EAV data was added or eliminated
on the transaction-id (T).

;;  E      A            V         T    added?
   [12345, :attribute1, "value1", 500, true],
   [12345, :attribute2, "value2", 500, true]

Underneath this mannequin, frequent information operations (create, replace, delete) are represented like this:

  • Create: a single tuple with added? == true
[[12345, :attribute1, "create entity 12345 with area :attribute1 at transaction 500", 500, true]]
  • Delete: a single tuple with added? == false
[[12345, :attribute1, "create entity 12345 with area :attribute1 at transaction 500", 501, false]]
  • Replace: a pair of deletion and creation tuples
 ;; At transaction 502
 ;;   invalidate the outdated entry for :attribute2
      [12345, :attribute2, "old-value", 502, false]
 ;;   add a brand new entry for :attribute2
      [12345, :attribute2, "new-value", 502, true]

The full historical past of the database is the cumulative record of those tuples.

Atomic Consistency

Knowledge-updates are submitted as transactions which can be processed atomically.
Which means that whenever you submit a transaction, both all the adjustments in
the transaction are utilized, or none of the adjustments are utilized.


Transactions are submitted as a record of data-modification instructions.

The only data-modification instructions (:db/add, :db/retract) correspond
to the accumulative tuples described above:

  [:db/retract 12345 :attribute2 "old-value"]
  [:db/add 12345 :attribute2 "new-value"]

When this transaction is dedicated it would produce the next tuples within the database historical past
(the place is the following transaction-number):

  [12345, :attribute2, "old-value", , false]
  [12345, :attribute2, "new-value", , true]

Utilizing Object/Map kind in transactions

Along with the command-form, you too can create/replace information utilizing the
object/map type of an entity:


This type is equal to the command-form:

  [:db/add 12345 :attribute1 "value1"]
  [:db/add 12345 :attribute2 "value2"]


As a result of all saved information reduces to EAVT tuples,
schemas are outlined per Attribute, fairly than per Entity.

Schemas definitions are merely Entities which have particular schema-attributes.

Defining the schema for :attribute1:


Taking every key-value pair of the instance in flip:

  • :db/id #db/id[:db.part/db]: declares a new entity-id within the :db.half/db id-partition
  • :db/ident :attribute1: declares that :attribute1 is an alias for the entity-id
  • :db/doc "Schema definition for attribute1": human-readable string documenting the aim of :attribute1
  • :db/valueType :db.sort/string: declares that solely string values are allowed for :attribute1
  • :db/cardinality :db.cardinality/one: declares that an entity could no-more-than one :attribute1.
    Which means that for an given entity-id, there’ll solely ever be one present tuple of [ :attribute1 ].
    Including a brand new tuple with this attribute will trigger any current tuple to be eliminated.
  • :db.set up/_attribute :db.half/db: declares that this :attribute1 is registered with the database as
    an put in attribute

The included docker compose can be utilized to spin up a very built-in Eva setting. This contains:

To spin up stated setting run the next instructions:

make gen-docker-no-tests # to construct Eva with the newest adjustments
make run-docker

To close down the the setting use the next command:

As a way to open a repl container that may discuss to the setting use:

And run the next to initially setup the repl setting:

(require '[eva.catalog.consumer.alpha.consumer :as catalog-client])
(def config (catalog-client/request-flat-config "http://eva-catalog:3000" "workiva" "eva-test-1" "test-db-1"))
(def conn (eva/join config))

Lastly, take a look at that every little thing is working with an empty transaction:

(deref (eva/transact conn []))

The same outcome to this must be anticipated:


Is that this mission or Workiva in any means affiliated with Cognitect?

No. Eva is its personal mission we constructed from the bottom up. The API and high-level
structure are largely suitable with Datomic, however the database, as much as
some EPL code, was totally constructed in-house. We have now a listing of probably the most
notable API variations right here.

Ought to I exploit Eva as a substitute of Datomic?

If you’re searching for a simple system to maneuver to manufacturing rapidly, virtually
actually not. Eva is way much less mature and has seen
far much less time in battle. Datomic Cloud is an incredible (and supported) product
that’s far simpler to face up and run with confidence. Eva is supplied as-is.

What are the important thing variations between Eva and Datomic?

There are a handful of small API variations within the Peer, whereas the Shoppers
are fairly distinct. For instance, a Connection in Eva is constructed utilizing a
baroque configuration map, not a string. From an operational standpoint,
Datomic is way extra turn-key. There are additionally probably some low-level
architectural variations between the methods that can trigger them to
exhibit totally different run-time traits. For instance, our indexes are
backed by a persistent B^𝜀-tree, whereas Datomic’s indexes appear to exhibit
properties extra like a persistent log-structured merge-tree. For a extra detailed
record verify right here.

Why did Workiva construct Eva?

Workiva’s enterprise mannequin requires fine-grained and scalable
multi-tenancy with a level of management and suppleness that fulfill our
personal evolving and distinctive compliance necessities. Moreover, improvement
on Eva started earlier than many highly effective options of Datomic have been launched, together with
Datomic Cloud and Datomic Consumer.

Why is Workiva open sourcing Eva?

The mission by-and-large is sort of characteristic full and we imagine is usually
technically sound. Workiva has determined to discontinue closed improvement on Eva,
however sees quite a lot of potential worth in opening the code base to the OSS
neighborhood. It’s our hope that the neighborhood finds worth within the mission as a

What’s going to Workiva’s ongoing contributions to Eva be?

Eva will probably nonetheless proceed to be maintained and matured in 10% time and by
earlier contributors on their private time.

Maintainers and Contributors

Energetic Maintainers

Earlier Contributors

Listed, in transaction log model, so as of addition to the mission:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.