Event Sourcing in Postgres
How to build an Event Sourcing system using one of the best data-store tools available to humankind: Postgres!
Big companies benefit from products like RabbitMQ, Kafka, or Elastic Search, and have the matching economy to pay for the required resources, mostly clusters of virtual machines.
All the other companies would benefit from the same patterns, but don’t have enough scale to make it efficient.
👉 Money is always a scarce resource 👈
Running a cluster of 3 servers just to store a few million messages in a RabbitMQ queue or Kafka topic is simply too expensive.
A relational database such as PostgreSQL is still among the cheapest options to store a large amount of data in a persistent and reliable way.
In this series of articles, we’re going to discover a data model that lets us store and consume events in a reliable way, offering the possibility for concurrent producers/consumers to work together without race-condition each other.
👉 The entire source code, complete with tests and performance analysis is available at: https://github.com/marcopeg/postgres-event-sourcing.