Dealing with 10M events #1123
Replies: 7 comments
-
hi @leandronsp
I think storing large amount of domain events will not be a problem. The write operation is just append only to the table. Single row (or at RES 1.1.1 still 2 rows in 2 tables). The reads are more important. Postgres (or any serious database) will handle 10M (or 100M) records without a problem. The only issues might occur if you try to read too much domain events. But in normal business-as-usual work you typically should not read more than several or at most dozens of them at once. If you have operations that need to read hundreds or thousands of domain events to perform some business logic (i.e. handle web request) then you should rethink your application design - probably your aggregates are too big (been there done that). BTW in one of the projects we are close (or maybe already over) 500M stored domain events. Without performance problems.
What is the database you would like to use?
We have information about several projects using RES with a lot more than 10M of domain events stored.
Have you considered migration to RES 2.x ? The release notes for RES 2.0 might be interesting for you https://github.com/RailsEventStore/rails_event_store/releases/tag/v2.0.0 |
Beta Was this translation helpful? Give feedback.
-
Hi @mpraglowski ,
You're right, I forgot to mention but we do not use the domain events for reading operations. By design we use them as write-models only, and never needed to read from them (maybe in the future for analytics/insights backwards purpose).
For the second database, I was thinking to use Postgres as well, just a side-database with two tables only, which would have stored the RES events. Instead of having a custom repository, I thought on patching the RES model, as follows:
But it's not yet decided to go on this approach, I'm not fond of patching stuff. My only reason to have the second database is looking for a future where we need to run some BI operations, archive old events (nonsense maybe?), etc all kind of read operation that could harm the main workload. On the other hand, it's terrific to know as you mentioned you stored 500M events without problems. That's great to see how the append-only operations in Postgres are so scalable!
Indeed, we are planning to migrate soon to RES 1.3, then RES 2.x. Looks good! I appreaciate your useful insights! |
Beta Was this translation helpful? Give feedback.
-
One note - keeping things in one database simplifies the problem of transactions. |
Beta Was this translation helpful? Give feedback.
-
Right @andrzejkrzywda, that's an important note. I'll consider your guys insights and maybe it's not time to write RES to another store, but upgrading it to 2.x instead 😅 Many thanks! |
Beta Was this translation helpful? Give feedback.
-
It seems you do not use Event Sourcing then. Anyway.... RES is capable to read all 10M events, it will not do this at once, we read always in batches (see EventRepositoryReader for details).
Ahh, so you want to have RES data in separate database, right?
So maybe a database replica could be a solution here? Any transactional operation (OLTP systems) read & write to primary database, the replica is handled by database engine (they are quite good at this) and all reports & analytics (OLAP systems) are reading from replica database (no possibility to write here). This should be enough to avoid performance issues in OLTP systems while running heavy reading operations from OLAP systems.
Great! Let's us know how it works for you :) |
Beta Was this translation helpful? Give feedback.
-
@mpraglowski I think after having this insightful discussion, I will keep it in one daabase only. I have no longer solid reasons to have the second database atm. But in case we decide to go on the second database, maybe we'll split it after migration to Rails 6, which is planned for this year. Thanks! |
Beta Was this translation helpful? Give feedback.
-
Finally there's partitioning, if it turns out that recent data is accessed more frequently than past data: |
Beta Was this translation helpful? Give feedback.
-
Hi everyone, thanks for this awesome gem.
We've been using RES for more than a year, (in Rails 5.2) as we reached 10M events in Postgres. We have a single database for other models as well as for RES Events.
At this moment that's not a problem in terms of performance or stability, but I'm considering to write RES events to another database instead.
According to the documentation I could create a custom repository but I'm not sure this is good for performance reasons, because of connection pool and another potential DB connectivity issue on Rails 5.
I wonder if anyone else already faced such numbers and which approaches they decided to follow.
Versions:
RES 1.1.1
Rails 5.2
Ruby 2.5
Thanks
Beta Was this translation helpful? Give feedback.
All reactions