Guide on the functioning of data operations in the Game Development sector
Within a data management and optimization system, there are several operations to be executed and concerns associated with each action. In this article, we will see how some of the main operations are performed within the new data persistence model adopted by Fabamaq.
In Part 1 of this content, we explained the importance of storing information in the casino gaming sector, as well as a new tool developed at Fabamaq to aid in data storage and management. Read the first article on The Importance of Optimization in Data Management in the Gaming sector.
How to perform the Saving Data operation
We’ll be using the Saving Data operation as an example, as you can extrapolate the Delete Data operation from this one since the only point that differs is the method name - from Commit to Drop.
This extrapolation is trivial because, for both saving and deleting data actions, the operation is treated as a single atomic commit, with the same data layout, as mentioned in the previous article.
It all starts as a request to the backend, asking to save some key-value pair of data. After some needed validations, the DataStore object forwards the Commit method call to the respective CommitEngine’s Serializer version. This all happens in the core thread.
At the same time, there’s a sync thread waiting for updates in the CommitsQueue, which in turn triggers the commit processing, calling the Serializer serialization method followed by the Save Method. This parallelization allows commits to be pushed even during the more intensive computing process of data serialization.
After serialization is done, the saving is delegated to each PersistenceDevice, which in turn has a worker thread for the effect.
Implementing the Getting Data operation
To get data saved in a certain key, a request is made to the backend with the respective key and data store where the value is stored, and the CherryPick method from the CommitEngine’s Serializer, which implementation depends on the Serializer version, is called.
On the current implemented versions there are two distinct approaches:
- Approach 1 - Access directly the local database representation stored in the DataStore JSON object. This one is fast, but more memory expensive;
- Approach 2 - Access the memory map stored locally to retrieve from the default persistence device the value for the respective key.
Possible Strategies for Memory Optimization
The various choices made within the system have different impacts on response time to requests and the efficiency of resources utilized. Let’s explore some strategies that can be potentially used for memory optimization.
Different Serializers
As stated above, there are two serializer versions implemented. Although the first one relies on storing locally a full representation of the whole database and it’s very convenient for retrieving data, it consumes more RAM.
To tackle this issue, a second serializer version was created. In this edition, only a memory map is kept for each key, holding the memory address for the respective data and the data block size.
Squashing Commits
As each data operation is treated as a single atomic commit, when you update or delete data for a given key there will be useless old commits, since only the most recent one holds the current data of that key.
We defined a memory threshold that will trigger a squash mechanism to keep only the most recent data for a given key.
In the case of the second serializer version, the mechanism is slightly different, because of its nature, being a memory defragmentation mechanism, but the principle is the same.
Data Compression
After all these steps, we support data compression, via ZSTD lib.
Don’t forget this…
There is no denying the significance of data in the casino gaming business world. As Tim Berners-Lee once said: “Data is a precious thing and will last longer than the systems themselves”. So, we shall take careful consideration in the way we manage each piece of data.
I hope that with this content I was able to give you a clear insight into the importance of information in the casino gaming world, as well as our Game Developers constant desire to find ways to boost our productivity by optimizing our data operations.
Article written by Diogo Pereira, Game Developer at Fabamaq