The Database size got reduced from 24 GB to 19 GB after extract and reload of the database

Options

Recently i performed Max item number change for few entities and for that we extract all of the database and do this change and then reload the database.

There is no loss of any data for me, nor there is any issue but the only thing i observed was after the reload of the database the size of database got reduced to 19 Gb.

Is there any catch memory that gets cleared when we do reload of entire database ?

Is there anything to worry about the authenticity if the database ?

Tagged:

Answers

  • Scott Bloxsome
    Options

    Hi Raj,

    When you talk about database size, is this the "in RAM" size or the on disk size? The on disk size reducing would suggest a significant loss of data as much of the database physical size on disk is related to cube size, however it could be that when re-loading your cubes the RAM may have become optimised as there would be elements of removing cached memory.

  • Domenico Panetta
    Options

    Hi Raj,

    one option (not the only one) could be that during this operation (that means that you have cleared related cubes before the reload) you have also cleared the Sparse structures related to those cubes (this happens automatically when you clear all the cubes that use the same Sparse structure).

    If this happened for several Sparse structures that had huge numbers of items (let's say hundreds of thousands or millions), it could be a cause of this size change.

  • Leone Scaburri
    Leone Scaburri Employee
    First Anniversary First Comment Level 100: Foundations of Building in Board Level 200: Leveraging Board for Business Insights
    edited December 2023
    Options

    Hi @Raj Gangani,

    all records with null values will be cleaned from your datamodel while reloading all cubes. Those records are created while using a Clear Cube step with "use current selection". You can find more infos within our "Clear Cube Insight" article