All,
Are we aware of any limitations when pushing data from an MDB cube to an RDBMS using a dataflow?
Scenario Overview:
We have an MDB cube composed of four entities - 3 General Entities + 1 Time Entity
Entities | Max Item | Item Count |
---|
Entity 1 | 100000 | 30141 |
Entity 2 | 1000 | 41 |
Entity 3 | 100 | 8 |
Month | Infinity | 204 |
The goal is to push data from the MDB cube to an RDBMS table whenever updates occur in MDB. To achieve this, we’ve created a corresponding RDB cube with the same dimensionality and use a dataflow to transfer the data from MDB to RDB.
Issue Encountered:
We recently encountered a problem where the dataflow failed to write data to the RDB. Initially, we suspected that special characters in Entity 1 member codes might be the cause. To investigate, I cleared and reloaded Entity 1 in batches to isolate any problematic members.
Here’s what I found:
When Entity 1 contains up to 18,275 members, the dataflow works as expected— even with only 27 intersections populated. However, adding the 18,276th member causes the dataflow to fail, regardless of how much actual data exists. Interestingly, if the cube contains only Entity 1 (30k members) and no other entities, the dataflow works fine.
This suggests the issue may not be related to the entity members but the total number of potential intersections in the cube.
Question:
Has anyone encountered a similar limitation or behaviour when pushing data from MDB to RDB via dataflows? Are there known constraints related to dimensionality or the number of potential intersections?
Any insights would be greatly appreciated.