Hi All, I have a question which I've asked some others about offline and haven't found an answer too, so I thought I'd try the community.
I have a 'Stock' cube (by day) and a 'Movement' cube (also by day) - dimension of each are identical, and there is sparsity (64bit)
To calculate today's (closing) stock the dataflow was set up as
Stock (previous period) + Movement = Stock
This procedure needs to be run for multiple (15) days at a time to allow for backdated stock adjustments following weekly audits.
The issue I have is that the procedure is only updating the stock correctly for one day. So if I have 15 days I wish to update for, the procedure will need to be cycled through 15 times, and each time it is run it will correctly calculate the next days stock.
below shows what I'm getting...

Stock has been correctly calculated to 17/3/2018, but the Stock value showing for 18/3/18 is the 'movement' for 17/3 and 18/3, but not including the Stock for 17/3.
If I run the procedure again I get the following

Stock has now been calculated for 18/3,and 19/3 stock has changed accordingly.
When doing similarly structured dataflows previously (eg. in generating Balance Sheet) it has calculated all selected periods when using the same approach, although they have been monthly and they have also been smaller dense cubes.
I'm wondering why in this instance this method doesn't work. Is it a limitation cause by day dimension, the number of dimensions/size of cube (cube has approx 10000T cells per year), or is it to do with the sparsity that is in place (Density factor is <0.01)?
Things we've tried unsuccessfully;
- offset rather than 'previous period'
- turning off 'High Performance' (this may work, but the calculation time goes from 2 seconds to >1 hour, so it is not a suitable approach)
Calculation and screen performance appears to be fine at present, so our current work around plan is to use a loop to calculate each day (rather than re-structuring the cube as we're not sure if that is what is causing the issue).
Thanks
Brendan