time Procedure
Hello everyone, i'd like to know if there is a way to minimize the time of the procedure?
Thank you for you help !
Answers
-
Hi Fethi Zerara,
a little more input would be greatly appreciated ;-)
- What procedure are you talking about?
- What does the procedure do or what is it supposed to be doing?
- How much time does it need?
- How much time do you think, it should need?
- Why do you think the procedure is working slowly?
Maybe, you could attach an example--because, as always, it depends...
Kind Regards,
Helmut
0 -
Hello Helmut Heimann
i'm taking annual data from a cube which contains 7 sparce demensions to another one which have the same structure but by mounth , so i'm dividing it by 12 , i think it should take about 10 secs but it's taking more than 2 minutes !
I have no idea why this one is working slowly!
0 -
Hi Fethi Zerara,
what does the log say about the dataflow mode being used? I assume the flow will be running in cell-based mode since the calculations have to be done on cell-level.
Kind Regards,
Helmut
0 -
hi again! ,
i get procedure failed , i don't know why!! -_-
0 -
Hi Fethi,
what did you do prior to executing the procedure?
Could you provide screenshots of you procedure and the definition of the dataflow as well as a screenshot of both cubes' structures--that would really help.
KR
Helmut0 -
Hi again ,
so here i'm taking the annual value of a cube and dividing it into 12 months to get an average value of month, and the procedure is taking a lot of time despite the little amount of data that i have !
here some pictures of what's happening
0 -
Hi Fethi,
I assume, your dataflow in step 3 has block "a" as target and block "b" as source of the calculation (assumption because you didn't attach a screenshot of that). So, my first idea would be to change that sequence (i.e. b (target) = a (source) / 12).
Try that and tell me how it worked.
Another thing: you might want to have a look here Dataflow execution methods: high performance in depth and here Best Practices for Performance Tuning on big data models to get more information on your topic.
KR
Helmut0