AnsweredAssumed Answered

Execution time in database logs

Question asked by ecausse on Jan 30, 2018
Latest reply on Feb 16, 2018 by ecausse

Hi everyone,

We noticed a few months back, while analyzing the performance of our data loading processes, that the information available in the logs was not always accurate.

Yesterday I noticed a strange behavior: in the DB log, the "time elapsed" info is half of what it should be.

Action CodeDateTimeUserNameDbNameOperation-TitleD.Flow ModeTargetElapsedFileRecordNrValidatedRejectedRAM StatusErrCode
FR2018012919:15ecaFAST_001Standard Prices00h18m43sFAST_-95419380439380430[0/0]Mb    
FR2018012919:58ecaFAST_001Standard Prices - Detailed Costs00h21m13sFAST_-95408234108234100[0/0]Mb    
FR2018012920:06ecaFAST_001Standard Prices - Calculation Dates00h03m27sFAST_-9533397962397344618[0/0]Mb    


The procedure was launched around 18:40. Therefore I know the first data reader took around 35 minutes to load, which is confirmed by the data reader screen:


These data reader uses a SAP connector. When I go in the connector log, I see each "extractor" has been launched twice. I suspect this is due to the use of the "replace" option which needs to scan the whole cube to obtain the time entities used (which means loading 1M lines to know that only 2018 is present instead of just reading the time field !).


So a few questions :

- Does anybody have the same issues ?

- How can we obtain the correct timings in the DB logs ?

- Is the "replace" option a good choice in this case or should I first clear part of the cube before loading in a normal mode ? Do you have any suggestion to make sure the procedure clears what is needed (not more and not less) before loading ?


Thanks !