Integration with R - Load data doesn't work properly

Hello,
I experimented the following problem. After execution of R statements, the DataReader doesn't work properly, but laods only a few items.
I'm sure that R dataframe has more the 100 items, because I write them on a file.
After the data loaded, DataReader show as number of rows loaded the right number, but in the cube i only see 3 items loaded.Is there some paramiter configuration to set ?

Thanks in advance

Carmine

Answers

  • Hi Carmine,

    The number of rows and the items being loaded may be due to duplicate loads or rows containing empty values for cube data, if it's in replace time slice mode this may appear as just 3 items from 100 rows even though there are 100 rows in the source. I'm also assuming here that all 100 rows have been validated and not rejected. Let me know if this helps.

    Scott

  • Carmine_Caruso
    Carmine_Caruso Active Partner
    First Anniversary First Comment

    The dataframe contains all distinct and valued rows

    As I wrote, I exported the dataframe to a text file and checked the contents.The text file loaded via dataload loads properly all the rows

    Reagrds

    Carmine

  • Hi Carmine,

    Glad you found a workaround, one for the platform team to be able to respond on regarding the direct integration method if the flat file isn't scalable for you use, otherwise happy days!

  • Hi Carmine

    What version of Board are you using in this setup?

    12.5.x (not sure about the patch numbers) in some cases briefly showed the described behavior of the the DataReader build for the R-step.

    Versions before and since 12.6, the latest, were and are reading all rows like expected.

    Hope this helps.

    Best regards,

    Filip

    ————————————-
    Filip Rankovic
    Associate Consultant
    Board Deutschland GmbH

  • Carmine_Caruso
    Carmine_Caruso Active Partner
    First Anniversary First Comment

    Hi Filip,

    My version is 12.5. I'll try to install the latest one and then try to see if the problem is solved-.

    Thanks

    Carmine

  • HI Carmine,

    Did the upgrade of the version solve the issue in the end?

    Kind regards,

    Andries