"Timeout expired" error when running a procedure comprising of "Bulk insert to SQL Table" step.

Options
Venkatesh Dixit
Venkatesh Dixit Active Partner
Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
edited February 2023 in Platform

Hi all,

 

I am trying to carry out Bulk insert to SQL Table , but I keep getting a "timeout expired" error message.

 

Here are some details about my setup:

 

I am running this procedure on a Board cloud server which also has SQL database connection set-up. I'm using Microsoft SQL Server. I got the error as shown below in SQL Logs:

 

ERROR [HYT00] [DataDirect][ODBC Hybrid driver]Timeout expired.

 

And "Action failed to execute: Bulk insert operation failed. Please refer to the SQL Log for the details." in Procedure logs in 10 iterations out of total 15 iterations when this procedure was ran.

 

I have already tried the following troubleshooting steps:

 

On the "Data pipeline" portal I changed the following options:

  • Login Timeout changed from 600 to 100000
  • Query Timeout changed from 30 to -1
  • Enable Bulk Load was turned on

 

Despite these efforts, I am still experiencing the timeout expired error. Can anyone suggest additional troubleshooting steps I can try or offer any insight into what may be causing this issue?

 

Thanks in advance for your help

Answers

  • Thomas Kreis
    Options

    Hello Venkatesh,

    at first you may have a look at the procedure step itself.

    What layout are you trying to extract with this procedure step? How many lines approximately? What you could try as a troubleshooting step is extracting a layout with only one line for instance. Do you still experience the issue in such a case?

    Best regards

    Thomas

  • Venkatesh Dixit
    Options

    Hey Thomas,

    Thanks for your response. I went through the procedure step once again.

    To give you more info on it. The layout comprises of 4 cubes for NSV,MRP,Cost and Qty with Date,PCC,Store-FY-Cluster and Plan in axis. The chunk size for which I ran this procedure was 2100. Have tried for lower chunk size values too. There are in total 4317755 lines.

    I tried the troubleshooting step that you suggested. It worked fine when run for just 1 line, without any error.

    But I intend to send the data of the whole cube. Which is when I'm facing the error I have mentioned in the question.

    Regards,

    Venkatesh

  • Thomas Kreis
    Options

    Hello Venkatesh,

    that is quite a lot. To be honest I do not have any experience with that much lines. Nonetheless for the bulk insert it is important that you export an actual working layout. So is your export layout working in a regular dataview on a screen?

    Additionally for further troubleshooting I suggest the following:

    • Are there any time functions or other functions in your export layout? If so please consider preparing export cubes which are populated by a dataflow using that function
    • Are all of your export cubes on the lowest granularity given by the axis (row) you wish to export? If not please consider preparing this granularity as well in export cubes and use them in your export layout
    • Have you also tried an export with the “Export Dataset” step with your export layout? It is similiar to the bulk insert step but simply creates a csv file instead. If that step does not work too the problem may be the layout itself

    Hope this helps

    Best regards

    Thomas

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hi Thomas,

    The Export layout didn't load in a regular Dataview on a screen.

    Regarding troubleshooting steps you suggested:

    • There are no functions in the export layout.
    • Yes the cubes used are on the lowest granularity given by the axis(row).
    • I used the same layout in "Export Dataset" step and ran the procedure multiple times it worked fine.

    To mention one important thing: I initially ran this procedure on our local board instance multiple times and it worked fine without any error. I'm getting this error only when I run on cloud instance of board.

    Thanks

    Venkatesh

  • Bettina Clausen
    Bettina Clausen Employee
    First Anniversary Level 100: Foundations of Building in Board Level 200: Leveraging Board for Business Insights 5 Up Votes
    edited March 2023
    Options

    Hi Venkatesh,

    what Query timeout are you using in the data source connection (not the one in the data pipeline)? Did you already try to increase this parameter?

    Kind regards,

    Bettina

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hello Bettina,

    Currently the Query Timeout is set to 60 in the data source connection. It's the same even in our local board instances. Haven't tried increasing it.

    Thanks for the response,

    Venkatesh

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hello Bettina,

    As suggested, I tried by increasing the parameter(Query timeout in Data sources) to 3600 and 7200. But I don't see any difference. I ran the procedure for 8 times, Out of which it gave me “Timeout expired” error for 6 times.

    Regards,

    Venkatesh

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hi Thomas,

     

    The Export layout didn't load in a regular Dataview on a screen.

     

    Regarding troubleshooting steps you suggested:

    • There are no functions in the export layout.
    • Yes the cubes used are on the lowest granularity given by the axis(row).
    • I used the same layout in "Export Dataset" step and ran the procedure multiple times it worked fine.

     

    To mention one important thing: I initially ran this procedure on our local board instance multiple times and it worked fine without any error. I'm getting this error only when I run on cloud instance of board.

    Thanks,

    Venkatesh

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hi Thomas,

    The Export layout didn't load in a regular Dataview on a screen probably due to it being very heavy.

    Regarding troubleshooting steps you suggested:

    • There are no functions in the export layout.
    • Yes the cubes used are on the lowest granularity given by the axis(row).
    • I used the same layout in "Export Dataset" step and ran the procedure multiple times it worked fine.

    To mention one important thing: I initially ran this procedure on our local board instance multiple times and it worked fine without any error. I'm getting this error only when I run on cloud instance of board.

    Thanks,

    Venkatesh

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hi Thomas,

    The Export layout didn't load in a regular Dataview on a screen probably due to it being very heavy.

    Regarding troubleshooting steps you suggested:

    • There are no functions in the export layout.
    • Yes the cubes used are on the lowest granularity given by the axis(row).
    • I used the same layout in "Export Dataset" step and ran the procedure multiple times it worked fine.

    To mention one important thing: I initially ran this procedure on our local board instance multiple times and it worked fine without any error. I'm getting this error only when I run on cloud instance of board.

    Thanks,

    Venkatesh

  • Venkatesh Dixit
    Venkatesh Dixit Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Comment
    Options

    Hey Thomas,

    Thanks for your response. I went through the procedure step once again.

    To give you more info on it. The layout comprises of 4 cubes for NSV,MRP,Cost and Qty with Date,PCC,Store-FY-Cluster and Plan in axis. The chunk size for which I ran this procedure was 2100. Have tried for lower chunk size values too. There are in total 4317755 lines.

    I tried the troubleshooting step that you suggested. It worked fine when run for just 1 line, without any error.

    But I intend to send the data of the whole cube. Which is when I'm facing the error I have mentioned in the question.

    Regards,

    Venkatesh.




  • Thomas Kreis
    Options

    Hello Venkatesh,

    okay that is quite a lot.To be honest I do not have any experience with that much lines.

    Nonetheless for the bulk insert it is important, that you export an actual working layout. So are you able to create a working dataview with your layout which you wish to export?

    For further troubleshooting I would suggest the following:

    • The layout you wish to export, does it contain time functions or any other functions on a layout block, e.g. refer to? If so, you should try to avoid them and preapare an export cubes with a dataflow using the function instead
    • How about the dimensions of your cubes? Is every cube contained in your layout on the most detailed level you wish to export (row dimensions)? If not, prepare likewise export cubes and use them in the export
    • Have you also tried exporting your data with the “Export Dataset” step? It basically is the same step as the bulk insert but it simply creates a csv file. If this step fails aswell the problem may be the actual layout you want to export

    Hope this helps

    Best regards

    Thomas

  • Rajith Perera
    Rajith Perera Active Partner
    Name Dropper First Comment
    Options

    Hi @Venkatesh Dixit,

    I am facing a similar issue with the Bulk insert, were you able to solve your issue?

    Please let me know.

    Kind regards,

    Rajith

  • Andrea Duò
    Options

    Hello @Venkatesh Dixit,

    I recommend opening a support request on our portal support.board.com since there are a few technical details we need to check to perform an accurate analysis on this issue.

    Thanks!

    Andrea

  • Bart Scott
    Bart Scott Active Partner
    5 Likes Photogenic Community Captain First Anniversary
    Options

    There are multiple timeouts to check when pushing via Bulk insert:

    1. Data Pipeline as you mentioned
    2. Data Source settings within Board Web (query timeout)
    3. The source database server settings (in SQL Server it is the Remote Query Timeout)

    Many times people can miss #3 because it is a DBA/SA role only that can control that setting.

    Bart Scott

    CFO Solutions LLC