Getting started with BOARD Cloud Data Pipeline

Document created by arocchietti Employee on Oct 26, 2017Last modified by arocchietti Employee on Nov 12, 2017
Version 13Show Document
  • View in full screen mode



BOARD Cloud Data Pipeline is a light weight software service that is designed to allow BOARD applications to access data from data sources that are in the cloud or on-premises.
The solution is build of two components: the Data Pipeline Cloud Service and the On-Premises Connector. 

Data Pipeline Cloud Service, is installed in the cloud and is the principal component. It exposes a web portal where the user can define connections and rules.
On-Premises Connector (from now on OPC),  is the component that is installed remotely, typically to access databases that reside behind the firewall. It provides the services to securely move data from on-premises data stores residing behind a firewall to the cloud. This module securely connects to the Data Pipeline Server without requiring firewall configuration changes using HTTPS.


Getting Started

First step: Change user password

Open the Data Pipeline web site from a browser, and insert the user and password provided in the cloud Admin site:

You must change the password at the first login:


For the security reason the password must have the following policies:

  • are at least eight characters long
  • contain a combination of upper- and lower-case letters 
  • contain numbers
  • contain punctuation marks and other special characters


Second step: prepare the server for On-Premise Connector


The OPC can be installed on physical Server or a Virtual Machine.

The minimum hardware requirement are the following:

  • CPU with 4-core or more
  • RAM 8 Gb or more
  • Disk 40Gb or more

The On-Premises Connector supports all editions of the following Windows 64-bit operating systems:

  • Window 10
  • Windows Server 2008
  • Windows Server 2012

The server must have an internet access.

The following TCP port must be opened for outbound traffic:  443, 11280, 40501  


All previous version of On-Premises Connector must be uninstalled.


Third step: install On-Premise Connector


These are the steps necessary to install the OPC:

  1. Download the OPC installer to a temporary directory on the Windows machine on which you want to install it.
  2.  Unzip the installer.
  3.  “Run as Administrator” or Double-click the installation file: PROGRESS_DATADIRECT_HDP_OPCONNECTOR_4.1_WIN_64.exe.
  4. The Introduction window appears. Click Next to continue.

  5. In the Where Would You Like to Install? field, specify the installation directory in one of the following ways: 
  • Accept the default installation directory, and then click Next.
  • Enter the path of the installation directory, including the driver letter of the Windows machine. Then, click Next.
  • Click the Choose (...) button to browse to and select an installation directory. Then, click Next


6. Select the type of installation: 

  • If your installation does not need modification, select Standard Installation and continue
  • If you need to customize the installation, select Advanced Installation, and thenselect one or more of the following options.
  • If you need to enable support for Microsoft Dynamics CRM, select Microsoft Dynamics CRM.
  • The On-Premises Connector must communicate with the Hybrid Data Pipeline service using the Internet. If your network environment requires a proxy to access the public Internet, select Proxy Connection and continue.
  • If you need to enable support for MySQL Community Edition, select MySQL Community Edition.



If you are using a MySQL Community Edition data store, provide the name and location of the MySQL Community Edition driver jar file. Then, click Next and proceed.




If you are using a Microsoft Dynamics CRM data store, provide your user name and password for Microsoft Dynamics CRM. If required for your environment, select the check box and type the path to the Kerberos configuration files that you want to use.


If you are using a provide your proxy connection information and the type of proxy authentication you want to use. (You can change this information later using the Hybrid Data Pipeline On-Premises Connector Configuration Tool.)

Type the proxy connection information:

  • Hostname specifies the Host name and, optionally, the domain of the proxy server. The value can be a host name, a fully qualified domain name, or an IPv4 or IPv6 address.
  • Port Number specifies port number where the proxy server is listening.
  • User Name specifies the user name needed to connect to the proxy server if you are using HTTP Basic or NTLM authentication. If NTLM Proxy Authentication is selected, the user name must be in the form Domain\User.
  • Password specifies the password needed to connect to the proxy server if you are using HTTP Basic or NTLM authentication.

From the Proxy Authentication drop-down list, select the type of proxy authentication needed in your environment:

  • Select No Proxy Authentication if your proxy server does not require authentication.
  • Select HTTP Proxy Authentication if the proxy server requires that all requests be authenticated using the HTTP Basic authentication protocol.
  • Select NTML Proxy Authentication if the proxy server requires that all requests be authenticated using the NTLM authentication protocol.



7. Provide the user ID and password for your Hybrid Data Pipeline account. If desired, you can change the default connector label. Click Next.

The installer validates your Hybrid Data Pipeline account credentials.


8. Review the Pre-Installation Summary window. To install the connector, click Install.



9. Click Done to exit the installer.




10. After the installer closes, the On-Premises Connector Configuration Tool opens and verifies access to the Hybrid Data Pipeline service.



Create a data source definition on Data Pipeline web site.


Once you have completed the OPC installation you are ready to define the first data source. The information needed by Data Pipeline to connect to the desired data are stored in a Data Source definition.

The steps in this topic apply generally but the available options differ by data store type. When you are logged in to your Data Pipeline account, follow these steps to create a Data Source definition:

  1. If the Data Sources view is not already open, in the left navigation panel, click Data Sources to open it. 
  2. Click +NEW DATA SOURCE. The Data Stores page opens.

  3. From the list of Data Stores, click the logo for the type of data store to which you want to connect. The Create Data Source page opens.
  4. On the General tab, enter a data source name, and fill the required fields. If applicable, enter the login credentials for the data store. If you do not enter the login credentials at this point, a dialog will prompt you to do so, when you try to connect to the data store.
  5. To persist the login credentials for the data store, you must enter the credentials in the data source definition page.
  6. If the data source is on-premise pickup the connector ID defined previously.

  7. Then, do one of the following:

  • Click SAVE to create the definition without testing the connection.
  • Click TEST CONNECTION to establish a connection with the Data Store, create and save the data source definition. Then, click SAVE.


Note: If you click TEST CONNECTION without specifying the login credentials, a pop-up dialog appears. Enter the login credentials for the data source and click OK.The new Data Source is added to the list on the Data Sources page.




Create the ODBC connection on cloud environments


When you complete the creation of all datasources, you can open a ticket on our support system (It requires a registration with your company code ) or send an email to in order to create the ODBC connections on cloud environments.


You can use the follow tamplate:

Dear Cloud support, Please allow the new ODBC connection that I have created to be accessible in BOARD client: CUSTOMER : <customer name>


  • Production server: <yes or not applicable>
  • Sandbox 1 server : <yes or not applicable>
  • Sandbox 2 server: <yes or not applicable>




  • <Your_datasource_name>
  • <Your_datasource_name>
6 people found this helpful