CSV inbound data with multiple columns per product

Options

I have a CSV file, which contains inventory levels and stock movements by location.

The file is structured that each line contains detail by location, with multiple product headings in the columns and the volumes then in the row.

File example

How do i go about mapping this to the various entities?

I have set up a product, movement and location entitities,  but as you can see from above the data contains multiple product entries per line.

Tagged:

Answers

  • Hi,

    There is a way to solve your issue, which works if:

    • you don't have too many products columns (because you will need to map then one by one)
    • these products remain the same in all files you want to load into your database using the same datareader (same reason as above)

     

    The help explains how to use the required function, called Loading data from multi-column text files

     

    Does it help ?

  • Kelly Goldsworthy
    Options

    Thanks for the reply.

    Unfortunately it does not work, as the file is comma separated, and each value under the products is a different width… therefore meaning I can’t define a specific starting point for the data.

    The “Across” functionality does not seem to apply to CSV delimited files.

     

    Any other suggestions? Or am I missing something?

  • Previous Member
    edited March 2020
    Options

    I am sorry, you're right. The "across" tab disappears when I check the "CSV" box...

    image

     

    I don't have any idea then :(

    I would try to get another CSV file from the source where the product dimension is not by column...

  • Samuele Marchetto
    Samuele Marchetto Active Partner
    First Anniversary First Comment 5 Up Votes 5 Likes
    Options

    Hi, I solved a similar issue in two different ways:
    - using a macro Excel that convert the table to a flat file, where the first five columns are repeated for each article.
    - using an third party ETL software that make this transformation

     

    Bye

  • Kelly Goldsworthy
    Options

    Thanks – I will give the Excel macro a try first. Should be easy enough to replicate.

    I do have a long term fix for the data, but just need a work around for building the model.

  • Previous Member
    edited March 2020
    Options

    Alternative solution, which requires a small amount of products to be practicable: use an ODBC datareader (and not an ASCII one) to load your csv file.

    The advantage is, you can write SQL-like queries on your data.

    I guess your target structure for your ODBC datareader in BOARD is the following:

    • entity: Movement code
    • entity: Location code
    • entity: Date
    • entity: Product
    • cube: YourCube

     

    in this case, the select statement (source-side) should look like this:

    <span class="keyword token">SELECT</span> Movement_c<span class="punctuation token">,</span> Location_c<span class="punctuation token">,</span> Date<span class="punctuation token">,</span> Product_c<span class="punctuation token">,</span> Stock_Value <span class="keyword token">FROM</span> YourCSVfile<span class="punctuation token">.</span>csv<span class="line-numbers-rows"><span></span></span>

    in your case, the product is not a single column, but you could load the product columns one by one using the UNION ALL operator.

    If I keep the headers of your example (4 products), the select statement should look like this:

    <span class="keyword token">SELECT</span> Movement_c<span class="punctuation token">,</span> Location_c<span class="punctuation token">,</span> Date<span class="punctuation token">,</span> <span class="string token">'Product_1'</span> <span class="keyword token">as</span> Product<span class="punctuation token">,</span> <span class="string token">"Product 1"</span> <span class="keyword token">as</span> StockValue <span class="keyword token">FROM</span> YourCSVfile<span class="punctuation token">.</span>csv <br><span class="keyword token">UNION</span> <span class="keyword token">ALL</span> <br><span class="keyword token">SELECT</span> Movement_c<span class="punctuation token">,</span> Location_c<span class="punctuation token">,</span> Date<span class="punctuation token">,</span> <span class="string token">'Product_2'</span> <span class="keyword token">as</span> Product<span class="punctuation token">,</span> <span class="string token">"Product 2"</span> <span class="keyword token">as</span> StockValue <span class="keyword token">FROM</span> YourCSVfile<span class="punctuation token">.</span>csv <br><span class="keyword token">UNION</span> <span class="keyword token">ALL</span> <br><span class="keyword token">SELECT</span> Movement_c<span class="punctuation token">,</span> Location_c<span class="punctuation token">,</span> Date<span class="punctuation token">,</span> <span class="string token">'Product_3'</span> <span class="keyword token">as</span> Product<span class="punctuation token">,</span> <span class="string token">"Product 3"</span> <span class="keyword token">as</span> StockValue <span class="keyword token">FROM</span> YourCSVfile<span class="punctuation token">.</span>csv <br><span class="keyword token">UNION</span> <span class="keyword token">ALL</span> <br><span class="keyword token">SELECT</span> Movement_c<span class="punctuation token">,</span> Location_c<span class="punctuation token">,</span> Date<span class="punctuation token">,</span> <span class="string token">'Product_4'</span> <span class="keyword token">as</span> Product<span class="punctuation token">,</span> <span class="string token">"Product 4"</span> <span class="keyword token">as</span> StockValue <span class="keyword token">FROM</span> YourCSVfile<span class="punctuation token">.</span>csv <span class="line-numbers-rows"><span></span><span></span><span></span><span></span><span></span><span></span><span></span></span>

    in this example, the product codes for the product entity are hardcoded within the SQL statement and will be product_1, product_2 etc... and for each of these products, the value from the appropriate column of your CSV file will be loaded into your cube.

     

    an important requirement for this approach is, how to create an ODBC datareader on a csv file ? you need to create an ODBC connector for this (as for any other source system using ODBC datareader). You need the ODBC driver called "Microsoft Access Text Driver (*.txt,*.csv).

    image

    to install it on your BOARD server, you may use this link to download the file AccessDatabaseEngine_X64.exe

    This should install the ODBC driver needed on the server

     

    Configuring this is a bit of work, it will produce a mapping file called schema.ini within the folder where the CSV file is stored on the BOARD server, but once it is done for all variations of the records of your CSV file, you can use it as often as you need, unless its column structure changes.

  • Brendan Broughton
    Brendan Broughton Active Partner
    Level 200: Leveraging Board for Business Insights Level 100: Foundations of Building in Board First Anniversary First Comment
    Options

    Hi Kelly Goldsworthy, I believe you would need to save it as a space delimited *.prn file type for the 'Across' function to work.  The main issue with doing that is making sure your column widths are set correctly before you do, and they are consistent for each update/version of the file, as it will impact the space delimiting and consequently the Across datareader

  • Hi,

    We have the same situation, we are getting excel sheets with the months in the columns and each month and it was a lot of work to convert them into a format which can be imported into Board.

     

    So I created a folder on the server and the users can drop their excel sheets into the folder. I created a power shell script which monitors the folder and as soon as there is a new file dropped in the folder it will execute a SQL stored procedure to import the data into an SQL table.

     

    Once the data is in SQL it is easy to import into Board.

     

    The solution consists of two components a SQL stored procedure and a power shell script.

     

    If you want I can post the scripts.