Back

Write and Read Data Lake data with SmartConnect and Popdock

Published: May 29, 2024
Post Author Written by Ethan Sorenson

This article is for anyone needing to write data to a data lake to power a Popdock list.

Use Cases

Here are a couple use cases I have heard for using SmartConnect to write data to a Data Lake.

  • Multi-Data Source
    • Combine any data together quickly in Popdock. I had a customer that received pricing updates for their entire catalog from a vendor every day on a 3rd party FTP server. They only wanted to update their web shop with prices that changed. The resolution was to write all the external FTP data into a Data Lake so Popdock could run a Comparison list between the new prices and the web shop prices.
  • Optimized reporting for slow APIs
    • If Popdock is being used to retrieve data from a slow API, scheduled exports can be ran from SmartConnect to create historical data in the data lake for improved performance.
  • Reduce Data Storage Costs
    • Cloud data storage is expensive, but Azure BLOB storage is cheap. Exports from other cloud applications into Azure will allow historical data to be removed from these other applications, lowering storage costs.

SmartConnect Integration Configuration

We will create a SmartConnect integration to write to Azure BLOB storage using an SFTP Connector.

In this example, we will be exporting historical invoices from Business Central into an FTP folder. This will be an export of deltas meaning we won’t export the full data source on every run, only incremental data based on a change trigger or date filter.

  1. Add a new Azure SFTP connector if one doesn’t already exist.
  2. Log into portal.azure.com > Storage Accounts > Containers.
  3. Open the container you are using for this process.
  4. Click Add Directory.
  5. Give the directory a name (I used invoices)
  1. Create a Data Source that will extract data from your source system.
  2. Create an integration process using your data source.
  3. On the Target tab select the following.
    1. FTP Destination – Azure SFTP Connector from step 1
    2. FTP Path – Select the directory created in step 5
    3. File Name – Text File
    4. Column Separator – Csv Delimited
    5. Include Headings – True
    6. Append date/time – True
    7. Enclose column values in quotes (“”) – True

If you are sending the full data set on each run, you will check the box for Overwrite file.

  1. Switch to the Integration tab.
  2. Select Export to file from Target Lines.
  3. Switch to the Target Integration subtab.
  4. Select Column Actions > Copy Source Columns.
  1. Save and Run the integration.
  2. After the integration has successfully ran, you can proceed to the Popdock configuration.

Popdock List Configuration

We will read data from the Data Lake using a Popdock Data Lake Connector.

  1. Log into Popdock.
  2. Add a new Azure Data Lake connector if one doesn’t already exist.
  3. Open your Azure Data Lake connector.
  4. Navigate to Lists > Click here to add a folder to Azure Data Lake.

If you are providing the full data set on each integration run, you will select file from Azure Data Lake instead.

  1. Provide a List name and point it to the Folder created in step 5.
  2. Click Add.
  1. Click edit on the list and change any details required such as display name and format.
  2. Add some default fields.
  3. Click Preview, and you will see your data imported from SmartConnect on the data lake.

Feeling stuck? Get the support and guidance you need to help you power through any data challenge

We're on your integration team. Connect with our people and let us know how we can help you.