This article is for anyone needing to write data to a data lake to power a Popdock list.
Use Cases
Here are a couple use cases I have heard for using SmartConnect to write data to a Data Lake.
- Multi-Data Source
- Combine any data together quickly in Popdock. I had a customer that received pricing updates for their entire catalog from a vendor every day on a 3rd party FTP server. They only wanted to update their web shop with prices that changed. The resolution was to write all the external FTP data into a Data Lake so Popdock could run a Comparison list between the new prices and the web shop prices.
- Optimized reporting for slow APIs
- If Popdock is being used to retrieve data from a slow API, scheduled exports can be ran from SmartConnect to create historical data in the data lake for improved performance.
- Reduce Data Storage Costs
- Cloud data storage is expensive, but Azure BLOB storage is cheap. Exports from other cloud applications into Azure will allow historical data to be removed from these other applications, lowering storage costs.
SmartConnect Integration Configuration
We will create a SmartConnect integration to write to Azure BLOB storage using an SFTP Connector.
In this example, we will be exporting historical invoices from Business Central into an FTP folder. This will be an export of deltas meaning we won’t export the full data source on every run, only incremental data based on a change trigger or date filter.
- Add a new Azure SFTP connector if one doesn’t already exist.
- Log into portal.azure.com > Storage Accounts > Containers.
- Open the container you are using for this process.
- Click Add Directory.
- Give the directory a name (I used invoices)

- Create a Data Source that will extract data from your source system.
- Create an integration process using your data source.
- On the Target tab select the following.
- FTP Destination – Azure SFTP Connector from step 1
- FTP Path – Select the directory created in step 5
- File Name – Text File
- Column Separator – Csv Delimited
- Include Headings – True
- Append date/time – True
- Enclose column values in quotes (“”) – True

- Switch to the Integration tab.
- Select Export to file from Target Lines.
- Switch to the Target Integration subtab.
- Select Column Actions > Copy Source Columns.

- Save and Run the integration.
- After the integration has successfully ran, you can proceed to the Popdock configuration.
Popdock List Configuration
We will read data from the Data Lake using a Popdock Data Lake Connector.
- Log into Popdock.
- Add a new Azure Data Lake connector if one doesn’t already exist.
- Open your Azure Data Lake connector.
- Navigate to Lists > Click here to add a folder to Azure Data Lake.

- Provide a List name and point it to the Folder created in step 5.
- Click Add.

- Click edit on the list and change any details required such as display name and format.
- Add some default fields.
- Click Preview, and you will see your data imported from SmartConnect on the data lake.