Popdock Data Lake Upload Tool Installation and Usage Guide for Dynamics NAV

Published: Jan 18, 2024

Popdock Data Lake Upload Tool Installation and Usage Guide for Dynamics NAV

Published: Jan 18, 2024

Overview

The Data Lake Upload Tool is designed to help customers move their on-premises data into the cloud. It can be used to perform regular data backups or migrate legacy data to cloud storage for faster access through Popdock. Once data has been transferred to the data lake, it is stored, managed, and accessed from this central repository.

This installation guide will support eOne Partners and services consultants through setting up and using the Data Lake Upload Tool to migrate customer data from Dynamics NAV. This guide includes instructions on preparing for the installation, installing the tool, and using the tool to port data from Dynamics NAV to an Azure data lake or an Amazon S3 storage account.

In addition to running the Popdock Data Lake Upload Tool, we recommend storing a backup of your database someplace (can be within the same Data Lake) in the event you need it in the future.

Prepare for the installation

To ensure a successful install and migration of your systems using the tool, you will want to:
• Complete pre-installation tasks
• Meet minimum hardware/software guidelines
• Follow network recommendations
• Create an account credentials list

Pre-installation tasks

There are two pre-installation tasks the Data Lake Upload Tool requires and should be completed before installing it. An existing Azure Data Lake or Amazon S3 setup is required along with an Azure Data Lake or Amazon S3 connector configured in Popdock which connects to it.

The tool needs a running Data lake with a storage container/bucket to store your list information. You can use this article to set up an Azure Data Lake or this article for setting up Amazon S3.

Once the data lake containers or buckets have been created, you will add an Azure Data Lake Connector or an Amazon S3 Connector to your Popdock configuration. After either your Azure Data Lake or Amazon S3 Connector is created, you will have the necessary pre-installation tasks completed.

Hardware/Software Guidelines

The Data Lake Upload Tool requires that the computer running the tool is connected to the same network as the SQL database server, has enough processing power, storage space and memory for the tool’s data transfer processes and meets the minimum supported Operating System version guidelines.

Minimum hardware requirements:

CPU1GHZ Processor
Memory8 GB
Disk Space (free)Recommend 100 GB

Minimum software requirements:

Client OSWindows 8 or higher
ServerWindows Server 2012 or higher

Minimum database requirements:

SQL ServerSQL Server 2012 or higher

For technical support with earlier versions of SQL Server, contact support@eonesolutions.com.

Network Recommendations

The internet connection should be at least 5 Mbps upload to support the file transfer processes of the tool. The network and internet connection play a key role in the data transfer experience because low bandwidth internet and poor network connections can lead to slower transfer speeds and connection timeouts. While the 5 MBPs internet speed is a minimum guideline, a faster internet connection may not always guarantee faster data transfer speeds.

Account Credentials List

The installation process requires login information for multiple accounts to provide access to resources and systems involved in the upload process.


Use the list below to populate the necessary account information and have it available during the Data Lake Upload Tool installation.


Popdock Credentials – This is login information for the Popdock account.

System: PopdockDescriptionEnter your information
Username[The Popdock admin account username] 
Password[The Popdock account password] 
Account[The login account name] 

Data Lake Connector Credentials – This information can be gathered from either the Azure Data Lake Connector or the Azure Console if using an Azure Data Lake. If using Amazon S3 storage, this information can be gathered from the Amazon S3 Connector.


For Azure Data Lake:

System: Azure Data LakeDescriptionEnter your information
Data Lake Connector[The name of your Popdock Data Lake Connector] 
Storage Account[The name of the storage account in the Azure console or your Popdock Data Lake Connector configuration] 
Table Container[The storage account container, in the Azure console, where all tables/views will be copied] 
List Container[The storage account container, in the Azure console, where all lists will be copied] 
Shared Key[The Shared key for the storage account in the Azure console] 

For Amazon S3:

System: Amazon S3DescriptionEnter your information
Data Lake Connector[The name of your Popdock Data Lake Connector] 
AWS Region[The name of the region your AWS S3 bucket is stored in] 
Bucket[The storage account container, in Amazon S3, where all lists will be copied] 
Access key[The Access key for the S3 account in AWS] 
Secret key[The Secret key for the S3 account in AWS]   

Database Credentials – This information is from your database server.

SystemDYNAMICS NAVEnter your information
Server[The hostname of the SQL server] 
Port[The port being used by the SQL server] 
System Database[The name of the SQL server system database] 
Username[A SQL Server user with read access to the system database] 
Password[The password for the SQL Server user] 

Run the setup wizard installation

The Data Lake Upload Tool Setup Wizard is an installer that places the files necessary to run the upload tool on the computer.

Before running the setup wizard installation, make sure you have completed the required tasks under prepare for the installation.

To install the Data Lake Upload Tool:

1. This is the welcome page, select Next to continue.


2. In Select Installation Folder, a default installation folder for the Data Lake Upload Tool files is selected.

You can change the installation folder location, where the files to run the tool are copied, by pressing the Browse button and selecting another folder.

3. Select the default installation folder location, then select Next.


4. The Setup Wizard will install the software required by the upload tool.


5. If the Windows Desktop Runtime 6.0.13 is not installed, on the computer that will run the Data Lake Upload Tool, it will be installed when you select Install.


6. You will know the .NET Runtime installation is complete when you see this window. Select Close.


7. The setup will resume the remaining steps in the installation automatically.


8 When you see this window, the setup wizard has successfully completed the installation, select Finish.


Copying Lists from Dynamics NAV

To use the tool, make sure you have completed the tasks under prepare for the installation and run the setup wizard installation.

Once the setup wizard is completed successfully, the files necessary to run the Data Lake Upload Tool are installed on the computer and the tool is ready to use. You will need the account credentials list, which you gathered earlier to use the Data Lake Upload Tool.

If you do not have the list, go back to the prepare for the installation section and complete the list.

In this section, you will use the tool to import your legacy data from Dynamics NAV by migrating the tables as lists.

Go to the Start Menu and select Popdock – Data Lake Upload Tool.


Welcome Page

On the welcome page, you will have the option to copy from different systems. The only option available for Dynamics NAV is copying lists. This will create lists for all your NAV tables.
Proceed with making a copy of all the tables as lists from your server.

1. Select the dropdown arrow on the right, to view the available options.


2. Select Dynamics NAV – Copy lists, then select Continue.


Log in to your Popdock account

At the Log in to Popdock screen, you will provide the Popdock username, password, and account information for your account, which will be validated by the tool. The Popdock login entered should be the admin of the account.


1. At the log in screen, enter the Username and Password for the Popdock account admin, and then select Log in to validate the credentials.


2. If the login information cannot be confirmed as entered, you will see the error message below. Check your Popdock login username/password, then enter the information and select Log in.

The installation will not move to the next page until your Popdock credentials are successfully verified.


3. If you have more than one account, an Account field will appear on a new page. Select the dropdown arrow below account to view the list of available accounts.


4. If the default account populated in the Account dropdown is the Account you want, select Continue.


5. If the default account populated is not the account you want to continue with, select the new Account from the account dropdown menu and then select Switch account to move forward.


6. Once the login information and account are successfully validated, the installation process will automatically move to the next configuration page.


Connect to your storage account

At the Data Lake screen, you will provide connection information for your Azure data lake or Amazon S3 storage account and its respective connector in Popdock. The installer will verify the information entered then proceed to the next step in the installation.  

You will need the account credentials list for this section. 

For Azure Data Lake (for Amazon S3 storage go here):

1. Here you will select a data lake connector, enter the storage account, enter the container, and paste the shared key. 


2. Select the dropdown arrow under Data Lake connector to display the list of Azure data lake connectors configured in your Popdock account.


3. Select a Popdock Data Lake connector.


4. Enter the name of your Storage account.


5. Enter the name of your Table container.


6. Copy and paste the contents of your Shared key.


7. After entering the Azure Data Lake information, select Connect to validate.


8. If the storage account, container, or shared key information is entered incorrectly, the installation will not move to the Database page and the error message below will be displayed.


9. Check your credential list Azure data lake information, then re-enter information, and select Connect to validate.

10. Once the connection to your Azure Data Lake connection is successfully validated, the installation process will automatically move to the next page.

For Amazon S3:

1. Here you will select a data lake connector, enter the AWS region, enter the bucket, and paste both the access key and the secret key. 


2. Select the dropdown arrow under Data Lake connector to display the list of Data lake connectors configured in your Popdock account.


3. Select a Popdock Amazon S3 Data Lake connector.


4. Enter the AWS region of your storage bucket (example format: us-east-2).


5. Enter the name of your AWS S3 Bucket.


6. Paste the contents of your Access key.


7. Paste the contents of your Secret key.


8. Select if you want to Use gzip. If selected, this will convert each list or table into a gzip file type. Gzip will provide performance gains when accessing the data via Popdock.


9. After entering your Amazon S3 information, select Connect to validate.


10. Once the connection to your Amazon S3 connection is successfully validated, the installation process will automatically move to the next page.


Connect to your database

At the Database screen, you will provide connection information for the Dynamics NAV database. The installer will use the server and login information to test the connection to your database server.

You will need the account credentials list for this section.


1. Under Server, enter the hostname of your database server.


2. Enter the Port being used by your SQL server to provide database services. The default port for database services is 1433, but before using it, confirm to find the TCP Port number your SQL Instance is listening on.


3. Enter the name of the System Database.


4. Enter the SQL Server user credentials, under Username and Password.


5. Once all the information for the database server is entered, select Connect to validate.


6. If any of the database server connection information is not entered correctly, an error message will appear, and the installer will not move to the next page.


7. To find additional information on the common error messages, go to the troubleshoot the tool section.


8. If a successful connection is made, the lists will begin loading and the installer will move to the next page.


Select Companies

The Select Companies screen displays the available company databases to choose from. You will select the companies here, and in the next screen, you will be able to choose lists to download.

1. Choose the companies by selecting the box to the left of the company name.


2. Select the box next to the companies you want to copy, and then select Next: Select lists.


Select Lists

On the Select Lists page, you will choose the lists you want to copy from the selected companies. The lists are copies of every NAV table.


1. Choose the Lists you would like to upload, by selecting the box to the left of each list.


2. After choosing your lists, select Next: Run.


Run

The Data Lake Upload Tool will begin the import process. The upload tracking information below is available on the Run page:

Cancel the download

Select Cancel to stop the upload and make changes to the settings in the upload tool, i.e., Data Lake, Database, etc.


Track upload progress

1. The Progress section shows the overall status of the upload process.

The statuses below are displayed the Progress section:
Running is the total number to upload.
Pending is the remaining number to upload.
Completed is the number that uploaded successfully.
Failed is the number that failed to upload due to an error.
Zero rows is the number that were not uploaded due to having zero rows of data.


2. The Current process section displays the ongoing tasks the upload tool is performing and their status.


3. The Report screen will appear at the end of the upload.

You can view the statuses below in the Progress section:
Success means the upload was successful.
Failed means the upload failed due to an error.
Skipped means the upload was skipped due to zero rows of data.
Cancelled means the upload was cancelled.


4. Use Run again to execute the tool, without having to re-enter login credentials.


5. Select View logs to access more detailed information about the upload or to investigate any copy failures.


6. The View logs button will open the folder containing the log files. You can open the log files in the text editor to view detailed information on failures. You can use the log files to fix the upload errors or send them to support@eonesolutions.com for additional support.


7. After fixing the errors in the log file, you can select Re-run selected failures button to retry copying the uploads with a failed status.

If you have a disruption in network access that stops the upload, you can use the Re-run selected failures button to re-run the process.

Troubleshoot the tool

Database “Login Failed” Error

If you get the “login failed for user” error, check that your username and password are:
• Typed correctly
• SQL server login credentials
• Have access to the system database

For further assistance with connecting to your database server, contact support@eonesolutions.com.


Database “Server not found” Error

The “Server is not found or was not accessible” error will appear, if the computer running the data lake upload tool, cannot connect to the database server.

In that case, you should:
• Make sure the server name is spelled correctly.
• Confirm that the server port entered matches the port on the SQL server.
• Check that the SQL server is up and running.

For further assistance with connecting to your database server, contact support@eonesolutions.com.




Questions on the Data Lake Upload Tool? Email support at support@eonesolutions.com

Content
Overview Prepare for the installation Run the setup wizard installation Copying Lists from Dynamics NAV Troubleshoot the tool

Feeling stuck? Get the support and guidance you need to help you power through any data challenge

Reset Filters