Converting All GP Data from One Environment to Another
Hi:
A GP 18.2 client upgrading from GP 2016 wants their data migration conducted unconventionally.
They have GP 18.2 installed in a a test environment with a data import tool, and they want all fourteen years of their transaction based and master data imported into this new environment. Notice that I said “imported” rather simply upgraded as a part of the GP Utilities process.
First, would some sort of SmartConnect direct table to table import make the most sense?
Secondly their main company database is 280GB and the list below shows the number of records within the major SQL tables not counting third-party tables.
So, would the amount of time required to convert this data be more like weeks or even months rather than on a weekend?
Thanks!
John
Table ID
Table Name
# Records
GL10000
GL Transaction Work
16
GL20000
Year to Date Transaction Open
1,121,821
GL30000
Account Transaction History
10,886,276
PM00200
PM Vendor Master
7,761
PM10000
PM Transaction Work
6
PM20000
PM Transaction Open
182
PM30200
PM Paid Transaction History
1,029,050
PM30300
PM Apply to History
896,341
RM00101
RM Customer Master
35,346
RM10301
RM Sales Work
0
RM10201
RM Cash Receipts Work
171
RM20101
RM Open File
924,317
RM20201
RM Apply Open
515,878
RM30101
RM History File
1,076,141
IV00101
Item Master
1,514
IV10000
Inventory Transaction Work
1
IV30200
Inventory Transaction History
27,495
IV30300
Inventory Transaction Amounts History
9,481,291
CO00101
Document Attachment Master
48,395
SOP30200
Sales Transaction History
22,229,024
POP30300
Purchase Receipt History
14,266
FA00100
Fixed Assets Master
60,657
FA00200
Fixed Assets Book Master
60,656
FA00902
Asset Financial Detail Master
4,537,350
CN00100
Collections – Notes
650,494
A GP 18.2 client upgrading from GP 2016 wants their data migration conducted unconventionally.
They have GP 18.2 installed in a a test environment with a data import tool, and they want all fourteen years of their transaction based and master data imported into this new environment. Notice that I said “imported” rather simply upgraded as a part of the GP Utilities process.
First, would some sort of SmartConnect direct table to table import make the most sense?
Secondly their main company database is 280GB and the list below shows the number of records within the major SQL tables not counting third-party tables.
So, would the amount of time required to convert this data be more like weeks or even months rather than on a weekend?
Thanks!
John
Table ID
Table Name
# Records
GL10000
GL Transaction Work
16
GL20000
Year to Date Transaction Open
1,121,821
GL30000
Account Transaction History
10,886,276
PM00200
PM Vendor Master
7,761
PM10000
PM Transaction Work
6
PM20000
PM Transaction Open
182
PM30200
PM Paid Transaction History
1,029,050
PM30300
PM Apply to History
896,341
RM00101
RM Customer Master
35,346
RM10301
RM Sales Work
0
RM10201
RM Cash Receipts Work
171
RM20101
RM Open File
924,317
RM20201
RM Apply Open
515,878
RM30101
RM History File
1,076,141
IV00101
Item Master
1,514
IV10000
Inventory Transaction Work
1
IV30200
Inventory Transaction History
27,495
IV30300
Inventory Transaction Amounts History
9,481,291
CO00101
Document Attachment Master
48,395
SOP30200
Sales Transaction History
22,229,024
POP30300
Purchase Receipt History
14,266
FA00100
Fixed Assets Master
60,657
FA00200
Fixed Assets Book Master
60,656
FA00902
Asset Financial Detail Master
4,537,350
CN00100
Collections – Notes
650,494
Answers
I’d like to say that SmartConnect is the best solution for this scenario but that isn’t going to be the case.
While SC can easily do “direct to table” by choosing the 2016 DB as the source and then 2018 as the destination, SC does the table inserts one by one. So any one of these tables that has a million rows is going to take forever to process.
While I’m not an expert on migrations from platform to platform as they are doing here, I guess I don’t understand why they don’t just move the database and upgrade in place?
I would get that you would maybe want to essentially “start over” and so wouldn’t want to bring the db along for that – but then that isn’t it because you are talking about “history” tables in your list of table sizes which implies to me that they want that data too.
And if you are going to bring along the history for all this, you may as well just take the db. Now the upgrade itself will take a while too for sure, but I can’t imagine that it would take more time than trying to bring the data over yourself.
But if they really want to go this route, I would probably use a tool like DTS which is just going to shove data from PointA to PointB which is what you suggest they want to do.
While SC can easily do “direct to table” by choosing the 2016 DB as the source and then 2018 as the destination, SC does the table inserts one by one. So any one of these tables that has a million rows is going to take forever to process.
While I’m not an expert on migrations from platform to platform as they are doing here, I guess I don’t understand why they don’t just move the database and upgrade in place?
I would get that you would maybe want to essentially “start over” and so wouldn’t want to bring the db along for that – but then that isn’t it because you are talking about “history” tables in your list of table sizes which implies to me that they want that data too.
And if you are going to bring along the history for all this, you may as well just take the db. Now the upgrade itself will take a while too for sure, but I can’t imagine that it would take more time than trying to bring the data over yourself.
But if they really want to go this route, I would probably use a tool like DTS which is just going to shove data from PointA to PointB which is what you suggest they want to do.