UtterAccess.com
X   Site Message
(Message will auto close in 2 seconds)

Welcome to UtterAccess! Please ( Login   or   Register )

Custom Search
 
   Reply to this topicStart new topic
> Flexible Excel Import Speed Issue, Access 2016    
 
   
chilladsi
post Jul 29 2019, 11:08 AM
Post#1



Posts: 21
Joined: 10-May 18



I have created an import routine that allows me to import Excel files of various formats to some back end SQL tables - basically I have a table with a record per file, and a sub table for storing the info regarding the columns where I can describe the structure of the file with Column References and column headings, and then map these onto a SQL Stored Procedure that does the insert - the issue is this runs a row at a time as it opens the Excel file in VBA and reads it a row at a time, passing each row as a range to a sub procedure that extracts the required values from the row and builds the insert call.

We need the flexibility as the supplier of these files can change the format without too much warning, and so we need to be able to alter the process without too much reprogramming. Also we can pick and choose which columns we want, and miss out any we aren't interested in.

On initial trials it worked pretty well, but now they have gone into full production we have some files with 400k rows which is taking far too long with this current format.

I could create an SSIS package to directly import the data into SQL much faster, but that will break and need to be changes and recompiled every time they change the file format.

Anyone have any ideas as to a flexible, yet performant solution?

Many Thanks.
This post has been edited by chilladsi: Jul 29 2019, 11:10 AM
Go to the top of the page
 
theDBguy
post Jul 29 2019, 12:05 PM
Post#2


UA Moderator
Posts: 76,821
Joined: 19-June 07
From: SunnySandyEggo


Hi. Just curious, does the SP insert the data from the original Excel files or from a temp table in Access? Is the "slowness" on the Access or the server side?

--------------------
Just my 2 cents... "And if I claim to be a wise man, it surely means that I don't know" - Kansas
Access Website | Access Blog | Email
Go to the top of the page
 
MadPiet
post Jul 29 2019, 12:20 PM
Post#3



Posts: 3,364
Joined: 27-February 09



How flexible do you need?
Could you determine a structure on the fly? I'd hate to go through Access because doing it in SSIS would be a lot faster,...

What if you did the simple transforms stuff in Access, outputted the data to Excel, and then used a Bulk Insert from Excel to SQL Server?
This post has been edited by MadPiet: Jul 29 2019, 12:22 PM
Go to the top of the page
 
WildBird
post Jul 29 2019, 08:31 PM
Post#4


UtterAccess VIP
Posts: 3,671
Joined: 19-August 03
From: Auckland, Little Australia


Its the row by row that is hurting, obviously. You need to build a system that builds the structure of the file first, then import that as a whole file. I would look at dynamic strings for building tables etc, and import like that to staging tables, then deal with those.

--------------------
Beer, natures brain defragging tool.
Go to the top of the page
 
chilladsi
post Jul 30 2019, 02:42 AM
Post#5



Posts: 21
Joined: 10-May 18



Thanks all for replies - will try and answer all questions.

The process reads each Excel row, and passes the row as a range to a sub routine - there is builds the Parameters for a SQL SP and calls the SP via an ADO Command Object (adCmdTxt) to do the insert straight into the SQL table. Outside of the loop I examine my tables where I describe the schema to build out the required parameters and data placeholders into a string, and on the row by row basis I create a copy of that string with the placeholders replaced with the data values. Then I call "execute" the string via the ADO command object as it's in the format "EXEC spName @Var1 = val1, @var2 = val2" for example.

It does need to be flexible as we know the supplier of the files is going to be making incremental additions to their files over the coming weeks / months.

It would be ideal as stated to be able to determine the structure dynamically, but then do the import as a bulk style operation - should i be looking at building out some sort of temp table dynamically based on what is in the file, then transferring the data to my production tables based on what I know to be the more permanent data I require - does that sound feasible / better?
Go to the top of the page
 


Custom Search


RSSSearch   Top   Lo-Fi    7th December 2019 - 06:57 PM