Printable Version of Topic

Click here to view this topic in its original format

UtterAccess Forums _ Access Automation _ Import Excel Worksheet Using Vba - Errors When Range Is Too Big

Posted by: scoobster Oct 7 2019, 03:43 PM


I'm importing data from Excel (.xlslx format) to a new table using the following statement.

DoCmd.TransferSpreadsheet TransferType:=acImport, SpreadsheetType:=acSpreadsheetTypeExcel12Xml, tableName:="importExcelNow", FileName:=Forms!frmComparison!txtWorkbook1, HasFieldNames:=True, Range:=Forms!frmComparison!cboWorkbookSheets1.Column(1) & "!A1:XFD1048576"

I have to use 'range' because the workbook could have more than one sheet (i.e. 'original', 'migrated', etc)

The import works fine until increase the range. So if I set the range to be the maximum possible column and rows, I get the error 'run-time 3011' saying 'access engine could not find the object Original$A1:XFD1048576'. If I change the range to 'A1:Z100' then the import runs successfully.

The range of data in the Excel sheet will vary so I don't really want to guess a maximum column and row.

Anything I can do to make this work?


Posted by: June7 Oct 7 2019, 04:07 PM

Just reference the sheet name with $ character. All columns with data should be imported. If code is behind frmComparison, can use Me qualifier.

DoCmd.TransferSpreadsheet acImport, acSpreadsheetTypeExcel12Xml, "importExcelNow", Me.txtWorkbook1, True, Me.cboWorkbookSheets1.Column(1) & "$"

Posted by: ADezii Oct 7 2019, 05:18 PM

To be sure, for each selected Worksheet, you could reference the UsedRange Property of that Worksheet:

'Assuming wks has previously been defined
Range:=Forms!frmComparison!cboWorkbookSheets1.Column(1) & "!" & wks.UsedRange.Address

Posted by: June7 Oct 7 2019, 06:25 PM

I expect that would only work if using Excel automation. Doesn't look like OP is doing that. I don't think TransferSpreadsheet is used with Excel automation.

Posted by: isladogs Oct 7 2019, 10:54 PM

Try making the Excel worksheet a linked table in Access and then import the required data using a make table or append query.

Posted by: scoobster Oct 8 2019, 03:11 AM

this is top advice!

I linked the table and used a 'create table' query. Importing 79k rows from Excel took 8 seconds. Using the link and 'convert to local' & 'transferspreadsheet' proceses took between 130 - 150 seconds to complete.

I'll use the link / create table and then set the index on the newly created table.

Many thanks.

Posted by: ADezii Oct 8 2019, 06:44 AM

I expect that would only work if using Excel automation.

Definitely true, just a side note to determine the appropriate Range. You could, actually, determine the exact Range to Import for a Sheet via Automation then apply it to the TransferSpreadsheet() Method but that is a little rogue, even for me. laugh.gif

Posted by: isladogs Oct 8 2019, 07:09 AM

Glad my suggestion worked for you.
If you are going to repeat the import of data from Excel, don't use a make table query each time as it will cause unnecessary bloating and can lead to instability.
Instead synchronise your data using a combination of append, update and possibly delete queries.
For more information on possible approaches, see the second part of this article

Posted by: scoobster Oct 8 2019, 10:50 AM

that's an excellent article. Thanks.

The issue I have is that the Excel workbook will be different in terms of structure and data each time and so I have to create the table from the linked workbook, set the index on the already identified unique value column, and then setup queries to run. That's the next challenge : how to setup the existing queries to use the newly created table(s).

I'll clean-up the tables and compact the DB to reduce bloating. Thanks for the heads-up on that.

Posted by: isladogs Oct 8 2019, 11:20 AM

Hmm...sounds like 'fun'

If the PK field of your created table is used as the PK or FK in other tables, you could have issues as the new records with the same ID won't necessarily match up.
Also, if you need to do this very frequently, you may need to decompile your app occasionally to prevent the instability / corruption I mentioned

Posted by: dmhzx Oct 9 2019, 10:01 AM

A couple of things to watch for when using TransferSpreadhseet.

If the data in a cell is more than 256 characters, it will be truncated without telling you.

If you haven't set the data type by column, then office (not sure if Access or Excel) will decide what data type and just ignore anything that doesn't suit - Can be great fun if your column should be a date, but the first few rows don't have dates.

(if your system date is not in MM/dd/yyyy format, things can be even more 'enjoyable')

Actually any data that doesn't match what has been determined will be silently not imported.

In some cases you might get an import error table created.

Not sure how good the data in excel is, or how important, I have often opted for a much longer import time in order to get good data ( for example the number of times excel has a date of November 31st)