UtterAccess.com
X   Site Message
(Message will auto close in 2 seconds)

Welcome to UtterAccess! Please ( Login   or   Register )

Custom Search
 
   Reply to this topicStart new topic
> Using Ms Access 2013 To Move A Large Input .txt File With Fixed Length Fields To A SQL Server Table, Access 2013    
 
   
mlcktmguy
post Jan 15 2018, 12:25 AM
Post#1



Posts: 12
Joined: 5-February 17



I am developing an MS Access 2013 application with a SQL Server backend for a client. As part of the project I have to write conversion logic to transform their legacy data into the new format, resident in SQL Server tables.

I wrote and tested my conversion routines with a subset of the legacy data provided by the client. Everything worked fine.
The client is supplying the data as .txt file. Within the txt file the fields are fixed length. I created Import Specs for each data set being converted and then imported each file to be converted using this construct.
CODE
Public Sub importThisOne(passedImportFIle As String, _
                         passedImportTable As String, _
                         passedImportSpec As String, _
                Optional passedClearFileBeforeImporting As Boolean = True)
'
If passedClearFileBeforeImporting = True Then
    deleteString = " delete * from " & passedImportTable
    DoCmd.SetWarnings False
    DoCmd.RunSQL deleteString
    DoCmd.SetWarnings True
End If
'
DoCmd.TransferText acImportFixed, _
                   passedImportSpec, _
                   passedImportTable, _
                   passedImportFIle, _
                   False
'
End Sub

passedImportFIle – is the complete path, including the file name, to the .txt file to be imported
PassedImportTable – is the linked SQL Server table linked where the imported data will go
passedImportSpec – Name of the Import Spec for the table being imported
passedClearFileBeforeImporting -= Boolean flag. If yes, the SQL table is cleared of data before the import

Everything worked perfectly with the subset of the data provided by the client for initial testing. The client just provided a complete set of legacy data to test against my import and one very large input table is causing issues.

When the logic executes the above ‘Docmd.Transfer’ with the large table as input I get this error.
Runtime error 3183. The query cannot be completed. Either the size of the query result is larger than the maximun size of the DB (2GB) or there is not enough temporary storage space on the disk to store the result.

My accdb is 1GB and my disk is not close to full.

It appears that I will have to change my technique for moving the large input txt files to their related SLQ table. I think txt files can be linked to an application but I’ve never done that.

What method would you use to get this large, fixed length .txt input file to the SQL table?

Is there someway I can connect directly to the large .txt file?
Go to the top of the page
 
JonSmith
post Jan 15 2018, 05:18 AM
Post#2



Posts: 3,979
Joined: 19-October 10



So, from my experience, when importing data like this it tries to process every single record in your import before committing the data, this runs smooth until the memory hits some sort of limit where each record added takes exponentially longer as if it is somehow processing every record in memory again each time it adds a new one. Don't tell me why, its just what I observed during testing.
I can hazard a guess at least for trying to process all data first since during an import Access would display a dialog informing you of how many records you are trying to commit and it there are any conversion errors so you can choose to rollback and not commit. As such it has to hold every record in memory.
To get around this I wrote some code to import from a CSV and loop down line by line inserting on row at a time. I would open a transaction for it to go smoother and found a golden number of records to keep it going fast (if you make it execute the transaction after its processed a higher number you'll see the slowdown I mentioned).
I'll try to find my code.

JS
Go to the top of the page
 
nvogel
post Jan 15 2018, 05:24 AM
Post#3



Posts: 891
Joined: 26-January 14
From: London, UK


If the target is SQL Server then why not import the text file using SQL Server Integration Services, or the bulk insert feature, or the OPENROWSET feature? SQL Server arguably makes a better tool than Access for this kind of job. There is an import/export wizard in SQL Server Management Studio that can create a package to do the basic import - if you have more sophisticated requirements you can edit the package or create your own.
Go to the top of the page
 
JonSmith
post Jan 15 2018, 05:34 AM
Post#4



Posts: 3,979
Joined: 19-October 10



That sounds like a better suggestion, mine was focused on a MS Access Back end.
I won't bother finding my code as I think you should follow that route first.
Go to the top of the page
 
mlcktmguy
post Jan 15 2018, 03:01 PM
Post#5



Posts: 12
Joined: 5-February 17



Jon,
Please keep looking for your code. I took the bulk load suggestion and was able to load the data directly in to SQL Server table. However, this file contains legacy data that needs to be converted into the new DB format. It is not a simple one record to one record conversion. I am restructuring this input legacy data into a couple different records.

Even though I got the data to load into SQL this is still an MS Access issue. Once I bulk loaded the data into SQL, I have an Access VBA routine that reads each record in the SQL table creating output records in other SQL tables. The logic works but I still get the same '3187' error after about 30,00 records are read and written.

I think your idea about transactions may be the answer to the situation I am now faced with, which is trying to process all of the records using an Access front end.

If you can find your code I would really appreciate it.

This is my current routine that is reading the bulk loaded SQL table and parsing into individual fields.
CODE
Public Sub readBigLumpOfData()

Dim wkCostName As String
Dim wkMuniCode As String
Dim wkLotBlock As String
Dim wkTaxType As String
Dim wkMuniCode02 As String
Dim wkControlNumber As String
Dim wkTaxType02 As String
Dim wkCostDetail01_100 As String
Dim wkGRBNumnber As String
Dim wkGDNumber As String
Dim wkRemarks As String

'On Error Resume Next
wkStartTime = Now

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
'
Set db = getCurrentDbC
Dim rsin2 As DAO.Recordset
Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////
'
If rsin2.EOF Then
    '
Else
'   If rsIn2.RecordCount > 0 Then
        '
        While Not rsin2.EOF
            '
             recCount = recCount + 1
             totRecs = totRecs + 1
             '
             If recCount = 10000 Then
              '   Stop
                 recCount = 0
                 wkCurrTime = Now
                 Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
                 DoEvents
             End If
            
            wkCostName = Mid(rsin2!EntireRec, 1, 30)
            wkMuniCode = Mid(rsin2!EntireRec, 31, 4)
            wkLotBlock = Mid(rsin2!EntireRec, 35, 17)
            wkTaxType = Mid(rsin2!EntireRec, 52, 1)
            wkMuniCode02 = Mid(rsin2!EntireRec, 53, 4)
            wkControlNumber = Mid(rsin2!EntireRec, 57, 7)
            wkTaxType02 = Mid(rsin2!EntireRec, 64, 1)
            wkCostDetail01_100 = Mid(rsin2!EntireRec, 65, 14000)
            wkGRBNumnber = Mid(rsin2!EntireRec, 14065, 30)
            wkGDNumber = Mid(rsin2!EntireRec, 14095, 30)
            wkRemarks = Mid(rsin2!EntireRec, 14125, 76)
            
     ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks
            
            
            With rsOut
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                     ![LotBlock] = wkLotBlock
                     ![TaxType] = wkTaxType
                     ![MuniCode02] = wkMuniCode02
                     ![ControlNumber] = wkControlNumber
                     ![TaxType02] = wkTaxType02
                     ![CostDetail01_100] = wkCostDetail01_100
                     ![GRBNumnber] = wkGRBNumnber
                     ![GDNumber] = wkGDNumber
                     ![Remarks] = wkRemarks
               .Update
            End With
    
            rsin2.MoveNext
        Wend
End If

rsOut.Close
Set rsOut = Nothing
'
rsin2.Close
Set rsin2 = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub
Go to the top of the page
 
mlcktmguy
post Jan 15 2018, 03:14 PM
Post#6



Posts: 12
Joined: 5-February 17



Jon,
I tried revising my code without your example but it I still end up with the same error.
This is the revised code, basically I added a BeginTrans at the beginning of the data read and paired a CommintTrans, BeginTrans every 5,000th record processed.
Hopefully your idea will work better.
CODE
Public Sub readBigLumpOfData()

Dim wkCostName As String
Dim wkMuniCode As String
Dim wkLotBlock As String
Dim wkTaxType As String
Dim wkMuniCode02 As String
Dim wkControlNumber As String
Dim wkTaxType02 As String
Dim wkCostDetail01_100 As String
Dim wkGRBNumnber As String
Dim wkGDNumber As String
Dim wkRemarks As String

'On Error Resume Next
wkStartTime = Now

clearSQLTable_Sproc "SQCS_CostFile_Import"
' output file
Set db = getCurrentDbC
Dim rsOut As DAO.Recordset
Set rsOut = db.OpenRecordset("SQCS_CostFile_Import", dbOpenDynaset, dbSeeChanges)
'
Set db = getCurrentDbC
Dim rsin2 As DAO.Recordset
Set rsin2 = db.OpenRecordset("SQCS_EntireRecAs1Field_Import1", dbOpenSnapshot, dbReadOnly)   ' opened Readonly  //////////////// Read Only  //////////////////
'
BeginTrans    '///////////////////////////////////////////

If rsin2.EOF Then
    '
Else
'   If rsIn2.RecordCount > 0 Then
        '
        While Not rsin2.EOF
            '
             recCount = recCount + 1
             totRecs = totRecs + 1
             '
             If recCount = 5000 Then
              '   Stop
                 recCount = 0
                 wkCurrTime = Now
                 Debug.Print totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
                 DoEvents
                 CommitTrans '////////////////////////////////////////
                 BeginTrans  '///////////////////////////////////////
             End If
            
            wkCostName = Mid(rsin2!EntireRec, 1, 30)
            wkMuniCode = Mid(rsin2!EntireRec, 31, 4)
            wkLotBlock = Mid(rsin2!EntireRec, 35, 17)
            wkTaxType = Mid(rsin2!EntireRec, 52, 1)
            wkMuniCode02 = Mid(rsin2!EntireRec, 53, 4)
            wkControlNumber = Mid(rsin2!EntireRec, 57, 7)
            wkTaxType02 = Mid(rsin2!EntireRec, 64, 1)
            wkCostDetail01_100 = Mid(rsin2!EntireRec, 65, 14000)
            wkGRBNumnber = Mid(rsin2!EntireRec, 14065, 30)
            wkGDNumber = Mid(rsin2!EntireRec, 14095, 30)
            wkRemarks = Mid(rsin2!EntireRec, 14125, 76)
            
     ' Debug.Print "CostName: " & wkCostName; ", Remarks: " & wkRemarks
            
            
            With rsOut
               .AddNew
                     ![CostName] = wkCostName
                     ![MuniCode] = wkMuniCode
                     ![LotBlock] = wkLotBlock
                     ![TaxType] = wkTaxType
                     ![MuniCode02] = wkMuniCode02
                     ![ControlNumber] = wkControlNumber
                     ![TaxType02] = wkTaxType02
                     ![CostDetail01_100] = wkCostDetail01_100
                     ![GRBNumnber] = wkGRBNumnber
                     ![GDNumber] = wkGDNumber
                     ![Remarks] = wkRemarks
               .Update
            End With
    
            rsin2.MoveNext
        Wend
End If

CommitTrans  '//////////////////////////////////////

rsOut.Close
Set rsOut = Nothing
'
rsin2.Close
Set rsin2 = Nothing

Debug.Print "AllRecs: "; totRecs & ", " & RunTime(wkStartTime, wkCurrTime)
MsgBox "Complete"

End Sub
Go to the top of the page
 
datAdrenaline
post Jan 16 2018, 04:05 PM
Post#7


UtterAccess Editor
Posts: 17,964
Joined: 4-December 03
From: Northern Virginia, USA


You can do this through Linked Table objects ... create a Linked Table object to your Fixed Length text file, and to your destination, then create a simple APPEND Query object ... I do it all the time. If you want to get real fancy, you don't even need the Linked Table objects -- but you can still deal with the data as a set instead of row by row.



Go to the top of the page
 
JonSmith
post Jan 16 2018, 04:12 PM
Post#8



Posts: 3,979
Joined: 19-October 10



I found my code earlier but didnt get chance to edit it and post. I'd documented that 5000 records were taking around 6 mins and doing it in chunks reduced it to less than 2. I used a progress bar to find my slowdown number as you see it go to a crawl. I think I did it every 500 but I need to double check.

In the meantime try using smaller transactions and a progress bar to monitor slowdown.
Brent I think linked tables have the same slowdown issues with large numbers of records?
Go to the top of the page
 
Jeff B.
post Jan 16 2018, 04:24 PM
Post#9


UtterAccess VIP
Posts: 10,166
Joined: 30-April 10
From: Pacific NorthWet


Another possible reason for a slowdown could be if the SQL-Server table(s) into which the data is going have indices on one/more of the fields/columns into which the data is going...
Go to the top of the page
 
datAdrenaline
post Jan 16 2018, 05:13 PM
Post#10


UtterAccess Editor
Posts: 17,964
Joined: 4-December 03
From: Northern Virginia, USA


So .. since you are trying to use code, and you asked "Is there someway I can connect directly to the large .txt file?" ...Try this out ...

First go to the location of your text file, then ADD a file to that folder named Schema.ini (to learn more about Schema.ini click here)

Then add the following spec ...

[TheTextFileYouAreUsing.txt]
Format=FixedLength
ColNameHeader=True
CharacterSet=ANSI
Col1=CostName Text Width 30
Col2=MuniCode Text Width 4
Col3=LotBlock Text Width 17
Col4=TaxType Text Width 1
Col5=MuniCode02 Text Width 4
Col6=ControlNumber Text Width 7
Col7=TaxType02 Text Width 1
Col8=CostDetail01_100 Memo Width 14000
Col9=GRBNumnber Text Width 30
Col10=GDNumber Text Width 30
Col11=Remarks Text Width 76

Now, save that Schema.ini file. Remember it MUST be in the same folder as the .txt file you are trying to use as a data source.


Then ... your code would would be something like this:

CODE
Public Sub readBigLumpOfData()
    
    Set db = getCurrentDbC
    
    Dim strSource As String
    Dim strSQL As String
    Dim dtStartTime As Date
    Dim lngRecordsAffected As Long
    
    dtStartTime = Now

    strSource = "[Text;Database=C:\folderPathThatHasTheTXTFile].[TheTextFileYouAreUsing#txt]" 'Yes that # was intentional

    strSQL = "INSERT INTO SQCS_CostFile_Import (" & _
                "[CostName]" & _
                ", [MuniCode]" & _
                ", [LotBlock]" & _
                ", [TaxType]" & _
                ", [MuniCode02]" & _
                ", [ControlNumber]"  & _
                ", [TaxType02]" & _
                ", [CostDetail01_100]" & _
                ", [GRBNumnber]" & _
                ", [GDNumber]" & _
                ", [Remarks]" & _
            " SELECT " & _
                "[CostName]" & _
                ", [MuniCode]" & _
                ", [LotBlock]" & _
                ", [TaxType]" & _
                ", [MuniCode02]" & _
                ", [ControlNumber]"  & _
                ", [TaxType02]" & _
                ", [CostDetail01_100]" & _
                ", [GRBNumnber]" & _
                ", [GDNumber]" & _
                ", [Remarks]" & _
            " FROM " & strSource

    With db
        .Execute strSQL, dbFailOnError
        lngRecordsAffected = db.RecordAffected
    End With

    Debug.Print "AllRecs: " & lngRecordsAffected & ", " & RunTime(dtStartTime, Now)
    MsgBox "Complete"

End Sub


(untested - smile.gif )
Go to the top of the page
 
datAdrenaline
post Jan 16 2018, 05:28 PM
Post#11


UtterAccess Editor
Posts: 17,964
Joined: 4-December 03
From: Northern Virginia, USA


The slowdown is likely do to the MaxLocksPerFile. By default, I believe the setting varied by version, but 9500 or lower rings a bell. When I ran into that limitation I set the value prior to my loop, or I paused and did a doevents after 4-5k records (like you did), or I set the value higher in the registry. One advantage to setting in code is that when you distribute the code, you know what the value will be set to.

So ... for a to temporarily change your MaxLocksPerFile setting:

DBEngine.SetOption dbMaxLocksPerFile, 50000

Or ...

Increase the limit permanantly in the Registry:

Matching bitness of Office and Windows (ie: 32bit/32bit or 64bit/64bit)
HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\xx.x\Access Connectivity Engine\Engines\ACE

32bit Office on 64bit OS:
HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Office\xx.x\Access Connectivity Engine\Engines\ACE


Reference ...
https://support.microsoft.com/en-us/kb/815281
Go to the top of the page
 
MadPiet
post Jan 16 2018, 07:18 PM
Post#12



Posts: 2,595
Joined: 27-February 09



I agree with NVogel...
maybe this will help? Don't do this with Access. Just slows things down. Use BCP or SSIS.
https://docs.microsoft.com/en-us/SQL/relati...data-SQL-server
Go to the top of the page
 
datAdrenaline
post Jan 17 2018, 01:35 PM
Post#13


UtterAccess Editor
Posts: 17,964
Joined: 4-December 03
From: Northern Virginia, USA


An issue I have ran into when using Bulk Insert, BCP, or SSMS is that the SQL Server did not have visibility of the source folder of the TXT file. SSIS is a good choice too, but there is some configurations and such in order to execute the package, but SSIS is pretty fast and very nice for doing such tasks.

But ... if this is short term, and the method of using Schema.ini and such works quickly and gets the job done, then --- it may be worth keeping!
Go to the top of the page
 


Custom Search


RSSSearch   Top   Lo-Fi    11th December 2018 - 08:57 AM