UtterAccess.com
X   Site Message
(Message will auto close in 2 seconds)

Welcome to UtterAccess! Please ( Login   or   Register )

Custom Search
2 Pages V < 1 2  (Go to first unread post)
   Reply to this topicStart new topic
> Importing Csv With More Than 255 Columns, Access 2016    
 
   
GroverParkGeorge
post Nov 9 2019, 03:05 PM
Post#21


UA Admin
Posts: 36,043
Joined: 20-June 02
From: Newcastle, WA


Good, thanks.

--------------------
My Real Name Is George. Grover Park Consulting is where I do business.
How to Ask a Good Question
Beginning SQL Server
Go to the top of the page
 
FrankRuperto
post Nov 9 2019, 05:24 PM
Post#22



Posts: 253
Joined: 21-September 14
From: Tampa Bay, Florida, USA


One mega rows in Excel may be fine for jsurpless' current needs, but future requirements could run into limitations. It happened to me. I started out using Excel to store voter registration records for one county, then user provided me registrations for the entire state (13M+ records).

--------------------
Currently supporting pawnbrokers that use my store management system developed with Access 2010 on Windows7. Experienced with Informix and Oracle DB's.
Go to the top of the page
 
jsurpless
post Nov 11 2019, 08:29 AM
Post#23



Posts: 485
Joined: 21-December 03



QUOTE
How long does the .Readline method take? Did you try Line Input?

.Readline takes about 54 seconds to process the CSV which has 7200 rows and 1100 columns.

What I am doing is reading each line into an array and on the 1st line, I identify the column index of each field that I am interested in. I then iterate through each field I want and load the data from the appropriate array element index.

QUOTE
OK. I see 4 possible approaches to import your data.

1. modify your CSV after creation by selecting the required columns in Excel.... as previously suggested
2. extract just those columns when the CSV is created from the other 'database'
3. Try linking to your CSV instead of importing it. Then run a query to extract those columns (that part may not work due to the 255 field limit
EDIT: Just tested that. Linking also fails as soon as the file has more than 255 columns
4. What type of file is the other database? Can Access link direct to that file? If so, no need for a CSV file at all

1/2. I was thinking a similar thing, might end up going down that route
4. I don't have access to the other database - it is a SQL database but its structure is quite involved... for instance, creating the CSV I mentioned earlier takes approximately 3 hrs. As such, I'm not sure that I want to go looking into it for the data that I need. Easier to use the export process I currently use.

QUOTE
Just to be clear - .Readline, which is the import of the export, is where it takes 30 seconds. (For how many records?) So the problem is in the import. It works just fine. It's just too slow.

That is correct

Thanks for the help!
Go to the top of the page
 
FrankRuperto
post Nov 11 2019, 09:14 AM
Post#24



Posts: 253
Joined: 21-September 14
From: Tampa Bay, Florida, USA


Whoever created the csv could just include the fields that you need. You are filtering the data you only need anyway with the array subscripting you mentioned, which I feel is a whole lot more complicated and work so why not just include the fileds you only need in the csv to begin with?... its less work that way and your import time will be significantly reduced! I typically create views of just the data I need to consume when the data source is comming from a db server like SQL Server to optimize performance. So by the same token, ask for just the data you need because 1,100 columns looks like its a denormalized flat file with redundant data.
This post has been edited by FrankRuperto: Nov 11 2019, 09:46 AM

--------------------
Currently supporting pawnbrokers that use my store management system developed with Access 2010 on Windows7. Experienced with Informix and Oracle DB's.
Go to the top of the page
 
isladogs
post Nov 11 2019, 10:11 AM
Post#25


UtterAccess VIP
Posts: 1,808
Joined: 4-June 18
From: Somerset, UK


Well as its over 1024 columns that also rules out using SQL Server as an intermediary step.
So if the exported CSV files can't be modified to only contain what you need you are once again back to my original suggestion.
Modify it in Excel and then import the modified file in Access

--------------------
Colin (Mendip Data Systems)
Website, email
Go to the top of the page
 
jsurpless
post Nov 11 2019, 10:17 AM
Post#26



Posts: 485
Joined: 21-December 03



QUOTE
Whoever created the csv could just include the fields that you need. You are filtering the data you only need anyway with the array subscripting you mentioned, which I feel is a whole lot more complicated and work so why not just include the fileds you only need in the csv to begin with?... its less work that way and your import time will be significantly reduced! I typically create views of just the data I need to consume when the data source is comming from a db server like SQL Server to optimize performance. So by the same token, ask for just the data you need because 1,100 columns looks like its a denormalized flat file with redundant data.

Actually, there's no redundant date in those 1100 columns, although there is definitely data that my users won't necessarily care about.

To date, my approach for the CSV files created by the user has been "just give me everything and you can decide what to import" since the export takes so long.

I'll most likely go with the route of reducing the # of columns in the CSV export

Thanks for all the input and suggestions!
Go to the top of the page
 
2 Pages V < 1 2


Custom Search


RSSSearch   Top   Lo-Fi    17th November 2019 - 11:23 PM