UtterAccess.com
X   Site Message
(Message will auto close in 2 seconds)

Welcome to UtterAccess! Please ( Login   or   Register )

Custom Search
 
   Reply to this topicStart new topic
> Archiving Data, Access 2010    
 
   
wornout
post Dec 3 2017, 09:18 PM
Post#1



Posts: 972
Joined: 17-November 13
From: Orewa New Zealand


I have 10 tables and I am wondering if it is possible to loop through all tables find the row with the matching ID and return it to another table. I tried to do this trough queries but non of the data is the sdame. for example I have a client table with their information then I have 3 diffrent tax tables and the table list goes on but they all have client ID in them.
so If I could say look for client id in each table and take that row and move it to another table
Go to the top of the page
 
RJD
post Dec 3 2017, 11:18 PM
Post#2


UtterAccess VIP
Posts: 7,848
Joined: 25-October 10
From: Gulf South USA


Hi: If I understand what you are asking, I wouldn't "loop" through the 10 tables, but would address each one separately, sequentially, sort of a "brute force" approach.

As I read your post, you have 10 tables and I assume 10 more tables in which to put the "found" records. Is this correct? And I assume you have a form on which you enter or choose a client ID. And I assume the client ID is present in all records you are trying to "move" from the primary 10 tables to the 10 backup or archive tables.

I would simply set up 10 append queries and 10 delete queries, with criteria of the Client ID from the form. Then I would use a sub to run the 10 append queries, copying the records from their primary tables to the backup tables. As a check I would then insure that the number of records with the chosen client ID is the same for each table set, giving an indication (msg or, preferably, report) of what has been done. Then, if all checks out, I would run the delete queries, removing the client records from the 10 primary tables.

As a precaution, I would always backup the db before executing the query series.

This is my "belt and suspenders" approach. No "logic" to deal with, other than selecting/limiting the client ID, just some deterministic code.

Is this what you are trying to do? Or do we need more explanation on this?

HTH
Joe


--------------------
"Each problem that I solved became a rule, which served afterwards to solve other problems."
"You just keep pushing. You just keep pushing. I made every mistake that could be made. But I just kept pushing."

Rene Descartes 1596-1650 (Mathematician and Philosopher)
Go to the top of the page
 
jleach
post Dec 4 2017, 04:39 AM
Post#3


UtterAccess Editor
Posts: 9,815
Joined: 7-December 09
From: Staten Island, NY, USA


Just out of curiosity, why are you looking to archive the data? Most times you're better off keeping it all in the same table. If you want to primarily work with "more recent" data in the rest of the application, you can query it accordingly, but there's typically no real advantage to archiving data: in fact, it tends to complicate matters in the long run.

--------------------
Go to the top of the page
 
GroverParkGeorge
post Dec 4 2017, 08:11 AM
Post#4


UA Admin
Posts: 31,244
Joined: 20-June 02
From: Newcastle, WA


I agree with both approaches here. Joe's strategy is appropriate for the actual "archiving" bit, but I wonder whether it's necessary at all.

If you do, indeed, need to REMOVE unneeded records, and simply want an archive copy of the original tables, why not create a back up of the entire accdb as of that date, including the existing records to be archived. Then you could delete the ones you want to remove from the tables in the production db. That leaves everything intact in the backup, i.e. the "archive", as of that date so you can go back and look at the data anytime you need to do so.

Unless you are running into really serious storage limitation and performance problems, though, removing records is generally not that effective.

Are we talking about hundreds of thousands of records to be archived, tens of thousands of records? Hundreds? Dozens? How often do you want to create the archive?
This post has been edited by GroverParkGeorge: Dec 4 2017, 08:12 AM

--------------------
Go to the top of the page
 
RJD
post Dec 4 2017, 10:19 AM
Post#5


UtterAccess VIP
Posts: 7,848
Joined: 25-October 10
From: Gulf South USA


Hi again: While I gave you one possible solution to the question you asked, I personally side with Jack and George that there are usually better ways to handle this.

For example, I recently built a Parishioner Database for a local church. They wanted to keep all current and previous parishioners in the database, with information about those not currently active about when and why they are not current members. Since some of the reasons and dates for parishioners (mostly long-ago) are not known, I incorporated an Inactive Yes/No field in the primary records, and I use that to screen out parishioners for various reports. All parishioners, current and past, are in the same tables.

Just an alternative to removing records, although in your particular case perhaps you may have a good reason to remove them.

Just some thoughts ...

HTH
Joe

--------------------
"Each problem that I solved became a rule, which served afterwards to solve other problems."
"You just keep pushing. You just keep pushing. I made every mistake that could be made. But I just kept pushing."

Rene Descartes 1596-1650 (Mathematician and Philosopher)
Go to the top of the page
 
wornout
post Dec 4 2017, 01:19 PM
Post#6



Posts: 972
Joined: 17-November 13
From: Orewa New Zealand


Great thanks answers were what I was looking for as I was really unsure. I dont have a lot of records but the ones I do have lots of different information. I had actually built an excel data base to begin with and then relized I should of done it in access.so converted it . I like the idea of a active/nonActive field. I was a bit concerned about it being a bit slow but maybe not was looking towards future (hoping I would Have heaps of clients Ha Ha )
thanks for the answers have taken onboard what was said
Go to the top of the page
 
GroverParkGeorge
post Dec 4 2017, 02:04 PM
Post#7


UA Admin
Posts: 31,244
Joined: 20-June 02
From: Newcastle, WA


Good luck with the rest of the project.

--------------------
Go to the top of the page
 
RJD
post Dec 4 2017, 03:26 PM
Post#8


UtterAccess VIP
Posts: 7,848
Joined: 25-October 10
From: Gulf South USA


We were all happy to assist. And as George says, good luck with your project.

Regards,
Joe

--------------------
"Each problem that I solved became a rule, which served afterwards to solve other problems."
"You just keep pushing. You just keep pushing. I made every mistake that could be made. But I just kept pushing."

Rene Descartes 1596-1650 (Mathematician and Philosopher)
Go to the top of the page
 
jleach
post Dec 4 2017, 03:33 PM
Post#9


UtterAccess Editor
Posts: 9,815
Joined: 7-December 09
From: Staten Island, NY, USA


>> I was a bit concerned about it being a bit slow <<

I used to worry about that too, until I realized that you can have millions of records and it'll churn 'em up fine (provided you're at least somewhat careful about normalization and indexing, anyway).

Rows are cheap. And besides, if you do wind up with that many clients, you'll probably have someone else handling your database needs by then!

--------------------
Go to the top of the page
 


Custom Search
RSSSearch   Top   Lo-Fi    17th December 2017 - 04:15 PM