Full Version: why is math so difficult?
UtterAccess Forums > Microsoft® Access > Access Forms
ok I got a subtotalfield adding up a bunch of fields and including tax.
How I have a total field. What If I want this field to display a total and also add it to a table(DONT tell me that I shouldnt store totals in the table. I want to and need to for recording or viewing purposes on paper)
I have a subtotal field adding up 3 other fields and giving the tax. Now I can make it show te grand total + tax but I want that grand total to go into the tbltotal field on the table. Get my drift LOL
clear as mud but essentially I want to do add up these fields, get tax for the total and display the total in txttotal as well as put that total in the table.
Can you please explain this? I know there are a lot of people here who wont help someone do things the wrong way.
ng caculated totals in a table is kind of a catch 22. Certainly you can do it. Doing so, however has a 100% probability that the stored result will become inaccurate at some point in time because tables are not like spreadsheets and are never recalculated. So, then the question becomes, why store something that you know will ultimately convey invalid information. If you want to view the result on paper you do the calculation in an unbound text box just as you would on a form.
dd an unbound text box to the form and set its control source to "=name of subtotal textbox + name of tax text box.
Sometimes calculated fields must be stored
.... including tax ...
seems like an invoice
lets say that todat there is an invoice totalling 10 €
next year, the same items cost more or less, whatever, and the same invoice would add up to 11 or 9 , doesn't matter. thats one reason.
any wayt, try this
Function totalencomenda(encomenda As Long) As Long
Dim I As Integer
Dim t As Integer
Dim rs As Recordset
Set rs = CurrentDb.OpenRecordset("SELECT * FROM sortimento WHERE [encomendakey] = " & encomenda)
While Not (rs.EOF)
totalencomenda = totalencomenda + null2zero(rs.Fields("qt1")) + null2zero(rs.Fields("qt2")) + null2zero(rs.Fields("qt3")) + null2zero(rs.Fields("qt4")) + null2zero(rs.Fields("qt5")) + null2zero(rs.Fields("qt6")) + null2zero(rs.Fields("qt7")) + null2zero(rs.Fields("qt8")) + null2zero(rs.Fields("qt9")) + null2zero(rs.Fields("qt10"))
Set rs = Nothing
End Function
qt1 to qt10 are fields on the grid related to quantity
encomenda is the autonumber ( as the invoice number )
sortimento is the grid line on the invoice details, therefore, encomendakey is the invoice number
this stores the total on totalencomenda
hope to help
Sorry, Pedro, but this is not the case. You store the FACTORS of the calculation (units, prices, tax rate) but not the RESULTS of the calculation.

Totally agree, but lets say that the articles have a master file where the price is stored.
when invoicing, you store the actual price of the article. you don't link because the price can change over time and previous invoices must stay as they were issued. The classical situation to demonstrate the autolookup query to fill a form grid. just pick the article and the price is pulled from the table and other relevant data that will be stored in an normalized redundant way but must be so

so, why not store the invoice total ? may help performance.

this is a situation where an otherwise calculated or linked field value can/must be stored.

of course, there could be another table : article/price/invoice to keep things neat but ........

Hope to have made myself understood.

Edited by: cimbo on Fri Nov 26 12:40:39 EST 2004.
what glenn is rightly saying (and forgive me glenn for putting words into your mouth) is that you should not store the value you invoiced... but you should store the quantity, price per unit and rate of tax.
Actually just reading glenn's post that's what he says.
This is so if something changes (possibly not likly) the results can be recalculated
OK so let me reiterate.
got forms and reports for example.
Ocreate all the fields to be added or whatever(which I have) then a field to figure tax (total * .055 in my instance) and then an unbound field adding the subtotal + tax.
I got that on the form no problem. So if Im not able to EASILY store the grand total in the table, I can show it on the report.
I am able to do calculations on reports obviously the same way I do them on forms right? (total1+total2+total3) etc. etc. etc.
Pedro,< One would have to run a well designed benchmark test to establish the actual effects on performance of storing or not storing a calculated result. What is easy to establish is that storing a calculated result does increase the size of every record by the size of the data type of the result. With a double that is, I believe, an extra 8 bytes per record, not much on a per record basis but with thousands of records the total wasted space will become significant. Without a benchmark it is difficult to have a definitive answer but any performance enhancement that may ensue from not having to recalculuate the result versus added overhead for data retrieval. Considering that data retrieval involves somewhat slower devices versus memory where the calculation my guess is that storing the result would probably only help performance, if at all, what the volume of data is trivial to very small.
All of this of course is not to mention that one must edit both the factor and the result field in order to effect corrections to the data. Unless great care is taken to ensure that happens, there is a siginificant risk of the result field falling 'out of sync' with the factor fields.
One case where storing a calculated result is data periodic archiving. In that case, one would store the total of each column in a balance forward record and then transfer data that precedes the cutoff date to an archive table.
That sounds about right, Jon. You would most likely store the applicable tax rate in your table. The tax calculation would be in an unbound text box control on your form or report as are your subtotal and grand total.
I am making a deliberate distinction here between fields which are objects belonging to a table and controls which are objects belonging to forms or reports. We calculate in a control, we display data in controls, we store data in fields.
G’day Glenn.
Well I’m going to have to careful here because the great weight of experience agrees with your remarks.
I do have one result of timing such a de-normalized approach.
About a year ago I was asked to write a database that required the pricing of individual items in an order entry (invoice) form. The requirement stated that any individual customer could have individual prices for any item they purchased.
These prices may shift up or down at the whims of company selling the goods.
Sort of like a discount based on item and purchaser... something I have been told QuickBooks can’t do.
(Please don’t get me wrong, QuickBooks is a good program, but if it can’t do it then perhaps in the next version…)
So it means keeping a record of last price of every item sold to every customer.
The order entry works as such: -
Select an item.
Using customer ID look to see if they have a last purchase price for that item.
If so use it, but allow for the price to be manually entered. (Changed)
If manually entered then create a new record for that item for that customer. (Time Stamp)
If the item has not been sold to that customer before then use the retail price list and save as new entry and any standard discount that customer may receive at the time. (Time Stamp)
(The standard discount for that customer can vary over time.)
Apply any tax that may or may not be applicable to that item for that customer.
(The tax for that item for that customer could vary over time.)
Total calculated price for that item * quantity and display the result for that item.
Sum the Totals for items and display as invoice total.
I hope I got the explanation correct…it has been awhile.
This works just fine, at the invoice level, because everything that can be calculated is calculated every time the invoice is viewed. Normalization is complete as far as I know.
Then they wanted an aging summary form and that’s when the Ritz hit the fan.
(For those that don’t know, an aging summary form shows outstanding dept distributed over some time frame. Typically columns on a continuous form spread dept as Current, C + 30, C + 60, C + 90 days…And they have a form footer that shows the totals of dept distribution from all customers.)
The form took around ten seconds to display but the totals on the footer of the form were not available for around 70 seconds.
This might be normalized but the delay is not what the customer wants.
Solution…de-normalize the invoice table.
When something, in this case an invoice, is normalized and calculated then the results should be correct…and they were.
(To the cent and with no exceptions… after all, that is the point of normalization in the first place.)
So if the results of normalization are always correct then there can be no error in saving the calculated results.
Now that we have a saved absolutely correct value that stems from normalization we can use it.
The aging summary form does not need to redo all the calculations but simply look at the total on the invoices.
Result was a drop in time from 70 seconds, to display the form footer results, to less than 1 second overall.
In other words, that which was normalized was a dog to open…that which was de-normalized was what the customer paid for and it was accurate.
But I will make some stress on this point…
One should not, and in fact can not, de-normalize until it is normalized.
It must be done correctly from the very start.
(The source code was purchased by my customer and is therefore not generally available.)
Hi Chris,
Here the calculations to which you are referring Chris done directly on the form. I'm wondering what performance difference there might have been if the calculations were performed in a query and the form bound to the query. Certainly 70 seconds is an incredible delay and I can understand how that would be unacceptable to your customer.
You make an excellent point when you
Oagree that sometimes de-normalization is necessary but stress that the decision to de-normalize should not be taken lightly or with less than full awareness that it is being done and why.
G’day Glenn.
From memory some of the calculations were done in queries, the rest were done in VBA.
It was much too complex for me to do the entire calculations in queries.
Also I must add…the bailing out to VBA does incur a time penalty but in tests, not as much as one may think.
In any case 70:1 proves the worth of looking elsewhere even if the result is technically undesirable. It is not due to doing it in a query or VBA. It is due to not doing it at all...the fastest code is the code that is not executed. That point seems a trifle to me but to re-calculate a certainty is a waste of time.
The source code is the customers and can not be published. I can no longer read it because it was written in A2K and I now only have A97. In any case it would not compile backwards to A97 because it used conditional formatting.
This is a "lo-fi" version of UA. To view the full version with more information, formatting and images, please click here.