On Mon, 2013-10-07 at 14:04 +0200, Caveat wrote: > Thanks Randall, we got some great discussions going here. I think the > closing paragraph sums it up quite well... > > " > > So err on the side of caution and normalize your data until you have > the experience needed to know when to break the rules. If you always > normalize your data you will be correct 99.999991 percent of the time. That > last 0.000009 percent of the time you may never get to.... > > " > > Of course, as I later said, my pennies example was a prime candidate for > normalisation! And, I think, the case Willy is dealing with would have > been normalised (in an ideal world) from: > > | ID | Gewicht | > 345 100 > 345 100 > 345 50 > 347 200 > 347 200 > 347 200 > 347 200 > > to: > > | ID | Gewicht | Count | > 345 100 2 > 345 50 1 > 347 200 4 > > with a Primary Key composed of ID and gewicht (weight). > But, Willy is working with a legacy system, and we don't know for sure > why someone decided to make (for example) 4 identical 347-200 rows, > instead of one 347-200 row with a count of 4. Perhaps it's a very busy > system and the designer thought having many people able to create and > delete arbitrary numbers of ID-weight rows would make it easier to > manage the counts/weights in a system with many concurrent users, rather > than having people queue up to modify the count on a normalised 347-200 > row? Who knows!
The database is set up normalized with relations to other tables, preventing double data, except for this one exception. The table mentioned is the only one that is set up this manner and without a primary key. It is a coming and going of records and is never longer than about 150 records. It is one of the smaller tables in the database. When decided what do make of one or more of these items, the manager wants to see all individual items part of an ID in a table (so record by record and not like 357 200 2). Next they select one (by entering a selected weight and clicking a button) and transfer it to production with added info on what to make of it, consumer price and more (like an added unique ID to serve as primary key and field refering to ID the record originated from). So each item eventually ends up in a new table with added fields and a primary key. So the table is just some kind of temporary place holder and there are some user requirement in how they want it presented. Now we could argue that it can be stored like 357 200 2 and next shown to user as 357 200 / 357 200, but that never happened historically. Since I need the VB-clients to work with the same database as well during migration period and don't want to spend time on them as I have my hands full building the Gambas clients, it will have to do. Once the VB-clients are out of the picture, I might implement it differently. Hope that clears it up a bit, maybe it is one of the 0.000009 percent of cases where an exception is made, even thought it can be implemented normalised? Slightly OT...maybe, but how OT are databases when building Gambas clients for them? But a lovely discussion...for sure ;) -- Kind regards, Willy (aka gbWilly) http://gambasshowcase.org/ http://howtogambas.org http://gambos.org ------------------------------------------------------------------------------ October Webinars: Code for Performance Free Intel webinars can help you accelerate application performance. Explore tips for MPI, OpenMP, advanced profiling, and more. Get the most from the latest Intel processors and coprocessors. See abstracts and register > http://pubads.g.doubleclick.net/gampad/clk?id=60134071&iu=/4140/ostg.clktrk _______________________________________________ Gambas-user mailing list Gambas-user@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/gambas-user