sql - Saving 400 counters into table(s)? -
Every 15 minutes we read 250 xml files, each XML file is an element each element (xml File) is composed of 5 sub-elements and each sub-element has 400 counter. All those counters will be used for formulas and aggregations. What is the most effective way to store this data in tables, in this case the T-SQL table? The data can look like this is an XML file, so there are 249 more: The maximum number of columns in the table is 1024 (see), so you can not add 2,000 columns to a table . It basically leaves two options: In general, I will bow down to the cache for each sub-element. This would be especially true if the following are true: If columns are generally different, then I think about an EAV model or a hybrid model. Do you need different tables for
[element 1] - [element 1-1] - [ Counter 1]: 54 - [counter 2]: 12 - [counter 3]: 6 - ... - [counter 400]: 9 - [element 1-2] - [counter 1]: 43 - [counter 2]: 65 - [Counter 3]: 98 - ... - [Counter 400]: 12 - [Element 1-3] - [Counter 1]: 43 - [Counter 2]: 23 - [Counter 3]: 64 - .. - [Counter 400]: 1 - [Element 1-4] - [Counter 1]: 4 - [Counter 2]: 2 - [Counter 3]: 8 - ... - [Counter 400]: 12 - [Element 1-5] - [Counter 1]: 43 - [Counter 2]: 98 - [Counter 3]: 2 - ... - [Counter 400]: 12
elements and
Subelements , how it can be used using results. For a complete data model, you may want to include them. If you are doing "just" numerical analysis on the measures in the loaded data and are not using data for other purposes (collection, reporting), then these institutions may not be necessary.
Comments
Post a Comment