Hello group. i have a very large c program with several hundreds of
structures. is there is a tool which could detect bad structure
Thanks & Regards Prasun
I think No.
Which editor are you using to code your C program? I'm using Visual Studio 2008 IDE. In that I've to select the block I want to format(by pressing SHIFT+ARROWKEY or using mouse),and then press CTRL+F to format.
I don't know, but Lint[^] comes to mind when dealing with static code analyses. Don't know if it can handle inefficient member alignment though. There is also a code analyzer in VS2010 if you have Ultimate edition.
You might end up in trouble trying to optimize for space like this, if the order of the members is used in any way in the program. Some C (and C++) programmers tend to take shortcuts once in awhile.
Just a word of warning.
Your memory will obvouisly grow when working with data since all this memory is most likely read into the memory, so nothing strange there in my opinion. However you are loading 50.000! items in a list and that is a bad practise for several reasons:
1) Performance, it slows down the application because it has to load 50000 items at once
2) Usability, A user will not be happy scrolling through 50.000 items. Give them filtering and paging
3) Memory, 50.000 items can consume a lot of memory.
So my suggestion: Limit the amount of data by implementing paging (somewhere between 20 and 50 items each time) and filtering.
Your argument is valid in principle, but you shouldn't base your it on the assumption that this list is actually being displayed to a user - there is no mention of such a thing. In fact there may be no UI involved at all.
The only issue we know of so far is memory consumption. Paging is a solution, but whether a page should hold 20 or 20000 items depends on the actual problem.
If you are using your own implementation of a list then we obviously cannot say what it does. I will therefore assume that you are using an existing implementation, such as std::list (you didn't say!).
Every time you insert an item into a list, a new list node will be allocated. This node - depending on te actual implementation - will at least contain one or, more likely, two pointers that are required to maintain the links to the remainder of the list, and the actual data. So, even if your list only holds short values, each node might in fact take 10 bytes (2*sizeof (pointer_type) + 2) to allocate 50 thousand items thus will require half a million bytes of memory.
If your list items are large, requiring 1000 bytes each to store, then each node will be 1000+2*sizeof(pointer_type) in size, and the whole list will require 500 MB of storage.
If your list maintains large objects and you've decided to store them elsewhere you might want to only stor epointers to these items in your list, but even then each list node will store three pointers: two to maintain the list, and one to point to the actual data. On a 32 bit system that would be 12 bytes, or 600 thousand bytes for a list of 50 thousand items.
In short, you're storing information in memory, a lot of it - why are you surprised this takes up memory?
P.S.: according to your question the list takes up more memory than you are comfortable with. This begs some questions:
1. How much memory do you have available (i. e. what does the target system provide)?
2. How large is each indivuidual item?
3. Are these items stored redundantly, i. e. are they being held in several lists at once, or in other sorts of containers?
4. Where do you get the data from (files, media streams, internet, other applications)?
5. Would it be possible to store just one or a few items, process them and store away the results?
6. Have you considered compressing the data - either each node individually, or by storing only the differences to previous items?