|
You can't split a class into multiple files in C++ to be compiled together into one executable? I'm pretty sure you can, but I only ever dabbled in C++ back in the day.
Or are you saying that you want to have different parts of a class compiled into separate executables (DLLs)?
|
|
|
|
|
No, it would be nice to be able to declare a class twice in two different files and have it compiled into the same binary.
C++ does not let you do that, at least not to my knowledge, unless they added it after C++17.
For example, I have a draw class that handles all the drawing operations in my graphics library.
It would be nice to segregate the different drawing primitives into different files, but the only way to do that is to delegate and forward or to use multiple C++ implementation files for a single class, but you still wind up with all the method definitions for all the drawing primitives in the same header.
You can hack around it using the preprocessor, by #include ing class fragments, but that's techy.
Edit: You can use inheritance to approximate it, but that still runs you into visibility issues.
To err is human. Fortune favors the monsters.
|
|
|
|
|
Hmm, odd, I was sure I had done that back in the 80s or 90s. I must be mistaken.
PIEBALD goes spelunking and finds some old C++ code...
Ah, you are correct (of course), I was mistaken for the most part.
I see by my code that the class has to be defined in one place, but that the implementations of the members can be separated out into other files -- and I'm using #include to combine the code together.
When C# 1 was first released, a class had to be fully defined and implemented in one file -- which was horrible -- but C# 2 added partial classes (and interfaces), with which a class definition and implementation can be spread across multiple files.
I do see that something similar could probably be accomplished with C++ (and maybe C# 1) by using the C-preprocessor, but that wouldn't be as clean.
|
|
|
|
|
Yes, you can. Class definition has to stay in one file but implementation can go in any number of files.
Mircea
|
|
|
|
|
But with C# (2 and newer), the definition can also be spread across files; not just the implementation.
C# combines the definition and implementation together (other than abstract members).
|
|
|
|
|
I know, but my monster is C/C++
Mircea
|
|
|
|
|
If it ain't broke, don't break it.
|
|
|
|
|
You certainly can declare a class in a single .h file and have the implementation spread over multiple .cpp files. That has been possible since ARM C++.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Sure, but you wind up with an .h file with all the declarations for the class in it, regardless.
To err is human. Fortune favors the monsters.
|
|
|
|
|
No idea of the lines count, but almost certainly assembler code: maybe 1/2MB or thereabouts?
The assembler we were using only supported single files: no includes, no relocatable blocks, no linker. For a 32KB ROM, that's only 16 chars per line so it's probably about right - maybe a little conservative.
Since the ROM was full, and most common instructions one byte long, maybe 28K lines or so?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Well, it was a long time ago but back in pre-history when I was writing COBOL (no longer on cards thank God) the main program in an overnight suite I supported was (I think) 21,000 lines of code (well, code + blank space). We did everything we could to avoid printing the thing (360 pages), so the listing was frequently annotated by hand. Of course that didn't really help when the actual code had typos that still allowed it to compile.
As overnight on-call support, I'd frequently get bleeped (by pager) and have to hook up the 600baud teletype to my landline, and get the relevant bits of memory dump. (Most issues were S0C7 faults). Yes, for some reason data was never properly validated; actual code bugs were rare, but get a non-numeric character in a character column in data and the whole shooting match failed.
|
|
|
|
|
DerekT-P wrote: the listing was frequently annotated by hand Wasn't that common practice in the 1970s-80s?
In my student days, I was an intern in a company making 16bit minis and 32bit superminis, running their own OS (in those days, 'Unix' was hardly known at all outside universities). I managed to isolate a bug in the OS, and went to the responsible guy. For quite a while, he flipped back and forth in his huge OS source printout, before exclaiming a "There!", dug out his ballpoint pen and wrote the code fix into the listing.
What makes me remember it better than I would otherwise: This OS was written in a language about midway between assembler and K&R C. He didn't write his fix in that language. He didn't write the assembler instructions. He wrote down the numeric instruction codes, in octal format.
I guess that this qualifies for being an 'oldtimer'
|
|
|
|
|
Guess the fix might have been just directly manipulating the binary executable. Have had to do that a few times. Other than that, the places I worked were usually quite good at keeping up-to-date hard copy listings. But then this was before VDUs. We also, of course, had the card decks, and they were all carefully maintained and filed. We even had a card printer that would read a punched card and type the contents along the top. Useful, as though most of the punching was done by the girls (and only girls) in the punchroom - whose machines printed the text on the cards - we also had access to a punch machine for doing the occasional edit, and that one didn't print. So looking through a deck you'd come to a revision but didn't know what the code was. (Well, you could just hold it up to the light and read it - became second nature after a while!)
|
|
|
|
|
Fixes to that OS was distributed as binary patches to the executable! A few areas was set off in the executable for code that wouldn't fit in the original place (requiring more instructions that the old code), so a jump could be inserted in the buggy location to the code in that spare area, and then jump back.
You wouldn't leave such tasks to interns
|
|
|
|
|
Your and Derek’s stories remind me of a life lesson from the same era. As a young uni student, I was fighting with an assembly language program and the assembler was giving all kind of strange error messages. Showing the problem to the TA I was working with, he didn’t figure it out either and sent me to author of the assembler, a guy just a few years older than me. He looked at the problem and went: “yes, I see what’s going on, come back in the afternoon for a solution”. In the afternoon, he gave me a single punched card to add to the deck of punched cards that was the assembler program - in those days you loaded the assembler from punch cards - and that indeed solved my strange error. For me the whole thing was almost like magic.
Some time later I figured the error had to do with the evaluation to different values of a symbol during different assembler passes and what the guy did was to make a binary patch that fixed the problem.
Why I say it was a life lesson for me? When I saw that guy figuring out quickly a problem that had both me and the TA stumped, and giving such an elegant solution (a single punched card), the young, cocky me realized there are people much smarter than me in the world and some humility is always in order.
Mircea
|
|
|
|
|
Was it an HP?
HP seemed to love Octal.
12bit bytes/words work well with Octal.
|
|
|
|
|
The company was called Norsk Data (when established: "Norsk Data-Elektronikk", "Norwegian Computer Electronics". The machines were named Nord-1, Nord-5 (the 32 bit ones), then Nord-10 and Nord-50. At the time of the story I told, they were ND-100 and ND-500. Later came an ND-5000, but I don't think there ever was any ND-1000.
The fun thing about octal for the ND-100 is that it really didn't fit the instruction format at all: Most instructions was built from 4 fields, each 4 bits. In the same period, the MC68K was about to enter the market: With 8 registers, 8 addressing modes, always the low bits, only uppermost opcode field was of 4 bits. Yet binary MC68K instructions was always presented in hexadecimal format (unless, of course, the bit pattern was shown - which often was the case).
|
|
|
|
|
The first program I wrote was in machine code using hex notation. I then had to key it in by hand using the buttons on the front panel of the processor.
|
|
|
|
|
I have to deal with a program that has several files, one of which is 88K lines and about 2.5MB. To add to the misery, there are thousands of global variables. We have rewritten most applications based on this but not all of them and it's an on-going thing.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
|
I was told two extreme cases from the software for the ITT System 12 phone switch:
The largest 'struct' definition (the language used was CHILL, not C, so terms are different) ran to 8300 lines. Printed 72 lines per page, this single type definition would fill a 115 page book.
The linker for this system maintained a symbol export table for each module. Early linker versions used a signed 16-bit integer to index this table, so it was limited to 32768 exported symbols. This limit was exceeded - System 12 defined modules exporting more than 32 Ki symbols. I can understand importing that many symbols, but exporting from a single module!?!
The maintainer of this linker was a university classmate of mine (he was the one telling about that struct, too) told that they made a quick fix, changing the index type to unsigned integer, to allow for 65536 exported symbols. But if anyone can break a 32 Ki limit, they can break a 64 Ki limit, too. So in the next major revision, the index type was changed to 32 bits. Hopefully, noone will export more than 4 billion symbols from a single module
|
|
|
|
|
Original Pascal had no module concept. All code had to go in a single file.
Open source is not as new as Linux people will make us believe! The Pascal P4 compiler was always freely available. I picked it up as a university freshman and studied on my own alongside working on the '101 Introduction to Programming' hand-in exercises. I'll say that it gave me a head start in programming ...
The compiler source was between 30,000 and 35,000 lines. For quite a few years following, I was really bothered by the upcoming C source file common practice of creating a separate file for every single function: How can you find anything at all in the source when you have to open hundreds of files for searching? Tools for searching across an entire directory tree wasn't very developed then - not until the C practice had become more widespread.
Today we have the tools. We also have FOSS. Even today I am appalled when I have to handle a zillion files, each containing 70-100 lines of open source license/copyleft blurb, followed by a five line function. In every single one of a zillion files!
|
|
|
|
|
Thank you all for replying to my question. I understood that:
a) I'm a wimp for complaining about size.
b) Almost everyone thinks "my (source code) is bigger than yours" - honi soit qui mal y pense
Mircea
|
|
|
|
|
Around 10 years back, I joined my first company, where I was introduced to a codebase which was in Java/XML, kind of an Android Application. I saw files having 12k lines. It was a nightmare situation for me to even navigate the code and to follow up on a bug we were supposed to fix!
As we progressed through the years, we found lower lines per file, saw IDE limiters on number of lines in a File(Max 1000), saw Coding Architectures like MVX(MVP, MVVM, MVI, etc.) asking us to break up code blocks into smaller more testable modules, saw Testing Tooling and other advice from many asking us to test a smallest unit.
Overall, the experience is a very rewarding, at each step we learnt the mistake we did previously, made amends and then made some more mistakes and then fixed them in the coming months.
|
|
|
|
|
Sri Krishna wrote: at each step we learnt the mistake we did previously, made amends and then made some more mistakes and then fixed them in the coming months. Life experience is exactly the same. Nobody know everything from the beginning.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|