|
Yes, I mean malloc and the like (in fact, since I saw malloc in your code I am wondering about...).
Thank you.
I am curious to know if someone uses a custom allocator.
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
I tend to use malloc because many platforms have custom allocators to allocate things like external memory.
For example, on the ESP32 you can allocate from extended RAM (PSRAM) by using ps_malloc instead of malloc.
By accepting a void*(allocator)(size_t) pointer as an argument to my constructor for example, I can use custom memory. With new and delete or even calloc that becomes problematic, as there are no corollaries.
That's primarily why I use malloc in so much of my C++ rather than the C++ish alternatives.
It's kind of an aside to what you were asking, but may be of some significance to you.
To err is human. Fortune favors the monsters.
|
|
|
|
|
honey the codewitch wrote: With new and delete or even calloc that becomes problematic, as there are no corollaries. I don't get this, since you might, for instance, use your own allocator in new .
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
I should have been more clear.
Sure if you do that it would work, but with new there are also other issues. It's not always possible/ideal to use new, such as for example, when you need to allocate an uninitialized array of bytes. It's not a good idea to preinitialize a gob of memory you're going to be overwriting entirely anyway, such as bitmap data. No sense in clearing it if you're just going to fill it with purple as your next step. This gets important on little CPUs.
Also, it's actually easier in practice to tie an allocator to a function pointer and keep things consistent. new can be overloaded inside classes with different behavior. I haven't really thought about the ramifications of that, in terms of a graphics library on IoT, where different memory has different characteristics but it smells to me, as though it would create confusing bugs - allocating from PSRAM in one instance because you call it from somewhere else where it would usually allocate from SRAM.
To err is human. Fortune favors the monsters.
|
|
|
|
|
CPallini wrote: I am curious to know if someone uses a custom allocator.
Back in the day for my 'limited' systems (64k to 1 meg)
# I wrote my own heap for use in C replacing what was there. Replacement managed different sizes.
# I wrote my own C++ allocators. That was to manage strings (my class) more efficiently.
I also wrote my own virtual memory manager for a limited use role to switch in and out code for printing a graph (if I recall correctly.)
|
|
|
|
|
Wow.
You should write one article (or possibly many articles) on the argument.
I need to perceive full control, but a little scared of writing my own one.
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
I always avoid it: my realtime stuff is always expected to run 24/365, so any heap fragmentation is a major problem. I allocate ram at init or preferably compilation, and it doesn't change from there, except for the stack obviously.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
That was I've done so far. However, systems are getting more complex...
Thank you.
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
I've found that you can create robust systems that malloc just fine as long as you're careful.
I mean, I'm not saying you shouldn't avoid it, but used judiciously you can get away with it without having to worry that your code will crash a week in.
Basically, you allocate small amounts and throw them away quickly.
Then if you're really concerned you can profile to make sure you aren't leaking and have enough headroom, but in practice I don't even find I need to do that most of the time.
Edit: If you use common libraries like LVGL to provide a user interface for example, those will often malloc in the background.
To err is human. Fortune favors the monsters.
modified 26-Feb-23 4:55am.
|
|
|
|
|
If you have a memory controller or if your micro controller has a memory controller in it then yes. Because to use DRAM something has to be in charge of the refresh cycle(s).
That is if I understand your question properly.
|
|
|
|
|
I meant malloc/free (or new/delete). I'm dealing with SRAM-only microcontrollers.
Thank you.
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
I just had a "High Sea" in a restaurant in the fishers town of Urk, a variation on "High Tea" which will probably raise eyebrows with English CodeProject members.
It was deliceous, with all kinds of fish accompanied by chips, I also had a glass of Sauvignon Blanc to flush it all away (my favourite however still remains Muscadet)
|
|
|
|
|
how high did the waves break over the table ?
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch
|
|
|
|
|
We ended up at the sea bottom (name of the restaurant: "De Zeebodem")
|
|
|
|
|
RickZeeland wrote: which will probably raise eyebrows with English CodeProject members. With the dutch as well.
Can you not afford real meat?
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Well, they also had a "High meat" for the same (very reasonable) price, but as the dish was for 2 persons we had to choose. Next time we will go for the meat I think
|
|
|
|
|
High meat.
I'm thinking a pig that's stoned like a garnaal.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
RickZeeland wrote: accompanied by chips
So long as it was proper chips[^] and not fries, or what Americans call "chips" that are actually crisps, then you're fine.
(Damnit, now you're making me hungry! )
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Actually it was a kind of chips I had not seen before, looked a bit like potato slices that were folded and then fried.
Maybe they had had an accident in the kitchen
|
|
|
|
|
(I've probably rant about this before, I have some time to learn new things)
Not the act of learning new technologies, that's always fun.
But I hate trying to find good and up to date (modern) documentation and tutorials, or tutorials that go from uber simple things to WTF did you just show ... there are missing steps that should probably be obvious, but no.
I also hate installing tools (yes, I'm talking to you SQL Server Express), that does not install.
Or downloading code that does not compile or use deprecated or obsolete frameworks
Also, I have no clue what the new technologies are, what is "cool" or what is in need right now; I know it depends on what I want to do and what the company wants me to do.
I'm many years in technical debts and I have a lot of catch up to do;
Thank you for attending my anti-TED talk.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
I totally agree.
The thing I find most annoying is that when you go looking for example code and you already have a basic understanding you find the SAME simple example code everywhere. No one seems to want to tackle a little bit more advanced subject.
|
|
|
|
|
There's a thin line between love and hate (famous Dutch saying)
|
|
|
|
|
I have been running as hard as I can to "Keep Up", since GW-Basic was Microsoft's only product. I have grown very tired of all aspects of it as well.
really old guy
|
|
|
|
|
Maximilien wrote: But I hate trying to find good and up to date (modern) documentation and tutorials, or tutorials that go from uber simple things to WTF did you just show ... there are missing steps that should probably be obvious, but no.
Yes, indeed! Quality of documentation went down the drain some time ago and it doesn't seem to have any uptrend. Maybe the CEOs fired all the technical writers in anticipation of ChatGPT doing the work for free. Open-source projects never had any technical writers and real programmers never write documentation. Blah!
Mircea
|
|
|
|
|
Even worse, there is a teacher over here at a local college who teaches that it is nonsense to write documentation as it will soon be expired!
Sadly we get new developers from that school who think that this is ok ...
|
|
|
|