|
I believe it is, well as far as I know, they use a rather scaled down micro GC that (from what I'm told) does a better job than the full GC in the .net framework.
Of course, this is only what I'm told.
|
|
|
|
|
I have to be very careful in practice about where my RAM is allocated. For example, I have SRAM (fast) and PSRAM (over an 80mhz bus). PSRAM isn't DMA capable, meaning for me to blt bitmaps to the screen requires me to tie up the CPU. So I need to use SRAM to do so asynchronously.
Then you have ISR code, which must be loaded into SRAM, and I doubt you can even make ISR code with .net
No offense, but what exactly are you going to do with a garbage collected managed code framework on something like an ESP32 that couldn't be done on the same platform several orders of magnitude more efficiently without it?
Because nobody has been able to satisfactorily answer that question for me, I have dismissed it as a viable IoT development framework. For now. As time goes on, I'll reconsider my position as the technology warrants it.
Micropython I'm uneasy about, but at least there's a better argument for it, and it's battle tested at this point. As much as I don't like it, it has proven capable enough.
As for me, for now I'll stick with C++, because of all of the C++ ecosystem I can leverage in the IoT world but also because every cycle i don't spend garbage collecting or interpreting script is one more cycle of battery life.
Real programmers use butterflies
|
|
|
|
|
I hear you, and yes I follow your concerns.
What I do know about the nano framework however, is that for me, right now my biggest drawback is lack of drivers for common hardware. For example, I wanted to make use of an HD44780 LCD display, but my displays where all 4 bit with an I2C backpack on them, rather than the full 8 bit one the drivers where designed for.
I have for a large chunk had to write my own shims for the hardware I'm using, but I have to say aside from that I have had no other issues in any of the work I've been doing.
The project is also getting funding directly from Microsoft to help progress it, so that I think counts for something.
To be perfectly honest, I would rather use the meadow platform : Meadow
But right now they only run on their own hardware. Meadow is more mature, not going to deny that, but it's not available to me right now what I'm working on.
As for Nano, as I say, I can't say I've had any issues, but then again maybe I'm not working on the same type of projects you are, nano works largely ok for me, and my increases in productivity come from the fact that I'm using the same language and the same dev environment on both the device, and the device client sides of the equation.
|
|
|
|
|
For doing things like truetype rendering and partial double buffering over SPI (not I2C which is dog slow) you really benefit from DMA, and being able to niggle hardware some. I mean my unoptimized drivers get about 20fps and TFT_eSPI's spi drivers for the same configuration get 30fps. SPI on an ESP32 for 320x240@16bpp tops out at 31fps - just fast enough that the user hopefully doesn't notice your redrawing too much.
However, I routinely deal with 800x480 displays using an RA8875 controller that's not even as fast as most other SPI displays. I can top it out at only 20MHz rather than say, 27. All the extra pixels compounds the problem.
For me to render true type fonts, jpegs, and such reasonably I really need the speed that DMA buys me when it's available.
Real programmers use butterflies
|
|
|
|
|
Ah, well you see that's the difference. a huge amount of what I do communicates with a faster device on the client end that does the display (Automotive electronics)
All I realistically need to be concerned with at the small device end is, does the device read the inputs fast enough, respond to the instructions sent to it fast enough.
In most cases, anything that's time critical is on a GPIO all of it's own, anything that's computed, does so based on input to it's UART, the devices I build are typically sat down in the gunk and crap of the engine block, miles away from the user sat behind the dashboard, and very often each device handles at most only a handful of I/O pins, the most stressful thing I put any of my devices through is upping the serial baud rate
Display is usually sat up on the dash, and has something like an ARM7 with a decent clock speed and good amount of memory sat on it.
For those cases where the engine end really needs to perform, I'll typically deeply something like a PIC Micro and write it's firmware directly in PIC ASM, but where I can get away with it, I do like using the ESP32 and the Nano framework, as the "head end" that has the display on is very often written using .NET/C# on a device that can handle it.
|
|
|
|
|
That makes sense. It hadn't occurred to me to even use an ESP32 in an application like that. It's a bit heavier than what I'd have gone with but then with the ESP32 you get .NET and I suppose battery isn't an issue.
Real programmers use butterflies
|
|
|
|
|
Nope, no battery problems, as the power is taken directly from the automotive electrical system, and I have a LOT of Amps to play with
The biggest problem I have is trying to make things wireless.
When you have something effectively encased inside solid steel sealed assemblies, your pretty much guaranteed to need a wiring harness, and where there is a wiring harness there is capacity for leaks.
Oil Pressure/Temp monitoring is one such case, which makes maintenance a complete PITA.
I would dearly love to make more use of things like WiFi/Bluetooth etc, but there's often just far too much metal in the way.
One thing we did experiment with however, was to use the entire metal encasing as the antenna, but that unfortunately didn't work too well.
What I work on, is all about taking software design concepts such as microservices, and applying them to hardware, we try to make each "unit" as stand alone as possible.
|
|
|
|
|
I noticed that when i feel bloated, everything looks bloated.
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch
|
|
|
|
|
BillWoodruff wrote: when i feel bloated, everything looks bloated Kinda like everybody's pretty when I'm drunk?
|
|
|
|
|
As a student, I didn't have much money. I was living with a girl then, and we had to cut down on everything - she insisted that we couldn't even spend any money on beer. Jeeez ... being a student with no beer?? But I was obedient.
Then one day she came home having bought makeup for $98.50. I got really mad: You will not allow me even a single beer, and then you go out and spend almost a hundred dollars on makeup! (This is long ago and $100 was a lot more buying power than today.)
Of course she started crying: The makeup was so that I would look pretty to you ...
I made a deep sigh: But ... That's what I had the beer for!
I never saw her again.
|
|
|
|
|
|
the best thing for bloating is a good fart or two. I wonder if there is a way to make software fart.
|
|
|
|
|
Slacker007 wrote: I wonder if there is a way to make software fart.
Easy Peasy. Just lay off everybody that works on the test teams. Stink will follow.
|
|
|
|
|
You've clearly never read any of my "Debugging Messages" when I'm trouble shooting then...
And yes, I have in the past accidently left one or two in, only to get a puzzled email from a client asking why my software is putting a message box on the screen telling him that it "Just Farted"
|
|
|
|
|
i keep working on the challenge of making my code not flatulent
«The mind is not a vessel to be filled but a fire to be kindled» Plutarch
|
|
|
|
|
"Computer Scientists" decided to make things more "pure"....
|
|
|
|
|
'Keep it Simple' refers to the users.
Because users are getting more and more less educated (Friendly way of saying 'Dumber' ), software needs to be 'smarter' (Friendly way of saying 'Bloated' )
|
|
|
|
|
In regards to the language, you don't have to use all it's features just because they're there. As an embedded systems engineer (retired now), I used a subset of the C/C++ language in my projects.
|
|
|
|
|
0x01AA wrote: Once upon a time c# was such a beautiful, simply/logical structured language
It still is!
|
|
|
|
|
Or have the '?' at the end of the conditional then what falls under it is the answer:
if (x is not null)?
{
do_something();
}
we first ask if x is not null, then the answer would be to do_something(). This does make the language look more conversational. (Yes, this is going off the deep end.)
|
|
|
|
|
I have been using C# since 2000 and am impressed with how contemplative the languages teams at Microsoft have been to evolve the languages to address the computer science issues of the day. As you mentioned, they are now taking on the issue of nullability and providing the capabilities to identify and address the challenges. I have a background in mathematics and SQL Server so nullability has always been something that I have paid attention. But, many software programmers don't even think in terms of, take for example a Boolean where the values can be true, false, or indeterminate (null). The goal of course is to have more resilient code. The addition of null checking would seem like an easy thing to do, until you realize that the entire .Net library needs to checked and enabled to participate.
|
|
|
|
|
Hmmm... that seems more like an Easter Egg than a feature... I mean, in terms of fitting in with C#'s regular syntax, it really doesn't... "if (a!=3 || x is not null || b!=null)" ... and so on...
...what happens if you say "if (x is not null || b!=null)" Is that the same as "if (x is not (null || b!=null))? Doesn't work. Can you say "if (x is not 4)"?
|
|
|
|
|
Good call out.
This is a way to future proof against someone adding a messed up operator later.
The ironic thing is that if someone wrote incorrect overrides for “==“ or “!=“ where they neglected null, then the likely outcome would be a null pointer exception from the operator itself.
Legacy code, before someone introduces a bad operator!=
if (obj != null) obj.f();
Avoids future introduction of bad operator.
if (obj is not null) obj.f();
|
|
|
|
|
Super Lloyd wrote: With latest c# iteration, instead of x != null, one can write x is not null.
Something else to insist people should not use.
Super Lloyd wrote: But then I tried to override the == and != operators and then.. I understood!
Operator overloading was something that C++ programmers learned to avoid like the plague before C# existed.
|
|
|
|
|
Been using that all over the place. Technically, it does about the same as != null, but I think it's increased readability is a boon. A comparison may look like any other value comparison at a quick glance, but an is not null screams "Yo dawg, this is a very special case here, potentially used for high-level control flow/error handling".
|
|
|
|
|