|
EDIT 2: I DID IT. THIS ROCKS
I'm not entirely sure how they do it. The code is a labyrinth - it has over 100 contributors.
Alpha blending without hardware acceleration is slow because you have to read from the source pixel by pixel, blend each pixel with the new color, and then write to the the pixel to the destination.
I was doing this naively, but I want to approach LVGL speed. I need to rise to the challenge.
I'm now planning on reading pixels line by line from the display source in batches, then scanning each line, and only reblending if the source color changes when I fill a rectangle.
Filling rectangles is a primitive that almost all drawing operations - including draw-line use so it basically speeds up everything.
To complicate things, I can't guarantee that memory will be available to copy the source line into a buffer, so if it's not I need to fallback to a pixel by pixel read.
Here goes nothing.
Wish me luck.
Edit: I sped it up some, but it's sped up much more when the draw destination supports direct reads into RAM. However, the display device I'm using this with reads the pixels in 18-bit format rather than 16-bit format and worse, it's padded to 24 bits. That means no matter what I do, I have to convert each pixel to 16-bit anyway. I kind of know how to speed it a little bit even in that case but nowhere near approaching that LVGL demo. It makes me wonder if they aren't doing some sort of faux alpha blending in that demo rather than the real thing. That wouldn't surprise me at all. I have an idea of how to do that anyway.
Real programmers use butterflies
modified 20-Feb-22 10:44am.
|
|
|
|
|
I believe you can do it...
Unless the original code was very good.. but with 100 contributors there is a lot of room for messy useless redundant slow code!
|
|
|
|
|
I did it man. I'm not sure how it stacks up against LVGL because I still think their demo probably cheats, but I sped it up by orders of magnitude in most situations (as long as it has enough memory to make a temporary bitmap to blend to)
Real programmers use butterflies
|
|
|
|
|
Niiiiiice!
|
|
|
|
|
I have a Wio Terminal. It's a little $40 IoT widget that has 192kB of SRAM in it.
It *also* ostensibly has 4MB of PSRAM.
However, there is no documentation on using this extra PSRAM.
The last post reply on their support forum was 2 years ago.
There are no samples, either from them, or from 3rd parties that I can find that use this PSRAM, anywhere on github or elsewhere for that matter.
What's the point of spending the money to have 4MB of RAM in your device if you're not going to take half an hour and at least produce a sample that uses it? Why spend the money?
It just floors me that people think they can release products without documentation, put up a few youtube videos and call it done.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: What's the point of spending the money to have 4MB of RAM in your device Marketing.
honey the codewitch wrote: if you're not going to take half an hour and at least produce a sample that uses it? Idiot management.
|
|
|
|
|
You're not wrong! Grrr!
Real programmers use butterflies
|
|
|
|
|
David O'Neil wrote: Idiot management
Or because they couldn't. Many years ago I bought a small board (do not remember the seller name) for the ram and it wasn't even connected (there were no PCB traces going to it), it was just glued on some free space of their previous board revision
Easy money for them. Idiots us who bought it.
|
|
|
|
|
Option 1: The device becomes popular and people will start hacking away. Sooner or later someone will figure out how to access the PSRAM and you just saved the time it takes to write samples (which is more than half an hour - could easily be several hours)
Option 2: The device turns out not being very popular, and you just saved the investment in making examples.
Of course, one could argue that documentation is required (well, helpful) )for a device to become successful, but why make things complicated when the other options are easier for management to understand in a PowerPoint?
|
|
|
|
|
Well, the thing is a couple years old at least, so I believe I'm stuck with option 2.
Real programmers use butterflies
|
|
|
|
|
In the "supermini" days, I was working for a small company making a VAX competitor. The company didn't have the development resources to design different models. But the market demanded a "range" - entry-level alternatives, top-range alternatives. So the question came up: How to differentiate, when the core machine is identical in all the alternatives?
This was in the pre-RISC days. CISC CPUs were microcoded, and this machine loaded its microcode from the disk as part of the boot process. So one proposal that was seriously considered was to make an entry level model by inserting wait cycles in the microcode, hardware 100% identical to the higher models. It didn't end up that way, though. Cache memory was extremely expensive. Removing the cache saved about 40,000 Euro in component costs, roughly halving the CPU speed, so that alternative was chosen.
For the top range model, the machine was delivered in a twin cabinet, with lots of space for I/O cards (this was essential for lots of customers), and possibly small, internal disks. The CPU was identical in speed and functionality to the mid range model. I taught a course in programming these machines, and one of the participants got furious when I told that the top range model was no faster than the mid range model: She threatened to sue the company for fraud; they had spent the extra money for the top model to get the fastest CPU available, and it turns out to be a waste!
Another "one size fits all"-solution employed by this company: The machine had a hidden disk, not visible to the customer, containing the full suite of proprietary software. When a customer bought some software, it was distributed on a 360K floppy containing the license key, which was a decrypt key for copying the software from the hidden disk to the ordinary working disk. (This obviously was before the internet, so the alternative would have been to ship the software on 42 floppies.)
So, my guess is that the extra RAM may be there for some other use of the same design; the manufacturer maintains a single design, a single production line. Maybe the other use is a completely different product. Maybe there was a planned product never making it to the market, that would be using this RAM.
Hardware sometimes is like software: I am certain that at least 50%, but most likely 80-90%, of the Microsoft Office code has never been executed on my PC and never will. But MS won't make a special MSO edition for me, with only the functions I use. You have a piece of hardware with components your Wio Terminal does not use. Fair enough - maybe someone else uses it. Reusable hardware design - reusable software design; that is two sides of the same coin.
|
|
|
|
|
trønderen wrote: my guess is that the extra RAM may be there for some other use
Honestly I doubt it if only because they advertise it prominently. I think it's far more likely that they just suck at documenting their products. I've run into a lot of IoT boards like that. Some I'm even had to throw away.
Real programmers use butterflies
|
|
|
|
|
You haven't requested/obtained the information from the manufacturer on how to use this RAM. So you think the RAM shouldn't be there.
I fully recognize your opinion that the RAM shouldn't be there. That doesn't imply that I agree with you.
|
|
|
|
|
I have indeed, and I've exhausted every available contact avenue I have had with them.
I don't think it shouldn't be there. I think it should have been documented.
Real programmers use butterflies
|
|
|
|
|
My theory:
The memory is faulty or otherwise unusable. Sticking it on a board and sending it out to customers is cheaper than disposing of it according to environmental regulations.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
My experience is that the hardware gets developed before the software / firmware. The "next" version is then based on what you learned the first time around. Repeat.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
Gerry Schmitz wrote: My experience is that the hardware gets developed before the software / firmware You are probably right, but that isn't necessarily a good thing. I've seen a couple of cases where software guys were brought into the middle of hardware development, with an unquestionable positive effect on the hardware design.
Unfortunately, the situation as you describe it is much more common.
|
|
|
|
|
There will never be a "next version" of this device.
Real programmers use butterflies
|
|
|
|
|
This is not sarcasm, for once. I am doing a bunch of contacts. The ability to parse web page source text in an Access memo box, add the appropriate items to the DB, and keep track of the contact information in the DB (and, even though I haven't gotten there yet, being able in Outlook to open up Access and add the send and reply dates back into the DB from there (should be doable, but haven't tackled it yet)), is a MASSIVE help. Everyone seems to sh*t on Office and VBA, but when used correctly it is TREMENDOUSLY POWERFUL! A small business could be ran with it pretty easily if you coded it up nicely, and I haven't came across anything in VBA that forces you not to code it nicely.
People complain about the cost of Office, but with Access, the power is amazing when you are aware of it. I still want to complain about the cost, but there is no way in hell I could single-handedly code a suite like that in less than twenty years; probably many many more! It is cheap for what you can get out of it. So thank you! (But don't take this as an excuse to raise prices - you've distributed the cost among enough of us - quit being greedy; profits don't have to increase each year. Just making a profit when so many people are struggling to live should be good enough.)
But quit worrying about stupid icons so much, and eliminate that totally utterly stupid idea of forcing an MS account just to install Windows. Your corporate head honchos are bending over backwards to make a dystopian future where everyone is looked at as a number with a revenue stream associated with it. Quit it! Focus on improving the user experience! Like some of the flakiness of Word when pictures are moved around! And combining the Control Panel and the new Settings program! Based on how well you did with VBA and Office overall, and my praise thereof, YOU CAN DO IT!!!
Best wishes for a continuously improving future!!! And Thanks Again!
|
|
|
|
|
Access is the hidden gem in Office. It's a fully relational database, including all the data definitions, macros, forms, queries, VBA code, etc. being stored in tables. It has a powerful but relatively simple to use UI, and is perfect for workgroups on a high speed LAN.
|
|
|
|
|
I created a project tracking management system in it for a multi-million dollar company a few years ago. It should have been a full SQL system of some sort, but they were clueless (running off Excel spreadsheets till then), and I didn't have the time to learn anything else because I was doing full-time engineering project management for them. I set up a system that tracked projects by part number or project, tracked change notice requests to the last person it was given to, with a full prior track log, allowed assigning tasks to individual people, kept a status, and I forget what else. I was truly amazed at what I could do without any real issues, and I could easily see adding to the system to keep track of purchasing info for individual parts, tracking parts across the production floor, etc. Access was totally capable of all of it. Not good at individual permissions and security, or going above 3 or 4GB, but everything else? No issues, although it may get a little slow as the DB increased.
Anyway, one thing I've always asked myself is why don't people use Access as a reporting mechanism more often? I've heard everyone hate endlessly on Crystal Reports. I did not see anything I couldn't get Access to report that I needed, even if it involved setting up temp work tables to get things done. Since Access can access SQL DBs, why isn't it used for an easy reporting mechanism? They may not look flashy, but you could send some types of reports to Word templates and get pretty damn flashy that way. Is there something I'm just unaware of, not having been involved in DB development work other than experiences like the above?
|
|
|
|
|
I think many are clueless about what Access is. It is mostly a mentality thing.
Many former colleagues of mine through the years (programmers, DB administrators, etc) think that Access is not a true DB, only a toy for people who just want to play with something that looks like a DB.
I always reply to them that legos are also "toys" and yet, many great professional looking things can be built with them that are not used for playing (like cars, bridges, etc).
|
|
|
|
|
For smaller projects Access would be great.
For large solutions there is the problem with a restriction because Microsoft Access has a limit of 255 columns per table.
|
|
|
|
|
From my experience, I can see absolutely no reason to have that many columns. I got to look at BPCS, that the company used, and it was an absolute nightmare of a design, that I replicated some of with well-designed relational tables. I doubt I ever used many more than ten columns per a table, because there was absolutely no reason to do so if you normalize the data. But maybe I'm missing something.
|
|
|
|
|
In a properly designed 3rd normal form database, you are correct. You'd never get to 255 columns.
I can only think of one situation where it would be useful to have >255 columns and that is denormalizing a dataset for ease of reporting. For example a clinical-trials reporting database.
|
|
|
|
|