|
No, I added it to mess with you ... sorry about that, I'll get rid of it later on.
I'll go along with whatever decision the two of you decide on: pkfox, musefan, me, or A.N.Other.
decide between yourselves and let me know, please.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: No, I added it to mess with you ... sorry about that,
All good fun
OriginalGriff wrote: decide between yourselves and let me know, please.
It seems pkfox has claimed it. Also, I can say for certain I will not be posting tomorrow.
|
|
|
|
|
@petepjksolutionscom
@musefan
Seems like that's a vague consensus - pkfox it's all yours tomorrow!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
See my reply to musefan!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
|
|
In the past, tanks went "Broom-broom-broom"
In the future, they will go "Roomba-roomba-roomba"?
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
If outsourcing your army was cheap, they'd do it.
AAAS (army as a service)?
Actually...this might already have been done under the name of proxy wars.
|
|
|
|
|
The French have been doing this for a couple of centuries now - the French Foreign Legion is basically an outsourced army.
|
|
|
|
|
These things need to be banned and anyone who deploys them executed for treason against the human race. It's bad enough that we have stand off weapons where the person doesn't see the carnage that results from their actions, but having fully autonomous weapon systems is a nightmare scenario.
|
|
|
|
|
I gave up on the article in the first paragraph as it accused Musk of being a thinker!
Never underestimate the power of human stupidity -
RAH
I'm old. I know stuff - JSOP
|
|
|
|
|
Something I run into a lot with IoT is things that seem simple become complicated quickly due to having to run in a constrained environment.
Take running a jpeg slideshow off an SD card on a 320kB system with a 320x240 display.
Wire up the SD and an ST7789 or ILI9341 and bob's your uncle, one might think.
Except for one small wrinkle - a camera produces images in the megapixel range. That display is only .0768 megapixels if my math is right.
That means you have to do some sort of resizing on most images. Well, it's all well and good to do bicubic or bilinear resampling of an image if you can access the entire uncompressed image data in a frame buffer, but again, a 320kB system. Good luck.
Images must be progressively loaded and then blted more or less directly to the display as it loads - since the display hardware has its own 320x240 display memory on the chip. How that happens depends on the underlying image format. For BMP files progressive loading means going from bottom to top, scanline by scanline. For JPEGS it means getting 8x8 squares of the image at a time, left to right, top to bottom, etc.
The issue here is when you're resampling you need to use matrix computations on *overlapping* regions of the image in order to get a proper result, meaning you can't just resize those 8x8 chunks to 6x6 for example and call it good. It would create artifacts every 6 pixels where it wasn't "blended" properly at the edges with the next pixel.
So suddenly you need something like maybe a 12x12 intermediary buffer so you can progressively resample, which makes a simple algorithm suddenly messy.
And this is just an example.
All this for a slideshow.
Real programmers use butterflies
|
|
|
|
|
I see....
Nothing that would stop a code witch extraordinaire though, is it?!
|
|
|
|
|
*cracks knuckles*
*grabs wand*
Nothing an eye of newt won't solve.
Real programmers use butterflies
|
|
|
|
|
The post title deserves an upvote all by itself. And a fun problem too.
Here's a thought. If and only if, you have fast random-access reads to your SD, you could shift your 6x6 convolution window one pixel horizontally at a time, reading a new vertical column of 6 input pixels for each output pixel. This would mean that for the next output row you would read 5/6ths of the same pixels again. Each pixel would be read 6 times.
But TBH I would first try discarding pixels, and subjectively examining those results. With such a low res display I would assume the display to have a lower dynamic range as well, and that would help masking the artifacts from an unmathematical downsampling.
Or a hybrid, reading 2x2, and discarding.
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
I can't shift one pixel horizontally at a time because JPEGs are compressed in 8x8 chunks left to right, top to bottom.
Now, I could resize those, but like i said I'd get artifacts. I need to overlap. It's easy enough to do horizontally, but vertically is a problem because I need to store 2ximage_widthx2 bytes worth of pixels to do bicubic sampling vertically. That's a huge problem RAM wise, and it complicates the algorithm significantly.
My other option, and I'm not 100% sure about this, is to do two passes over the image, and do the "in betweens" vertically on the second pass. I'm not even sure this will work as so far I only have a vague sketch of the concept in my head, but if possible it will probably be the route I go.
Real programmers use butterflies
|
|
|
|
|
If you can read those 8x8-chunks with random access, would you really need to keep a whole row of chunks in memory?
If I've "seen" your challenge correctly you would only need to keep twobytwo of those 8x8-chunks in memory at any time. But with the downside that each chunk (except edge chunks) would have to be read, I think, 2x8 times. And ofc I have no idea how expensive those reads are...
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
They're compressed. There is no random access possibility in JPEGs unless I'm mistaken.
Adding, I don't really care about load times when it comes to resizing.
Real programmers use butterflies
|
|
|
|
|
What if you create an index of all chunks in initial indexing pass?
chunk[x, y] => addr
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
There's an idea. The only thing I'm not sure about is the huffman table. If it stores its compression table progressively I won't be able to seek exactly - I'll have to decompress up to the requested point. I have a feeling i may need to do that anyway.
One thing I was thinking of doing is opening the file twice, and scanning through them in tandem with one ahead of the other by one row of 8x8s.
Real programmers use butterflies
|
|
|
|
|
Hi, I use WinDirStat to evaluate the amount of space in each folder on my local drives, but WinDirStat doesn't work with NAS devices.
Can you suggest a utility that lets you view the full folder tree along with how much space is taken up by each folder/file?
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
I'm using version 1.1.2 and it works if I select it as a network folder.
EDIT: also works if network drive is mapped.
Mircea
|
|
|
|
|
Thanks for your suggestion.
I tried it, but I can't map a folder that's contained in a NAS. It's not a Windows share.
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Have you tried the option to open a folder and set folder name to something like \\nas\share\folder ?
Seems to be working for me.
Mircea
|
|
|
|
|
You're right. It works when I directly enter the folder path into the folder box. But it doesn't work if I browse for the folder using the drive/folder browser.
Thanks!
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|