|
This message has been flagged as potential spam and is awaiting moderation
|
|
|
|
|
Loved watching the his shows.
Quote: Mr. Lear’s entertainment career spanned the late 1940s to the 21st century, and he also found prominence in later life as a liberal political activist. But his legend was sealed in the 1970s, when he created a handful of shows that transformed the television medium into a fractious national town meeting and showcased the American family in all its hopes and dysfunctions.
As the aircraft designer said, "Simplicate and add lightness".
PartsBin an Electronics Part Organizer - Release Version 1.3.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
He was a genius, for sure, even though he was a Liberal twit. I happen to have on my shelf every episode of All in the Family, and I'm planning a binge watch next year!
Will Rogers never met me.
|
|
|
|
|
Roger Wright wrote: even though he was a Liberal twit
Seems to be common in the industry.
Yeah all in the family and Sanford were two of my favorites.
As the aircraft designer said, "Simplicate and add lightness".
PartsBin an Electronics Part Organizer - Release Version 1.3.0 JaxCoder.com
Latest Article: SimpleWizardUpdate
|
|
|
|
|
I blame the earthquakes in SoCal; it seems to rattle their brains a bit.
Will Rogers never met me.
|
|
|
|
|
Agreed, I'd add Good Times and The Jeffersons to that list.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
I finally read and completed the book, The Code Book: The Secrets Behind Codebreaking[^].
It was published back in 2002 and I read the first chapter but then stopped.
The thing is, because this book has so much history of cryptography/cryptology it is amazingly current.
I've read quite a few books now and this one really covers the entire history of codes and cracking codes.
It's actually even better than I thought it was going to be.
Have you read it? It's absolutely fantastic.
|
|
|
|
|
It looks very interesting. Cryptography has long been an interest of mine. The second program I ever wrote was handling a simple substitution cipher in BASIC on an HP 3000 using a Teletype. That was in 1974.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
A couple of threads earlier I asked the question: Should one defrag a SSD or not. I got different answers so I tried an experiment:
I am very familiar with Macrium Reflect to create images of my C: drive, which is a NVMe M.2 SSD. Reflect works extremely fast and will create an image as fast as the C: can feed it data.
So I created a system image and noted the speed at which Reflect was writing it to the target. It reached a maximum speed of 6.7 GB/s. Then I ran "defrag C:" from a command prompt and got a report that the C: drive was 20% fragmented before it was successfully defragged.
Then I ran Reflect again and this time it reached a maximum speed of 7.8 GB/s!
It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation.
Note: I ran the trim command on the same drive yesterday and it seems this did not remedy the fragmentation.
Thanks to all those who expressed an opinion on SSD fragmentation, but I will be running it from time to time. If that shortens the life of the SSD, well, they are cheap and easy to replace!
Note: Windows reported as follows after defragging the C: drive:
Pre-Optimization Report:
Volume Information:
Volume size = 930.65 GB
Free space = 868.64 GB
Total fragmented space = 20%
Largest free space size = 863.72 GB
Note: File fragments larger than 64MB are not included in the fragmentation statistics.
The operation completed successfully.
Post Defragmentation Report:
Volume Information:
Volume size = 930.65 GB
Free space = 868.64 GB
Total fragmented space = 0%
Largest free space size = 863.75 GB
Note: File fragments larger than 64MB are not included in the fragmentation statistics.
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
The only reason we used to use the defrag tool was to watch the animation of the blocks being moved around.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
Did you run the tests more than once before and after the defrag?
|
|
|
|
|
A valid question! In response to your question I just now ran the "after" test again and got the same result. The "before" test I ran many times over the weeks and never got the speed that I am getting now.
Also: I did a clean install on the machine 3 days ago, and this may explain the 20% fragmentation.
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Cp-Coder wrote: It seems to me the speed at which SSD can read large volumes of data is affected by fragmentation. People that pretend to be experts claim that since there are no mechanical parts in an SSD that fragmentation is no longer an issue clearly do not understand how software for a file system can also be a bottleneck. But the fact is, nobody verifies anything and just repeats crap.
That being said, it's much less of an issue these days. Back in the day if you had a fragmented filesystem... you would know. SSDs work substantially quicker than mechanical drives. In theory the speed of electricity. It's still a tradeoff between defraging and a shorter lifespan of the drive though. I mean they're much cheaper now and last a long time, still worth knowing about the tradeoff.
Just a tip to keep the FS from fragmenting. If you have a bunch of files that you move around a lot you can always dump them in a zip file that's lightly compressed. It'll spare your real file system. Granted, probably better to do this for tiny text files that aren't source controlled, so maybe it's not practical.
Jeremy Falcon
|
|
|
|
|
So I have this Ferrari Daytona SP3... and I was wondering if I could use jet fuel instead of gasoline. I asked online and most people suggested that jet fuel was probably a bad idea since it would likely shorten the cars life. I decided to test it at the track and discovered that using gasoline got me from 0-100mph in about 5.8s and topped out at about 210mph. However, if I used jet fuel it got me from 0-100mph in 4.7s and topped out at 242mph.
Awesome! I'm sticking with jet fuel!! Never mind that I live in the city and my average car trip is less than 3 miles (round trip).
Bottom line: Not sure that using Macrium Reflect is the best judge of your real world system performance. Just saying...
|
|
|
|
|
Since I use Macrium Reflect almost on a daily basis, it is a valid metric for me. I will continue doing what is best for me, and you can do whatever works for you!
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Cp-Coder wrote: I will continue doing what is best for me, and you can do whatever works for you! Of course... BTW - I was joking with my "analogy"; hence the "joke" icon of my post.
Out of curiosity, what role / function does this PC perform that necessitates such heavy use of Reflect?
|
|
|
|
|
I keep my data on a separate drive which I backup separately, so the C: drive only has Windows and the applications. So my Macrium images take less than a minute to create. Since it hardly takes any time, I take an image every morning first thing and I can restore my machine to a previous state if I pick up anything nasty or unwelcome.
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Cp-Coder wrote: So my Macrium images take less than a minute to create
So if it slowed down by say 10% then that would be too slow?
|
|
|
|
|
Ultimately that's all that matters, doesn't it.
If you have a measurable difference in performance, stick with your current method.
|
|
|
|
|
fgs1963 wrote: Awesome! I'm sticking with jet fuel!!
Next test is with the lawn mower?
|
|
|
|
|
I would imagine there's some overhead in processing a ton of file pointers to determine where the next chunk of a badly fragmented file is, as opposed to having a file stored in one continuous chain. Would that account for the difference? I have no idea.
Still. I don't know about Macrium's internals, but in theory, if a backup program worked by copying entire disks/partitions, as opposed to reading file systems, then it wouldn't matter how fragmented (or not) a disk is, or whether the software even needs to understand what file system is being used.
Of course that means backing up a 1TB drive that's only 10% full will back up 1TB and not 100GB. I have a 2-disk USB enclosure that's like this. There's a button on the front that, if held when powering up, will blindly clone one drive to the other, regardless of file system (assuming the target is the same or larger capacity). And if the source drive has tons of fragmentation, the individual cloned files will be as badly fragmented.
|
|
|
|
|
No. I have set up Macrium so that it only includes actual files in the image. So my images reflect the size of the used parts of the disk, not the entire disk. This works very well and I have restored my C: from such images dozens of times. Macrium also includes all partitions on the systems drive by default. It is really a fantastic utility for restoring your machine in case of some disaster.
Ok, I have had my coffee, so you can all come out now!
modified 1hr 20mins ago.
|
|
|
|
|
Hello all,
INTRODUCTION
Given the new accounting laws in our country, every company will need an accounting software that sticks to those laws.
Even freelancers (as me) will have to adopt a software like that.
I don't trust the cloud + I don't want to pay a fee every month to be able to use my accounting data.
Now I own a NAS which is more than enough for my needs but is not capable to run the accounting programs I will be able to use in my country.
Most of the accounting programs I could use require SSD and Windows to run.
Getting a server would mean:
* Getting a server, some SSD and HDD disks.
* Getting an UPS.
* Getting a small rack.
* Getting a Windows server license.
* Using our current NAS as backup for that server and keep doing our NAS extra backups with external USB HDDs.
QUESTION
What server / option would you recommend for this kind of job?
Would it be better to get a tower server? or a rack server?
As soon as we have children the server, NAS, UPS... will have to be placed inside a rack anyway.
It would be nice to be able to have a mix of SSD and normal HDD, SSD for the OS and the accounting program and HDD to store everything else.
+/- 8TB of data space available would be nice.
+/- 32GB RAM available would be nice.
Would it be better to install the accountant program inside a virtual machine? just to make it easier to move it from one server to another one in the future (if needed).
Do you agree that it's better to get a server than a normal workstation for all this?
And as a bonus... what would you use that server for apart of all mentioned before? Any additional hint/idea?
Thank you all!
|
|
|
|
|
Are there hardware requirements for the software ?
Are you the sole user of the data ? or your clients need access to the data ?
Whatever your do, make sure your backup work; plan regular tests of your backups.
I would use the server for a single purpose
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|
Maximilien wrote: I would use the server for a single purpose That's a good idea. If he feels the need to overpay and get a beefy computer, then he can at least use something like VMWare Sever to split it up.
Jeremy Falcon
|
|
|
|