|
One small correction - browsers don't accept certificates that are valid longer than 1 year.
|
|
|
|
|
I'm using letsencrypt/certbot with Apache 2.4.x on Ubuntu 20.04LTS. 3 sites, 1 cert to cover them all.
Initial signup Aug 2021, answered the script's questions and away it went.
Had to do some minor tweaks to the site configs it wrote for the :443 sites, mainly for some of my custom logging.
Since then, about every 2 months, it (I think the certbot snap) pops up, installs a new cert and does a graceful restart of apache httpd.
So undramatic I don't even notice. To write this I had to peek into /etc/letsencrypt/archive/ to find out when it did the renewals.
Obviously a case of ymmv (or apache vs nginx setup?)
Cheers,
Peter
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
honey the codewitch wrote: Certs are a hassle I'd rather not have to deal with ever 90 days.
I've got one 'bought' SSL cert (2 yrs) and two letsencrypt ssl certs. The letsencrypt certs on my windows servers are good for 90 days and are managed automatically by an app/service called certifytheweb. It was a bit tricky getting it working the first time, but since then I haven't had to worry about them for over 2 years now.
I'm running a mail server on one of those and recently (2 weeks ago) finally figured out how to export the public/private keys that are required for hMailServer. Now I've just got to learn enough powershell to automate the process!
"Go forth into the source" - Neal Morse
"Hope is contagious"
|
|
|
|
|
I did the let's encrypt thing for a season and found it fiddly. I prefer to have to renew once a year so I went and got a real wildcard as they are pretty cheap today.
|
|
|
|
|
I tried other stuff to have IIS/Windows autorenew (wildcard) - but not CertifyTheWeb - I will try it out and if it works I am going to owe you a beer.
If not I will just keep on drinking myself
|
|
|
|
|
IIRC, the trick in IIS was app pool permissions on the .well-known/acme-challenge folder. Good luck!
"Go forth into the source" - Neal Morse
"Hope is contagious"
|
|
|
|
|
Why are certs required? Who's the sheriff?
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
your browser will default to https these days. sites pretty much have to support SSL.
To err is human. Fortune favors the monsters.
|
|
|
|
|
So certs are the badge of a secure website and the right to claim "https".
That relationship is not obvious. Thanx.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
A bit more than that. The https protocol is not just a "label", it's an actual protocol, and the handshaking involves the sharing of the certificate with the requester. So the cert is an integral part of the SSL protocol. No cert, HTTPS doesn't even begin to work.
|
|
|
|
|
Not sure which browser you're using but Edge, Chrome (unless it was in the update this week), and Firefox don't default to SSL. They do check for a certificate first and then warn you if you're going to an https URL and there's no certificate.
|
|
|
|
|
the heck it doesn't. It wants to do it unless i explicitly type http:// in the address bar. I always have to fiddle with that when i'm calling web stuff off an esp32 which doesn't do ssl
To err is human. Fortune favors the monsters.
|
|
|
|
|
You need to go into Options and uncheck the "Screw up randomly" box.
Or use sudo scrwuprnd off
|
|
|
|
|
honey the codewitch wrote: Maybe some of you know why waving a dead chicken over linux never works, but I don't.
Windows is a proprietary O/S, so waving proprietary dead chickens over it works. Linux is an open-source O/S; you need to open-source your dead chickens.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
...also, which dead chicken you use is dependent on your distro. When in doubt, you may have to try all 500+ of them... but try them quickly. The longer you wait, the more seem to hatch!
|
|
|
|
|
Waving dead chickens over windows works but with Linux you must use a dead penguin.
|
|
|
|
|
I have 3000+ domains on IISs behind multiple HAProxyeis and NGinxs.
One Windows VM is responsible for the creations and renewal of all certs on all Proxies using custom C#.
(Keeps the date of last renew, renews, saves on proxy via SFTP, reloads proxy via SSH)
A certificate is renewed every 60 days, if it fails i get warned and have 30 days to solve the problem.
It never fails on LetsEncrypt, it is always because the Domain DNSs are wrong or something like that.
Commercial/Adminstrative people add or remove clients (domains) at will and i never have to handle any of that.
I absolutely love LetsEncrypt. No way i would renew 3000 domains manually.
Tip: Don't stop NGinx, just reload it. (i assume certbot will not complain)
If something fails on the renew, the site is still up with the old cert.
|
|
|
|
|
certbot won't run if something is bound to the http ports
To err is human. Fortune favors the monsters.
|
|
|
|
|
Ahh. OK.
I have only one certbot in a Raspberry PI at home but is running as a service/daemon.
I do not remember what i done, but it keeps renewing the cert by it self.
Nothing gets added or removed from that PI, so it not a good comparison.
But i find strange (a lot) that webserver has to stop to renew the cert.
Renewing many sites takes a lot of time and no way the downtime is acceptable.
Don't know what it is, but something is up.
|
|
|
|
|
Yeah probably ugly, although most larger sites are load balanced so in theory it should be possible to update a node at a time without downtime for a site like that.
But I share your confusion as to why the site needs to be stopped.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I've ad better luck with rubber chickens.
|
|
|
|
|
honey the codewitch wrote: why the heck do we need to encrypt all web traffic these days? Because Google decreed that it should be so.
If you think 'goto' is evil, try writing an Assembly program without JMP.
|
|
|
|
|
to answer that question of WHY SSL.
Because we need some privacy in what we are doing.
Before SSL, every man in the middle knew every search, your passwords to FTP and your email passwords (no ENCODING is not encryption, LOL).
So, now only GOOGLE (or your browser) can sell your URL hits if they are not tracked elsewhere (usually by google, fb, etc).
This is a step in the right direction. I use apache, and the process (as mentioned elsewhere) is pretty clean. My chief tech automated it years ago, never noticed it. It just works. Thankfully. (Of course my published site is very touchy, you don't get a 404 error. You get firewall BLOCKED for 72hrs, got tired of robo attacks, lol. Oh, outside of the us, it could be a 30 day ban! (99% of my web traffic was simply attack bots checking for phpmysql, etc)).
Spend the time to make sure you have the configuration right, and easier to update, it's clearly worth it.
But we need SSL. EVERYTHING over the internet should use strong encryption. The fact that we SUCK at it... Is kinda on us... We spend very little time playing with it, and just want it to work.
|
|
|
|
|
Yeah, all of this.
Although once traffic is "on the inside" I think people do tend to keep it SSL and this is probably a little bit bad/irrelevant/overkill. Encryption/decryption doesn't come for free. Let the API gateways/load balancers handle it.
My mouth stood agape at a line in Microsoft docs recently for a specific kind of containerization on Azure where they say applications don't have to and should not implement SSL. I have to think their thinking is much like the sentiment above.
However, it IS maybe a notably different animal to be able to sniff your own traffic.
|
|
|
|
|
Agreed, but remember that for DECADES our poor practices at protecting thing (Storing passwords clear text in DBs associated with the users, shoving them in COOKIES (OMG) as opposed to some GUID), and thinking we can ADD security later.
You know, like we can ADD performance Later... (Every project I've seen with that attitude suffered massive performance issues. You DESIGN for performance, you implement with care. If speed is important, then it's part of the CONTRACT and TRACING).
If Security is remotely important. It's got to be part of the contract.
And in todays world. Let's ASSUME that a LACK of security is a NON-STARTER.
The tools are getting easier/better. But people still not understanding which is the PRIVATE KEY and which is the PUBLIC KEY is getting old. (of course calling them both key files, and sometimes .key or no extension doesn't help, but the .pub should be pretty obvious).
--> We've come a VERY LONG way since the 1940s (Pre-Fortran). C, C++ (Objects), (Frameworks), and more!
I have hope!
|
|
|
|