|
Nope, not just you.
Dev systems need more frequent backups than "ordinary" systems, because it's too easy to lose stuff through a tiny mistake. And dev systems tend to change more than "normal user" data does as well.
Not telling you they stopped is ... incompetantcy.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: Not telling you they stopped is ... incompetantcy. Yep! That was the thing that really wound me up!
|
|
|
|
|
my shop...
The entire team is work remote. We have no formal office space. We use Microsoft Teams to communicate and meet.
-- we develop locally on our dev laptops, running synchronized local databases using DbUp(see below).
-- visual studio 2019, etc.
-- We use DevOps and Git. We use DevOps continuous integration with build and release pipelines.
-- We use AWS for DEV, QA, UAT, and PROD (applications, sites, SQL Server dbs, etc.).
-- Everything that is deployed to an environment (DEV, QA, UAT, PROD) is in source control (Git), including all configuration files (app, web, appsettings.json, etc.).
We never backup anything except our databases in DEV, QA and Prod. All code is in source control, obviously.
We have 14 developers
We have 2 engineers that manage AWS and all build and release pipelines.
We have 4+ QA testers.
We have 5+ Business Analysists.
We have one Release Manager who is also our Scrum master, if you want to call it that.
Home - DbUp[^]
modified 22-Sep-21 4:35am.
|
|
|
|
|
Exactly, source is in source control and can be deployed at any time, including all configuration for that specific environment
|
|
|
|
|
One of the problems here is that everything has to be backed up and that leads to long delays in getting databases and such spun up.
I used to be able to have local databases as sandboxes, but they won't allow that anymore.
|
|
|
|
|
Only if the devs are full local administrators on their machines, so they can back them up.
|
|
|
|
|
All the important code is in source control and several folders get synced via OneDrive. It would still be a pain to set up my dev environment again, but most things are backed up.
|
|
|
|
|
I've been squeezed for disk space by control freaks and supervising sadists. So, it's not just you. The subtext was: hardware is more valuable than a system programmer's weekends off.
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
Actually, that makes sense.
The two categories of important data on a dev system are
a) developed product
b) development environment
a) is already backed up in source control. b) has to be easy to set up in the first place (i.e. for onboarding) so you gotta have a single installer or a script which sets everything up. That script, of course, has to be backed up, but the actual dev machine can be set up from scratch on any empty system.
|
|
|
|
|
c) Various servers needed for the CD/CI pipeline (Build agents, deployment targets, ...).
Either you need a backup of these, or (my preference) also script the setup of these and keep that in source control. That said, scripting our LDAP test server turned out to be rather frustrating, so that is just a "step by step" setup guide in our Wiki that could be done in half a minute to an hour.
|
|
|
|
|
A how-to guide is the next best thing and if nothing else, may help someone new to get an overview of the delivery pipeline.
Thanks for reminding me, by the way, I'll have to make sure to back up my CD pipeline as well*!
*When I finally get to set it up anyway. A huge "advantage" of maintaining a legacy pile of outsourced-a-dozen-times bog is finally, after all those years, having the management backup to set it up anew, clean & with all the bells & whistles, including CD.
|
|
|
|
|
Member 9167057 wrote: a) is already backed up in source control. Yes, but we have over 50 applications that would need to be redeployed. Doable, but a bit of a pain.
Member 9167057 wrote: b) has to be easy to set up in the first place Sadly not! Not only would this require effort from several teams outside of Development, we would have to put in Change Requests and go through Change Management to get it done. I doubt we would get an empty system ready for redeployment in less than 3 days.
And, as the goalposts have been changed without our knowledge, (i.e. we always used to have back-ups, and now we don't), we haven't exactly prepared for this.
|
|
|
|
|
a) are you talking about applications you produce or you use? If you produce, they shall be in source control and redeployment is as easy as git clone. If about applications you use, that'll be item b).
b) you need outside departments to set up your dev environment? Holy f***ing hell, that's awful! At our place, devs can get (local) admin rights which are enough to set up stuff locally, including local test data (which sits in a Git repo anyway).
|
|
|
|
|
Yikes
I have dealt with the following only:
1) IT owns the server. It is a pain getting changes (change requests, the whole nine yards), but as they own it they also make sure security updates are installed and the system is backed up.
2) DevOps own the server. IT does not care how or why it is changed, because they are not responsible for running it (they might own the infrastructure it runs on and keep that running, there might be some check for antivirus etc, it might be on a separate vnet where IT does not get blamed for problems etc - but in general IT is hands off.
Both kind of works with drawbacks, but I prefer 2. It seems you got the worst part of both of these - let's at least hope somewhere out there a lucky smock who ended up with the good parts of both - though it is probably one or another inexperienced guy who does not even know how lucky he is. Oh well, reality will catch up sooner or later.
|
|
|
|
|
The a**hole who said that should be terminated with extreme prejudice. Urinating and defecating on his grave is left as an exercise by the student.
Software Zen: delete this;
|
|
|
|
|
This is tough. Honestly, all the source code / configuration on a dev machine should be readily available in the source control system.
We check in our component libraries, everything. Even a dump of the Registry Settings for our Primary IDE, just so we can install a base Windows Box, the IDE, mount the source directories, and load the registry settings. We are done.
Anything on a machine CAN & WILL be lost, destroyed, fried, etc. therefore it must be backed up, or it must be in source control, with a TESTED/COMPLETE rebuild process.
BTW, testing your backup solution, or your rebuild process. ALSO is YOUR responsibility, since you are the one who needs it to work.
I always ask my clients 2 questions:
1) Do you have backups?
2) Have you ever tested them?
If the answer to #2 is NO. Then the answer to #1 is NO!
And I've seen people using MIRRORED Hot swappable HDs, and they would pull one drive every friday, and replace it with an initialized blank one. It rebuilt over the weekend. GREAT... Except they never tested it. Turns out the mirroring was specific to that controller. The HD they pulled was useless in 3 computers they tried to actually read it from. One was close, but it had the wrong firmware.
That was ONE system in a SMALL company. It gets worse with scale, not better.
But at least you know!
|
|
|
|
|
My dev system isn't backed up at all. I am responsible myself for storing all vital data either on a network drive or in Subversion. In my home office via slow internet connection, where should the backup data be stored anyway? If my computer should ever go up in smoke (which never happened in my 24 years as a software develper), I never lose more than 1 week of work. 24 years of regular backup would mean much more effort than 1 week of work, so it's not worth it. My opinion!
|
|
|
|
|
Intel Core i9-12900K crushes AMD’s best in new leaked benchmarks | TechRadar[^]
Quote: One of the best things about the CPU war between Intel and AMD is how it’s encouraging both companies to really up their games, and a new benchmark leak for Intel’s upcoming Core i9-12900K 12th generation Alder Lake processer suggests Team Blue could be on to a winner.
The benchmarks, which were posted on Twitter and reported by Toms Hardware, apparently show the Intel Core i9-12900K, paired with an Nvidia RTX 3080 graphics card, quite comprehensively beating the AMD Ryzen 9 5950X, AMD’s flagship consumer processor that will be the i9’s chief competitor, which is paired with an AMD Radeon RX 5700XT GPU.
Get me coffee and no one gets hurt!
modified 22-Sep-21 8:28am.
|
|
|
|
|
But can it run Creation Kit without me having to go make a sandwich every time it loads?
Real programmers use butterflies
|
|
|
|
|
Cp-Coder wrote: Quote:
Source?
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
Oops! Source is now added. Thanks!
Get me coffee and no one gets hurt!
|
|
|
|
|
Its always been an affordability issue for me. I can usually get significantly more performance from AMD for the same price. When I'm spending my own money, that's a huge deal. And since I was blessed with motion sickness from video games, I don't care about the fancy graphics stuff!
Hogan
|
|
|
|
|
Curious as to how it compares to the M1?
cheers
Chris Maunder
|
|
|
|
|
I am not the right person to ask. I know nothing about Apple components.
Get me coffee and no one gets hurt!
|
|
|
|
|
Pendulum effect.
one month it's Intel, the other month it's AMD.
No biggie here.
I assume hardware youtubers will start the "AMD is dead, use Intel instead" cycle.
CI/CD = Continuous Impediment/Continuous Despair
|
|
|
|
|