

One day we are going to lose an unbelievable amount of information, but do you know what is simple to archive and bring back, a simple form from 2005.
One day we are going to lose an unbelievable amount of information, but do you know what is simple to archive and bring back, a simple form from 2005.
I have made worse, I used to do a cascading merge everyday to move stuff from dev branches to staging to production. Then I did a merge in the opposite direction for a small selection of branches so they could get their updates from staging. Feature branches were rebased as needed.
Undocumented feature flag in a plug-in, that changes the behavior drastically when in any deployment mode.
Don’t. Unless you are confident you are not adding hot garbage to the code base.
The guy at work who managed git before me, well didn’t quite have the knowledge I do and was not using LFS. In one of the main repos a 200mb binary was pushed 80+ times. This is not the only file that this happened to. Even if you do a shallow clone, you still need to add the commit depth eventually. It’s a nightmare.
That is kind of funny, sure it parses human speech but when you use the method for communicating letters and numbers very clearly, it breaks.
In game dev, a binary file conflict means someone is going to have to do their work a second time.