Everyone eventually hits a wall when they are building some software and there is a dependency that is unmet. Some library or framework isn't installed or is of the wrong version. You try to get that dependency and then it has three more that you don't have, so you have to chase those down as well, and either you eventually find them all or you run out of patience and say fuckit, trying to build that Tetris game from scratch really isn't worth your time. Lots of different solutions to this problem have evolved, and package management has started to become somewhat elegant, but it still is out there. Security of packaged code is a tricky business, and you are always relying on some proper vetting to make sure that those dependencies aren't about to unleash a trojan that will encrypt your hard drive and then sell it back to you.
Why am I talking about this?
Well, I was talking in a previous post about how microservices are a pattern that has been around for a long time and how it really made software so powerful. The thing about breaking software down into small reusable pieces though is that it brings along this dependency hell problem. Software developers like myself face this kind of problem a lot, especially when we have rules and regulations to adhere to that can restrict the places where we actually can get software from. We run into road blocks that essentially make us abandon or delay a lot of proof of concept efforts because we simply can't get everything we need. The process to get the things we need cleared is itself a challenge.
Is there any hope?
Apple took a different approach (or perhaps it was carryover from Next) where basically you didn't put the burden of matching dependencies on users but instead you just packaged your main software and your dependencies together into one packagable unit. Then you just drag it into a folder on your hard drive to install it, and then you just drag it to the trash to uninstall it. In fact, it is so easy that users hardly even realize that an application isn't a single file but is actually a folder tree with everything in it. Yes, there is some duplication of files everywhere, but the fact that you don't have conflicts of dependencies is like a breath of fresh air.
So why doesn't everyone do it this way?
Some of us remember the times when things like disk space and memory where really constricted resources. My first computer only had 64k of memory (for me, that seems incredibly limited, but I'm still a young guy and I'm sure there are folks that had computing devices where 64k would have been a luxury), and no disk drive. You powered it up, then insidered a big 5.25" floppy and loaded software off of that. Those floppies barely had more than a megabyte of data even when you used both sides. With such a small footprint, you either squeezed as much out of that one disk or else you put the burden of making your users change disks (even flipping the disk was kind of annoying). Those drives were also incredibly slow, and so even if you easily fit on one disk, you still want the code compact so that your users aren't sitting around waiting for data to be read off the drive. Thus the art of software was born with effeciency as the first commandment, and redundancy was one of the first things that would never fly.
Old habits die hard, and even now, none of us really want redundancy. Redundancy is bloat, and when things get bloated they tend to get slower. With computer speeds doubling almost every year, it started to feel like maybe bloat wouldn't be such a problem, but while our really compact code got really fast, people started writing even more complicated routines that kind of brought things back to earth. CPU speeds also hit a ceiling, and so keeping things efficient continued to be a priority. It isn't as critical as it once was - performance testing now seems to be lower in the testing chain and is more of a tuning step - but most of us are familiar with the complaint about some part of our application not performing. Seasoned developers are always thinking about performance, even if the first few passes on some logic don't really focus on it. If we keep things loosely coupled, we can always come back and tune the pieces one at a time.
So what do we do?
We keep doing the continuous integration and continuous deployment thing. Make sure that dependencies all fit together. We regularly pull the source off the shelf, build it, test it, and make sure that it runs. The more we can automate this, the less pain our users will feel. We won't always be able to prevent this issue from cropping up, but the sooner that a developer knows that some dependency is missing or broken (maybe that specific version of the library got deprecated and removed or maybe there is now a version incompatibility), the sooner we can address it.
Putting your testing as a priority is paramount to keeping your development processes going. Not just for keeping dependency hell at bay, but also for defects and security issues as well. Your testing shouldn't just be for happy path - it needs to address the negative paths as well. Every time there is a bug found, you should be writing a test for it so that you catch it as soon as possible. Some of us refer to this as moving the defect resolution process to the left - as close to the developers as possible. Fixing issues that crop up in production cost your business time and resources that are orders of magnitude greater than what it takes to fix issues in the development environment. Bottom line is that you can't ignore devops - it is the thing that will save your business more than anything and keep you out of places like dependency hell.