We are pattern matching machines. We love them and we'll find and match patterns even if they're not there. From raindrops on a sidewalk to the stock market to assassinations of US Presidents, we project patterns into statistically knowable events and then make predictions. If your take the long view you'll do OK, but Hari Seldon's psychohistory? Maybe we'll get there one day, but today is not that day. The same applies to large, interdependent systems. We know they're going to have problems, we might even be able to predict how many will happen over a sufficiently long period, but we can't accurately predict when the next one is going to happen.
A Black Swan is a seemingly random event that, in retrospect, we can see that it was inevitable. Just like the black swan, we don't know exactly when one of our systems is going to have a problem, but we know it will. Knowing that it will happen, we can prepare. We can think about a taxonomy of problems and put addition systems and processes in place to minimize their impact. Here's one such taxonomy and how we can learn from it.