We've all done it, we've labelled some code (often stuff we've inherited) as "legacy"? But it's still used in the production systems - so is it really legacy? And what makes it legacy? Should we shy away from this unwarranted labelling of perfectly functioning code; where the labelling is a pure convinience which allows us to push through new stuff and keep upper management nice and happy?
###Summary of answers
Summary of answers
Looking through the answers I see four general themes. Here is as I see the breakdown:
- Any code that has been delivered: 6
- Dead systems: 2.5
- No unit tests: 2
- Developers are not around: 1.5