When I was younger, the most befuddling part of programming languages was the ability
to create errors. My first reaction to the
throw operator in Java
was, “Well, that’s stupid; why would you ever want to
cause an error?” Errors were my enemy—something I
sought to avoid—so the ability to cause an error seemed like a useless and
dangerous aspect of the language. I thought it was dumb to include the same
first place. Now, with a great deal of experience under my belt, I’m a big
fan of throwing my own errors.
An error occurs in programming when something unexpected happens. Maybe the incorrect value was passed into a function or a mathematical operation had an invalid operand. Programming languages define a base set of rules that when deviated from, result in errors so that the developer can fix the code. Debugging would be nearly impossible if errors weren’t thrown and reported back to you. If everything failed silently, it would take you a long time to notice that there was an issue in the first place, let alone isolate and fix it. Errors are the friends of developers, not enemies.
The problem ...