The notion that you must define something before you can use it is ingrained deeply into the brains of most science folk! If you program in C or Fortran you will know what problems you can encounter if you use undefined things. This creates a culture in which many people start their work with a whole bunch of assignments to variables. This is unfortunate for at least two reasons:
Leaving things undefined can produce many interesting results. For example, using an undefined function, f (say) you can obtain a wealth of interesting results, such as:
These results are, of course, true for an arbitrary repeatedly diferentiable function f.
Undefined functions and variables also help you explore the operation of many Mathematica operations. For example:
Often it is better to work with expressions with undefined variables and functions and use /. to replace them with numeric values only when required. What is the point of going to the trouble to discover the notation for Planck's constant, only to define it away:
Any output from Mathematica involving this constant will be reduced to a meaningless number. Think of the way you would probably perform a hand calculation.
This brings me to the second big problem with an over-use of definitions. Variable definitions have a habit of hanging around while you use Mathematica on an unrelated calculation. This can be extremely dangerous (the type of danger obviously depends on your area of application!). Suppose you have inadvertently set x=10 as part of some calculation and you then move on to something else. The chances are high that you will use the variable x again, but it will be immediately replaced by its 'value': 10.