In many programs a
#define serves the same purpose as a constant. For example.
#define FIELD_WIDTH 10 const int fieldWidth = 10;
I commonly see the first form preferred over the other, relying on the pre-processor to handle what is basically an application decision. Is there a reason for this tradition?
There is a very solid reason for this:
const in C does not mean something is constant. It just means a variable is read-only.
In places where the compiler requires a true constant (such as for array sizes for non-VLA arrays), using a
const variable, such as
fieldWidth is just not possible.