SW2: define or const?

You probably know that to declare a value which doesn’t change along the execution of a program you can use define or const.

The questions are: which are the differences between them? Do I use define or const?

Let’s see how things should work in theory.

Arduino has a small amount of RAM, the UNO version has 2MB. For complex programs it would be necessary to optimize the RAM usage.

#define

#define is actually a keyword which triggers a macro when the compiler compiles the code. This macro takes any occurrence in the code of the name we wrote and replaces it with its value.

Let’s take the code used in the Breadboard article and change the first line of code. The result will be the following.

Once this code is compiled, it would look like this

All the occurrences of the word LED have been replaced with the number 9.

Because the value is now explicitly passed to the functions, there is no RAM usage.  There is no need to store anywhere the value 9 to retrieve it when we need.

const

By using const we declare a regular variable, with the only difference that we say “this will not change”. Because it is still a variable, the value should be stored into the RAM so that anytime the program will find that variable’s name into the code, it will go to the RAM to get its value.

From the theory to the practice

That was the theory, but what really happens?

It happens that the compilers are smart enough to know that a constant is not really a variable, it will never change, so we already know its final value at compile time. Because of that the compilers treat the const variables exactly as they do with #define. The value is replaced to any occurrence of the variable’s name and no RAM is used for them at runtime.

So? What do we do? Do we use define or const?

By the point of view of Arduino there is definitely no difference, it would say “Do as you want, to me is the same”.

By the programmer’s point of view instead there is a difference. Maybe you noticed that the const declaration needs a ; (semicolon) at the end of the line, while #define doesn’t.

Why this makes the difference? Because if by mistake we put a semicolon at the end of a #define declaration, the compiler will say nothing, that’s correct to it, but at runtime the program will not work.

How is that possible?

Take a look again to the first code in this article and suppose to put a ; at the end of the first line, so that we have #define LED 9; . The macro will replace any occurrence of the word LED with 9; which is not a valid value for a pin.

In conclusion, if you are thinking which one to use, I’d suggest to use const, but that’s just a suggestion.