Why is the input voltage shown in the sample diagram smaller than the allowed voltage?
Think of it this way.
You have 5V in, and you are going to get 1.8V out. What happens to the other 3.2V? That 3.2V is ‘lost’ in the regulator.
So assume that your regulator is rated for 1 Amp output… that means that 1A is going through the regulator. 1A times 3.2V equals 3.2 Watts, which is dissipated as heat.
In the example they use a smaller input voltage because the voltage drop across the regulator is correspondingly smaller. Suppose you have an input voltage of 3V, and an output of 1.8V, That means that the regulator only ‘uses’ 1.2V, which when multiplied by 1A is equal to 1.2 Watts… much less heat than with the greater input voltage.
So the higher the difference between your input and output voltage, the hotter the regulator will get.
At low current, the heat may be irrelevant, but at rated current and high input voltage, the heat will be enough to require a substantial heat sink to dissipate all that heat generated inside the regulator.
At high currents, a ‘switching regulator’ is advised, since it produces very little heat. But the design, while not too complicated requires more parts.
There are charts on the data sheet that show how hot the regulator will get according to input voltage and current, and there is a maximum rated temperature at which you can operate the regulator. If your design will make the regulator too hot, then you need additional cooling capability with a heat sink. There are other factors that will also influence the temperature of the regulator, such as air circulation and other hot components nearby, so you should give yourself a decent margin of safety.