File and script loads. No caching by default?

Working on a Laravel/Lumen based REST service.
For some of our “sensitive” system variables we include the reference to a “server-config.php” file that contains various const key/values that are configured on the target server for this purpose (location is controlled via an environment variable) and is loaded in a dependent php source through an include_once statement.
We find this particularly useful since it isn’t hardwired into the server and can be edited without the need to restarting the host server.
But this is my question. I’m sorta surprised it works this way and would have thought that this information would get cached so subsequent calls wouldn’t re-read the const dependencies?
To clarify, I can have a const foobar = true in this file… and the dependency that uses it can read it and use it just fine. So a call that includes the php module is made and the functionality works as expected.
Now if I go into this configuration file and comment out that variable and rerun the same file it will inevitably crash because it is no longer available.
This implies that the php files will get reloaded on each call? I’m guessing his won’t be the case in an optimized deployment where I’m guessing caching strategies can and should be applied?