I have spent quite a bit of time researching this, and I will let somebody better qualified say:
But none of this matters since by convention pH values in brewing are reported as the pH of a room temperature sample. This arises from the laboratory practice of cooling pH samples before pH is tested. While pH meters can correct for temperature and their probes may be able to withstand higher sample temperatures, testing only cooled sample extends the life of the probe. This common practice also means that reported pH optima and pH ranges are for room temperature samples even though the actual reaction happens at higher temperatures.
The only bit of this I would clarify is the bit I have highlighted in red. There is a common misconception about what pH meters with Automatic Temperature Compensation (ATC) actually do.
There are two things that variation in temperature cause..
1) the pH of the sample being measured will vary....this is the oft quoted 0.3pH difference between a warm and cool sample
2) a variation in the electrical response of the meter/probe itself
PH meters with ATC correct provide correction for the second of these things, they do not correct for (1)
At the end of the day its easy to get hung up on pH and what the measurement temperature should be...all I would say is that as long as an individuals day to day brew process and measurement regime are the same across all brews then, whatever you measure, will be meaningful in terms of identifying brew to brew variances.
Once your mash is underway, there is very little you can do to affect a change in the pH....by the time youve taken your sample (maybe 10 minutes in to the mash), cooled that sample to 20 degrees, measured the pH, decided what "other" additions you would like to make to the mash to try to adjust the pH...it's actually too late too make any meaningful difference...a lot of the conversion will be complete. Far better that you use the reading to plan for adjustments for the next brew, rather than trying to adjust this brew "on the fly".