Environmental conditions are not supposed to affect nuclear reactions in general or radioactive decay rates specifically. This is one reason why cold fusion got the cold shoulder from most physicists. Now for the "however" that is the hallmark of Science Frontiers :
"Thirty years ago, Otto Reifen- schweiler was searching for a compound which could protect Geiger-Mueller tubes from damage when they are first ionised. He found the compound, which became a money-spinner for Philips, in a mixture of titanium and radioactive tritium. He also discovered that as the mixture was heated, its radioactivity declined sharply. No process known to physics could account for such a baffling phenomenon: radioactivity should be unaffected by heat. Nevertheless, as the temperature increased from 115°C to 160°C, the emission of beta particles fell by 28%."
Reifenschweiler and his colleague, H. Casimir, put this discovery on the backburner and concentrated on the Geiger-Mueller tubes. The recent furor over cold fusion impelled them to resurrect the work and publish it in the January 3 issue of Physics Letters A. Is there a new phenomenon here? Is it relevant to cold fusion? It may be pertinent that some common fusion reactions also employ tritium.
(Bown, William; "Ancient Experiment Turns Heat Up on Cold Fusion," New Scientist, p. 16, January 8, 1994.)
Comment. How many other potential anomalies simmer neglected on backburners, while their discoverers focus upon more acceptable and profitable things?