Although there is no reason known to physics why radiation itself should radiate in its interaction with the vacuum of space, there are some reasons for speculation. Bear in mind that we are only dealing with energy losses less much less than the Planck constant x the speed of light per metre of travel. If the universe was eternally stable, then energy would have to be conserved. It can be shown that the power output of the entire observable universe is around 1047 Watts considering average stellar power output, numbers of stars in galaxies and total number of galaxies. The power of the microwave background radiation arriving at and passing backwards and forwards through a sphere of a radius around the size of the observable universe (radius 1.26x1026 m) using
Power = sigma A T4
is around the same value.
One way of looking at it would be to say that given enough distance of travel, all electromagnetic radiation will be converted to background radiation.
Another way is to say that the temperature of the background radiation is the temperature at which the radiation in the universe would radiate at if it could do so. The energy that produces the background radiation would be the energy lost that produces the redshift. Since this would apply to all radiation travelling throughout the universe in all directions, it would be no surprise to find out that the radiation was isotropic, apart from a slight variation to reflect the lumpiness of the existing universe.
Now we have our alternative explanation of Olbers' Paradox (If there is a star in every line of sight, why is the night sky dark?) With less than 10-5 Watts in both background radiation and starlight passing through any one square metre of intergalactic space, just how bright do you expect it to be?
In a big bang universe we would just happen to be at the time when this applies by coincidence.