Carbon dioxide scatters infra-red over a narrow range of energies. The infra-red photons, which should be carrying energy away from the planet, are scattered back into the lower troposphere. The retained energy should cause an increase in the temperature.
A photon with a wavenumber between about 550 and 700 per cm which has been absorbed by a carbon dioxide molecule is almost instantaneously re-emitted in a different direction with a slight loss of energy. The carbon dioxide molecule gains a very small amount of energy in the process, insufficient to raise its temperature significantly. Any heating that could be observed has to do with the scattering of infra-red radiation back towards the surface rather than escaping into space.
At about 5km above sea level, we lose as much radiant heat to space as is required to keep the planet in thermal balance. If we look back at Earth from space, we can get a good idea of the impact of carbon dioxide:
The blue line shows the spectrum of a black body radiating at 280 degrees K, which is close to what would be seen if there were no absorbers in the atmosphere. The red line shows the actual spectrum. The difference between these two is the energy scattered by various infra-red-active species. Water vapour has an effect across the entire spectrum - it is the major greenhouse gas. carbon dioxide adds to the scattering between about 550 and 700per cm, and ozone (O3) plays a part between 950 and 1100 per cm.
One of the features of the carbon dioxide absorption is that the effect drops off logarithmically with concentration. Doubling the concentration will not double any effect. Indeed, at present there is ~400ppm in the atmosphere. We are unlikely see a much different world at 800ppm. It will be greener - plants grow better on a richer diet - and it may be slightly warmer and slightly wetter, but otherwise it would look very like our present world.
However, just as any effect will lessen proportionately with increase in concentration, so it will increase proportionately with any decrease. If there are to be any observable effects, we should have seen them already. Have we?
There are "official" historical global temperature records. A recent version from the Hadley Climate Research unit is:
The vertical axis gives what is known as the "temperature anomaly", the change from the average temperature over the period 1950-1980. Recall that carbon dioxide only became significant after 1950, so we can look at this figure with that fact in mind:
* from 1870 to 1910, temperatures dropped, there was no carbon dioxide effect
* from 1910 to 1950, temperatures rose, there was no carbon dioxide effect.
* from 1950 to 1975, temperatures dropped, carbon dioxide increased
* from 1975 to 2000, both temperature and carbon dioxide increased
* from 2000 to 2015, temperatures were flat but carbon dioxide increased strongly.
Does carbon dioxide drive temperature changes? Looking at this evidence, one would have to say that, if there is any relationship, it must be a very weak one. In one study I undertook, I found that there was a 95% chance that, over a period of a century, the temperature would change naturally by as much as +/-2degrees C. During the 20th century, it changed by about 0.8degrees C. The conclusion? If carbon dioxide in the atmosphere does indeed cause global warming, then the signal has yet to emerge from the natural noise.
One of the problems with the "official" temperature records such as the Hadley series shown above is that the official record has been the subject of “adjustments”. While some adjustment of the raw data is obviously needed, such as that for the altitude of the measuring site, the pattern of adjustments has been such as to cool the past and warm the present, making global warming seem more serious than the raw data warrants.
It may seem unreasonable to refer to the official data as “adjusted”, but its basis is what is known as the Global Historical Climatology Network, or GHCN. But adjusted it has been. For example, it is possible to compare the raw data for Cape Town, 1880-2011, to the adjustments made to the data in developing GHCN series Ver. 3:
The Goddard Institute for Space Studies is responsible for the GHCN. The Institute was approached for the metadata underlying the adjustments. They provided a single line of data, giving the station’s geographical co-ordinates and height above mean sea-level, and a short string of meaningless data including the word “COOL”. The basis for these adjustments is therefore unknown, and the fact that about 40 successive years of data were “adjusted” by exactly 1.10 degrees C strongly suggests fingers rather than algorithms were involved.
One can only conclude that there has been so much tampering with the "official" records of global warming that they have no credibility at all. That is not to say that the Earth has not warmed over the last couple of centuries. Glaciers have retreated, snow-lines risen. There has been warming, but we do not know by how much.
Interestingly, the observed temperatures are not unique. For instance, the melting of ice on Alpine passes in Europe has revealed paths that were in regular use a thousand years and more ago. They were then covered by ice which has only melted recently. The detritus cast away beside the paths by those ancient travellers is providing a rich vein of archaeological material.
So the world was at least as warm a millennium ago as it is today. It has warmed over the past few hundred years, but the warming is primarily natural in origin, and has nothing to do with human activities. We do not even have a firm idea as to whether there is any impact of human activities at all, and certainly cannot say whether any of the observed warming has an anthropogenic origin. The physics say we should have some effect; but we cannot yet distinguish it from the natural variation.
Those who seek to accuse us of carbon crime have therefore developed another tool - the global circulation model. This is a computer representation of the atmosphere, which calculates the conditions inside each 5km x 5km x 1km slice, and links them to each adjacent slice (if you have a big enough computer - otherwise your slices have to be bigger).
The modellers typically start their calculations some years back, for which there is a known climate, and try to see they can predict the (known) climate from when they start up to today. There are many adjustable parameters in the models, and by twiddling enough of these digital knobs, they can "tune" the model to history.
Once the model seems to be able to reproduce historical data well enough, it is let rip on the future. There is a hope that, while the models may not be perfect, if different people run different tunings at different times, a reasonable range of predictions will emerge, from which some idea of the future may be gained.
Unfortunately the hopes have been dashed too often. The El Nino phenomenon is well understood; it has a significant impact on the global climate; yet none of the models can cope with it. Similarly, the models cannot do hurricanes/typhoons - the 5kmx5km scale is just coarse. They cannot do local climates - a test of two areas only 5km apart, one of which receives at least 2 000mm of rain annually, and the other averages just on 250mm, failed badly. There was good wind and temperature data and the local topography. The problem was modelled with a very fine grid, but there were not enough tuning knobs to be able to match history.
Even the basic physics used in these models fails. The basic physics predicts that, between the two Tropics, the upper atmosphere should warm faster than the surface. We regularly fly weather balloons carrying thermometers into this region. There are three separate data sets, and they agree that there is no sign of extra warming:
The average of the three sets is given by the black squares. The altitude is given in terms of pressure, 100 000Pa at ground level and 20 000Pa at about 9km above surface. There are 22 different models, and their average is shown by the black line. At ground level, measurement shows warming by 0.1degrees C per decade, but the models predict 0.2degrees C per decade. At 9km, measurement still shows close to 0.1degrees C, but the models show an average of 0.4degrees C and extreme values as high as 0.6degrees C. Models that are wrong by a factor of 4 or more cannot be considered scientific. They should not even be accepted for publication - they are wrong.
The hypothesis that we can predict future climate on the basis of models that are already known to fail is false. International agreements to control future temperature rises to X degrees C above pre-industrial global averages have more to do with the clothing of emperors than reality.
So the third step in our understanding of the climate boondoggle can only conclude that yes, the world is warming, but by how much and why, we really haven't a clue.