The uncertainty involved in air quality modelling data, which is based on calculated estimates, can give the public a less accurate impression of air pollution levels, according to an air quality expert.
Speaking on behalf of the Department for the Environment, Food and Rural Affairs (Defra) yesterday (March 13), technical lead for air quality monitoring at consultancy Ricardo-AEA, Brian Stacey, suggested that modelling data “can cause problems with public understanding”.
Mr Stacey comments came during his presentation ‘EU and UK Air Quality Policy and Practice’, at the Air Quality and Emissions (AQE) Show 2013 in Telford, Shropshire yesterday. The two day conference featured a number of speakers from across industry and government, as well as workshops and an exhibition for air monitoring firms.
Whereas measured air quality is based on actual readings of pollution at monitoring sites, modelling uses various mathematical and scientific methods to calculate concentrations of pollutants.
Modelling is used to estimate pollution levels from specific sources or in areas that do not have monitoring sites, for instance, and this means that the level of uncertainty in this data can be as high as 25%, according to Mr Stacey. He said the level of uncertainty in measured air quality data was closer to 5 or 10%.
Mr Stacey said: “A lot of the problem is how the public perceives the monitoring data. There can be up to a 25% uncertainty limit with modelling, which can cause problems with public understanding.”
The UK supplements its measured data with modelling to get a picture of air pollution levels across the country, but other countries such as Germany and France use only measured data (see airqualitynews.com story).
He continued: “I think the system we use in this country is very helpful and the data we collect is very good and it feeds into the process well. But the challenge is always how do we present that to the public? If you use the modelling to say nitrogen dioxide is at a certain concentration somewhere, the public will read that and take it as an absolute, whereas there is always going to be a greater level of uncertainty with modelling. So more work needs to be done on how we communicate that to the public.”
However, Professor Frank Kelly of King’s College London, who also spoke at the show, was not so convinced that uncertainties in modelling data communicated to the public presented a problem.
Commenting later, he said: “I wouldn’t worry about that because the concentrations could actually be higher than the modelling data suggests. It isn’t something we can necessarily do anything about at the moment, so I think you have to accept it.”
In his presentation, Mr Stacey also suggested that particulate matter would become a more prominent issue for air quality in the coming decade, and that he thought particles would “push policy over the next 10 years”.
However, Mr Stacey – who contributes to a number of European air quality working and the European Commission’s committee for the National Air Quality Reference Laboratories (AQUILA) – said that he found decisions on air quality policy both in the UK and at EU level to be a “long and slow process”.
“It is very frustrating sitting in a committee meeting and saying ‘well we are not going to make any decisions today’,” he said.
He went on to suggest that more work was needed to understand the direct health impacts of certain air pollutants.
“The problem is that we don’t know the direct effects of personal exposure at particular levels. We measure average concentration levels across a whole year, but it is hard to know the direct impact of, say, 20 micrograms of nitrogen dioxide on someone,” he said.
The Air Quality and Emissions Show (AQE) 2013 continues today (March 14). More information is available on the event website.