IOC response to the UC Davis report – International Standards Body or Lobby Group?
admin | August 9, 2010Three days after the UC Davis report on the quality of extra virgin olive oils sampled from Californian supermarkets was released, the International Olive OIl Council issued a lengthy statement.
Here is my interpretation and commentary on the statement. The statement verbatim is given in blue, and my comments are in green. Ok it’s long, but the points made in the statement whilst complex deserve to be discussed.
STATEMENT ISSUED BY THE INTERNATIONAL OLIVE COUNCIL ON THE REPORT PRODUCED BY THE UC DAVIS OLIVE CENTRE
A report issued by the UC Davis and Wagga Wagga laboratories concerning anomalies allegedly detected in olive oils imported into the United States has been brought to our notice by various associations and other information sources. Speaking as the Executive Secretariat of the International Olive Council (IOC) we wish to begin by saying that the IOC is recognised to be the leading authority on all aspects of olive oil and table olives. The United States takes part in the work of the Organisation through the IOC quality control scheme and the meetings of the IOC chemists, and is kept permanently informed on these activities.
One of the most significant regulatory tasks of the IOC laid down in article 1(2) of the General Objectives of the International Agreement on Olive Oil and Table Olives, 2005 is to develop the definitions and analytical characteristics of all the grades of olive oils and olivepomace oils included in the trade standards adopted by Members for compulsory application in international trade. The standards drawn up by the IOC are trade standards. They are adopted by consensus of the Members, which pledge to incorporate them into their legislation. According to article 22(1) of the International Agreement, headed Undertakings by Members, the Members of the International Olive Council undertake to apply the designations prescribed in Annexes B and C of the Agreement in their international trade and to encourage their application in their internal trade. In addition, article 22(2) states that the Council of Members will determine quality criteria standards applicable to the international trade of the Members.
None of this (Eurospeak) is relevant to the UC Davis report. The authors appear to be merely suggesting some legal jurisdiction, or at least some natural influence over matters involving olive oil sold in the US. What is not made clear by these opening paragraphs is that the IOC is a body that is funded by the EU, and their role and jurisdiction is defined by EU regulations. Its jurisdiction only extends to nations who are signatories to the international convention on olive oil (which is also a European creation). However they have no legal standing as to what are the ‘appropriate’ testing methods and standards outside of their member nations. The “Council of members” can determine “the quality criteria standards applicable to the international trade of the members” all they like.
But I think they have forgotten something. It’s not about them – nor should it be. Every country should be able to determine what standards should be applied to protect their own consumers.
IOC standards are revised in the light of scientific advances that help to make testing methods more accurate, or of technological and commercial developments. Their aims are to enhance and control quality, as well as to ensure transparency on the international market for olive oils, olive-pomace oils and table olives, and to promote their consumption. Every year the IOC assembles groups of expert chemists and sensory analysts from a range of countries, including the United States (USDA, AOCS), Australia and Germany, who study testing methods and revise them when necessary to determine the quality and control the purity of olive oils and olive-pomace oils. Methods are constantly being improved to adapt them to industry needs and technological developments.
It took nearly a decade for the IOC to do the right thing by consumers and decrease the free fatty acidity allowed in extra virgin olive oil from 1% to 0.8%! So what chance does a new potentially powerful method of say detecting the amount of old in a blend have of being approved in a timely fashion? Remember that some EU producers stockpile oil when prices are low, only to release them years later when prices improve. This practice while legal, dupes consumers as they are not getting a fresh product. It’s not surprising then that some consuming countries have lost patience and have proposed new tests in an attempt to protect their consumers from the dubious practice of others.
Ever since it first started to be involved in the standardisation of olive products, the IOC has cultivated a solid cooperative relationship with a number of international organisations, including the Codex Alimentarius Commission (CAC), the World Customs Organisation (WCO), the International Organisation for Standardisation (ISO) and the European Union. Its aim in doing so is to define the minimum compositional, quality and purity requirements of olive oils and to harmonise the methods of analysis in use. The Codex Alimentarius Commission is responsible for the joint FAO–WHO programme for the development of food standards with an eye to consumer health protection and fair trading. The World Trade Organisation (WTO) takes into account the standards and recommendations of the CAC in the application of the WTO Agreements on Sanitary and Phytosanitary Measures (SPS Agreement) and Technical Barriers to Trade (TBT Agreement). The CAC standards for olive oils and table olives are currently under revision to bring them into line with the IOC trade standards, and all the producing countries that participate in IOC work are aware of the enormous efforts made to bring about such harmonisation.
While it is true that the Codex standards are under revision, the response doesn’t mention that there has been substantial resistance against bringing the Codex standards in line with some aspects of the IOC standards as they are seen by some as just cynical attempts to set up technical trade barriers.
Beware!! The term “harmonisation” sounds warm and fuzzy, but it really means that ‘it would be much easier if everyone just did it our way’.
The IOC runs a quality recognition scheme for physico–chemical and sensory testing laboratories (tasting panels) aimed at increasing mutual confidence between testing facilities and heightening the confidence of the olive business sector in labs. Recognition is awarded on a yearly basis to labs and panels that fulfil the requirements stipulated by the IOC and which satisfactorily pass the proficiency check tests it holds every year. In 2009/10 a total of 40 tasting panels obtained IOC recognition; 55 panels are currently taking part in the two check tests arranged for 2010/11. In the case of physicochemical testing laboratories, 48 laboratories obtained IOC recognition in 2009/10 and 62 are currently participating in the 2010/11 ring test. The list of recognised laboratories and panels is posted on the IOC website and updated every year. The test certificates issued by recognised panels can carry legal weight in disputes. Each country is responsible for official product control.
As the only international olive oil organisation, they are in the best position to do this. However this paragraph does not add anything to their criticism of the UC Davis report.
The IOC also sponsors a voluntary, self-regulatory scheme currently in place on a number of markets where exporters, importers and distributors of olive oil and olive-pomace oil are required to adhere to IOC standards in order to help achieve orderly market development and fair trading. In the case of North America (USA and Canada), this quality control scheme has been operating since 1991 under an agreement signed by the IOC with associations to undertake product quality control at recognised laboratories using updated methods of analysis and taking into account the designations and quality criteria specified in the IOC standards. Some 200 samples of imported oils sold in the United States are chemically tested every year by the IOC under the quality control scheme and the labelling is also checked to ensure that the product contents tallies with the labelling declarations. According to IOC findings, anomalies are detected in less than 10% of the imported oils analysed (the association concerned is notified the nature of the irregularities with a view to taking action).
The IOC signs agreements with bodies that represent the interests of the importers of EU oil into the major consuming nations including the US. These agreements bestow the importers with the responsibility of testing and policing the quality and genuineness of the products they import. mmmm.
I was wondering that if 200 samples of olive oil are chemically tested by the IOC as claimed, where are the results? This statement attempts to discredit one set of data that has been made widely available, yet the results of their testing remains private.
Whether the IOC likes it or not, the UC Davis report is the only one which we can draw conclusions from as it is the only report that has been released for public scrutiny. In my opinion, until the IOC and its importer organisations publish their results (and methods), then criticising the published data of others is a bit below the belt.
I also presume that the 10% of oils which fail their own testing are/were available for consumer sale. If so, is it sufficient to just advise the importer organisation so that “action can be taken”. What action? Were the oils removed from the supermarket shelves as they were falsely represented as EVOO to the consumer? Were the appropriate local food/consumer authorities advised of possible breaches to any local laws? If olive oil consumers were genuinely high on their list of priorites then they should have.
To begin with, the UC Davis study reports results for only 52 samples of 19 brands, which is not statistically significant, and in some cases it does not provide customary details such as the date of collection, best before date, pack type, labelling information, etc. Also, when anomalies are detected in testing of this type, a second check test is usually carried out for confirmation purposes by another recognised laboratory; this has not been done in the UC Davis study.
I’m a university trained statistician and I have worked professionally as a statistician, so I’m pretty qualified to comment on this. The statement that the UC Davis study “lacks statistical significance” is ludicrous. What easier way of discrediting something than to say that it isn’t statistically significant? It’s a pretty lazy tactic really.
I’ll explain. When it comes to studies involving sampling, the concept of statistical significance is only relevant in the context of when a researcher aims to draw conclusions about the overall population from a sample taken from that population. In the UC Davis report the researchers simply commented on the oils which they tested. They didn’t draw conclusions outside the set of sample oils they tested.
In any case,statisticians know that the precision of any inference made from a sample doesn’t depend on the percentage of the population sampled, but on the actual number in the sample.
The IOC reports that their sampling gives a 10% failure rate. The UCD report gives 17% (if you only include oils that failed the tests they approve). But there is no mention of how and from where the IOC took their samples. If they sampled both supermarket and higher priced specialty oils then you would expect a lower proportion of EVOO ‘failures’ than if you just sampled from supermarket stock.
But heh, lets take a deep breath here….. 10%, 17%….. put your hands up if you think that both are excessive. I do! If 10% of the milk I bought were off, or 10% of the bread I bought was stale, I’d have good reason to complain. Why not EVOO??
Moreover, it is important to point out that the methods used in the study (DGF and Australian standards) are not official chemical methods cited in international olive oil- specific food or trade standards; they have however been adopted by the International Organisation for Standardisation (ISO). They were in fact presented to the IOC but were rejected after scrutiny because they were not reliable. This lack of reliability is the reason for their failure to be included in either the IOC trade standard or EU regulations, or consequently in the Codex Alimentarius standard. Before a method of analysis is approved by the IOC it is necessary to conduct numerous ring-tests at laboratories to validate the reproducibility and repeatability of the method and to make sure that it does not give false “positive” or “negative” results which can have detrimental repercussions.
My first impression of the UCD report was that it was imprudent of the researchers to include the results of non-IOC approved tests (regardless of how good the tests are) as it could give vested interests the opportunity to create doubt and confusion in the minds of consumers and the media.
While the IOC has criticised these alternative tests, they do not provide any information on how the reliability of the tests they haven’t approved was assessed. We apparently just have to take their word for it that they weren’t up to scratch. As a professional scientist I know that It is very bad form to publically discredit something without providing sufficient supporting information. Their results should have been published in a peer reviewed scientific journal. In this way, their methods and conclusions could also be scrutinised by the scientific community.
But this is all a smokescreen anyway. Quite a few off the EU supermarket oils failed tests for EVOO that are approved by the IOC. The UV232 absorbance test for old oil for a start. By my count, 9 of the 52 failed this test (with another slap bang on the limit).This test IS recognised by the IOC and it has been widely used for decades. So even if you only use IOC recognised chemical tests as criteria for failing EVOO status then still 17% of the EU oils sampled by the UC Davis researchers still failed.
The study also points to a correlation between UV absorbance and sensory analysis in some of the samples; however, it would be necessary to ascertain the method used for testing and to corroborate the results obtained.
It should be noted that the UC Davis study follows in the wake of recent reports of mislabelling of imported product in other countries based on similar work by the same team.
At the very least the source of claims such as this should be provided. If you accuse someone of something then the reader should have the right to know the detail. What ‘recent reports’. Where, when and how?
To conclude, owing to the lack of data and the methodology applied, this study does not provide solid evidence that the oils tested do not meet US and international standards.
As I read it, there is plenty of data both from methods recognised by the IOC, (and not recognised by the IOC) to show that there is a problem with the quality of Californian supermarket oils coming from the EU. When viewed in comparison with the locally produced oils, they looked very very ordinary.
We are clearly keen for this matter to be clarified and would be willing to cooperate in carrying out stringent, in-depth testing involving the participation of a large number of laboratories. We believe that the harmonisation and fulfilment of official standards, and the application of official methods, is of key importance in achieving greater transparency and fairer trading. It is important for the authorities of importing countries to harmonise and comply with the official established standards and it is their responsibility to guarantee product quality and protect consumers.
The current IOC systems (all no doubt based on stringent, in-depth testing involving the participation of a large number of laboratories) have resulted in the current state of play regarding the quality of US supermarket oils. Agreeing to ‘harmonise and comply’ will just perpetuate the current situation. Consumers in the US, Australia, Northern Europe and the UK will remain as confused as ever… and Madrid could make decisions for all of us. Consumers of extra virgin olive oil deserve better than this.
Thank you for the section by section assessment from a process and reporting point of view.
Bravo! And as always, spot on!
well said…
“If 10% of the milk I bought was off or 10% of the bread I purchased was stale Id have good reason to complain”
At the end of the day if 1% of the EVOO myself or my family purchase is not what the label says – I want action and NOW!!
What a limp response from the IOC…
THE most thorough and cogent response yet! Thank you for such thoughfulness in deciphering this IOC mumbo-jumbo.
Our olive oil company has from the beginning stated on the label helpful information. This includes harvest time, flavor intensity, and importantly free acidity. This year we include polyphenol (not inflated with tocopherol) value, as determined by a COOC recognized lab. We enter competitions for the purpose of completing lab analyses with American and international taste panel analyses.
We are confident that attentive consumers will begin to demand a more complete understanding of the value of the foods they purchase and eat.
Darro Griecom. Berkeley Olive Grove 1913