How Big Data revolutionizes toxicology and how risk assessments could be made safer even without animal experiments were discussed by scientists at the Alternatives and Animal Use in the Life Sciences conference last year in Seattle. The results have now been published in a highlight report.
As a rule, toxicological studies are to be performed according to the 3R Principle (replacement, reduction, and refinement). In other words, animal experiments should be avoided when possible or kept to a minimum. The animals should be affected as little as possible.
The authors write that access to toxicological data that enables a serious risk assessment continues to be difficult. To be able to read the required information from the data, scientists need high-quality, basic data. The technology solutions that capture, store, and present data should be standardized. Furthermore, the authors urge the scientific community to increased sharing the data. Risks can be assessed reliably only when all the available data is, in fact, available.
The authors also argue for better training of toxicologists in terms of the use of Big Data. In the future, they state, future studies of toxicity determination must focus less on the toxicological endpoint and more on the mode of action.
As part of registration, companies must also provide toxicological and ecotoxicological data. The scope of the data depends on the quantity of the chemical to be registered. Annexes III, VII, VIII, IX, and X of REACH Regulation provide information on the specific requirements. We are pleased to provide support if you have any questions. Please contact us at email@example.com.