New Approach Methodologies (NAMs) In Regulatory Risk Assessment Workshop Report 2020- Exploring Dose Response

Panel Discussion Sessions Outputs - 2020 Workshop Report

Last updated: 28 February 2024

New Approach Methodologies & Special Scenarios

  • Cost comparison vs traditional methodologies i.e., NAMs approaches to risk assessment may seem to be relatively inexpensive on a per assay basis, but as a number of approaches may need to be used as part of a tiered toxicity testing framework to give confidence in the results, costs and time can escalate and become expensive.
  • For higher level exposures, greater uncertainty factors or more conservatism may be needed in the risk assessment as applied through a rigorous uncertainty assessment.
  • Different tools and standards could be brought into the tiered approach and uncertainty assessment utilised for both the estimation of systemic exposure and ingredient bioactivity.
  • Bespoke investigations can be designed to explore effects of chemicals as they are progressed through the tiers of a NAMs approach.
  •  High throughput transcriptomic (HTTr) data could be used, from the perspective of potentially establishing PODs based on a No Observed Transcription Effect Level (NOTEL), which are more conservative than No-observed-adverse-effect levels (NOAELs) derived from animal studies.
  •  Internal dose: dosimetry and in vitro kinetics are imperative to define/predict what concentration of chemical went into the cell rather than what was added to the well in an in vitro assay. This is important so that doses of effects can be more reliably translated, and more accurate predictions made.
  • It needs to be established how good the strategy is for computational methods, since the models are only as good as the data going in i.e., if the data are not available, a model cannot be produced.
  • Not all biological effects and complex stress responses are picked up with computational methods. Therefore, ‘missing information’ needs to be covered using biology assays and adverse outcome pathways. This could be achieved by transcriptomics.
  •  ‘Big data’ approaches need to be linked to human clinical data, biobanks and biomonitoring data, including the analysis of biofluids to tissues and organs.

Approach that is fit for purpose: Validation of methodologies

  • Alternative testing and exposure strategies for nanomaterials was discussed, outlining the various Horizon 2020 projects such as:
  1. GRACIOUS- Grouping, Read-across, Characterisation and classification framework for regulatory risk assessment of manufactured nanomaterials and Safer design of nano.
  2. PATROLS - Physiologically Anchored Tools for Realistic nanOmateriaL hazard aSsessment is establishing a battery of innovative, next generation safety testing tools to more accurately predict the adverse effects caused by long-term engineered nanomaterial (ENM) exposure in humans and the environment. (The ambition is to accurately predict adverse effects caused by long term (chronic), low dose engineered nanomaterial exposure in humans and environmental systems to support regulatory risk decision making).
  3. Risk Management of Biomaterials (BIORIMA)- To adapt and validate current test methods and or develop new test methods to detect adverse effects of nanobiomaterials (NBM) (in vitro and in vivo), as well as contribute to integrated testing strategies to support QSAR and PBPK/PD. This work supports the standardisation of NBM and methods for their eventual use in advanced therapy medical products (ATMP) and medical devices (MD). This includes benchmarking reference materials.
  • Challenges for the future: highlighting the move to complex 3D cell models and microfluidic systems and how we ascertain dose may be challenging i.e. internal dose; agreement on definitions and measurement of dose (mass, surface area); stability of NBM in solution; corona assessment; assessment of complex 3rd generation NBM, within possible matrices, will challenge traditional approaches and understanding the implications of endotoxin contamination in production lines are all key areas that need to be worked through to support the safe development of the technology.
  •  Credibility of physiologically-based kinetic (PBK) models can be visualised using a matrix that characterises the degree of confidence in the components of the model: i.e., its biological plausibility, how well it simulates known data and its overall reliability considering uncertainty and sensitivity
  • To ensure credibility of the input parameters for any model, consideration should be given to their origin.
  •  Model reporting needs to adequately justify and document both the model structure and the parameters used, to ensure reproducibility and confidence in the model.
  • To advance the acceptance of PBK modelling, in the context of supporting chemical safety assessment, it is essential that there is an ongoing dialogue between model developers and (regulatory) users. Further uptake of PBK models is being facilitated by development of additional guidance documents, generation of case studies and improved resources for the generation of input parameters and models.

Physiologically-based pharmacokinetic (PBPK) Modelling

  •  For model reproducibility, generally, there is insufficient information in the documents (peer-reviewed literature) to allow reproduction for the same chemical, let alone other chemicals. One of the benefits, of the available PBPK software models, is that the user can put their own distributions into models. However, it is important to note that they should still have access to appropriate expertise. In the discussions, it was raised that dealing with contaminants is different to dealing with pharmaceuticals i.e., Model credibility depends on the intended purpose and must be taken into account in the risk assessment process.
  • PBPK models are versatile but also need to be reliable. It was stated that it would be difficult to validate a model per se because it is dependent on how the model will be used. However, there have been on-going efforts to make reporting of models more consistent. Guidance is under development at OECD and Tan et al. (2020) published a reporting template.
  • There is now much more available information on parameters. However, for contaminants it is not possible to get an understanding of unknown unknowns. It was stated that when sampling a population, you have to co-variate to get correlated sampling.
  • At what point, if at all, should FSA consider consumer-facing transparency re: NAMs when used in risk assessment? That’s the very reason that these methods are not being rushed into in risk assessment. The risk assessment will be clear and transparent about methods and uncertainties.
  • Are the available microdosing data relevant, given the dose is below the saturation kinetics and how can we ensure the system is not overly saturated or exposure significantly underestimated? Polyethylene terephthalate (PET) was used as a good example for microdosing because there is not a dose that will cause anomalies.

Future Methodologies and Micro-physiological Environment

  • Neural Networks are a class of Machine Learning Algorithms that can provide both binary and quantitative predictions.
  •  Structural Alerts, Random Forests and Neural Networks have been used to try and predict binary activity at Human MIEs.
  • A combination of these models (e.g., structural alerts, random forests and neural networks) and understanding of their workings is key to highest performance and model use in toxicology decision making.
  • Dose response relationships and risk assessment procedures ideally require quantitative information, but qualitative risk assessments can be carried out too.
  • Quantitative predictions help push this methodology closer to use in risk assessment, rather than just hazard identification.
  • The power of the machine learning algorithm is that it works in a similar way across the board. Models don’t have to be built in a bespoke way every time, but it was stated that applicability is bespoke. The applicability domain is acceptable but perhaps there should be degrees of certainty in different areas of space. The initial cases and training data (used for validation) also need to be considered.
  • Bayesian probability offers the opportunity to update the probability for a hypothesis as more evidence or information becomes available. It can look and filter the probability of accuracy and conditional probability. Therefore, you can relate the actual probability to the measured test probability. Alternative ways of doing dose response modelling are required to correct for errors. Data are not necessarily information; interpretation is required to achieve that transition.
  • Discussion on the questions: When to adopt new schemes? how many failures are you prepared to have? When are there enough in silico predictions that a physical experiment does not have to be performed? It was debated whether in silico and in vitro methods are actually cheaper than in vivo studies. There is increasing confidence in computational approaches, but they may need additional approaches and Weight of Evidence (WoE) would still be used initially, which increase the cost.
  • The Safety & Environmental Assurance Centre (SEAC) coumarin case study is a good example of building models. Increased confidence in the tiered/NAMs/PBPK approach are likely to predominantly come through case studies.
  • When considering the biotransformation of bioactive compounds in food it needs to be accepted that the gut, including its microflora, should be considered as well as the liver. There are >2000 species of microflora in the gut. Some are essential, some not, and they represent a huge metabolic capability. It was discussed that the microbiome changes with environment, diet, age, sex, pharmaceutical use etc., how the information from gut microflora should/would be used in PBPK modelling might prove somewhat challenging. It should also be remembered that the gut microflora-derived metabolites across various cultures/countries will vary.

Case studies

Tropane Alkaloids Contaminants (Natural)

Tropane alkaloids (TAs) are plant toxins that are naturally produced in several families including Brassicaceae, Solanaceae (e.g., mandrake, henbane, deadly nightshade, Jimson weed) and Erythroxylaceae (including coca). TAs can occur in cereal-based foods through the contamination of cereals with seeds from deadly nightshade and henbane. Although more than 500 different TAs have been identified in various plants, respective data on toxicity and occurrence in food and feed are limited (EFSA, 2013). The COT has reviewed TAs and in 2017, the FSA commissioned a survey on the monitoring of TAs in food.

Attendees were asked to consider the following:

  •  A number of other TAs of unknown potency were present at higher concentrations than (-)-hyoscyamine and (-)-scopolamine, with some of these reported at detectable levels in up to 26% of cereal-based samples. Syndicate groups were asked to consider this group of compounds and explore ways of ascertaining the potency of similar molecules in the group, given that data are available on only a limited number of TA’s.
  • As it is thought that the effects of a combination of TAs would be different from those of exposure to a single TA, groups are asked to explore possible methods of quantifying this difference.
Discussion output points:
  • With regards to potency, it would be prudent to first look into the known potencies of TAs. If it is assumed that all TAs are equipotent, then this would be the most conservative approach. However, the potency of most TAs is unknown but if there were standards used for their analysis, could potency be determined from these? The relationship between potency and antimuscarinic effects should then be explored. If this is not possible then an assumption could be made that their potency is equal to that of hyoscyamine and/or scopolamine. If they are equipotent then an assessment needs to be made as to the level of risk. It is important to note that if an assumption is being made on potency, then it cannot be ruled out that the potency of the TAs mentioned is more than that of hyoscyamine and/or scopolamine. For quantification, relative potency could be used taking advantage of data on effects on muscarinic receptors.
  • There is potential exposure to various TAs from eating cereal-based products. Therefore, the risk assessment would have to consider different combinations. When looking at the effects in combination, it is important to consider all of the TAs detected and the potency, if we assume synergistic effects. It is possible that the effects may be geometric or have antagonistic actions. It is possible that when in combination, less potent compounds may bind the receptors and prevent the more potent compounds from docking.
  •  When exploring antimuscarinic effects, in vitro tests should be conducted, and other endpoints investigated to check if TAs are all antimuscarinic. If it is assumed that all TAs are antimuscarinic then presumably combinations of TAs will have an addictive effect. Muscarinic receptors have known potency for these compounds. However, there are some limitations such as receptor ligand binding, receptor ligand responses. It would be worth exploring different HTS methods for TAs (binding assay) then using expert opinion to rank the data.
  •  It is important to consider whether TAs all have the same toxicokinetics. It would be desirable to measure bioavailability by looking at the metabolism and pharmacokinetics of TAs of known potency and then ranking potency levels of TAs and look into exposure of these chemicals. The structures could then be run through a QSAR programme to see if data gaps can be filled. It would be useful to look critically at the structures, such as substituents on the molecule and the variety of sidechains, for changes in the receptor. Questions arose such as:

     o   Is there a way that the potency of TAs can be ranked using QSAR?

     o   Could we use the acute reference dose (ArFD)?

  • Structural differences in TAs could have different effects on a receptor. If the TA is structurally different it may hit a different site of the same receptor and modulate other TAs which may lead to competition. Read-across may still be the best estimate but there is always uncertainty because the substances are not the same. The limited data on TAs reduces the reliability of read-across. It was noted that there are structural alerts present for genotoxicity in some TAs. Therefore, one would characterise using genotoxicity and then TTC, giving the worst-case scenario. No exposure data is provided and there is no information on LOQ or LOD, but as there are alerts for genotoxicity this would suggest that any exposure is unacceptable. It would need to be investigated whether there are any common chemical groups throughout the TA structures which trigger the antimuscarinic effect. Is a QSAR method able to differentiate between different effects? It is likely that a tiered approach will be required.
  •  Finally, it was noted that there has only been detection of 24 TAs in the cereal-based samples because these are what the samples were analysed for. However, there are more than 500 TAs, any of which could also be present. It was suggested that better agricultural processes could be used to mitigate and reduce the risk by reducing the presence of TAs in cereals. Additionally, analytical methodology could be applied to detect more TAs.

Polymers/Mixtures (man-made/ environmental): Plastic particles

Plastic particles (micro/nano plastics) are intentionally added to products (e.g., in cosmetics as exfoliants) or result from fragmentation of macroplastics into smaller sizes by natural processes (e.g., weathering, corrosion etc.). These particles can come in different sizes; nano (1 – ≤100 nm), micro (1 – 5 mm), and macro (> 5 mm). The occurrence of microplastics has been reported in seafood, honey, beer and salt, with most of the data being on occurrence in seafood. A full risk assessment on the potential toxic effects of micro and/or nanoplastics could not be carried out due to the lack of comparative data available for baseline levels of both compounds.

Furthermore, there is no established NOAEL for each polymer type. The European Food Safety Authority (EFSA) Scientific Panel on Contaminants in the Food chain (CONTAM) concluded that the risks of toxicity from micro and nanoplastics themselves, from oral exposure could not be assessed due to the lack of data, especially with regards to metabolism and excretion (EFSA, 2016). The COT is currently reviewing the potential risk of microplastics in food.

Attendees were asked to consider the following:

  • Do you envision the AOP methodology to be able to assist in prioritising the potencies of the different types of plastic particles? If yes, how so?
  • Do you agree with the read-across of plastic particles to tyre and road wear particles?
Discussion output points:
AOP methodology:
  • AOP methodology would assist in prioritising the potencies of plastic particles but it is not ready yet.
  • There is still a need for internal and external exposure data.
  • AOP needs a single chemical, but plastic particles may well have mixtures of chemicals.
  • Read-across might be challenging between different plastics as the composition of plastics will differ. For most particles it would depend on what the particles are made of in order to determine what effects they might have. 
  • There should be a criterion for inclusion of a certain adverse effect/pathway.
  • There should also be standardisation for the data used in read-across.
  • Testing against key events would tell us what the chemical does but not what it is i.e. Do we even know a key event that actually takes place at this stage?
  • Different exposure routes will lead to a wide variety of adverse effects. The route of exposure currently includes inhalation, dermal and ingestion which will then have different effects on internal dose.
  • When considering how to use the AOP diagram it needs to be borne in mind that there is a battery of processes to go through some of which are known, whilst others are unknown. It needs to be considered whether an OECD approach for AOPs should be followed. 
       Tyre and road wear particles:
  • Read-across from other particles is very limited. Read-across will therefore be challenging as there are limited or no data on plastic particles. The use of read-across of tyre and road wear to plastic particles was not currently considered useful.
  • With regard to fibrotic response to accumulation it will be challenging to pinpoint e.g., if it is adverse or is nano-clumping occurring?
  • It was noted that there is limited analytical methodology available for microplastics and even more so for nanoparticles which affects particle matter (PM)10 (for both tyre wear and atmospheric fibres). This is further complicated by the organic sample matter (food/tissue). However, it is possible that migration data from manufacturer’s could be obtained, and a risk assessment potentially be performed on the leachates.

Other discussion points:

  •  Particle morphology (size and shape) plays an important role on the toxicity profiles. This should be considered.
  •  Formation of protein coronas.
  • What different types of polymers are we exposed to?
  • What toxicology has been done to date? Any pointers for potential hazards?
  • Do polymers have systemic access?
  • There is information available on particle matter (PM)2.5,5,10 etc.  Can this be extrapolated?
  •  There are currently analytical and sampling challenges with measuring plastic particles such as how to analyse them in food/tissues. Consideration needs to be given to what chemicals are potentially stuck on the surface. The analysis is technically very challenging, and it is currently not possible to detect plastic particles below 1 µm in complex samples. The sampling size/method would be different for the environment/food and the different particles. Do we have sufficient particles in samples to analyse for the particle chemical effect?
  • Persistent organic pollutants or weathered particles may lose some inherent characteristics.
  •  The potential presence of biofilms needs to be considered as do microbial effects.
  •  The physical aspects of the particles are responsible for the effects. How do the particles break down and is the size we see in the food/environment the starting size or subsequent from break down?
  •  Certain polymer particles may be converted to Environmentally-Persistent Free Radicals (EPFRs) following UV photolysis.
  •  There is uncertainty around particle composition/size. There are various distributions.
  •   Analytical methods are needed to extract particles from the environment.
  •   It is not certain how reliable older data are. There are not many labs which have the technology/possibility to generate the data required.
  • There is a need to consider the possibility of microplastics accumulating other toxic chemicals within themselves.
  • It needs to be determined whether plastic should be analysed in its original form or whether the polymer should be considered; some of the components would have been assessed toxicologically but only for the chemicals and not for micro particles. However, this would still only provide a snapshot of that time/place.
  • Animal/toxicology studies are carried out on the pure plastic not on weathered particles which are what the population are generally exposed to.
  •  Nano-particles and micro-particles will behave differently, therefore having different effects.
  •  More clarity is required on the routes of exposure to plastic particles.
  •   Limited human data have demonstrated that (micron size) particles are able to pass through the gut. However, it has been demonstrated that in the nano range (nanoplastics) are sufficiently small to be able to cross and interact with biological components i.e., nano bio interface.
  •  The model would need to take into account the implications/long-term health effect of particles being retained in the lung/gut.
  •  How would the AOP pathway take into account chemicals that are stuck to a particle surface and released?

Food supplement (man-made): Selective Androgen Receptor Modulators (SARMs)

These can be found in bodybuilding/gym-based supplements and are designed to have a similar effect to anabolic steroids, but without many of the unwanted side effects. Toxicological information for SARMs is scarce and, where available, the dosage used in supplements is usually at higher doses than was tested in clinical trials. Since the mechanisms by which tissue selectivity is achieved have not been clearly elucidated, there is poor understanding of the potential side effects associated with exposure to SARMs through supplements. Moreover, structural modifications could affect the binding affinity, specificity and potentially affect the potency of different SARMs. Understanding of the structure-activity relationships (SAR), molecular pathways involved as well as the potency of the various molecules is needed for the development of a risk assessment strategy.

Attendees were asked to consider the following:

  1.   (Q1) What criteria could be used for the development of AOP methodologies for the risk assessment of SARMs?
  2. (Q2) Could read-across be used for risk assessment of SARMs with limited toxicological information? If yes, what criteria should be used and are there any classes of chemicals that are appropriate for read-across based on the information provided?
  3. (Q3) Is it prudent to attempt to extrapolate from the levels used in clinical trials to the levels used in supplements?
  4.  (Q4) Could PBPK used for understanding distribution of SARMs in the body and would this approach be appropriate for determining potential side effects?
Discussion output points:

Q1. What criteria could be used for the development of AOP methodologies for the risk assessment of SARMs?

  • Biologically relevant key events i.e., anabolic effects or tissue specific effects, antagonistic or agonistic effects should be used. Searches could be undertaken for tissue specific effects and androgenic effects.
  • The criteria used should be biologically relevant and of key events leading to a specific outcome. Example: trying to build an AOP on suitable skeletal muscle system in vitro.
  • Utilisation of in vitro assays to screen the responses, using the chemical structure as a starting point. However, it must be noted that the AOP is not chemical specific which could be a limitation. It should be testing potencies for androgenic effects. The criteria need to be biologically relevant and related to specific key events, then adverse outcomes i.e., Use tissue relevant in vitro assays to aid development of AOP for SARMs. For example: Skeletal muscle system in vitro then the development of the AOP. AOP would work with SARMs as classic mechanistic intervention event. MOA will be the key interaction.
  •  When looking at structures, the read-across will be challenging. Look at analogues within the groups rather than across groups. Use parent compounds to scope out how compounds act and compare to other compounds. SARMs have small structural changes.
  •  It is important to note that the amounts of SARMs used in supplements are higher than the clinical dose, therefore the levels are not comparable. It should also be noted that the toxicity might be extension of the pharmacology. Comparisons should be made with others in the  androgen receptor (AR) space and compounds may be tested at higher doses. It should be determined whether levels can be extrapolated. The pharmaceutical industry is selecting compounds for tissue specificity.
  • It would be useful to assess potency first, such as the biospider approach, using the androgen receptor model system and classic initiating event.
  • A bespoke strategy might be needed, depending on definition, initiating effect and mechanism.
  • Transcriptomics could be used and an AOP would be written for androgen receptors.

Q2. Could read-across be used for risk assessment of SARMs with limited toxicological information? If yes, what criteria should be used and are there any classes of chemicals that are appropriate for read-across based on the information provided?

  •  Using read-across for risk assessment of SARMs may not yet be possible, although AOPs could be used for similar compounds to allow possibility of read-across. Read-across is unlikely to be useful in this instance as small structural changes will potentially lead to large conformational ones. Read-across would be limited to binding, gene activity and transcription. However, it may be possible to use in vitro and structure via read-across. Read across could only be used if the new compound was similar in structure and end points to chemicals already considered i.e., if it causes a similar biological effect and it has a related structure.
  •  It is possible to do a risk assessment for androgenic effects, and that may raise a concern. If not, that doesn’t necessarily mean that there aren’t other effects, i.e., read-across from other substances affecting the androgen receptor is useful if it indicates a concern, perhaps less so if it doesn’t. The challenge is that there is no database of toxicological data, so the focus is on the androgen receptor. Do we know enough about AR-mediated effects?

Q3. Is it prudent to attempt to extrapolate from the levels used in clinical trials to the levels used in supplements?

  •  Benzimidazoles are from multiple origins and from different sources in the food chain. It becomes a risk-benefit equation and a co-intake issue.
  •  The higher doses being taken are not comparable to those tested in clinical trials; at high doses, receptors may be saturated, etc. There are limits in doses in phase 2 trials.
  • It is not considered prudent to extrapolate the levels used in clinical trials to supplement use, as levels in supplements are higher than those used in clinical trials. Although, the dose level selection in clinical trials, may indicate what a suitable risk/benefit ratio is.
  • Things to consider:

1.    Increased concentration via nanoencapsulation.

2.    Co-intake/poly-supplement use.

3.    Key ingredients have multiple origins and,

4.    Clinical data may indicate a risk/benefit ratio, but it is not prudent to extrapolate from this for supplement use.

Q4. Could PBPK used for understanding distribution of SARMs in the body and would this approach be appropriate for determining potential side effects?

  • PBPK could be used for internal dose, but risk assessment approaches use external doses. However, this may not help as the tissue distribution is only a hypothesis and in order to run a PBPK model the tissue concentration is needed but is currently unknown. The effect of high doses on the pharmacokinetics are unknown. Once a PBPK calculation is achieved inside cells it may make a decision easier. PBPK modelling might be possible with clinical trial data but may need more than 1 model. Therefore, PBPK modelling would be a good start but is unlikely to be sufficient by itself.
  • Is there a consistent chemical communality between the different SARMs? What does the structure do to the toxicity? The diverse chemistry may affect read-across.
  • Enough is known about the effects of other androgens to perhaps predict what PODs we might expect for androgenic effects. Therefore, a risk assessment can be done for the androgen part, however we don’t know what other effects could arise from exposure as it is currently unknown. Computationally, it could be anticipated what the adverse effects would be. AOPs do exist for effects on the AR. However, other potential aspects/effects are unknown.
  •  It may provide insight if it was known how the pharmaceutical industry selects SARMs for tissue selectivity, whether there is a specific method. It would be useful to know what reason they have for selecting certain SARMs and not others. It would be interesting to know why not all SARMs go on to phase 2 in clinical trials. PBPK modelling might be possible if the clinical trial data was made available, however, more than one model may be needed e.g., transport specific information, structural similarity might also be useful.
Other points raised to consider:
  • Internal doses of supplements should be considered and compared to medicines. PBPK modelling could be used for this. 
  • There are currently no biomarkers and there is only an idea about the variability as there are only small numbers of volunteers in the studies.
  • Regulatory assessment tends to model the hazard so historical data could be used.
  • Are supplements really foods? It would be useful to revisit the definition of foods.
  •  What goes into supplements? Is the labelling correct?
  •  Comparisons have to be done carefully for selectivity activity across different targets/off-target effects.
  • Different mechanisms will result in different side effects.
  •  Biomonitoring can be useful but is unlikely to be available.
  •  The habits of consumers should be considered:

     o   Do people take supplements separately or in combination?

     o   Phase 1 trials mostly involve men therefore, the reported effects are in men. However, women take these supplements as well. What is known about the effects in women?

     o   Are they being used by men and women? The general consensus was that they were more targeted towards men.

     o   Do users take combinations?

     o   Do they cycle through different SARMs?

 Food Contact Material (man-made): Vinyl Acetate monomer (VAM)

Vinyl acetate monomer (VAM) is solely used as an intermediate in the chemical industry for manufacturing (polymerisation) of vinyl acetate (co)polymers. Hence it is concluded that the entire production volume of VAM is used up for the manufacture of various (co)polymers, mainly polyvinyl acetate. Polymers manufactured from VAM are used in a broad spectrum of products, including adhesives (e.g., film and surface adhesives) for packaging products and contain traces of vinyl acetate as a residual monomer. Human data on the acute toxicity of vinyl acetate are not available, however there are some rat studies. Therefore, by applying PBPK modelling various risk assessments have been proposed and this could potentially be used in future.

Discussion output points
  • Supplementary analysis (uncertainty and sensitivity analyses) should be conducted as part of the model building phase (and not afterwards, as implied in the guidance from WHO 2010).
  •  It was noted that although guidance from the WHO states that “the plausibility of a particular dose metric (that is to be simulated) is determined by its consistency with available information on the chemical’s MOA as well as dose-response information for the toxicological endpoint of concern”, there is no dose-response information in the case of vinyl acetate, only information on MOA (i.e., only one side of the equation). Therefore, there was disagreement with the “medium” level of confidence placed on the model for vinyl acetate by the WHO.
  • A delegate is involved in preparing OECD on guidance on the issue/validation of human PBPK models without human pharmacokinetic data. It was noted that with lipophilic chemicals, there is increased potential for lymphatic uptake from the GI tract, an absorption pathway that is not always included in PBPK models. There is a need to establish computer modelling processes including read across to predict this uptake from logP values. Furthermore, there is a need for regulators to do read-across.
  • In the case of paraquat (herbicide), there is significant binding of this chemical to cartilage. This is an example of where the underlying biological interactions need to be understood before a PBPK model can be built to accurately reflect these exemplar mechanisms.
  • Read-across may be used to predict physico-chemical properties but accurate prediction of the pharmacokinetics is more challenging. 
  • It was agreed that the values of the PBPK parameters would change between a microdose and a larger occupational or domestic exposure dose. The extent of the change depends on the pharmacology of the molecule in question. The use of microdose data is only valid for linear behaviour and subsequently a narrow range of exposures and applicability. They may not therefore relate to higher levels of occupational exposure where saturation effects may occur; this has certainty been the case as seen in the pharmaceutical industry. There are also human ethical considerations that remain with the use of microdosing. Furthermore, the radiolabel may change the in vivo behaviour of the chemical.
  • Use and test known case studies as if the known is unknown.
  •  For a conservative approach: look at Monte Carlo simulation and Bayesian methods and see if they match. It is possible that you could apply this methodology to PBPK, select a concentration range and use distribution around vulnerable groups.
  • Animal to human PBPK prediction is possible. Inhalation/deposition pre-systemic exposure could be modelled, although the anatomy is different, so it would only work in limited circumstances.