The toxicity of iron (III) in fresh waters has been detected at concentrations above the iron solubility limit, indicating that this must involve colloidal and particulate forms. Current water quality guideline values for iron (III) in fresh water are based on analytical determinations of filterable or total iron. Such methods are, however, conducive to the underestimation of the colloidal fraction, or the recovery of fractions of low bioavailability from suspended solids (e.g. mineralised iron oxides and oxyhydroxides) naturally abundant in many surface waters. Consequently, there is a need for an analytical method that permits the determination of a bioavailable iron fraction, while avoiding both false negative and false positive results. Ideally a measurement technique is required that can be readily applied by commercial laboratories and field sampling personnel, and integrated into established regulatory schemes. The current study investigated the performance of pH 2 and pH 4 extractions to estimate a bioavailable iron (III) fraction in synthetic water samples with different iron phases. The effects of ageing on fresh precipitates was studied for up to 14 days of incubation. The results showed that, the total recoverable, 0.45 µm filtered, and pH 4 acid-soluble fractions did not show an adequate discriminating performance. Contrastingly, the pH 2 extraction showed specificity towards iron phases, particularly within the 0.5- and 2-hour intervals. Extraction times above 4 hours and up to 16 hours equally recovered >90% of the spiked iron, regardless of their age. Most importantly, the pH 2 extraction recovered less than 1% of the well-mineralised iron phase. This study shows that a pH 2 dilute-acid extraction is a suitable method to describe a bioavailable iron fraction avoiding false negative and false positive results. The advantages of an extraction time of 0.5 versus 4-16 hours will be discussed.