IODP Proceedings    Volume contents     Search


Analytical methods

Pore fluids were collected from whole-round cores that were cut on the catwalk immediately after recovery, capped, and taken to the laboratory for processing using a titanium squeezer, modified after the stainless-steel squeezer of Manheim and Sayles (1974). Gauge pressures up to 30 MPa were applied using a laboratory hydraulic press to extract pore water. Interstitial water was passed through a prewashed Whatman No. 1 filter fitted above a titanium screen, filtered through a 0.2 µm Gelman polysulfone disposable filter, and subsequently extruded into a precleaned (10% HCl) 60 mL plastic syringe attached to the bottom of the squeezer assembly. Details of the procedure are given in the “Methods” chapter (Expedition 334 Scientists, 2012b).

For the minor element concentration analyses (B, Ba, Fe, Li, Mn, and Sr), the interstitial water sample aliquot was diluted by a factor of 10 (54 of the Hole U1379C samples; 0.25 mL sample added to 2.25 mL 1% nitric acid) or 15 (all other samples; 0.2 mL sample added to 2.8 mL 1% nitric acid). Only B, Ba, and Li data were collected from the 54 Hole U1379C samples that were diluted 10-fold; and these samples were further diluted 20-fold (0.2 mL diluted sample added to 3.8 mL 1% nitric acid, for a total dilution factor of 200) before acquiring Sr data. Iron and Mn data were not acquired for these samples. Because of the high concentration of matrix salts in the interstitial water samples at the 10- and 15-fold run dilutions, matrix matching of the calibration standards was necessary to achieve accurate results by inductively coupled plasma–optical emission spectrometry. To this end, a matrix solution was prepared from trace-metal-pure salts (NaCl, CaCl2, and MgSO4; Sigma Aldrich, USA). These ultrapure salts contributed a not insignificant amount of Ba to the solution, determined to be ~13 nM. This contribution was accounted for when we determined the Ba concentrations in the calibration standards.

The calibration standards used for acquiring B, Ba, and Li data from the 10-fold diluted samples were made in 1% nitric acid from 1000 µg/mL primary standards (Ultra Scientific, USA) and matrix-matched using our matrix solution. Calibration standard dilutions were done by weight, allowing for a concentration to be calculated in each solution. Standards ranged from 0.178 to 18.2 µM Li, 11.3 to 1155 µM B, and 10.3 to 917 nM Ba (accounting for Ba contributed from the matrix solution). Calibration standards used to calculate Sr in these samples were diluted by weight with 1% nitric acid from an in-house stock solution containing 1 µg/mL Sr. These standards were not matrix-matched to the samples, as we assumed the 200-fold dilution would dilute the sample matrix sufficiently. The standards ranged from 0.0441 to 2.04 µM Sr.

The stock standard solution used while running all other samples was prepared from 1000 µg/mL primary standards (Ultra Scientific and BDH Chemical) in a 1% nitric acid solution. This solution contained the following concentrations of elements: B = 108.0 µM, Ba = 1.233 µM, Fe = 5.381 µM, Li = 43.99 µM, Mn = 5.571 µM, and Sr = 27.42 µM. This standard solution was diluted by weight with 1% nitric acid, with dilution factors of 3, 10, 30, 100, and 300, and each dilution was matrix-matched using our matrix solution.

Calibration standards were run multiple times each run day, and the sample concentrations were calculated using a regression of all calibration data points. The daily limit of detection was determined through error analysis of these regressions. The error due to the regression was calculated for each individual standard run, and a second-order polynomial was fit to these error values. The limits of detection reported here are from the point at which the concentration is equal to three times its corresponding error value along this curve. Because this limit was calculated each day, it is variable between runs. Detection limits, corrected for sample dilution, range from 4.3 to 6.4 µM B, 70 to 84 nM Ba, 0.25 to 0.34 µM Fe, 0.82 to 2.4 µM Li, 0.19 to 0.28 µM Mn, and 0.67 to 1.1 µM Sr.

Accuracy and precision of the samples were estimated by repeated analysis of IAPSO standard seawater (Ocean Scientific International Ltd., United Kingdom), with a salinity of 34.993. Values with 2σ errors (n = 26) were B = 452 ± 14 µM, Ba = 631 ± 31 nM, Li = 25.8 ± 0.9 µM, and Sr = 88.0 ± 1.3 µM. Iron and Mn were always below the detection limit for each day’s run. For these elements the average analytical precision was estimated from the repeated runs of the standard curves, and was <5% for Fe and ≤2% for Mn.

Dissolved silica concentrations were determined on shore with a Thermo Scientific Genesys 10s Vis spectrophotometer at the University of Washington using the colorimetric method described in Gieskes et al. (1991). Briefly, the method is based on the production of a yellow silicomolybdate complex and the subsequent reduction of this complex to yield a blue color. Standards and samples were prepared for analysis by diluting 20 times then adding the molybdate solution to the vials. Approximately 15 min was allowed for complexation with the molybdate solution before reduction with a solution consisting of metol sulfite, oxalic acid, and sulfuric acid. After addition of the reducing solution, the samples were allowed to sit for at least 3 h to let the blue color develop before the absorbance was read on the spectrophotometer at a wavelength of 812 nm.

Calibration, check, and drift standards were made from a 3000 µM stock solution prepared by dissolving 0.5642 Na2SiF6 in 1000 mL of artificial seawater (NaCl solution with an ionic strength of 0.7). Calibration was achieved with the following standards: 30, 60, 120, 240, 360, 480, 600, 900, and 1200 µM H4SiO4. Two check standards, which were prepared from the stock solution but were not part of the calibration curve, with concentrations of 300 and 500 µM were analyzed during each batch of analyses to monitor analytical accuracy. A 480 µM drift standard was analyzed every 5 samples to monitor analytical precision. The average precision of the dissolved silica analyses based on repeated measurement of the 480, 300, and 500 µM standards over a 1 month period was <1%, and the average accuracy based on repeated analysis of the 300 and 500 µM check standards was <1.5%.