Tuesday, March 29, 2011

core selection

If you have participated on an oceanographic coring expedition, you know we recover many cores of the seafloor. I generally take more than we need. Sometimes a core does not penetrate all the way and falls over, sometimes the sediment is too sandy and “washes out” past the core catcher, sometimes a location ends up having a lower rate of sedimentation than desired. Probably 80 % of our cores are useful for their intended purpose, but still we have extras. That’s a good thing because, for example, if a site is double cored we can learn a lot from the first core,

but it may soon become oversampled. This usually happens after a paper is published and other researchers request samples from the same core. A second core means new questions can be asked by later researchers without the expense of going to sea again. This is why WHOI and other major institutions have a core library. I usually do not double core a site, but prefer instead to take a second core in the region. Then, if we have a way to compare cores within a region, we would expect a common climate signal to emerge and we could choose the one with the best signal to begin the research.
Fortunately, we have a convenient way to select among similar cores the one that looks “best” without the work of splitting and sampling all of them. If you have sailed on a long coring expedition, you have probably seen a core scanner in operation. A scanner, or logger, collects three kinds of data: sediment sound velocity, sediment density, and magnetic susceptibility by drawing the ~150 cm core section past the sensors. We don’t generally collect the sonic velocity data. The sediment density data is useful for calculating the rate at which sediment accumulates, and along with the magnetic susceptibility it provides stratigraphy (a way to correlate layers). Geology is 90% stratigraphy. Without stratigraphy we cannot get to chronology, and of course the study of time is what it’s all about.
So, usually beginning at sea, someone is plotting magnetic susceptibility data. Magnetic susceptibility is a dimensionless number that is the ratio of an applied magnetic field (from the core logger) to the field that is induced in the core. Generally when it is high there are lots of iron oxides and other minerals, and when it is low there is more shelly material from the plankton. Magnetic susceptibility is useful for correlating locally to regionally, where sedimentary processes are the same. One of the first things I do after a cruise is to choose which cores to split based on their susceptibility records.
Laurentian Fan provides a good example. Our Laurentian Fan core site (22 on the Canadian Margin map) is very close to a site we cored with our giant gravity corer (GGC) on voyage 326 of R/V Oceanus in 1998. Oceanus is not large enough to take long piston cores, but my purpose on OCE326 was mostly to study the past 11,000 years (the Holocene). The OCE326 cores revealed an exceptional record of deglaciation, but only for the time since around 17,000 years ago (see mag sus at LF cores.doc).

The rate of sedimentation was too high to reach the last glacial maximum (LGM) with a 4 to 5 m GGC. At the top of the figure is the mag sus record from OCE326 26GGC. You can see that the most prominent feature is a hump at about 360 cm (very narrow drops in susceptibility often occur at core section breaks (vertical lines in figure)). The radiocarbon date on planktonic forams there shows the conventional 14C age to be 9.33 kyr. Looking at the bottom panel, you can see we got a similar age on forams from the Holocene hump 14GGC, but in that core the peak is about 130 cm. Thus, bearing in mind that the sediment physical properties are about the same between the two OCE326 sites, core 26 has nearly triple the accumulation rate of core 14.
The rate of sedimentation in 14GGC before the Holocene is very high, so in a 2005 paper I was able to patch cores 14 and 26 to make a composite climate record of the past 17,000 yrs in high resolution. But it is still important to know something about the last glacial maximum (LGM) and earlier at this location, so that is why we cored there with our new long coring system on R/V KNORR voyage 197/10 in July 2010. We took 41GGC at the same location as piston core 42CDH (28 m long and not showed here) and you can see (middle panel) it has the same magnetic susceptibility features as the earlier cores (the y-axes differ because the sensors were different). Through Steve Swift’s skillful surveying we found a site that seems to have a sedimentation rate intermediate between that of the OCE326 cores. Isabelle Gil has been studying the history of diatom changes at this site using 14GGC sediments, but until now she has not been able to sample the LGM. She wants to know if the region was as fertile during glaciation as it is today, how the nearby ice sheet affected the surface ocean, etc. In 42CDH it looks like the LGM is a 2.5 m interval with only small changes in magnetic susceptibility. We haven’t split the core yet, but from other work on the Canadian margin I bet the mud is typically fine-grained and “brick red” as at the bottom of 14GGC.
In general there is a quest for high sedimentation rate cores in the community of paleoceanographers. We want high rates of sedimentation because the higher the rate, the shorter the time interval between successive samples. If you are old enough to remember tape recorders, the faster you run the tape the higher frequency signal you can record. We want to record high frequency signals in part because the climate change we are experiencing this century is high frequency in the geologic sense. How will human-induced climate change be affected by the underlying climate processes that we don’t understand yet?
(n.b. I thought I would begin by getting right into data because that is what motivates me. The blog should be about generating research results and how we think about them, so please let me know if this discussion is at the right level. Frank Scofield indicated that you might want to know abut how we sample the cores, so I will cover that in the next post.)