Calculating receiver functions to study the transition zone in West Antarctica. The transition zone which is represented by two globally known boundaries (410 and the 660 km) are are parameters in the deep earth which can be used to see hydrated regions and mantle upwelling/ plumes that tie into tectonics of Antarctica
Just an update on pretty much what’s been going on in Socorro. The past two weeks have been more of the same, although every day I show up to work on my project, there’s always some aspect of either scripts not working, or my computer not having the correct tools in order to get my job done. On my last day here, I’m just expecting my operating system to somehow delete all my data, and then my external hard drive would explode. It would only make sense at this point.
At the moment, I’ve been trying to run a CCP stacking script (Common Conversion Point) with all the data in the figure below. Just like the figure from mylast post, this figure is showing where the conversion point of the 410 (upper boundary of the transition zone) is located in Antarctica. In another figure I'm making, it'll use this data to show the topography of gthe 410 by showing depth to all these points. But that'll be for my next Blog post.
The idea here is to stack data from one station where it overlaps with another station to give us a better picture of the Transition zone where we otherwise wouldn’t have data. The current problem is that we have so much data that the scripts we are using to run all this data won’t utilize it all. It’s not that the script won’t run (which would be better), but it’ll finish and exclude data giving us less desired results without letting us know. So, I “fixed” it to exit the program and show the user a warning if there is too much data in the grid being made for the CCP stack.
No joke, this is probably the work I’m most proud of. It may even go on my poster for AGU.
At PASSCAL, I finished testing the RT125A (Texans) datalogger and moved on to something a bit more new: CTR4 (Centaurs). I was a little more excited with these because they are similar to DAS’s I’ve seen in the field. Testing the Centaurs went pretty fast, so I got to move on to apparently the infamous Q330’s. PASSCAL has said that the Q330’s are their best and most requested systems, so I doubt it’ll be the last I see of those. Unfortunately, because of liability and GPS issues, I wasn’t able to get as hands on with the Q330’s as I was with the Texans and Centaurs.
Anyway, that’s it. Only one week left!
I would say the last couple of weeks have been mostly mundane with exciting bits inbetween. The most exciting part of last couple of weeks was probablly running of a mountain after hearing some thunder crash. As far as my project goes, the most exciting thing for me was finishing QC’ing all the data I’m going to use. No more waveforms for me! Since then I’ve been focusing on GMT stuff and trying to make figures which is slowly moving along, but that’s really it as far as my project goes.
Figure 1: This is a picture showing where we have conversion points mapped (where a signal converts at the transition zone, represented with orange dots) from only a single station (black triangles). This figure still needs a lot of work aside from the obvious misplaced labling: The plan is to compile all of our stations onto one map, so there will eventually be a bunch more little orange dots, and we also need to do this for common conversion point stacks (CCP).
I also started to volunteer at PASSCAL a couple of weeks ago. I’ve been helping test and repair their RT125A (or more commonly referred to as Texans) data acquisition system. Here’s my workspace over at PASSCAL:
What everyone a PASSCAL refers to as my fort. It's pretty exclusive.
So that’s pretty much it. Most of my time is split between these two projects and probably will be for my last 3 weeks here.
The paper most central to my project is “The mantle transition zone beneath West Antarctica: Seismic evidence for hydration and thermal upwellings” (E.L. Emry et al.) This paper written by my advisor is pretty much my project. In a nutshell, what she is trying to do is study the Mantle Transition Zone (often referred to as the 410 and 660, which is referring to typical depths to the boundaries of the transition zone) and if changes in thickness or depth to the boundary can be tied to hydration or thermal anomalies.
So far, all the methods and purpose of the study from the paper are exactly the same as my current project. The only difference between my project and the project from the paper is that I’m adding in as much seismic data as possible to fill in gaps since the paper was written. It may seem like I’m not doing anything new, which I’m really not, but data coverage in Antarctica is sparse so every little bit counts.
Figure 1: Single station stack of 1D.PIG1 seismic station in the Amundsen Sea region of West Antarctica. These three figures are made by stacking (averaging lots of seismic data for a station to amplify seismic signatures, and cancel noise), migrating (converting time to depth), then applying Gaussian filters: 0.5, 0.75, and 1.0 (Gaussian filters correspond to frequencies allowed in a specific bin. The higher the Gaussian filter, the higher the allowed frequency for the specific bin used in the stacking process. Different filters are used because some frequencies may be better at showing boundaries/signatures than others). The figure shows two distinctive peaks (more or less) around 430km and 680km which is assumed to be the Mantle Transition Zone. While its thickness doesn’t differ from the global average, the transition zone is depressed which can still be tied to a thermal anomaly in the crust.
My project at NMT consists of no field work, but through my persistence of telling my advisor that we don’t go outside enough, she asked around to see if I could help anyone out with some fieldwork. So, during the week I had the opportunity to work with representatives of the Bureau of Economic Geology of Texas and New Mexico who worked out a deal to install a couple of permanent seismic stations in the Permian basin. I’ve never installed a permanent station before so there were a lot of parts to this that were new to me which was cool. What wasn’t cool though, was digging through a foot of bedrock (breaking their pick ax twice) in ~110˚ heat. But, overall it was still a lot of fun and I really can’t complain. The Texas people had a great sense of humor, so they were fun to be around, and they eventually offered me a job; and because of the long car rides across New Mexico, I got to bombard the New Mexico representative of the Bureau of Geology (Mairi Linterland) with all kinds of questions which I’m sure she grew tired of fast.
I’ve also never been through an oil field quite like the Permian Basin before, so that was also a different experience for me—cowboy hats and F-150’s are definately in style down there. One eye opening aspect of it was seeing a landscape littered with oil rigs, then roughly a half mile from any rigs is the Wipp headquarters which is where the US stores all of it’s nuclear waste. Apparently, this is ok and nothing bad is ever going to happen.
My week 4:
Looking at years waveform data. That's it.
I know I bragged in the past about not having to look at waveforms, but karma is a… well, you know the expression. All week I’ve been getting rid of bad wave forms from my data set. That’s it.
The most depressing thing I’ve done so far is “delete” about 3 hours of work. I didn’t actually delete anything, but I accidently merged a folder of good data with bad data, and now have to redo the QC’ing process for that data set. Not the end of the world, but still unfortunate. For anyone reading, the wildcard in bash (*) without anything to go with it selects every file in the current directory. Don’t be like me; check to make sure you’ve typed something next to it.
As far as a success, every day is a success when I can get bash to do what I want it to do. I guess what I’m most proud of are the scripts I’ve written that have fixed a lot of issues in retrieving data, and I’ve also made some scripts that have sped up the whole data processing part. I’m still behind though.
Also, I can’t make a map of the area I’m working on because I don’t a lot of software downloaded onto the computers I’m working on; I was never able to get virtual box to work on my computer or the computer at my institution. For the record, I did complete the tutorials, but I used whatever programs I could find at my workspace to do it. I guess that was one other struggle I’ve had to overcome.
If you haven’t already read the week 3 blogs from the other interns, we’ve all been tasked to describe how we deal with “The Elevator Speech,” or what is basically your initial pitch to someone or many people how to introduce yourself and what you do in a very short and concise way.
I would think most of us have already gone through many elevator speeches just by grinding through school. Our intro that we constantly have to come up with when we meet every new person every quarter was constantly a work in progress. I also have family and friends (some of whom I even went through lower division with) who have no grasp on what geology is (sometimes it’s not even worth differentiating that geophysics is different) so having to pitch to them what I’ve been doing for the past five years has been a constant practice. I guess I find it a bit odd that it was deemed necessary for us to really think about the elevator speech considering that I figured scientists (or students trying to be scientists) have to go through the elevator speech constantly and already have a lot of practice in refining it; I’m guessing the reason why we should think about this a bit more isn’t because we haven’t practiced our intro speech enough, but that we often fall short when it comes to knowing our audience. I think the hardest part for really having a great elevator speech is hitting the sweet spot of not overestimating or underestimating what your audience knows.
My formula for anyone who ever asks about me (for anything, not just about science) has always been: what I do (often coupled with what I know), why it’s important, then maybe something more specific to see if it spikes their curiosity at all. I find that when I’m allowed to be the most broad—when I have talk about what I do to people who don’t know geology at all, geologists, and sometimes geophysicists—my formula for a quick introduction works pretty good, but still changes a lot depending on their reaction to that very quick intro. When I find that I’m dealing with someone where I’m allowed to be more specific (maybe another seismologist), I’m finding out that my formula falls apart in that I find the conversation a lot less one sided and I often become the person asking questions. I think this is ok If I’m in a conversation, but if I have to write a paper or give a presentation to a group of people that know most of the specifics of what I’m discussing, and I have to keep going on about the subject without any input, the whole discussion becomes very tricky.
Overall, I find introductions typically pretty easy. I’m a pretty simple person who doesn’t remember things if I can’t connect it to what I already know, so I try to internalize that in most discussions I find myself in—that’s not anything new though, I think that’s kind of the mantra of science, “keep things simple until you have to complicate it.” When delving into specifics with others, the conversation comes down to trying to make sense of something you don’t know on the spot, or finally deciding to say, “I don’t know.” Personally, I’ve found deciding when to finally say I don’t know really depends on how many beers I’ve had.
Anywho, my project is moving along. I’m starting week 4 and just now looking at waveforms and still don’t have any coherent figures yet. Writing my abstract is going to be a bit tricky without any clear idea of how 10 years of new data might change our perspective on what is happening underneath West Antarctica.
I also drove out to Santa Fe over the weekend to meet up with the Geophysics camp SAGE which was a lot of fun as well.
I've finished getting all the data I'm going to need to finish this preoject using the IRIS metadata aggregate. Sometime in the middle of last week I started to use a script that calculates the receiver functions which is embedded with all kinds of procedures and algorithms that ultimately calculate the waveforms, checks if the deconvolution/convoluting process was effective in removing the station signal, then picks first arrivals for me (I've done this the long way before so I'm not sad at all about this. Sorry David). I've also started using a QC'ing script that throws out waveforms that don't represent the bulk of the data, and then gets a plot ready to look at.
I've gotten accustomed to fixing and now writing scripts, so all last week every new task just got tacked on the end of the scrips I was making. All I really do at the moment is press the enter button on my computer and let the scripts do what they do. While it may seem mundane, It gives me extra time to read some papers, and because my computer screens are littered with terminal windows running number it looks like I’m hacking into the matrix all day which is cool.
Aside from working, I wondered around the desert for a good portion of my Saturday and get a little sun burnt—it kind of feels nice after being inside for so long. I think I’ve got some cool pictures for anyone interested.
Driving out to New Mexico hot off the clutches of my last quarter was one best pieces of mind I've had in a long time. It's been a while since I've set out on a road trip and this one was especially beautiful driving through fog surrounded mountains of Tonto National Forest.
I've gotten to know my advisor Erica Emry pretty well over that last two days have already developed a pretty good rapport with her. Everyone in the department of New Mexico Tech has been very friendly--someone in the department has already bought me coffee, which was the opposite of how I thought Internships worked! Many people within the department are excited to talk about their research and what they specialize in which is oddly not something I'm used to, but is great.
So far, our short term plan seems pretty direct in that we need to collect more data to bolster the idea of a mantle plume underneath West Antarctica possibly responsible for extension in the region.
The first few weeks will be just collecting data and using SAC to calculate reciever functions which I have feel like I already have some bit of a grip on, so it shlouldn't be too rough. Interpreting the data is something I feel like I'm going to bug Erica a lot more about.
I'd like to have the abstract for AGU ready by July 23rd.
Very Short Term (as in now)
Start getting back into shape! School has taken a toll on me for the last year, and have become probably the unhealthiest I’ve ever felt in recent memory. That being said, and the rein of upper division classes over with:
Run/exercise 4 times a week.
Short Term (First 2-4 Weeks)
Learn Unix, SAC, and some other common tools in seismology that have come up way too often for me to not understand: convolution/deconvolution, migration, Gaussian filter off the top of my head.
Make a trip out to Santa Fe to meet up with people doing the SAGE internship which I was apart of last year.
Meet up with Tim Parker who’s indefinitely helped my career by sharing a lot of his geophysics knowledge.
Middle Term? (Doubt that’s how I’m supposed to say that, but after 4 weeks and on)
Since the IRIS/PASSCAL Instrument Center is right up the street from me, I’d like to get into helping them service their equipment and brush up on my electric repair skills.
Long Term (by my week 10)
Make a template for grad school applications and essays as well as have all the professors I’d like to work with for grad school ready to be contacted.