Today I was finally able to submit my abstract!
This has been a wonderful experience working here. I can say without a doubt that I have never learned so much, so quickly in a field that I knew nothing about going in.
The struggle that I worried about going in (programming) became my favorite thing to do. I learned so much about the bonds grad students have and how much easier it is to get to know an advisor versus a Colonel.
I will now be seriously considering a graduate school experience after I leave the Air Force.
The biggest challenges were simply getting used to the environment. It's a high pace environment at times and an extremely chill pace at others. But in the end, what matters is getting results and I have tremendous respect for that.
I was able to find reflectors using methods that had never been published. And to top it off they alligned almost perfectly. This was a tremendous boost in my morale and even though I understand that in seismology sometimes you don't get what you think you get, I am comforted knowing that I at least contributed to this field.
I have definitely learned to communicate seismology in a more professional way. When I first got here I remember staring blankly at those I asked for help.
I will look into going into the geophysics community though definitely not in Memphis (because of the location). However, CERI is one of the most fascinating places to work.
This summer has been a blast and I learned things that I know will translate for the rest of my life.
In two days I get to perform field work with two of the grad students in arkansas and missouri. I can't wait learn more about data collection on this mini-journey.
I finally finished collecting, autocorrelating, and then proofreading all of the data last week.
I then printed out all of the stacks and gui screenshots (~150 pages) and marked my picks for what was estimated as the reflectors.
This week, Dr. Langston was able to get me MATLAB code that accomplishes an automatic gain control as well as a conversion from time to depth using a velocity model for the region.
I did this for a few stations and will be presenting him with the results tomorrow. We will be looking for similarities in the reflectors in the BHN and BHE stations, then compare the reflectors of one station with nearby stations.
Next week I will be going to Arkansas to gather data from 14 NELE stations. I am really excited and have been self designated the dog/horse whisperer, which means I will be keeping these animals at bay while the others gather the data.
This will be a great experience as I will be able to see the process involved in gathering the data with respects to the landowner, maintenance of the sensor, and the gathering of data. It really puts the huge amount of data you have at your disposal into perspective!
Also I got a free NELE hat and T-shirt so I'm content.
For this week I will be posting an image of a GUI I have recently been using to look at stacks of autocorrelations of my data.
This Gui is very useful for identifying outliers and seeing if the 9 month stack of data correlates with the majority of my autocorrelations.
Each waveform is 10 hours of data with each windown containing 25 waveforms.
Unfortunately I cannot post an image to this blog at this time but I will try to post it in the comments section or in another blog post.
I have finally completed all of the programs I need to gather station data, autocorrelate, and stack.
In my arsenal: MATLAB code to build URL to download station data, SAC Macros to convert from velocity to displacement, MATLAB code to Autocorrelate, Shell Scripts to organize the data, stack, and then delete the raw data.
I will be processing 150 TA stations for around two years of data. Each data set is around 10 GB and takes around 19 hours on the CERI Macs. After my first dry run of gathering station R43A, I will hopefully use the HPC (High Power Computer) to run the rest at a much quicker rate.
Unfortunately I cannot use more than 50 GB on the CERI Macs so I will need to quickly delete the data when I am done processing it.
While I wait for the 19hours of gathering the first data set, I will be trying to make my code more efficient and allow more automated processes.
I was able to get the autocorrelation MATLAB code working with minimal problems. Now I am stacking the data.
First I stacked it with a few days, then a week. Both yielded some signs of reflectors, but the process takes about 5 hours to gather the data, process it, and then stack it for only a month's worth of data.
For week 5, I will be making Macros, Awk commands, and more Matlab code to streamline this process.
The Gui code is not working at all, so I will be simply plotting them in SAC and print screening them to get the images.
Most of my work last week was on trying autocorrelation through SAC with different parameters. I tried multiple ranges for the first and second bandpass, different frequency limits for removing the instrument response, and different highpass filters.
After running the seismograms by my graduate advisor Blaine, we concluded that it would be a better idea to work on perfecting a code that another grad student years back put together to identify reflectors within large volumes of data through autocorrelation.
I have now two sets of MATLAB code: one set deals with the data aquisition and autocorrelation, the second deals with plotting these figures through a GUI.
I have spent two days so far on the first set, and I can comfortably say that this student does not like commenting or efficiency with his code.
I annotated approximately 90% of the first set of code, and later today I will finish the other 10% and figure out the looping pattern with pen and paper. After that I will move organize the programs and start using it to autocorrelate the data.
For any interested, I will be opening up a forum for MATLAB help and suggestions. I can also send anyone the code I am working on with annotations if they want a good reference for seismic analysis code in MATLAB.
For my project, I will be using the NELE data and some of the TA data. I am currently using JWEED to access the data from IRIS.
I will be working with approximately two years of raw data at nearly 80 stations. Many faculty at CERI use the NELE data, but as far as I know, I will be the first to deal with such a large quantity of ambient noise data.
I will also be using SAC to clean up, correlate, and stack the data. I will be doing most of this through the High Power Computer (HPC) at CERI.
To plot and display the data, I will be using both Matlab and GMT.
The main structure of this week was to familiarize myself with CERI, obtain a University username, and establish a CERI account for the computer labs. I was just able to accomplish the final task of obtaining my CERI account and was given a quick tutorial on how the Macs work. I spent most of this week studying what will be accomplished during my project and some of the methods of doing so. While I am still struggling with how a lot of processing works, I better understand the basics of seismic processing and what the information in a seismogram gives me. I also was able to better familiarize myself with Matlab and SAC. I met with Dr. Langston today, and we outlined the structure for my project into four sections. 1) correlate to get the source-receiver waveforms through cross-correlation and auto-correlation. 2) Dispersion measurements to find group and phase velocity. 3) Construct Tomography of the Norther ME. 4) Use Structural Inversion to get a map of the shear velocities at varying depths. Knowing these tangible goals, I am much more confident in my ability to perform each of them. I also was able to take a look what the programing will look like and what style of data I will be dealing with. The first goal of correlating and organizing the data from all of the stations should take roughly a month, but could be longer since a huge part of my project is locating the reflectors from autocorrelation. Dr. Langston discussed July 7th as being a rough goal for accomplishing this first part.