Gabriel Ferragut is a student at North Dakota State University currently completing his research at Stanford University under Dr. Simon Klemperer.
The goal of this project is to compare Moho depth results using the relatively new method of Virtual Deep Seismic Sounding (VDSS) with the traditional P-wave receiver function method results using the records from Yellowknife Array in Canada. The VDSS method uses teleseismic S-waves that convert to P-waves when they encounter the free surface in the region of the array. These P-waves reflect back downward, and then reflect up again from the Moho where they are detected. Thus they look like a P-wave source from the surface, but are actually from 1000's of km away, hence the name virtual. These two methods can produce different measurements of Moho topography and the aim of this research is to understand where the discrepancy is coming from.
At this point my weeks and blog posts are beginning to get a bit jumbled around but this is effectively “Week 5” and the context is the week following 4th of July. The actual week of the 4th was a bit off because of the holiday but by the next week I was able to get into the record section production a bit more as well as write some scripts for data processing and event picking. I added some normalization conditions to the graphing process for the my hand selected “best events” and it made the record sections look much better, well on the way to something, dare I say, professional.
Perhaps the most frustrating thing during this week was just debugging code for extended periods of time. Tianze suggested that I create some sort of automated selection process in order to make the event picking more objective than just somewhat arbitrarily deciding which events was “good” and which were “bad.” I combined some earlier shell scripts to run through and append information to the SAC header files like arrival times, ray parameters, as well as do a rotation on the data to get R and T components, all in one sweep. From here, the files were read into MATLAB and each was converted to a structure that could be referenced and used. Our primary concern was looking at the vertical and radial components, as they carry the motion relevant to SsPmp phase conversion, so the MATLAB code reads these components. Once read in, the positive envelope function for the waveform is calculated and graphed. The positive envelope function helps immensely in determining the strength of the S and converted P phases because we can search near the anticipated S phase arrival and look for a spike, get the mean value of this narrow region, and then compare to the overall mean for the waveform and use some ratio as a condition for appropriate signal to noise. By looping through all the events this effectively picks out the events that have strong S phase arrivals and, often but not always, strong SsPmp arrivals.The code initially performed quite poorly because I had initially overlooked the normalization. After adding that, however, my little routine picked out the best events quite easily!
Here's a link to one of the record sections!
Time for a belated 4th of July blog post! The fourth week of my internship here at Stanford has been happily productive and I finally was able to take some of the more choice seismograms from my data and create my first (albeit rather rough) record sections using Matlab. I got a chance to discuss these graphics with Simon and Tianze later in the week and was given some solid advice on what to change as well as the different variables that I could use to organize the record section (ray parameter, distance, magnitude, depth, etc.) The result was a myriad of record sections displaying the information in completely different ways. From there I need to improve the overall layout of the record section as well as automate a method for strong S-wave based picking of seismograms to maintain objectivity. This week also was the last meeting I will have with Simon for a number of weeks because he is currently in Tibet on a research trip. For now it’s just me and Tianze working on refining the project and getting ready for the AGU abstract submission deadline in early August.
The weekend before the 4th and the strongly encouraged lack of working for Monday/Tuesday meant that I was able to do some regional explorations a bit farther away. Another IRIS intern in the area (Lexie) and I hit the road and went up to Lake Sonoma and then over to Chico. The amount of vineyards in Sonoma County is completely overwhelming, thirst-inducing, and awe-inspiring but unfortunately we didn’t do any wine tastings, that’s for another weekend! Sticking with the fermented beverage theme, we were also able to take a tour of the Sierra Nevada brewery in Chico that highlighted their sustainability practices and commitment to reducing their impact during production. The following day we drove into the hills and I had a chance to fly fish at the beautiful Deer Creek ,which is carved through deep lahar deposits from ancient volcanism in the area. The more I explore this state, the more it grows on me and I’m excited to continue the work and travel in ~6 weeks that are left!
The elevator pitch was an interesting exercise certainly but it has quite a lot of evolving to do over the rest of the summer. I took a bit to think through the project and what to talk about but once I started writing it actually wasn't too difficult. It's about the right length but I think there is still some substance missing. With a subject like biology, as in the ThePostdocWay article we were assigned as reading, there is a level of relevancy for anybody that hears the elevator pitch. It's easy to pull people in because you can tie your work into health issues that could conceivably impact your listener direcly. With seismology and geophysics, particularly with research not involving earthquake hazards, developing a sense of interest and personal relevance is a lot more difficult. I think as this pitch evolves that needs to be my focus. Why should someone care about whether or not we can understand the Earth's crust/mantle boundary, what has it got to do with them? That's the challenge of describing your work succintly to somebody else, not just lay people but scientists from other disciplines or sub-disciplines as well. I can't say that I've got a gripping and personally relevant way to describe this quite yet but that's the whole point of having the pitch be dynamic and evolving throughout the summer's research. Adding a bit of catastrophe and doom might do the trick though, as in : "The more we know about Earth's crustal/mantle structure, the more we can explain and investigate the processes that occur there such as mantle upwellings and the causes of largescale volcanism. This information could be useful in determining the evolution of supervolcanoes such as Yellowstone, which could cause widespread damage if eruption occurs." Do I have a completely detailed way to connect those topics? Not necessarily but that's ok because it's a realistic possibile byproduct of the research not a certainty of this specific work. It definitely serves to introduce some personal relevance to the topic though. After all, the point of the elevator pitch isn't to exhaustively explain your work to someone until they find it as interesting as you, the point is to get someone immediately interested so that you can further discuss the work at a later time.
Apparently the average speaking rate is somewhere around 110-150 words per minute. Here I have 114. So I can either do a very quick 30 second pitch, or a moderate speed 60 second pitch with room for a bit more of the personal relevance material to be added. Off to a decent start!
"It's possible to get measurements of the Moho, the boundary between the Earth's crust and mantle, by measuring the timing of seismic waves created by earthquakes. Recently a new approach has been developed by using waves from very distant earthquakes that reflect off the earth's surface and act as a virtual source. By analyzing numerous earthquakes it is possible to solve for both the thickness of the Earth's crust as well as the speed of the waves we detect. Since it is thought that the Moho is not an instantaneous boundary, solving for the velocity at different depths allows us to understand the velocity structure, and therefore material properties, of this crust mantle boundary."
After completing the second week of the summer’s work I am beginning to feel like I am actually working on something and not just trying to catch up and learn everything as quickly as possible. In particular, the meetings I had with both my faculty and graduate mentors were both productive and clarifying and I ended up learning a lot. As I mentioned before, my work this summer will focus on teleseismic data from the Yellowknife Array north of TheGreat Slave Lake in the Northwest Territories of Canada. This array was originally set up in the early 60’s in order to monitor for nuclear weapons testing across the globe and currently is used as both a main source of seismic related research in Canada and still as a nuclear weapons test monitoring tool in accordance with the CTBT of the 90’s.
These later era records are where my data will come from. I’m looking at the broadband stations of the array for the longest archived period (~20 years) to the shortest period (~2 years) to try and find suitable records of SsPmp wave trains from 30-50° away. Yellowknife is one of the foremost stations for seismologic in Canada, partly due to it’s position over large igneous cratons and remote location, which reduces anthropogenic sources of noise. As such, these records have been looked at for years and are nothing new, but as far as I know, virtual deep seismic sounding at Yellowknife has not been the main investigation of any previous work. The data is straight from the IRIS servers and acquired with a SOD recipe I wrote and as such needs a little handling. In the recipe it’s possible to process the seismograms before they are ever even stored as SAC files on your computer. So far I’ve been removing the mean, trends, instrument response, and applying a Butterworth figure to get seismograms with a suitable signal to noise ratio. From here on I’ll have to pick events that have clear S and SsPmp arrivals in order to use them later.
In the last week I also discussed my self reflection rubric with my mentor Simon. Overall I think it’s a pretty useful exercise and definitely helps put out there what level of complexity someone believes they can handle. Some other interns I talked to had expressed some concern that their self ratings were so low. I think this is pretty typical of successful students though. Maintaining a healthy sense of uncertainty inspires you to pursue questions farther and deeper and ultimately leads to a richer learning experience. The topic from the self reflection rubric I’d like to focus on in this blog post is the ability to critique both your own work and the work of other scientists in a field. This is something I could definitely improve on, at least with respect to the literature surrounding a research topic. I suspect many others who’ve spent the last years learning and studying, myself included, tend to take the words from scientific publications as canon law when really this is the opposite of what we should be doing. Reading something and absorbing its conclusions isn’t science at all. The whole point of science is that it’s a fundamental process for understanding the world and that it involves skepticism at every turn. Of course you can’t start from scratch each time (thank you peer review) but to get the most out of this process and one’s own literature research it’s important to be skeptical and question what is being reported. This insulates us from mistakes and drives science farther forward. It also helps to supplement the peer review process itself, which is really an idealized process. In order to do this proficiently I think one needs to be able to go into a piece of literature with an open mind, but still focusing on any assumptions or jumps in reasoning that may be present, and to then investigate these further.
My first week in the Geophysics Department here at Stanford has come to a close and I'm starting to feel a bit more settled in. The initial shock of everything lasted a few days and while I certainly felt overwhelmed by it all it was a challenge that I'm happy I had to go through. The most intense thing really was jumping right into writing code I had quite limited experience with some of it i.e. GMT (Generic Mapping Tools) and SOD (Standing Order for Data.) Luckily, the graduate student I'm working under, Tianze Liu, is extremely knowledgable and willing to help. There's a minor sense of dread at the prospect of the unknown while trying to learn something in a kind of heiarchical arrangement of knowledge and responsibilities. While it's impossible to just expect someone to pick something up right away, I was somewhat reluctant to bring problems to those higher up in the food chain so to speak. I didn't want to bother Tianze or Simon with problems in the code that I was stuck on both because I generally get very stubborn with these sorts of problems and want to figure it out on my own but also because I was worried my very basic questions or problems would either be a bother or somehow reflect poorly on me. I don't actually believe that but that general anxiety was in the back of my head (likely due to the overall impression Stanford has.) At some point I just gave in and starting asking questions more regularly and it's worked out very well. Questions are good! I was able to work my way through some initial plotting and syntax learning in GMT and SOD with Tianze and I'm feeling pretty good about it now. Asking questions is definitely one of my weaknesses and it is part of the reason I've chosen to pursue a topic like seismology that I was hitherto mostly unframiliar with, and to do so at an insitution like Stanford. Being flung into that kind of situation forces you to reach out for help and organizing your questions often helps soldify your learning further. Improving my ability to ask quesitons and reach out is one of the first goals I thought of for this summer and I believe (as well as hope) that this goal is a continually developed throughout the whole summer.
My next goal sort of ties in with the first and it's concerning coding/programming in general. Knowledgability of coding languages, practices, and uses is an extremely valuable skill and while I am certainly not new to programming I know that working here will be able to substantially improve the skills already in place. Now I absolutely love field work and that desire is always floating around my head but I chose this project in hopes that I would garner broad computational/programming experience through data plotting, analysis, and acquisition. I can definitely say that there will be no shortage of this and I'm really excited to continue learning. On the data acquisition side, using SOD for the first time was initially very frustrating. I later found out that this was mostly due to the fact the the SOD distribution on the IRIS website is actually outdated and won't run properly to retrieve .sac files. After I fixed that though things went very smoothly! I was able to pick out certain earthquakes that I had been looking for and plot up their seismograms in SAC. The next step is to write up a larger scale data request with filtering built in to the code. The data plotting in GMT was initially fairly frustrating as well but this was mostly due to all the different map projections possible and the process of trying to figure out which one best represents what I was looking for. Eventually I was able to get the right projection I was looking for (with help from Tianze of course) and produced a couple of very simple maps that plotted earthquakes at various depths and at certain distances from the Yellowknife Array we would be working with. I was also able to get a bit more experience using AWK to help me process the data I initially downloaded and for use in producing different plots. I definitely want to learn more about AWK based on how useful it's been already.
These goals are not necessarily quantifiable but I would be satisfied by the end of the summer if I could reasonable tackle whatever problem came into my head using these different tools. For example, if I needed to download data from some source, process it accordingly, plot it accordingly, and carry out any subsequent analysis on it, my ability to do so would be sufficient criterion for this goal's success.
- Gabriel
The first week of the summer here at orientation in Socorro, NM has been wonderful so far. The atmosphere amongst the interns is already significantly building into one of collaboration and understanding. Almost everyone was initially reserved but in a short time (and through Michael's group activities) the ice has broken away and interactions have become more fluid. It's a comforting realization that everyone here is in the same scenario and that most are introverted. I think it allows a sense of deeper comradery to build within our group and I am sure this sense of community will continue through the summer and especially into AGU in the fall! Overall, it's been great so far and I'm looking forward to what else is in store.