Brandon Rasmussen is a student at University of Nevada, Reno currently completing his research at Colorado State University under Dr. Derek Schutt .
The Mackenzie Mountains Earthscope Project is a five year initiative by CSU and other partners to investigate the origins of the Mackenzie Mountains, a mountain belt far from the pacific plate subduction zone. We will be contributing to the project by performing surface wave tomography with data from existing stations in the area using the dual plane-wave method (a relatively new method developed within the past 15 years). This process will yield both estimated seismic velocities and estimated anisotropy down to great depths, and help to understand lower crustal and upper mantle processes and their influence on the mountain range. In addition, we will be deploying roughly 30 seismic stations in the Yukon in July-August to increase station density and data availability for seismologists.
Fieldwork in the Yukon was quite the experience. The landscape was beautiful, but the weather and bugs were not on our side. Similarly to the logistical nightmare of planning for the trip, fieldwork hit many hiccups but still turned out to be a successful endeavor. All planned stations minus a few (less important) stations to the southwest were installed and the remaining few will be put in once permitting has gone through in BC. Though we donated a hefty amount of blood to needy mosquitoes (I was definitely singled out and targeted) and were soaked with rain every day in the field, the group remained positive and it was a great time.
I chose to dig holes instead of dealing with instrumentation, though I helped with a good deal of trouble shooting and completed a few installs and services towards the end as well as a good chunk of the initial huddle test setup. We only spent seven consecutive nights in the field to complete the entire Canol road up to Macmillan Pass, but three other full field days were used to take care of things within a day’s driving range of Whitehorse. This was despite multiple forgotten tools and hardware components, impassable road washouts, gps problems, and slow setting concrete. We finished so quickly that I ended up having to change my flight to a full week earlier.
The trip was a great break from the computer screen, and it gave me a healthy respect for the seismology community as a whole. The quantity of effort that goes into getting data from remote places is surprising. To think that people are doing it all over the world just for the sake of collecting some squiggly lines to look at is amazing to me. We even had one member drive himself up from Arizona to voluntarily be eaten alive while being soaked in rain and performing manual labor when he was in no way affiliated with the project. Crazy, right?
This summer has definitely taught me a lot about seismology as a field, one populated with computer nerds who also love exploring the outdoors (at least the entirety of the subset of the community that I spent my time with). It has also shown me that I feel right at home among them! While I’m still unsure of whether or not seismology will be my target for grad school, I am certain that I would have an awesome graduate experience if I was to go for it. I feel likeI accomplished all of my initial goals to a certain degree and learned a lot about my quirks as a researcher. My new goals are to learn to plan out and weigh all options before investing time in one route of a workflow, and to keep a better record of what I'm doing.
The road which we drove up to deploy on was a world war two era project which sought to bring oil from interior Northwest Territories to the coast for shipment. There were many dozens of trucks like this abandoned on the side of the road.
I was one of only four who got to go all the way past Macmillan Pass on our last field day for servicing/ scouting for next year! This was four of us after 6 nights in the wilderness.
My last week before heading to the Yukon involved rushing to try and get the inversion code working while simultaneously organizing a 4 week remote international trip. Unfortunately, while we got close, I was unable to finish the inversion before I left. This means my AGU abstract was very vague and presented no real results. While I waited for Derek to debug the last couple of steps in the inversion, I continued adding onto my crustal model which now includes gridded topography and water depth, sediment thicknesses, and moho depths. I also used python basemap to create a figure which showed my event azimuthal coverage by drawing great circles for each station-event pair. Unfortunately, I'm currently in Whitehorse and can't post it! I'll be out of internet access for the next few weeks. Wish me luck in the middle of nowhere!
This week I began trying to understand and compile all of the two plane wave Fortran code. Both of these processes were difficult, with weird environmental variable problems and trying to learn another new ( very unintuitive) language. I also began to write python code for creating a grid based crustal and upper mantle model incorporating data from multiple sources. I also had my girlfriend visit for 5 days. We went to the denver zoo (I listened to the AAAS webinar while walking around in the zoo) and on two great hikes. We also visited her potential future vet school, explored old town, and saw a drive in movie. Something really new to me on this project is my inability to be independent. The first few steps of the inversion code to generate predicted phase velocities jumps through over 20 different Fortran scripts, each with extremely picky input formatting requirements (some of which are so old they still have input lines for punch cards). With my limited ability in Fortran, I simply can't debug the problems that arise without investing unrealistic amounts of time. My mentor has been a huge help through this part of the process I unfortunately don't have pictures of my work environment to share, but CSU has a beautiful campus and my work environment is similar to what I have at UNR.
This week was initially spent using my cutting and bandpassing code to create invertible data for use in the Fortran two plane wave code from Donald Forsyth. Putting it into practice was enjoyable because I made the exact mistakes that I prepared my code to handle, and it did so well. I now have 244 useable events ranging from 5-107 traces which each have been copied and bandpassed to 15 different frequencies. I then headed back down to good old Socorro for deployment training specified for our Yukon deployment. This was invaluable, as I will likely be part of the technical team of the deployment.
The elevator speech wasn’t very challenging, but I can definitely see it being very valuable. At my sister’s wedding following orientation, I had to interact with a bunch of non-scientist family members who I hadn’t seen in 15 years that were very interested in what I was doing with my life. While I could explain general fields I may be interested in going in my career and why seismology was useful and interesting, I could not effectively explain what I was spending my entire summer researching on the spot. Putting research into simple terms that highlight its goals and connect these goals with understandable problems is valuable for explanations both inside and outside of your field, especially if your work is highly specialized. My speech (at 60 seconds) would need to be tailored for different crowds to make it understandable for non-scientists, and more technical for scientists. These changes are as simple as a few small modifications/omissions, though.
The NRAO guesthouse at New Mexico Tech. A slight upgrade from Baca Hall!
In addition to writing multiple shell scripts for further data manipulation, this week was spent making my codes for cross correlation and bandpassing more flexible and functional. My second data collection resulted in over 1800 events and nearly 60000 individual traces of several minutes of data from 1990-2016. Because of low station density in the Yukon, I wrote a script to remove events with less than 5 recording stations (barely enough to get worthwhile tomographic information). One day of using my visual filtering code brought the number down to 507 events, and showed me the lack of coherence and abundance of noise in many events. Afterwards, I used my correlation coefficient code and looked at multiple “good events” to find a realistic threshold value, which I applied to all of my data. This brought the number down to 244 events and less than ten thousand traces. All of my data is currently from the IRIS DMC collected through Standing Order for Data (SOD), though I hope to include data from the CNSN and a couple of other groups in my final AGU product.
After all of this work and manipulation, multiple new potential datasets were brought to my attention and I realized (this already being my second full collection) that at some point a decision has to be made to continue through with what you have. If I was to restart the process every time a new piece of data became available, there is no way that I could have meaningful results before my deadline.
My skill I’d like to further develop is understanding the next step in a research project, and I’ve figured out that this can take on so many different forms. When one path is blocked, progress needs to be made somewhere else and often this requires looking far ahead or even behind. This was hard for me to figure out and frustrating at first, but there is always something meaningful to be done. Luckily I have had the ability to produce meaningful work even when multiple roadblocks have been hit (waiting for others’ code, internal network problems, waiting on decisions from others, etc.) in the form of coding. But often progress can be made by things like making logistical decisions, helping others whose work influences yours, or even taking a break to make your mind functional again.
During my free time this week I enjoyed the breweries of Fort Collins, talked with some old friends, and went on a couple of great hikes with my roommate and Turtle the incredible bouldering mountain-dog. If you come to Fort Collins, definitely check out Horsetooth Rock. It will give you a new perspective on exactly how flat most of Colorado really is!
I started at CSU in mid-May taking it easy by learning the basics of Linux and reading up on the theory behind the dual plane wave method of surface wave tomography. In addition to a few meetings about the logistical nightmare of a large seismic deployment in the Yukon, going over introductory labs and reading papers gave me a good idea of what I was getting myself into for the summer. With the second week, though, the smoothness of the first week became history.
Network ID setup, disk quota, root permission, and sod recipe problems took up the bulk of my time for most of the week. But the problems really just ended up accelerating my learning of Linux by forcing me to learn clever workarounds through shell scripts and through extensive command line use. By the end of the week, though, I had a really good idea of the common problems that arise with getting seismic data and manipulating it for the true processing. I wrote multiple shell scripts and learned the basics of sac macros for downsampling/writing/ file manipulations. I began the task of looking at hundreds of sets of squiggly lines to identify good vertical component Rayleigh wave arrivals, only to be sent off to Rocky Mountain National Park for a seismic deployment before orientation in the midst of the roughly 1800 events I was dealing with.
After returning this week, we decided to do a recollection of data and added a few components to the recipe, which unfortunately led to more network-problems when we tried to update sod. While these problems were fixed internally, I got Obspy set up on my machine and began teaching myself its unique quirks. As an example, it took me about an hour to find out that what I was trying to access in a dictionary (distance to source) was actually in another dictionary within that dictionary.
After getting used to its strange setup and structure, I wrote a script for linearly interpolating between arrival times through a single user input, cutting Rayleigh waves, and then passing the newly windowed arrivals through multiple bandpass filters. This was jumping the gun, (as this is actually one of the last steps before passing the data through an inversion program for the final product we are looking for) but it was a great introduction into Python for seismology. After resorting to using the old version of sod, I am now currently back to the step of looking at thousands upon thousands of squiggly lines to find good data using a script I wrote to make the process faster. It's still very slow.
Four seismic stations, several miles, a quick hail storm, close wildlife encounters, and lots of deep snow to trudge through. Also a lot of fun. The prime reason for why I walked like an old man for the first two days of orientation. (Tried to post a picture but this happened?)
Goals for Summer:
1st third : Become better at scripting, especially debugging, and become quicker and more accurate with picking good data for inversion. Prepare thousands of seismograms for phase velocity and anisotropy analysis.
2nd third : Invert the data, find the inevitable data which screws up the inversion, and rerun the inversion. Learn to truly understand seismic characteristics and what they mean in the context of complex tectonics and mantle interactions. Write AGU abstract, keeping it broad for future findings.
3rd 3rd : Explore the Yukon.
4th 3rd ? : Play more with the data while staying in contact with Derek and Rick during school to see if we can draw anything more useful from it before AGU. If the data turns out interesting and we can draw some cool conclusions, hopefully turn the work into a paper.