Friday, May 13, 2016

Activity 11: Navigation with GPS



Introduction:
Figure 1: Study area at the Priory in Eau Claire, WI

This is a follow-up to a previous field activity where the class created navigation maps. The class was split into groups of two where each group member two navigation maps. The best map was then chosen and used for this activity. The best map was chosen by comparing the clutter within the map and observing the coordinate system used. The best map had minimal clutter and had a coordinate system of UTM. The UTM map was used opposed to the decimal degree map because it is far easier and more accurate to navigate using a linear unit such as feet or meters, rather than decimal degrees. The objective of this lab was to navigate through the Priory in Eau Claire, Wisconsin (Figure 1) by using the navigation map and an Etrex GPS unit (Figure 2).
Figure 2: Etrex GPS unit


Methods:
Figure 3: Coordinates of marked locations

Each group had 5 locations that were previously marked with pink ribbons that were labeled with a site number by Dr. Hupy (Figure 3). The GPS unit was used to help navigate through the study area as well as to log our path while navigating and to mark the locations of the course. As Dr. Hupy passed out the paper maps, it was hard not to notice some errors and mistakes on the parts of students. Many of the groups had compared maps and noticed they were in different coordinate systems. The group's map also had the latitude numbers cut off on the side of the map so they had to be filled in with the help of other groups' maps.

Next, the groups had to mark the five locations. This was done by looking at the coordinates given, as shown in figure 2. The points were a bit off on the map from their true location so the GPS unit helped to find the true locations in the field.

The final step was to go out into the field and navigate to the points by using the given coordinate systems, paper map, and GPS unit (Figure 4).
Figure 4: The track log from the GPS unit while navigating the Priory

Discussion:

I noticed it was difficult at times to navigate using the GPS. The group had backtracked a few times because it was hard to get bearings at times. It was difficult to navigate both the northing and easting while walking. The terrain varied greatly, from one area containing thick underbrush, to another area having easily accessible deer paths or a large mowed path, and one area was an open pine tree farm.

Activity 10: Processing UAS Data in Pix4D

Introduction:

This activity is in preparation for an Unmanned Aerial Systems (UAS) lab at the end of the semester. Dr. Hupy gave the class two separate folders (baseball field and track field) containing UAS images within the city of Eau Claire, Wisconsin taken by a previous class. These images would be used in Pix4D to create a georeferenced mosaic of imagery.

Description:


  • What is the overlap needed for Pix4D to process imagery? 
  • What if the user is flying over sand/snow, or uniform fields?
Pix4D is a program that processes images to create a Digital Elevation Model (DEM) as well as a high resolution 3D image of the terrain. It is based on automatically finding thousands of common points between images, and the more common points there are the more accurately the 3D points can be computed. So the main rule to have is to maintain a high overlap between the images. That means at least 75% frontal and 60% side overlap in general cases. There are special precautions to take when taking images of different features. Snow and sand for example have little visual content due to large uniform areas. So it is important to use a high overlap of at leat 85% frontal overlap and at least 70% side overlap. It is also important to set the exposure settings accordingly, to get as much contrast as possible in each image.

  • What is Rapid Check?
There are a couple of templates that can be used. A full processing technique basically provides the best resolution possible, but this takes a significant amount of time so this method is best used when in the office. There is also a rapid check that reduces the resolution of the original images which results in lowered accuracy. This is significantly faster than the full processing, so it is recommended to use while still in the field.

  • Can Pix4D process multiple flights? What does the pilot need to maintain if so?
Yes, it is possible to process each flight separately and then merge the several subprojects. The pilot needs to keep the same horizontal and vertical coordinate system for each flight. The GCPs also have to be in the same horizontal and vertical coordinate systems. And the horizontal and vertical coordinate systems for the output must be the same. It is also important to fly at the same altitude for each flight to keep the resolutions as similar as possible.

  • Can Pix4D process oblique images? What type of data do you need if so?
Yes, Pix4D can process oblique images. The user would need to know at what angle off nadir the images were collected.

  • Are GCPs necessary for Pix4D? When are they highly recommended?
They are not necessary, but they are highly recommended when processing a project with no image geolocation. GCPs give scale, orientation, and absolute position information. 

  • What is the quality report?
It provides all of the metadata behind the processing, such as the amount of overlap between images, the camera used, the coordinate systems,  the image positions, etc. 

Methods:

Figure 2: Selecting  the type of output image
Figure 1: Adding images to Pix4D
Pix4DMapper Pro was opened to a new project where a setup window appeared. Every image in the file folder of the track field were added (Figure 1). The project was saved to a working folder. From there, a 3D model map view was chosen (Figure 2). 




Figure 3: The initial window of the data in mapview
After the parameters were set in the setup menu, a view of the locations where the images were tied down to the earth appeared (Figure 3). From here, the processing tab on the bottom left was clicked where a processing window (Figure 4) appeared. This mosaics the images based on points, vertices, and geotags associated with each image. There are three processes that take place to provide a desired output. The first is an initial processing which is a preliminary step to see if the images are able to provide an accurate output mosaiced image. The next is a point cloud which is a 3D look at an object in an image. The final step is the DSM and Orthomosaic. Each of these outputs was saved to the designated working folder and once the initial processing was finished, a quality report was created to help see how well the images overlap as well as a variety of other important information (See Figures Below).
Figure 4: Processing Window









Quality report images


Sunday, April 24, 2016

Activity 9: Topographic Survey with a Total Station



Introduction:

This week was more of a comparative study between topographic surveying with the distance/azimuth lab two weeks prior (lower grade technology) and the survey grade total station (higher grade technology). The class was broken up into groups of two in order to become familiar with using a total station GPS unit in order to collect various points with attached elevation data near Little Niagara Creek near Phillips Hall (Figure 1). This data would then be used to create a digital elevation model, or DEM, of the study area.
Figure 1: Study area of topographic survey with Total Staion

Methods:

Equipment

Figure 2: TopCon Total Station
The setup included a MiFi portable hotspot and TopCon Tesla on a tripod as well as a TopCon Total Station (Figure 2) which was also situated on a separate tripod stand. There was also a Prism which is held by a user.

Procedure

The class was broken down into groups of two with additional help from Dr. Hupy for this activity, with one group of three. The total station is used most effectively and efficiently with the help of three people: one to shoot the Total Station at the Prism, one to hold the Prism over an area to collect a point, and one to collect the points on the Tesla unit.

Prior to the activity, a couple of locations were selected as backsites and were marked with orange flags. The orientation angle is then calculated with the coordinates of the total station and those of the backsites by measuring the angle and the distance between them with the help of the stadia rod.

Next, the total station was leveled on the tripod stand. This was done by swiveling the total station in three directions. When facing a given direction, a circular knob at the base was twisted until the unit was level. So there were three directions and three knobs in total. Only one knob was twisted each time the unit was redirected as to avoid interference with previous levelings.

The Total Station must remain in a single location during the entire survey in order to avoid data discrepancies. This is known as a static or occupied point. 60 points are collected and averaged to provide a highly accurate position of the Total Station. The height of the station and the height of the stadia rod's prism from the ground must be taken and recorded on the GPS before being able to begin. A surveyor can also change the height of the stadia rod's prism in the middle of field work, but the new height must then be recorded. An example of needing to change the prism's height would be when there is a drastic drop in elevation and the prism cannot be seen in the total station. The stadia rod would then be raised so the user at the total station is able to see the prism.

When ready to to record a point, one group member would walk into the area being surveyed with the stadia rod. At each point, the prism on top of the stadia rod must be faced directly towards the total station. Next, the aperture must be leveled by using the plum line on the stadia rod. After those two criteria are met, the group member working the total station must focus the lens over the center of the prism. The top of the total station is on a swivel and can be turned right-left and up-down. It also has a coarse magnifier and fine magnifier in order to focus on the prism. Once the total station has been centered on the prism, the third group member then uses the GPS and records the point. Each group member takes turns on using the Prism, total station, and GPS in order to get a feel of the different parts of the process of collecting topologic points.
Figure 3: Portion of the normalized notepad text file

After each group has had a turn in the field, and all of the points had been taken, Dr. Hupy sent the data to the class through notepad (Figure 3). The notepad file could be directly imported into ArcMap where it was then converted into XY data to visualize the points on a map. Kriging, Natural Neighbor, Spline, IDW, and TIN tools were run for all of the points to see which model seemed to represent the data in the most accurate way. The Kriging model seemed to be the best fit (Figure 4).

Figure 3: Kriging model of topographic points

Discussion: 

This activity showed a rather stark contrast from the distance/azimuth surveying technique. This surveying method is efficient, easy to use, and very accurate. It is a pain, however, that a surveyor is only able to take data of a single point at a time and one must walk to each of those points and level/position the Prism. It would also be a pain to have to pick up and move the Total Station to another area when surveying, I am not sure how common that problem is though. But it was really interesting to learn how to use this instrument.

Saturday, April 16, 2016

Activity 8: Dual-Frequency GPS Unit

Introduction:

This lab was aimed to familiarize students with surveying various objects using a high precision GPS unit. Topographic surveying can be done in many different ways, as is evident between this surveying technique and the distance/azimuth technique conducted last week. There were seven groups of two students gathering topographic point data for roughly five features per group. I personally surveyed an emergency-call telephone, a tree, a fire hydrant, a light post, and two signs. There was one extra attribute recorded, and that was the diameter of the tree that was surveyed.

Study Area:

  • Date: April 12, 2016
  • Location: University of Wisconsin-Eau Claire behind the Davies and Phillips buildings (Figure 1)
  • Conditions: Cloudy with some wind; temperature of 44 degrees Fahrenheit
Figure 1: Study area using the Dual-Frequency GPS unit


Methods:

Equipment
There were four components used for surveying with the Dual-Frequency GPS unit (Figure 2). The first being the TopCon HiPer S4. This is the GPS receiver that is attached to the top of the unit. Next is the TopCon Tesla which is a screen monitor and creates files and records the data as it is taken. A UTM Zone 15N projected coordinate system was used with the features recorded in the units of meters. A MiFi portable Hotspot ensures a personal wifi hotspot wherever the unit goes. And lastly, a tripod stand kept the unit stable and allowed for the other three components to be attached to a single unit.

The class was broken down into groups of two and each group then took turns collecting point features with the GPS unit. At each location a feature was recorded. The Northing and Easting were recorded with the TopCon HiPer S4, as well as the elevation above sea level. Each time a feature or object was selected to be recorded, that feature must first be selected in a drop-down menu previously created on the GPS unit itself. At each location a feature needed to be recorded, the tripod was simply leveled and positioned as close to the feature as possible to ensure reliable positional data accuracy. Once the user is ready to take the collect the data, the button "collect" is pressed on the TopCon Tesla screen monitor. The TopCon is extremely precise, being able to tie a feature down to just millimeters of the actual location of the unit. The GPS unit collects and averages roughly 20 points after this button is pressed, so it is important not to move the unit during this process. There is also a more accurate method to collecting data, which simply collects a minimum of 60 points when collecting the data. However, for a project like this it is perfectly fine to use the quicker method which does not take quite so many points. This is because the project does not call for highly accurate or precise surveying techniques since it is only used for a teaching tutorial. The Dual-Frequency GPS unit does not need to only be used to gather data for various features. It could also be used to merely take elevation and location data of a sloping study area, for example.

Figure 3: Normalized text file 
Next, the data was exported as a text file into a class folder. The text file had already been normalized so it could be directly transferred to ArcMap. But to normalize a table exported from this GPS unit, the headings would need to be formatted. The final normalized product the class received had the headings: Name, Northing, Easting, Elev, Ellipsoid, Codes, and Shape (Figure 3). The data was then
able to be imported into ArcMap where it was able to be displayed as points by clicking the 'Display XY Data' button. A topographic base map was added to provide a relative backdrop of the locations of the features.

Results:

As a class, there were a total of 33 feature points collected: 14 trees, 11 lamp posts, 2 fire hydrants, 2 campus signs, 2 garbage bins, 1 emergency telephone, and 1 mailbox (Figure 4).
Figure 4: The final map created from the point features data

Discussion:

This method of collecting topographic data with a Dual-Frequency GPS unit seemed to be rather efficient and highly accurate. However, when personally collecting data in the field there were a couple of problems. Right behind the Phillips Science Hall, the GPS unit was not able to gather points effectively. It was a 'dead zone' and the unit was not able to get a fix because there was radio interference which could be caused by electrical wires. This problem can be bypassed simply by changing the setting so the unit does not collect quite as accurate data. And if worse comes to worst, the user can simply turn off the unit and it should be able to get a fix after it is turned back on. It also appeared that the data for tree diameters were not transferred into the table so I wonder why that happened. But it would have been interesting to look at the differences in tree diameters and be able to show that on the map.

Monday, April 11, 2016

Activity 7: Distance/Azimuth Survey

Introduction

This week the focus was turned on to learning how to conduct field work without the reliance on technology. Technology is generally very useful and helps speed up the process of performing field work, but it might not always be reliable or available. So it is important to learn new techniques for data gathering for if and when technology is not practical or when technology fails. Technology could fail due to a number of reasons, such as extreme weather conditions, a device freezing up, running out of batter, etc. Access to a given study area may only be permitted for a short period of time, so it is important to be aware of different survey techniques that could be used if technology were to fail. For this field activity,  two different types of rangefinders were used to map out various trees on campus. The first surveying method required two separate instruments to find the distance and the azimuth such as a rangefinder (figure 1) and a compass (figure 2). The second surveying method utilized an instrument that could measure both the distance and the azimuth (figure 3). 
figure 1: This image shows a Vector Optics laser rangefinder device that was used in the field to collect distance data. This device requires two users where one user holds a device at a desired location and the second user points the unit at the window of the device to determine the horizontal distance between the two devices. 

figure 2: Here is a Suunto compass which was used to find the azimuth by looking through the hole and pointing it at a desired object. 

figure 3: The TruPulse laser shown above was used to collect both the horizontal distance and the azimuth in the field. The user points the unit at a desired object through the eye piece and then fires a laser in order to acquire the corresponding data. This unit is handy because a user is allowed to find both the horizontal distance and the azimuth by using a single device. On top of that, the TruPulse can also be used to determine the height of an object as well as other pertinent information.

Methods

As a class we went near the side of Phillips, an academic building on campus, to conduct the survey using the two methods stated above. A corner of the sidewalk was designated as the point where each of us would stand to collect the distance and azimuth data for the two devices, which was recorded as the x,y location. The class had gathered distance and azimuth data for 17 trees in total along the Little Niagara creek near the side of the building. The tree species and DBH (Diameter at Breast Height) were also recorded. The data for each tree was compiled in a table in each of our notebooks during this field activity, to get accustomed to not using any technology. This data was then transferred into an excel spreadsheet (figure 4).

figure 4: Excel table of data gathered in the field.



Before we could transfer the data into excel, we had to convert our point of origin from degrees, minutes, seconds, into decimal degrees in order to properly and accurately represent the data on ArcMap. In order to do this, we divided the minutes by 60 to give us a precise decimal value. It was important to classify the X, Y and as well as the other fields as "numeric" in excel rather than just "general" so the values could be properly represented in ArcMap.

Once the excel file was imported to ArcMap, a tool "bearing distance to line" was used to display the distance and azimuth data from the table as lines on the map from the point of origin (figure 5).
figure 5: The end result from running the "bearing distance to line" tool on ArcMap.


The next step was to convert the data into points by using the "feature vertices to points" tool. This tool gives the end of the lines an endpoint which helps one to visualize the end of the line on the map (figure 6).
figure 6: The end product from running the "feature vertices to points" tool.


After the endpoints were added, a basemap could be used to show the accuracy of these surveying methods that did not require the use of technology (figure 7).
figure 7: The final result with the use of a basemap to show the accuracy of this method.


Discussion

This lab was very useful because it gave us base knowledge and understanding on ways to conduct field research if our technology were to fail. It was interesting to see how the horizontal distance was consistently different between the two units we had used and to see how accuracy is the limiting factor for the survey. A survey such as the one our class conducted does not need to be very accurate, however. A surveyor needs to mainly gather a liberal amount of data points that provide a general understanding of a study area. For instance, a tree on a survey map does not need to be in the exact location that tree is in the real world. The tree just needs to be in the general vicinity to allow for one to be able to interpret the study area. What is more important are the data and fields collected. The number of bird nests or the species of trees in the general location, for example, are more crucial than the exact location of those features. In order to avoid as discrepancies though, it is important for a user or multiple users to follow the same data collection techniques. It would not be in one's best interest if one user who is tall in stature were to hold a rangefinder right next to their chest and another shorter user were to hold a rangefinder lower to the ground and away from their chest. It is important to have uniform collection techniques in order to avoid as many discrepancies in the data as possible. Because the weather was not ideal, it is also important to take into consideration the way a user takes field notes. The use of a pencil is crucial to use in rainy or wet conditions. Pens tend to bleed and run when exposed to water. A small field notebook with waterproof paper would also be useful to have. To be aware of and make simple changes like those could save a lot of time and headache while working in the field. So as always, it is important to be well prepared for a task at hand before going out into the field. As a class we had mixed up our X and Y data so the data was not properly represented in ArcMap at first. So in order to continue, I needed to switch the X and Y values accordingly and also needed to add a negative sign to the X values based on the the longitude of Eau Claire.

Conclusion

This lab was very informative and helpful. Being able to use this new method of locating and plotting points will be extremely beneficial in the future when technology will let me down. I looked online at the rangefinders and noticed that they were quite expensive. So this method could also be applied to the use of a measuring tape and compass to find distance and azimuth of features. Being able to use TruPulse, Vector Optics, and Suunto are more tools I am now familiar with and the continued use of Excel and ArcMap are greatly helping me understand and be familiar with the two platforms.


Saturday, March 19, 2016

Parcel Mapping Forum

I first attended the session from 12:45 to 1:45pm where I first learned about the work Jason Poser and Frank Conkling did on ‘Transforming a County Land Information Program – Starting with the Parcel Fabric’ for Buffalo County. I have heard of parcel mapping before but I never knew quite what it meant so it was nice to hear about a real world project involving parcel mapping. The two men had originally used CAD for parcel mapping and they worked with employees that were not well trained and they did not receive enough funding, both of which made their job very difficult to do mapping. They realized that they could not continue to use CAD and if they were to continue using CAD, they would continue to run into a large number of problems. So then they decided to switch over to parcel fabric. They were able to map parcels in parcel fabric without the need of a PLSS, which could allow for easy property changes. Over the course of eight years using CAD they were only able to map about 40% of the county but by using parcel fabric for just a few months, they mapped roughly 30%. Parcel fabric allows one to create a polygon and to keep the metadata associated with that parcel, then once can accurately move that parcel to an accurate location at the end, which was interesting to learn. Using parcel fabric for this project is a pretty manual process with unjoining and moving parcels without much math involved (i.e. RMSE), and they have an open data policy!!

I then learned about a project that involved mapping or monumenting corners. They have their corners monumented with the help of tie sheets for Barron County and are in the process of restoring all of the corners of Barron County. I didn’t quite learn what corners are used for or how they are important, but it did seem rather interesting and a really tedious process. The speaker talked about how many of the people he works with have been there for a while and that there are very few turnovers, which would be nice because everybody would be well educated and I bet there would be some nice chemistry in place to help complete tasks more easily. It was interesting to see that many monuments have been trees in the past and have been replaced by ‘markers’ through the years. Many of these markers have been buried under roads which I never would have guessed. He made it a point to build and keep good relations with law officers and the community, because they can either help with the project or cause a lot of pain and headache if you do not have a good relationship with them. It is also important to keep your firmware up to date which makes sense, but I probably would not have thought about that on my own. And he also made it a point to note that redundancy is important!!!
Next, Brett Budrow talked about the PLSS and Parcels of St. Croix County. He used a multi-purpose cadastic with the help of PLSS and other means, which helped to map the parcels. He mentioned that they used a bounty system, which I have never heard about until that presentation. It was interesting to learn that the original survey was in 1847-1849 and they still go back to those surveys and have built upon those surveys over the years, but the first full-time surveyor was only hired in 1989. Again, he used tie sheets for monumentation just like the first speakers and they tied PLSS with parcel mapping in the late 90’s. He had to work with surveys that lacked geodetic control so his first step was to geodetically ‘fix’ the section and quarter-quarter section corners. The department had a parcel conversion project in 2001 and continual maintenance/spatial improvement is needed for the tax parcels, so funding is very important for a project like this.


The last section of the forum involved a discussion about various aspects of parcel mapping. It was interesting to hear that the different groups have similar ideas but the way they presented those ideas had quite different implications behind them. Some of the important highlights I gleaned from the discussion included the fact that everyone should be on the same level of accuracy. The county level is the best place to determine needs for the county, which is a simple and common sense approach but it is invaluable to do. Parcels should be mapped according to their value, meaning that prioritization of parcels is key. Communication is very important to determine the user’s needs, educate the public about parcel mapping and the importance behind it, strengthen collaboration between parcel mapping professionals as well as collaboration between the professionals and the public, etc. Parcel mapping is always a work in progress and one needs to provide a product for the user but at the same time continue towards an ultimate end game. It is important to aim for perfect and complete, but to accept incremental improvements and local prioritization matters. It is crucial to mandate parcel mapping and mutual respect between WLIP and PLSS is beneficial. It is good to have a professional surveyor. Funding is very limited so educating the public as well as elected officials is very important in order to convince people to invest money into these efforts. PLSS is the foundation of determining property taxes. At the end of the discussion we were asked to answer a question. What is the most important step we can take to improve parcel mapping? There were quite a few different answers and I enjoyed hearing other people’s thoughts on the matter. I personally thought the most important step is to educate the public and government on this application and its importance. This in turn will help the projects receive more funding as well as to help facilitate collaboration and ultimately ease the process of parcel mapping and create a better product in the end.

Monday, February 29, 2016

Activity 4: Geodatabases, Attributes, and Domains

Introduction

In this week's assignment, I was to create a geodatabase that would in turn be deployed in ArcPad. This geodatabase will be used to create a microclimate map which then will be a part of an activity down the road. The geodatabase needs to contain various data for each point taken in the field. These data attributes include: temperature, dew point, wind speed, wind direction (cardinal and azimuth), group number, relative humidity, and the most important being an attribute for note taking. It is important to create a good geodatabase with valuable domains before heading out into the field. Having domains allows for easier and quicker data collection as well as reduces the potential of incorrectly inputting values for different attributes. This activity also helped to be able use ArcPad to collect different data for the micro climate around campus and to import that data into ArcMap. It is important to properly create a geodatabase because an activity such as collecting micro climate data is supposed to take as little time as possible as to avoid discrepancies in the data due to temporal variances. It is also important to practice good normalization as to avoid problems and headaches when manipulating data back in the lab.

Part 1

The first step is to get ready to create a micro climate geodatabase. In order to do so, one needs to plan ahead and figure out the attributes that are required or important before heading into the field. It is crucial to have the proper tools and a Trimble Juno GPS unit (figure 1) is one of the most important tools to have for a project such as this. This allows for a geodatabase to be uploaded to the mobile device and allows for data to be collected and surveyed on the fly in a more or less mobile version of ArcMap. As mentioned before, it is important to plan ahead and figure out the different attributes that are important to collect to get a nice understanding of different microclimates. For this activity, it is important to understand what a microclimate is. A microclimate is a small restricted area that varies in temperature, humidity, wind speed, etc. (climate) of another climate or area nearby.


figure 1: The Trimble Juno GPS unit that will be used for data collection later on. 

Part 2

Step 1 - Construction of a Geodatabase

This is probably one of the fastest steps of the whole process. I first had to go into ArcCatalog and create a new file geodatabase in a folder I created. Now my geodatabase was created (figure 2).

figure 2: The new geodatabase named 'mc_hagenjc.gdb' that will be used to collect and analyze the data from this project

Steps 2 & 3 - Development of Geodatabase Domains/Development of Domain Ranges

This step is very important because it helps save time and reduces the chances of making user errors when inputting values, as mentioned before. By looking at the data that will be collected, one can determine what the domains (a set of rules assigned to a given attribute) and ranges will be set as (figure 3). For example, temperature was given a range of 15-60 degrees Fahrenheit so values that lie out of this range will not be recorded. Coded values were also used for wind direction, so recording an N would imply that the wind direction was north, and SW would imply southwest. It is good to observe the attributes you decide to collect as well as when and where they are collected. It would be cumbersome and possibly problematic to set a range of -50 to 150 for temperature in spring because temperature will most likely not reach a value near -50 or 150. That is why I chose to use a range of 15-60 for the time of year I would be collecting data. When creating domains, it is important to create a description in order to know exactly what the domain is, especially if you decide to abbreviate domain names. On top of that, the knowledge of data type must be understood. For instance, what implications short integer, long integer, float, and text have on the data being collected.

figure 3: Example of a domain name, description, field and domain type, and domain range
Step 4 - Construction of a Feature Class for Later Deployment in ArcPad

After the domains have been created, it is time to create a feature class. A feature class is collection of features that all have the same spatial representation such as a point, line, or polygon and a common set of attribute columns. For this activity, a point feature class was created with a UTM zone 15 projection. It is important to note that only one feature class should be created, otherwise you would have to filter through numerous feature classes in the field for each attribute.

Step 5

I decided to add another step by importing an aerial basemap of the study area. The basemap should be zoomed in to the desired area to avoid pixilating the image as well as to prevent the Trimble from taking a long time to buffer the image. I figured having a basemap would help for easier navigation around the campus and to make sure I was in an accurate location on the map as compared to the field.

Discussion

It can be easy to create a geodatabase if you know what you are doing. But it appears you can potentially run into a plethora of problems such as using a short integer when a long integer is required, or not properly setting a domain range. These types of errors can easily be avoided by planning ahead for the project and determining best practices ahead of time. On top of that, human errors can also occur in the field so that is one importance of setting domains to combat those user errors. It is hard to keep in mind to use best practices to make data interpretation not only easy for yourself, but also for other people who could potentially use the data you have collected. So I have been trying to keep in mind to write out detailed descriptions and try to make sure that another person could understand my data and use it without running into problems that could have easily been avoided if I were more careful and conscientious in pre-planning.

Saturday, February 20, 2016

Activity 3: Development of a Field Navigation Map

Introduction

The purpose of this lab was to become familiar with being able to produce an aesthetically pleasing and practical map which will later be used to navigate a course set up in the end of April. The course is going to be located at the Priory which is a parcel of land owned by the University of Wisconsin - Eau Claire at the outskirts of the city. The map needed enough information for navigation purposes and at the same time it could not be too cluttered or contain too much information to make navigating difficult or cumbersome. One of the objectives of this lab was to create two different navigation maps to learn first hand which type of map is easier to use. The first map is based on the Universal Transverse Mercator (UTM) Zone 15 North (figure 2), and the second is based on the Geographic Coordinate System (GCS) (figure 3).

Methods

There were two activities for this week's lab assignment. The first was to determine our personal pace counts. To do this, we counted the number of steps we took with our right foot across a distance of 100 meters. My pace count was 59 and this information will be used in the actual activity of navigating through the study area.

The second activity was to design two navigation maps; one with a 50 meter UTM grid and the other with geographic coordinates in decimal degrees. These were the only requirements given, so the map design was entirely up to our own discretion.

I began to create the UTM map. In order to design the map, I imported a basemap that contained background aerial imagery given by Dr. Hupy. I then pinpointed the location of the Priory and fit it to the data frame. Next, I changed the coordinate system to NAD_1983_UTM_Zone_15N and added a UTM grid set at 50 meters. After the grid was set, I looked at some data that was provided by our professor. There was a 2 foot contour interval and a 5 meter contour interval so I initially tested out the 2ft interval and noticed that there was too much clutter to be able to successfully use the map so I decided on the 5m interval. I played around with colors, grid number sizes, etc. in order to minimize the clutter but at the same time to display the map properly. A DEM was also provided by the professor, however I felt more comfortable with using an aerial image for proper navigation over the use of a DEM. The aerial image seemed to bold so I messed around with the transparency, contrast, and brightness and found a happy medium by only using a 40% transparency. I also played around with how much area I wanted to show outside of the Priory and how to position my legend and decided on leaving little to no extra space around the priory and to put the legend on the outside of the map instead of overlapping the map itself. After this I noticed I had no clue of the elevation of the contour lines so I decided to add labels based on the contour intervals to hopefully make for easier interpretation of the map.

I followed this same process to create a Geographic Coordinate System map, with a few slight adjustments to account for differences between GCS and UTM coordinates.


Figure 2: The Priory with a UTM 50 meter grid
Figure 3: The Priory with a geographic coordinate system measured in decimal degrees
Discussion

I was surprised at how much difference there was in the map's shape based on the coordinate system used. The UTM map has more of square shape to it but also appears to be more 'skewed' or tilted in the data frame whereas the GCS map is more of a rectangle. But at the same time, the GCS map appears to be more distorted than the UTM. So based on that observation, as well as the fact that UTM maps provide measurements in meters rather than degrees (which allows for easier ground distance calculations), the UTM map will most likely be easier to use and more reliable in the navigation exercise. At the same time, however, Eau Claire seems to be right near the edge of UTM zone 15 and 16 which means there will be more distortion since Eau Claire is located at the edge of a zone rather than near the middle. Even with this problem, I do still feel that the UTM map will be more reliable than the GCS map.

Conclusion

Both maps have their pros and cons, so it is important to weight out those positives and negatives and make an educated decision on which map to use for a given activity. It is also important to always keep in mind that more is not better when it comes to map. Often times simpler maps are the better choice. This activity was really helpful and I did learn quite a bit from it. I did not know how many considerations there are in creating a map that will be used in the field, as compared to a map that is merely a visualization tool with no real field use. It was interesting to learn how to create a grid system for a map. I feel that will be very useful in the future. And one aspect I feel will be interesting to try and figure out is to interpret our pace count with elevation changes and to also translate that pace count that has both vertical and horizontal information behind it into a map that merely shows horizontal information with contour lines.

Friday, February 12, 2016

Activity 2: Visualizing and Refining Our Terrain

Introduction

This was a continuation of the survey conducted from the previous week with a resurvey of our data to better and more accurately represent the features. Through the process of resurveying, it was crucial to normalize our data in order to minimize data redundancy through the organization of attributes and tables to ultimately better represent the study area. After data normalization was complete, the terrain needed to be visually represented through ArcMap and ArcScene with the use of various data interpolation methods including inverse distance weighted interpolation (IDW), natural neighbors, kriging, spline, and triangular irregular network (TIN).


Methods

The first step was to import the xyz data into Excel which would be the table I would import into ArcMap through ArcCatalog. This table is used to create a feature class based upon the measurements taken in the field. In order to import the table, the Excel file needed to be normalized to create a proper format that ArcMap would use (Fig. 1).

Fig. 1: The normalized excel table with the x, y, and z data that would in turn be uploaded to ArcMap
Fig. 2: This is the feature class that was created through the use of an excel file. This shows how each point is evenly distributed along an x and y axis, but it is also important to note that each point also contains a z (elevation) value

The next step was to use the various interpolation techniques to visually represent the surveyed terrain and ultimately decide the model that best represents the collected data.


Deterministic Interpolation Methods:
These methods calculate the smoothness of a surface based directly on the surrounding values or mathematical formulas. 

IDW:
Inverse distance weighted interpolation determines cell values using a weighted combination of sample points. This method more or less takes the values of nearby sample cells and creates an average of those values to for a given cell. With this method, the values near the center of a cell have more influence than the values in the peripheral of a cell.
Fig. 3: IDW Interpolation
Natural Neighbors:
Natural neighbors interpolation utilizes an algorithm that determines the closest set of sample points to a point of interest and applies weights to the closest points based on the proportion of the area to then calculate an output value. This method does not create trends and does not produce peaks, pits, valleys, etc. that are not already represented in the input data. This method provides a smooth surface everywhere except at locations of the input data. This method is known to work well with regularly and irregularly distributed data. 
Fig. 4: Natural Neighbors Interpolation

Spline:
Spline estimates values using mathematical functions to reduce the overall curvature of the surface which results in a smooth surface that passes through all data points. 
Fig. 5: Spline Interpolation

Geostatistical Interpolation Methods:
They use statistical models to find relationships between data points. These techniques can create a prediction surface as well as provide a measure of the accuracy of the predicted surface.

Kriging:
Kriging explains the spatial variation and statistical relationships among measured points based on statistical models. It assumes that the distance or direction between sample points reflects a spatial correlation can help explain surface variations and it uses a mathematical function to fit all points within a specific location which helps insure accuracy of the created surface.
Fig. 6: Kriging Interpolation

TIN:
A TIN is a vector-based digital means to represent surface morphology. They are constructed using a triangular set of vertices which are connected by edges to form a network of non-overlapping triangles to form a continuous surface. TINs work well with steep features such as ridgelines where the TIN tends to produce a higher resolution. 
Fig. 7: TIN


These different methods each modeled the terrain in a unique way, some good and some bad. 


Discussion:

Many interpolation methods were learned through this assignment and as with everything, each one has its pros and cons. The Kriging appeared to be the least accurate representation of our landscape. This model appeared to make too many generalizations of the landscape and seemed to oversimplify the terrain. It cut down drastically on the hills and ridges and at the same time extended the valley quite a bit. 

The IDW interpolation also did not give a desired result. The surface appeared blotchy and spiky, very unrealistic compared to our terrain to say the least. This model seemed to make more peaks or variations in elevation than the terrain had. 

Natural neighbors seemed like it modeled the terrain well at first glance, but then at further observation the model seemed to oversimplify the terrain in certain areas. 

The TIN looked to be a good match of the elevation, however it does exaggerate parts of the terrain. If we had done a different sampling method, such as having more sample points at areas with more changes in elevation and less sample points in areas with little change in elevation. 

Spline (fig. 8) seemed to be the best fit for our data. This method is more detailed than the Kriging model and is also smoother than the IDW model. And although spline looked to work the best, it still did not represent the landscape well enough. The areas with the least accuracy were the ridge and the valley. These features could be a little better represented by taking more coordinate points in those areas. 
Fig. 8: Spline Interpolation for Resurveyed Terrain


We then went back to recreate the terrain as best we could to match the terrain from the previous activity to improve the accuracy of our output model. While we were recreating the terrain we decided it would be a good idea to smooth out the surfaces since we agreed we would use the spline interpolation method for our final output. On top of this, we added more sample points so the terrain would be displayed in greater detail.


Conclusion:

After working on this lab for two weeks I have learned many ways to collect geospatial data and saw what works well with different interpolation methods. It was nice to be able to go back over this project and fix mistakes we noticed. I often times want to go back over labs and learn from previous mistakes, so it was nice to actually be able to do that with my teammates. It was extremely interesting to be able to relate a lab to the real world. I was able to see how a plan did not go as expected through the initial run and learned how to glean potential solutions from encountered issues. It is important not to have a single sampling technique. One should change and alter a sampling technique to best represent different terrains. For instance, it is better to take more sample points in areas with greater changes in elevation and to take fewer points in areas without much change in elevation. This saves both time and money in the real world. It was a breath of fresh air to have to think for ourselves on how to best increase the accuracy of the study area. We needed to learn to use our time wisely, be patient, work well with others, be flexible, and learn to troubleshoot. It is of utmost importance to plan ahead; this may seem cumbersome in the beginning but it saves so much time and headache in the end. 

Tuesday, January 26, 2016

Activity 1: Creation of a Coordinate System & Digital Elevation Surface

Introduction

The first lab assignment required the creation of a Digital Elevation Model of various features created in a wooden flower box with the use of snow as the medium. Very little guidance was given in order to create an atmosphere of critical thinking and hands on learning through trial and error. To create an elevation model, certain parameters needed to be met, such as the creation of a ridge, valley, hill, depression, and plain in the snow, as well as determining what coordinate system to use. The use of the coordinate system allowed for the collection of precise X and Y locations that contained a Z (elevation) value within the study area, which would then be used to replicate the landscape features in a 3D model in the second part of this lab activity.






Methods

The first step was to determine at what point on the box we would institute sea level. The final consensus was to build up and down from the midway point of the box with the sea level being the top of the box which would in turn make all of the collected elevation points negative values. Next, the actual features (ridge, valley, hill, depression, and plain) needed to be constructed in the snow.

Once the features were created, a coordinate system was created using string that ran across 100cm lengthwise and widthwise of the box. The string created a square grid with an x and y point at every 10cm interval. The elevation was taken at each point where the strings intersected.


Each measurement (x,y,z) was recorded in a notebook and then transferred into excel (Table 1).


100 15.00 14.00 15.00 15.00 14.00 13.00 14.00 13.00 12.00 10.00
90 16.00 15.00 10.00 11.00 10.00 7.00 9.00 7.00 10.00 7.00
80 17.00 15.00 20.00 23.00 22.00 22.00 15.00 14.00 19.00 15.00
70 3.00 23.00 7.00 10.00 13.00 12.00 10.00 18.00 12.00 7.00
60 12.00 5.00 13.00 5.00 13.00 12.00 15.00 12.00 11.00 12.00
50 15.00 20.00 15.00 4.00 12.00 13.00 15.00 14.00 10.00 10.00
40 15.00 15.00 14.00 3.00 11.00 12.00 13.00 7.00 3.00 1.00
30 15.00 15.00 15.00 3.00 11.00 10.00 10.00 5.00 0.00 0.00
20 15.00 15.00 13.00 2.00 12.00 10.00 10.00 5.00 0.00 0.00
10 15.00 13.00 12.00 1.00 9.00 10.00 9.00 3.00 0.00 0.00

10 20 30 40 50 60 70 80 90 100
 Table 1. Elevation measurements taken across the study area






Discussion

This field lab assignment required the use of critical thinking skills as well as group collaboration. The project went well overall. We were able to successfully construct our own coordinate system and determine where place our zero values. We did run into a few roadblocks along the way, most notably dealing with where our sea level (z-value of 0cm) should be. Attempts were made at filling the box with snow and building up/down from the top of the box but it was difficult to keep the string straight over at our 10cm increments over some of the features that were above the top of the box. Then we tried to remove half of the snow and make a level surface at the point where the two boards met--two boards were placed on top of each other to give the flower box its height--which was roughly 13.5cm. We were originally going to make that halfway point be our sea level but then decided to make the top of the box our sea level with each value essentially being negative. It would not be difficult to in turn make the halfway point our sea level, with 13.5cm being added to each z-value recorded. We also had trouble deciding how large to make the grid squares. We ended up using 10cm/10cm squares, which looking back on the project did provide a consistent scale but did not provide an accurate enough scale. Using smaller increments would make a more detailed and accurate depiction of the study area.





Conclusion



This lab assignment did in fact help to develop a better understanding on how to create coordinate systems and elevation models. We were left to figure out how to lay out our study area and how to collect data, with very little guidance or background on the matter. This helped us strengthen our critical thinking skills, ability to work in groups, and ability to creatively problem solve to successfully complete a new task.