Monday, October 20, 2014

Data Gathering and Preparation for Frac Sand Mining Project

Goals and Objectives

The goal of this assignment was to become familiar with the process of downloading data from different sources on the internet. After downloading the data the next step was to import it into ArcGIS, join the data, and then project the data from the different online sources into one coordinate system. Once that was accomplished the last task was to create and design a geodatabase to store the data.

General Methods

The first part of this lab consisted of data downloading from online resources.  Before any downloading could be done it was first important to create a temporary file folder in the Q drive.  This allows large zip files to be downloaded only temporarily so they do not take up space on the server.  From the temp folder the zip files were extracted into a working folder where they can be manipulated and later put into a geodatabase. In figure 1 below are the sources where the data was gathered from, as well as what was downloaded from them.

Figure 1

When looking at the NRCS Soil Survey data alone it isn't as beneficial as one would like it to be.  To enhance it's usability the SSURGO data had to be imported and joined with the soils data, along with the drainage index and productivity index.  To do some of this we had to use Microsoft Access, ArcMap, and ArcCatalog.  Lastly we had to bring in railroad data from the DOT and clip it to the Trempealeau County border.

After all of the data was downloaded and the geodatabase was organized, the python script was written, which can be seen in my second blog post labeled "Python Scripting".  From the Python Script in post two the following maps were created:

Figure 2: Trempealeau County


Figure 3: Digital Elevation Model (DEM) of Trempealeau County

Figure 4: NASS Cropland Data for Trempealeau County

 Figure 5: Land Cover Data for Trempealeau County


Data Accuracy

By looking at the metadata one can learn more about the data and also about the data's limitations.  Figure 6 has all of the available metadata for the sources provided.  If the data couldn't be found then it it marked with N/A.


Conclusion

being able to download data from outside sources, join it, organize it in a geodatabase, and then manipulate it in ArcGIS is a valuable skill.  This exercise provided good practice on doing all of those things and also taught us how frustrating dealing with downloaded data can be.  Preparing the data for use in ArcGIS took a lot of time and effort because the people who created it didn't have a set metadata template to follow.  Using Python two set up our map layers was also frustrating but left one with a feeling of accomplishment when the code finally ran without errors.






No comments:

Post a Comment