In this week's lab, we learned about spatial accessibility and applying the network analysis tools to measure the distance and travel time of the spatial access. This spatial accessibility model shows a before/after of the ACC Cypress Creek Campus closure and its impact on the accessibility of the ACC campus system. The Service Area analysis was used to created the Drive Time data ranges for both maps. The Closest Facility analysis was also applied to data which we were able to obtained a before/after of the students' drive time. By closing the Cypress Creek Campus, the students who lived near the Cypress Creek Campus have increase on their driving time which ranges between 18 to 50 minutes and they would be relocated to the Northridge Campus.
Saturday, June 24, 2017
Applications In GIS (GIS5100): Lab 5
Thursday, June 22, 2017
GIS Programming (GIS5103): Module 6
In this week's module, we learned how to Python to execute geoprocessing tools in ArcMap. We had to create a script that performed 3 different geoprocessing functions: adding a xy coordinate, creating a buffer, and dissolving the buffer around the hospitals into individual features. This assignment was relatively easy to complete and I only had a little trouble getting the Dissolve tool to work.
Monday, June 19, 2017
GIS Programming (GIS5103): Module 5
In this week's module, we learned about various geoprocessing tools through Python. The screenshot above shows the basin without the 'Not prime farmland' shapefile aka soil_Select.shp thus resulting in areas that are prime farmland in the basin. Overall, this module wasn't very difficult to complete and it was quite helpful with how to the batching aspect of geoprocessing tools.
Notes:
1.
Right-click
the newly created toolbox and add ‘New Model’. Drag the needed shapefiles into
the ModelBuilder.
2.
Drag
the Clip tool, Select tool, and Erase tool from the ArcToolbox into
ModelBuilder.
Application in GIS (GIS5100): Lab 4
In this week's lab, we learned how to utilize visibility analysis. For Camera 2, I placed it on
the other side of the finish line on the right corner of a building so we can
see front and behind of the runners when they are finishing the marathon.
Camera 2’s offset was set at 100 and the viewing angle was set between 250-340
degrees. For Camera 3, I placed this camera on the left side of the finish line
in a tower of a building which allows the viewer to see which runner crossed
the finish line first or what order did the runners finish. Camera 3’s offset
was also set at 100 and the viewing angle was set between 140-250 degrees.
Wednesday, June 14, 2017
GIS Programming (GIS5103): Module 4
In this week's module, we learned how to about debugging and error handling the scripts. Script 1 was quite easy to fix since we were only fixing two exceptions. Script 2 was a little bit tricky when I got the df.pantoextent part, I was able to fix the error by using the defined syntax of extent into the parentheses next to the df.pantoextent. Script 3 was took a little bit more time to do since we had to apply the try-expect statement to the script. Pages 242-243 were somewhat help with getting the general idea of the placement of the try-except statement should be located within the script.
Notes:
1.
Delete
the ‘mapdoc.save(),’ ‘del mapdoc,’ and lyrlist scripts so that the try-except
statement can run correctly.
2.
Make
sure to indent the if statement after the for statement because an syntax error
will show saying ‘expected an indented block’
Applications in GIS (GIS 5100): Lab 3
In this week's assignment, we learned how to create a watershed analysis on ArcMap. The DEM had to be Fill so the stream segments would not be disconnected when the flow direction tool and the flow accumulation tool were run. After running the flow direction tool and the flow accumulation tool, the Con tool was used to define the streams by the value of 200 or greater. Once the streams were defined, they were then converted into polyline features which were then used the Stream Link tool to visually see each stream segment. The output of the Stream Link tool was applied the Stream Order tools were put the stream into a ranking system. Also the Stream Link's output was applied to the Watershed tool and this allowed us to see the all of the watersheds on Kauai.
I decided to focus on the Hanapepe River watershed for my map. I had to create a new point shapefile in ArcCatalog which I used as my Pour Point layer. The pour point was used to created the modeled watershed that was somewhat different than the observed watershed. To compare the two watershed, the metric unit that I decided to use is acres. Using acre measurements would allow individuals to know how much land is being used to create these watersheds and the numbers won’t be so overwhelming to the individuals who are viewing this data. The data would be determined by using the attribute table again to add a new field to the table and we can use the field calculator to calculate the area of both polygon features.The amount of acres that wshed_pour14 is covering is 17226 acres. The amount of acres that Hanapepe_wshed is covering is 17163 acres. The wshed_pour14 is covering more land than Hanapepe_wshed which from a visual standpoint did not overly appear so in the map of Kauai.
I decided to focus on the Hanapepe River watershed for my map. I had to create a new point shapefile in ArcCatalog which I used as my Pour Point layer. The pour point was used to created the modeled watershed that was somewhat different than the observed watershed. To compare the two watershed, the metric unit that I decided to use is acres. Using acre measurements would allow individuals to know how much land is being used to create these watersheds and the numbers won’t be so overwhelming to the individuals who are viewing this data. The data would be determined by using the attribute table again to add a new field to the table and we can use the field calculator to calculate the area of both polygon features.The amount of acres that wshed_pour14 is covering is 17226 acres. The amount of acres that Hanapepe_wshed is covering is 17163 acres. The wshed_pour14 is covering more land than Hanapepe_wshed which from a visual standpoint did not overly appear so in the map of Kauai.
Sunday, June 11, 2017
Peer Review Assignment #1
Geoprocessing
tool to model beach erosion due to storms: application to Faro beach
(Portugal).
By: ALMEIDA, L.P., FERREIRA, Ó. And TABORDA, R.
URL:
https://hostsited2l.uwf.edu/content/enforced/998343-50402GIS5103201705/SupplementalReadings/PotentialsForPeerReview/Almeida_et_al_2011.pdf?_&d2lSessionVal=toxLBNl0eaL1nMS7E8yhUvnlj&ou=998343
This
article discusses the application of GIS programming on creating a
geoprocessing tool called GEOSTORM which models beach erosions caused by
coastal storms. GEOSTORM was applied to a case study onto Faro Beach, Portugal.
The tool cuts down on the amount of processes taken to get a model
representation of the beach erosion. GEOSTORM is comprised of two modules that
are connected to graphical user interface (GUI).
The
first module of this geoprocessing tool is preparation of transect. This module
allow the users to create a group of cross-shore transects with uniform lateral
spacing along the coastal area of interest. The inputs that are needed before
using the Module 1 are a digital terrain model (DTM) and a baseline shapefile.
Python was used to create this module by using the scripts as executable files
which can be brought forth from the VBA code when the GUI pushbutton objects
are acted upon. The by-product of this module are XYZ coordinates for each
transect (Almeida et al. 2011:1831).
The
second module uses this XYZ coordinates to give a profile erosion estimation of
each transect by applying the Kriebel and Dean storm erosion convolution model.
The by-products of this module are the retreat line and the eroded volume.
Kriebel and Dean model allows the determination of maximum retreat and eroded
volume for four types of schematic beach profiles (Almeida et al. 2011:1831). Another
programming software was used to help connect Module 1 to Module 2 and this
software is called Matlab GUI. Matlab GUI was used as a boundary line between
Kriebel and Dean model and the user. The beach profiles that were created by
Module 1 are utilized by the Matlab GUI. This data is utilized by the user to set the
hydrodynamic conditions. Once the conditions are set, the simulation will start
but the user has to continuously input some morphological parameters to help define
the profile type. The final results of the second module are converted into
ASCII files which are then converted into a shapefile using Python and finally
imported to ArcMap using VBA (Almeida et al. 2011:1831-1832).
GEOSTORM
was applied to the case study area of Faro Beach, Portugal using a storm with a
25 year wave characteristics return period in the application. The results of
GEOSTORM show that there is not enough protection from erosion due to predicted
output of the beach coastline morphology. The lack of protection means that the
storm’s erosion impact will affect areas where humans occupy such as houses,
roads, parking and seawalls (Almeida et al. 2011:1832-1834).
Overall
this article was very straight to the point on how this new geoprocessing tool
can be used for real life issues many coastal communities face after a storm
passes through thus causing the coastline to eroded. The article was well
written and was easily understandable for a person who just began learn about
GIS and applications of programming. The only weakness that was found in this
article was that it did not elaborate enough about Matlab which I would have
like to learn more about.
L.P. Almeida, Ó. F. (2011). Geoprocessing
tool to model beach erosion due to storms: application to. Journal of Coastal
Research, 1830-1834.
Friday, June 9, 2017
Applications in GIS (GIS5100): Lab 2
In this week's lab assignment, we had to created a corridor for black bears moving between the Coronado National Forest. I applied the Euclidean tool
to Road.shp and after getting the result calculation from the Euclidean tool, I
used the Reclassify tool to narrow down the distance to fit my suitability
criteria. I then reclassified the Elevation and landcover raster datasets to
the suitability values criteria for what is acceptable habitat for black bears
in the Coronado National Forest. With the three new reclassified files, I
combined them into one dataset using the Weighted Overlay tool. The new
Weighted Overlay layer’s values were then inverted by using the Raster
Calculator tool. (10 – “Weighted Overlay layer”). I used the new inverted
weighted overlay layer to calculating the cost distances for shapefiles of
Coronado1 and Coronado2. Once the cost distances were created, I used them both
in the Corridor tool which created possible passages for black bears to migrate
safely. I narrowed down the values into 4 classes and made the any value over
58,075.3 have no color value.
Subscribe to:
Posts (Atom)