In this week's lab, we had to due damage assessment on areas in Ocean County, NJ that were affected by Hurricane Sandy. We had to prepared the data for the assessment, had to use before and after storm picture to assess the structure damage to each parcel, determine the distance from the coastline where most buildings were impacted by the storm, and combine parcel data with the damage assessment layer.
I
mostly rely on the structure damage scale to help me assessment the damage of
the building structures on the parcels. The buildings that were completely tore
down I defined as Destroyed, buildings that were missing major parts of their
structure as Major Damage, buildings with minor parts of their structure damage
as Minor Damage, buildings with debris across the parcel as Affected, and
buildings that seems like they were not affected as No Damage. The difficult
decision for me was determining what structure type the buildings were. If I
was able to have information about what each parcel was used from, it would
help me out tremendously with identify the structure types. Also, if the post
Hurricane Sandy imagery was taken at the same time of day and had similar
lighting, it would help with making better deductions of the parcel damages.
From the findings, I was
able to note that worst damage that occur to the building structures happen
within 100 meters of the coastline and the farther away the buildings were from
the coastline; the less damage the buildings had. 100% of the buildings within
100 meters of the coastline were damaged and 95% of buildings within 100 to 200
meters of the coastline were damaged. 19% of buildings within the 200 to 300
meters of the coastline were damaged. The findings seem accurate enough to be
used in other areas nearby the study location.
Showing posts with label Applications in GIS. Show all posts
Showing posts with label Applications in GIS. Show all posts
Saturday, July 22, 2017
Applications in GIS (GIS5100): Lab 8
Wednesday, July 19, 2017
Applications in GIS (GIS 5100): Lab 7
In this week's lab, we had to use a DEM to determine which areas would be impacted by the sea level rise and to overlay this data with census information to decide the possible number of people that would be impacted by it.
To obtain the two different sea level scenarios of 3 feet and 6 feet, I had to limit the raster processing only to Honolulu since this is the study area. I went into the Environment setting and selected the Raster Analysis which gave me the option to set the mask to Honolulu. After setting the mask, I created two different flood zone scenarios by using the Less Than tool. The tool created a grid with cells that show areas that are flooded with the value of 1 and areas that are not flooded with the value of 0. Once the rasters were made I was able to determine what the total area of the flood zone by multiplying the cell count of flood zone and the cell size. The flood elevation was created by using two different tools. The first tool that was used was the Extract by Attribute which uses the Lidar DEM as one of the inputs and the other input as the SQL expression of “VALUE” < 2.33. This tool gives elevation to the flood zone. To get the depth of the flooded area, the Minus tool was applied. The first input was the Flood Zone 1 which had the sea level of 3 feet and the second input was the alter Lidar DEM of the flood zone elevation. The results allowed us to see the sea level rise of 6 feet would look like in the flooded areas of Honolulu. If my calculations are correct, the first scenario of sea level rise of 3 feet will have the total area of 5,710,950m2 and the second scenario of sea level rise of 6 feet is 23,793,075m2.
To obtain the two different sea level scenarios of 3 feet and 6 feet, I had to limit the raster processing only to Honolulu since this is the study area. I went into the Environment setting and selected the Raster Analysis which gave me the option to set the mask to Honolulu. After setting the mask, I created two different flood zone scenarios by using the Less Than tool. The tool created a grid with cells that show areas that are flooded with the value of 1 and areas that are not flooded with the value of 0. Once the rasters were made I was able to determine what the total area of the flood zone by multiplying the cell count of flood zone and the cell size. The flood elevation was created by using two different tools. The first tool that was used was the Extract by Attribute which uses the Lidar DEM as one of the inputs and the other input as the SQL expression of “VALUE” < 2.33. This tool gives elevation to the flood zone. To get the depth of the flooded area, the Minus tool was applied. The first input was the Flood Zone 1 which had the sea level of 3 feet and the second input was the alter Lidar DEM of the flood zone elevation. The results allowed us to see the sea level rise of 6 feet would look like in the flooded areas of Honolulu. If my calculations are correct, the first scenario of sea level rise of 3 feet will have the total area of 5,710,950m2 and the second scenario of sea level rise of 6 feet is 23,793,075m2.
Wednesday, July 12, 2017
Applications in GIS (GIS5100): Lab 6 Crime Analysis
In this week's lab, we learned about different hotspot mapping techniques that could be applied to finding where certain crimes happening most often. The three techniques were the grid-based mapping, kernel density and Local Moran's I.
1.
When gathering the data for the top 20% of
burglaries that were happening within the grid-based thematic map, I had 507
grid cells with different burglaries counts. I divided the 507 cells by 5 which
gave me the number 101.4. I rounded down to 101 which I made my selection of
the top 20% cells with the highest burglary count. The next map that created to
pinpoint high burglary crime areas was a kernel density map. The parameters that
I used for the kernel density analysis was 100ft for the output cell size and
the search radius to be set at 2640 ft. After using the kernel density analysis
tool, I reclassified the data and excluding any zero values. The mean number
was 35.16 which I multiple it by 3 which resulted as 105.48. Any number below
105.48, I excluded from the map and convert it from raster to polygon. I had to
use the select attribute tool to select features with the grid code of 1 to get
the final resulting hotspot map. In the final hotspot map that was created, the
Local Moran’s I was used. Before using the tool, the Blockgroups2009Fixed.shp
and the 2007 burglaries file had to be spatial joined and a crime rate field
had to be inserted into the data. The crime rate was determined by the number
of burglaries per 1000 housing units. After the Local Moran’s I tool was used,
the select attribute tool helped select the clusters with high crime rates.
Saturday, June 24, 2017
Applications In GIS (GIS5100): Lab 5
In this week's lab, we learned about spatial accessibility and applying the network analysis tools to measure the distance and travel time of the spatial access. This spatial accessibility model shows a before/after of the ACC Cypress Creek Campus closure and its impact on the accessibility of the ACC campus system. The Service Area analysis was used to created the Drive Time data ranges for both maps. The Closest Facility analysis was also applied to data which we were able to obtained a before/after of the students' drive time. By closing the Cypress Creek Campus, the students who lived near the Cypress Creek Campus have increase on their driving time which ranges between 18 to 50 minutes and they would be relocated to the Northridge Campus.
Monday, June 19, 2017
Application in GIS (GIS5100): Lab 4
In this week's lab, we learned how to utilize visibility analysis. For Camera 2, I placed it on
the other side of the finish line on the right corner of a building so we can
see front and behind of the runners when they are finishing the marathon.
Camera 2’s offset was set at 100 and the viewing angle was set between 250-340
degrees. For Camera 3, I placed this camera on the left side of the finish line
in a tower of a building which allows the viewer to see which runner crossed
the finish line first or what order did the runners finish. Camera 3’s offset
was also set at 100 and the viewing angle was set between 140-250 degrees.
Wednesday, June 14, 2017
Applications in GIS (GIS 5100): Lab 3
In this week's assignment, we learned how to create a watershed analysis on ArcMap. The DEM had to be Fill so the stream segments would not be disconnected when the flow direction tool and the flow accumulation tool were run. After running the flow direction tool and the flow accumulation tool, the Con tool was used to define the streams by the value of 200 or greater. Once the streams were defined, they were then converted into polyline features which were then used the Stream Link tool to visually see each stream segment. The output of the Stream Link tool was applied the Stream Order tools were put the stream into a ranking system. Also the Stream Link's output was applied to the Watershed tool and this allowed us to see the all of the watersheds on Kauai.
I decided to focus on the Hanapepe River watershed for my map. I had to create a new point shapefile in ArcCatalog which I used as my Pour Point layer. The pour point was used to created the modeled watershed that was somewhat different than the observed watershed. To compare the two watershed, the metric unit that I decided to use is acres. Using acre measurements would allow individuals to know how much land is being used to create these watersheds and the numbers won’t be so overwhelming to the individuals who are viewing this data. The data would be determined by using the attribute table again to add a new field to the table and we can use the field calculator to calculate the area of both polygon features.The amount of acres that wshed_pour14 is covering is 17226 acres. The amount of acres that Hanapepe_wshed is covering is 17163 acres. The wshed_pour14 is covering more land than Hanapepe_wshed which from a visual standpoint did not overly appear so in the map of Kauai.
I decided to focus on the Hanapepe River watershed for my map. I had to create a new point shapefile in ArcCatalog which I used as my Pour Point layer. The pour point was used to created the modeled watershed that was somewhat different than the observed watershed. To compare the two watershed, the metric unit that I decided to use is acres. Using acre measurements would allow individuals to know how much land is being used to create these watersheds and the numbers won’t be so overwhelming to the individuals who are viewing this data. The data would be determined by using the attribute table again to add a new field to the table and we can use the field calculator to calculate the area of both polygon features.The amount of acres that wshed_pour14 is covering is 17226 acres. The amount of acres that Hanapepe_wshed is covering is 17163 acres. The wshed_pour14 is covering more land than Hanapepe_wshed which from a visual standpoint did not overly appear so in the map of Kauai.
Friday, June 9, 2017
Applications in GIS (GIS5100): Lab 2
In this week's lab assignment, we had to created a corridor for black bears moving between the Coronado National Forest. I applied the Euclidean tool
to Road.shp and after getting the result calculation from the Euclidean tool, I
used the Reclassify tool to narrow down the distance to fit my suitability
criteria. I then reclassified the Elevation and landcover raster datasets to
the suitability values criteria for what is acceptable habitat for black bears
in the Coronado National Forest. With the three new reclassified files, I
combined them into one dataset using the Weighted Overlay tool. The new
Weighted Overlay layer’s values were then inverted by using the Raster
Calculator tool. (10 – “Weighted Overlay layer”). I used the new inverted
weighted overlay layer to calculating the cost distances for shapefiles of
Coronado1 and Coronado2. Once the cost distances were created, I used them both
in the Corridor tool which created possible passages for black bears to migrate
safely. I narrowed down the values into 4 classes and made the any value over
58,075.3 have no color value.
Subscribe to:
Posts (Atom)