Sunday, July 8, 2018

Homeland Security - Preparing MEDS Datasets

This week we looked at how to prepare a set of data for use as Homeland Security's MEDS, or Minimum Essential Data Sets.  The purpose of this lab was to take base data and create a tailored MEDS package for the Boston Metropolitan Statisical Area (BMSA).  The goal is to used the data package that we created to conduct a pre-Boston Marathon Bombing assessment of the BMSA and determine what roadways, buildings, and infrastructure are classified as critical in the event of an emergency.

The MEDS dataset that I created contains different layer groups to help organize the data.  The groups that were created were Boundaries, Transportation, Hydrography, Land Cover, Orthoimagery, Elevation, and Geographic Names.  Each set of data was tailored for the BMSA.  To create the MEDS data, data was moved from a provided Geodatabase into the proper layer groups listed above.  Manipulation of data was necessary for certain datasets, such as roadways, to create the final MEDS dataset.  These group layers also had specific symbology and labeling features set to them. 

In the end, every group layer was saved as a layer file to preserve these settings.  This enables future users of the BMSA MEDS package to get right into doing data analysis without having to format a bunch of layers of data.  This also ensures that the formatting is standardized across all users. 

Below is a screen shot of my final BMSA MEDS package.  If you have any questions at all, leave a comment below. 


Friday, June 29, 2018

Geoprocessing with Python

This week we focused on executing geoprocessing steps outside of ArcGIS using Python scripting.  While the task for this weeks assignment seemed daunting at first, but when I broke it down and used the ArcGIS help pages to obtain the correct syntax, all the pieces fell into place rather simply and without much need for incessant troubleshooting.  The following steps were used to create the script:

  1. Import arcpy and the env tools
  2. Obtain the proper syntax for AddXY and add that tool to the script, export messages upon completion
  3. Obtain the proper syntax for Buffer and add that tool to the script, export messages upon completion
  4. Obtain the proper syntax for Dissolve and add that tool to the script, export messages upon completion
  5. End script.
The script was literally that easy to create.  Only required parameters were used for this script as well which made it that much easier to write.  Once the script was run, I opened ArcMap and verified that the steps were completed correctly.  Here is a screenshot of the interactive window from PythonWin showing all of the tool messages after I ran the script:


Let me know if you have any questions by using the comment section below!

Sunday, June 24, 2018

Crime Analysis in the DC Metro Area

This week focused on using different techniques within ArcGIS to analyze crime statistics and then create maps based off our analysis. 

The first map that was created was using statistical crime data to determine where there may be gaps in police coverage that could be filled by creating a new police station/substation.  This map also analyzed what were the most common forms of crime during 2011.  As you can see, almost 75% of all crimes happened within 1 mile of a police station.  Now despite this, there were a couple of potential gaps that I proposed new stations for.  The first one in the south-east would cover a large area that was outside of the 2 mile coverage and reduce some of the burden on the 7th District station.  The second proposed station is in the north west and would fill a small coverage gap, but more importantly, it could take some of the burden from the 2nd and 4th District stations. 



The second map I created focused on three types of crimes, burglary, homicide, and sex abuse.  The three map panels you will see show the different frequencies of the different crimes and that information is overlayed on top of a population density sublayer.  While it is a little difficult to see the population density clearly, you can gather by the dark undertones where there are more people and see whether population is more of a factor for certain crimes.  


That is all for now.  Let me know what you think and happy mapping.  


Wednesday, June 20, 2018

Geoprocessing in ArcGIS

This weeks lab work focused on introducing us to creating toolboxes within ArcGIS that can hold custom created models (built from the ArcGIS model builder) as well as script tools that, in this case, were written in the PythonWin IDLE. 

The first step in the lab was to create a model tool using the model builder within ArcGIS.  The goal of the model was to take two shapefiles, create a clip of one using the other as a mask, then select certain features within that mask and create another output shapefile of the selected features.  The final step of the model was to delete the selected features from the original clipped shapefile.  What I ended up with is something like this:


The original clip was solid while the final output above shows all the deleted areas within it that were removed from running the model. 

The rest of the lab consisted of taking the model I created, exporting it to a python script, validating that the script worked and overwrote the existing output, and then finally taking the python script and creating a script tool within ArcGIS. 

Overall it was really neat to work through the process and see how all these tools can be interconnected and have varying degrees of power and flexibility.  While the processes we are working with right now are relatively simple and mundane and one would wonder, why even script to do something so simple?  It is plain to see that as you get into more complex and repetitive geoprocessing that scripting and model tools could be a huge help in reducing manhours and tedious manual processing.  

Sunday, June 17, 2018

Tracking Hurricanes and Assessing Damage

This week in my Applications in GIS class we looked at how to apply GIS to hurricane tracking and damage assessments.  The scenario we applied the techniques this week involved Hurricane Sandy which caused unprecedented damage for a category 1 storm. 

The first map I made was a tracking map for the storm from start to finish.  The track shows what category the storm was as it moved north through the Caribbean Sea and Atlantic Ocean.  The points along the track also showed the wind speeds and barometric pressures.  Finally, the states that were affected by the hurricane were also highlighted on the map.  The final product is below:


The second map I created took a more involved look at assessing the damage that was caused in a New Jersey neighborhood when Sandy made landfall.  I analyzed pictures of a section of beachfront property that were taken before and after the hurricane blew through the north eastern United States.  The intent of the analysis was to determine what kind of damage occurred to the homes near the coast.  The types of damage that we looked at was structural, wind, and storm surge inundation.  The section I looked at appeared to have every house suffer storm surge inundation.  What was really difficult to determine was the effects of wind and structural damage aside from those houses that were completely wiped out.  The parcels were noted with the level of structural damage on the map.  A table was also created with the tallies of properties and the type of damage on them.  Finally, both the pre and post-storm imagery was used to show the extent of the damage.  The map that was created is below. 


Let me know what you think!

Thursday, June 14, 2018

Script Debugging

This week's focus was on debugging errors that is an integral part in script writing.  No matter how particular you are in your program writing, errors are bound to happen.  Whether they are syntax errors, exception errors, or even logic errors, they can be frustrating to locate and weed out when you are creating your scripts.  For the lab this week, it was our task to review and debug three different provided scripts. 

The first script contained two errors and both were simply typographic errors on different variables.  Once those type-o's were corrected, the script ran correctly as you can see below. 


The second script was more complex and contained 8 errors within it.  I had a lot of trouble getting past the first error and once I realized that the problem was not only the file name, but the file location, I was able to step through the remaining errors somewhat efficiently.  Some types of errors that I found within the second script included: file name and location errors, syntax of file paths, typographic errors for function/methods used, and missing or extra parameters for functions.  Once I corrected the 8 errors that I found, I was able to get a successful run of the script and here is how it turned out:


The last script had us take a different approach.  Instead of correcting the errors that were found, the goal was to implement the try-except callout to identify errors within a section of code and still successfully run the entirety of the script without erroring out.  The script had already been divided into two parts.  Part B was good to go with no errors.  There was an error in Part A however that needed to be isolated and reported using the try-except functions in Python.  Once implemented successfully, I was able to achieve and error output for Part A and then still complete Part B of the script.  The output is below:


So that is it.  No real flow charting on how I go through my debugging process.  I am sure debugging gets easier over time as one becomes more familiar with how different classes are used/called upon in Python and what is needed for those to run successfully.  As always, leave any comments below on how you approach debugging. 

Cheers!

Sunday, June 10, 2018

Tsunami Analysis and the Fukushima Disaster

This weeks focus for Application in GIS was Tsunami analysis.  While the analysis we conducted was based on the Fukushima tsunami incident in 2011 and wasn't exactly for preventative planning, the results of the analysis could be used for future planning.  In preparation for this lab, we also did a module focusing on how to create feature datasets in ArcGIS as well as going a bit deeper into adding feature classes and rasters into geodatabases. 

For the lab analysis of the Fukushima disaster, we conducted two separate analyses.  One focused on the radiation spread from the Fukushima reactor meltdown and how distance affected how at risk one was and where people really needed to evacuate from.  The second analysis focused on the tsunami itself.  Using raster imagery, masks were made from the coast to 10,000km inland.  Using these masks and the already known land area that was affected, an evacuation zone plan was made to provide a layout for where safe land could be found in the event of another tsunami. 

The evacuation zones for the radiation zones were created manually using ring buffers centered on the Fukushima II power plant.  However for creating the evacuation zones for the tsunami affected lands, more automation was used by implementing a tool within ArcGIS called model builder.  Again, once these zones were created, the final map was created and can be found below.  I do ask if anyone has any issues with my color choices, please comment below with suggestions.  Living as a color blind individual does not always translate well in full color mapping. 

Enjoy!