Friday, June 29, 2018

Geoprocessing with Python

This week we focused on executing geoprocessing steps outside of ArcGIS using Python scripting.  While the task for this weeks assignment seemed daunting at first, but when I broke it down and used the ArcGIS help pages to obtain the correct syntax, all the pieces fell into place rather simply and without much need for incessant troubleshooting.  The following steps were used to create the script:

  1. Import arcpy and the env tools
  2. Obtain the proper syntax for AddXY and add that tool to the script, export messages upon completion
  3. Obtain the proper syntax for Buffer and add that tool to the script, export messages upon completion
  4. Obtain the proper syntax for Dissolve and add that tool to the script, export messages upon completion
  5. End script.
The script was literally that easy to create.  Only required parameters were used for this script as well which made it that much easier to write.  Once the script was run, I opened ArcMap and verified that the steps were completed correctly.  Here is a screenshot of the interactive window from PythonWin showing all of the tool messages after I ran the script:


Let me know if you have any questions by using the comment section below!

Sunday, June 24, 2018

Crime Analysis in the DC Metro Area

This week focused on using different techniques within ArcGIS to analyze crime statistics and then create maps based off our analysis. 

The first map that was created was using statistical crime data to determine where there may be gaps in police coverage that could be filled by creating a new police station/substation.  This map also analyzed what were the most common forms of crime during 2011.  As you can see, almost 75% of all crimes happened within 1 mile of a police station.  Now despite this, there were a couple of potential gaps that I proposed new stations for.  The first one in the south-east would cover a large area that was outside of the 2 mile coverage and reduce some of the burden on the 7th District station.  The second proposed station is in the north west and would fill a small coverage gap, but more importantly, it could take some of the burden from the 2nd and 4th District stations. 



The second map I created focused on three types of crimes, burglary, homicide, and sex abuse.  The three map panels you will see show the different frequencies of the different crimes and that information is overlayed on top of a population density sublayer.  While it is a little difficult to see the population density clearly, you can gather by the dark undertones where there are more people and see whether population is more of a factor for certain crimes.  


That is all for now.  Let me know what you think and happy mapping.  


Wednesday, June 20, 2018

Geoprocessing in ArcGIS

This weeks lab work focused on introducing us to creating toolboxes within ArcGIS that can hold custom created models (built from the ArcGIS model builder) as well as script tools that, in this case, were written in the PythonWin IDLE. 

The first step in the lab was to create a model tool using the model builder within ArcGIS.  The goal of the model was to take two shapefiles, create a clip of one using the other as a mask, then select certain features within that mask and create another output shapefile of the selected features.  The final step of the model was to delete the selected features from the original clipped shapefile.  What I ended up with is something like this:


The original clip was solid while the final output above shows all the deleted areas within it that were removed from running the model. 

The rest of the lab consisted of taking the model I created, exporting it to a python script, validating that the script worked and overwrote the existing output, and then finally taking the python script and creating a script tool within ArcGIS. 

Overall it was really neat to work through the process and see how all these tools can be interconnected and have varying degrees of power and flexibility.  While the processes we are working with right now are relatively simple and mundane and one would wonder, why even script to do something so simple?  It is plain to see that as you get into more complex and repetitive geoprocessing that scripting and model tools could be a huge help in reducing manhours and tedious manual processing.  

Sunday, June 17, 2018

Tracking Hurricanes and Assessing Damage

This week in my Applications in GIS class we looked at how to apply GIS to hurricane tracking and damage assessments.  The scenario we applied the techniques this week involved Hurricane Sandy which caused unprecedented damage for a category 1 storm. 

The first map I made was a tracking map for the storm from start to finish.  The track shows what category the storm was as it moved north through the Caribbean Sea and Atlantic Ocean.  The points along the track also showed the wind speeds and barometric pressures.  Finally, the states that were affected by the hurricane were also highlighted on the map.  The final product is below:


The second map I created took a more involved look at assessing the damage that was caused in a New Jersey neighborhood when Sandy made landfall.  I analyzed pictures of a section of beachfront property that were taken before and after the hurricane blew through the north eastern United States.  The intent of the analysis was to determine what kind of damage occurred to the homes near the coast.  The types of damage that we looked at was structural, wind, and storm surge inundation.  The section I looked at appeared to have every house suffer storm surge inundation.  What was really difficult to determine was the effects of wind and structural damage aside from those houses that were completely wiped out.  The parcels were noted with the level of structural damage on the map.  A table was also created with the tallies of properties and the type of damage on them.  Finally, both the pre and post-storm imagery was used to show the extent of the damage.  The map that was created is below. 


Let me know what you think!

Thursday, June 14, 2018

Script Debugging

This week's focus was on debugging errors that is an integral part in script writing.  No matter how particular you are in your program writing, errors are bound to happen.  Whether they are syntax errors, exception errors, or even logic errors, they can be frustrating to locate and weed out when you are creating your scripts.  For the lab this week, it was our task to review and debug three different provided scripts. 

The first script contained two errors and both were simply typographic errors on different variables.  Once those type-o's were corrected, the script ran correctly as you can see below. 


The second script was more complex and contained 8 errors within it.  I had a lot of trouble getting past the first error and once I realized that the problem was not only the file name, but the file location, I was able to step through the remaining errors somewhat efficiently.  Some types of errors that I found within the second script included: file name and location errors, syntax of file paths, typographic errors for function/methods used, and missing or extra parameters for functions.  Once I corrected the 8 errors that I found, I was able to get a successful run of the script and here is how it turned out:


The last script had us take a different approach.  Instead of correcting the errors that were found, the goal was to implement the try-except callout to identify errors within a section of code and still successfully run the entirety of the script without erroring out.  The script had already been divided into two parts.  Part B was good to go with no errors.  There was an error in Part A however that needed to be isolated and reported using the try-except functions in Python.  Once implemented successfully, I was able to achieve and error output for Part A and then still complete Part B of the script.  The output is below:


So that is it.  No real flow charting on how I go through my debugging process.  I am sure debugging gets easier over time as one becomes more familiar with how different classes are used/called upon in Python and what is needed for those to run successfully.  As always, leave any comments below on how you approach debugging. 

Cheers!

Sunday, June 10, 2018

Tsunami Analysis and the Fukushima Disaster

This weeks focus for Application in GIS was Tsunami analysis.  While the analysis we conducted was based on the Fukushima tsunami incident in 2011 and wasn't exactly for preventative planning, the results of the analysis could be used for future planning.  In preparation for this lab, we also did a module focusing on how to create feature datasets in ArcGIS as well as going a bit deeper into adding feature classes and rasters into geodatabases. 

For the lab analysis of the Fukushima disaster, we conducted two separate analyses.  One focused on the radiation spread from the Fukushima reactor meltdown and how distance affected how at risk one was and where people really needed to evacuate from.  The second analysis focused on the tsunami itself.  Using raster imagery, masks were made from the coast to 10,000km inland.  Using these masks and the already known land area that was affected, an evacuation zone plan was made to provide a layout for where safe land could be found in the event of another tsunami. 

The evacuation zones for the radiation zones were created manually using ring buffers centered on the Fukushima II power plant.  However for creating the evacuation zones for the tsunami affected lands, more automation was used by implementing a tool within ArcGIS called model builder.  Again, once these zones were created, the final map was created and can be found below.  I do ask if anyone has any issues with my color choices, please comment below with suggestions.  Living as a color blind individual does not always translate well in full color mapping. 

Enjoy!

Wednesday, June 6, 2018

Python Fundamentals II - Getting Acquainted with Modules and Conditionals

This week we got a bit deeper into programming with Python.  We delved deeper into using/importing modules and then were introduced to using conditional statements like if/else and while/for loops.  The particular script I worked on had three parts to it.  The first part was simply finding the errors in a section of code and fixing them.  There was two and the output you will see below shows that the errors were fixed.  This particular section of output shows 7 named players and whether their randomly assigned number based off of twice the length of their name is a winning number or not. 

The second part of the script was to create a list of 20 random numbers.  Each number had to be within 0-10.  Once the list was generated it was printed to the screen.  In the output picture below, this list will be the first one you see. 

The third part of the script was the more challenging part.  This used a combination of different conditional statements to determine if a predefined unlucky number was in the list.  If that number was not in the list, as statement was returned and the unmodified list was reprinted.  If the unlucky number was in the list, the number of times it occurred was counted, a statement was returned stating the number of times the number was in the list, actions were performed to remove the unlucky number, and then the modified list was printed to screen. 

Here is a flow chart for how my thought process worked for this last part:


And here is my output for the entirety of the script:


If you are curious as to how I achieved the outputs above, leave a comment.  I can give examples after the due date as to how I finalized my script.  Let me know if you have any questions and happy scripting to all!

Sunday, June 3, 2018

Lahars and Mt. Hood

This week in Applications in GIS we focused on applying our GIS skills to determining how a lahar would affect a population base using different tools within ArcGIS.  For those who don't know what a lahar is, it is a destructive mudflow that is triggered during a volcanic eruption.  When a volcano erupts, the intense heat melts snow packs and glaciers on the volcano slopes and the resulting flows pick up mud and debris that are then carried down slope destroying everything in its path.  Typically lahars will follow existing stream and river beds as they are the paths of least resistance.  Two somewhat recent lahar flows were those resulting from the Mt. St. Helens eruption in 1980 and the 1985 eruption of Nevado del Ruiz in Columbia whose lahar flow killed over 20,000 people in the town of Armero. 

Our lab this week had us replicate a study of the Mt. Hood stratovolcano and how/where the potential lahar flows would affect the surrounding areas.  The first step to this lab was to obtain the geodatabase that we would be working from.  The data provided within this geodatabase would serve as the foundation for all the processes we would complete in this lab.  A main point to using this geodatabase was keeping a naming convention for newly created elements that made sense and ensuring that we didnt keep useless files within the geodatabase.  The picture below illustrates how my geodatabase ended up at the end of the lab. 


The following steps were used within ArcGIS to create the basis for creating the map that you will see below.  First, a study area was created around the Mt. Hood area that encompassed the Multnohmah, Wasco, Clackamas, and Hood River counties.  This study area would serve as a clip feature to remove unnecessary features later on in the map creation.  

The next step was to create a mosaic raster out of the provided rasters files in the geodatabase.  This was done to make analysis easier by analyzing only one raster vice having to complete the analysis on multiple rasters.  Once the raster mosaic was created the following tools from the Spatial Analysis toolset in ArcGIS were used in the following order: the Fill tool, the Flow Direction Tool, and the Flow Accumulation Tool.  Appropriate file names were used for the resulting outputs.  So what did these do?  They basically identified the likely areas where liquid materials will flow to on Mt. Hood.  In essence, this amounted to a stream network flowing from the peak to the base/surround areas of the mountain.  

Next I used the math Int tool to convert our values from the previous steps to integer.  Originally the pixels had floating point values.  We would then determine what 1% of the value of the total number of pixels were in our stream network.  That would then be used in the Con tool to create what was more likely to be the true stream network.  This output was then converted to a geodatabase feature using the Stream to Feature tool.  

The remaining steps involved conducting the actual hazard analysis by creating a 1/2 mile buffer around our stream feature and determining which population blocks and schools would be affected by the 1/2 mile lahar buffer.  The results were then mapped and the output map is below.  Please let me know if this map works for you.  As a colorblind mapper, I always welcome comments and suggestions to make things better.  Cheers!