Sunday, July 8, 2018

Homeland Security - Preparing MEDS Datasets

This week we looked at how to prepare a set of data for use as Homeland Security's MEDS, or Minimum Essential Data Sets.  The purpose of this lab was to take base data and create a tailored MEDS package for the Boston Metropolitan Statisical Area (BMSA).  The goal is to used the data package that we created to conduct a pre-Boston Marathon Bombing assessment of the BMSA and determine what roadways, buildings, and infrastructure are classified as critical in the event of an emergency.

The MEDS dataset that I created contains different layer groups to help organize the data.  The groups that were created were Boundaries, Transportation, Hydrography, Land Cover, Orthoimagery, Elevation, and Geographic Names.  Each set of data was tailored for the BMSA.  To create the MEDS data, data was moved from a provided Geodatabase into the proper layer groups listed above.  Manipulation of data was necessary for certain datasets, such as roadways, to create the final MEDS dataset.  These group layers also had specific symbology and labeling features set to them. 

In the end, every group layer was saved as a layer file to preserve these settings.  This enables future users of the BMSA MEDS package to get right into doing data analysis without having to format a bunch of layers of data.  This also ensures that the formatting is standardized across all users. 

Below is a screen shot of my final BMSA MEDS package.  If you have any questions at all, leave a comment below. 


Friday, June 29, 2018

Geoprocessing with Python

This week we focused on executing geoprocessing steps outside of ArcGIS using Python scripting.  While the task for this weeks assignment seemed daunting at first, but when I broke it down and used the ArcGIS help pages to obtain the correct syntax, all the pieces fell into place rather simply and without much need for incessant troubleshooting.  The following steps were used to create the script:

  1. Import arcpy and the env tools
  2. Obtain the proper syntax for AddXY and add that tool to the script, export messages upon completion
  3. Obtain the proper syntax for Buffer and add that tool to the script, export messages upon completion
  4. Obtain the proper syntax for Dissolve and add that tool to the script, export messages upon completion
  5. End script.
The script was literally that easy to create.  Only required parameters were used for this script as well which made it that much easier to write.  Once the script was run, I opened ArcMap and verified that the steps were completed correctly.  Here is a screenshot of the interactive window from PythonWin showing all of the tool messages after I ran the script:


Let me know if you have any questions by using the comment section below!

Sunday, June 24, 2018

Crime Analysis in the DC Metro Area

This week focused on using different techniques within ArcGIS to analyze crime statistics and then create maps based off our analysis. 

The first map that was created was using statistical crime data to determine where there may be gaps in police coverage that could be filled by creating a new police station/substation.  This map also analyzed what were the most common forms of crime during 2011.  As you can see, almost 75% of all crimes happened within 1 mile of a police station.  Now despite this, there were a couple of potential gaps that I proposed new stations for.  The first one in the south-east would cover a large area that was outside of the 2 mile coverage and reduce some of the burden on the 7th District station.  The second proposed station is in the north west and would fill a small coverage gap, but more importantly, it could take some of the burden from the 2nd and 4th District stations. 



The second map I created focused on three types of crimes, burglary, homicide, and sex abuse.  The three map panels you will see show the different frequencies of the different crimes and that information is overlayed on top of a population density sublayer.  While it is a little difficult to see the population density clearly, you can gather by the dark undertones where there are more people and see whether population is more of a factor for certain crimes.  


That is all for now.  Let me know what you think and happy mapping.  


Wednesday, June 20, 2018

Geoprocessing in ArcGIS

This weeks lab work focused on introducing us to creating toolboxes within ArcGIS that can hold custom created models (built from the ArcGIS model builder) as well as script tools that, in this case, were written in the PythonWin IDLE. 

The first step in the lab was to create a model tool using the model builder within ArcGIS.  The goal of the model was to take two shapefiles, create a clip of one using the other as a mask, then select certain features within that mask and create another output shapefile of the selected features.  The final step of the model was to delete the selected features from the original clipped shapefile.  What I ended up with is something like this:


The original clip was solid while the final output above shows all the deleted areas within it that were removed from running the model. 

The rest of the lab consisted of taking the model I created, exporting it to a python script, validating that the script worked and overwrote the existing output, and then finally taking the python script and creating a script tool within ArcGIS. 

Overall it was really neat to work through the process and see how all these tools can be interconnected and have varying degrees of power and flexibility.  While the processes we are working with right now are relatively simple and mundane and one would wonder, why even script to do something so simple?  It is plain to see that as you get into more complex and repetitive geoprocessing that scripting and model tools could be a huge help in reducing manhours and tedious manual processing.  

Sunday, June 17, 2018

Tracking Hurricanes and Assessing Damage

This week in my Applications in GIS class we looked at how to apply GIS to hurricane tracking and damage assessments.  The scenario we applied the techniques this week involved Hurricane Sandy which caused unprecedented damage for a category 1 storm. 

The first map I made was a tracking map for the storm from start to finish.  The track shows what category the storm was as it moved north through the Caribbean Sea and Atlantic Ocean.  The points along the track also showed the wind speeds and barometric pressures.  Finally, the states that were affected by the hurricane were also highlighted on the map.  The final product is below:


The second map I created took a more involved look at assessing the damage that was caused in a New Jersey neighborhood when Sandy made landfall.  I analyzed pictures of a section of beachfront property that were taken before and after the hurricane blew through the north eastern United States.  The intent of the analysis was to determine what kind of damage occurred to the homes near the coast.  The types of damage that we looked at was structural, wind, and storm surge inundation.  The section I looked at appeared to have every house suffer storm surge inundation.  What was really difficult to determine was the effects of wind and structural damage aside from those houses that were completely wiped out.  The parcels were noted with the level of structural damage on the map.  A table was also created with the tallies of properties and the type of damage on them.  Finally, both the pre and post-storm imagery was used to show the extent of the damage.  The map that was created is below. 


Let me know what you think!

Thursday, June 14, 2018

Script Debugging

This week's focus was on debugging errors that is an integral part in script writing.  No matter how particular you are in your program writing, errors are bound to happen.  Whether they are syntax errors, exception errors, or even logic errors, they can be frustrating to locate and weed out when you are creating your scripts.  For the lab this week, it was our task to review and debug three different provided scripts. 

The first script contained two errors and both were simply typographic errors on different variables.  Once those type-o's were corrected, the script ran correctly as you can see below. 


The second script was more complex and contained 8 errors within it.  I had a lot of trouble getting past the first error and once I realized that the problem was not only the file name, but the file location, I was able to step through the remaining errors somewhat efficiently.  Some types of errors that I found within the second script included: file name and location errors, syntax of file paths, typographic errors for function/methods used, and missing or extra parameters for functions.  Once I corrected the 8 errors that I found, I was able to get a successful run of the script and here is how it turned out:


The last script had us take a different approach.  Instead of correcting the errors that were found, the goal was to implement the try-except callout to identify errors within a section of code and still successfully run the entirety of the script without erroring out.  The script had already been divided into two parts.  Part B was good to go with no errors.  There was an error in Part A however that needed to be isolated and reported using the try-except functions in Python.  Once implemented successfully, I was able to achieve and error output for Part A and then still complete Part B of the script.  The output is below:


So that is it.  No real flow charting on how I go through my debugging process.  I am sure debugging gets easier over time as one becomes more familiar with how different classes are used/called upon in Python and what is needed for those to run successfully.  As always, leave any comments below on how you approach debugging. 

Cheers!

Sunday, June 10, 2018

Tsunami Analysis and the Fukushima Disaster

This weeks focus for Application in GIS was Tsunami analysis.  While the analysis we conducted was based on the Fukushima tsunami incident in 2011 and wasn't exactly for preventative planning, the results of the analysis could be used for future planning.  In preparation for this lab, we also did a module focusing on how to create feature datasets in ArcGIS as well as going a bit deeper into adding feature classes and rasters into geodatabases. 

For the lab analysis of the Fukushima disaster, we conducted two separate analyses.  One focused on the radiation spread from the Fukushima reactor meltdown and how distance affected how at risk one was and where people really needed to evacuate from.  The second analysis focused on the tsunami itself.  Using raster imagery, masks were made from the coast to 10,000km inland.  Using these masks and the already known land area that was affected, an evacuation zone plan was made to provide a layout for where safe land could be found in the event of another tsunami. 

The evacuation zones for the radiation zones were created manually using ring buffers centered on the Fukushima II power plant.  However for creating the evacuation zones for the tsunami affected lands, more automation was used by implementing a tool within ArcGIS called model builder.  Again, once these zones were created, the final map was created and can be found below.  I do ask if anyone has any issues with my color choices, please comment below with suggestions.  Living as a color blind individual does not always translate well in full color mapping. 

Enjoy!

Wednesday, June 6, 2018

Python Fundamentals II - Getting Acquainted with Modules and Conditionals

This week we got a bit deeper into programming with Python.  We delved deeper into using/importing modules and then were introduced to using conditional statements like if/else and while/for loops.  The particular script I worked on had three parts to it.  The first part was simply finding the errors in a section of code and fixing them.  There was two and the output you will see below shows that the errors were fixed.  This particular section of output shows 7 named players and whether their randomly assigned number based off of twice the length of their name is a winning number or not. 

The second part of the script was to create a list of 20 random numbers.  Each number had to be within 0-10.  Once the list was generated it was printed to the screen.  In the output picture below, this list will be the first one you see. 

The third part of the script was the more challenging part.  This used a combination of different conditional statements to determine if a predefined unlucky number was in the list.  If that number was not in the list, as statement was returned and the unmodified list was reprinted.  If the unlucky number was in the list, the number of times it occurred was counted, a statement was returned stating the number of times the number was in the list, actions were performed to remove the unlucky number, and then the modified list was printed to screen. 

Here is a flow chart for how my thought process worked for this last part:


And here is my output for the entirety of the script:


If you are curious as to how I achieved the outputs above, leave a comment.  I can give examples after the due date as to how I finalized my script.  Let me know if you have any questions and happy scripting to all!

Sunday, June 3, 2018

Lahars and Mt. Hood

This week in Applications in GIS we focused on applying our GIS skills to determining how a lahar would affect a population base using different tools within ArcGIS.  For those who don't know what a lahar is, it is a destructive mudflow that is triggered during a volcanic eruption.  When a volcano erupts, the intense heat melts snow packs and glaciers on the volcano slopes and the resulting flows pick up mud and debris that are then carried down slope destroying everything in its path.  Typically lahars will follow existing stream and river beds as they are the paths of least resistance.  Two somewhat recent lahar flows were those resulting from the Mt. St. Helens eruption in 1980 and the 1985 eruption of Nevado del Ruiz in Columbia whose lahar flow killed over 20,000 people in the town of Armero. 

Our lab this week had us replicate a study of the Mt. Hood stratovolcano and how/where the potential lahar flows would affect the surrounding areas.  The first step to this lab was to obtain the geodatabase that we would be working from.  The data provided within this geodatabase would serve as the foundation for all the processes we would complete in this lab.  A main point to using this geodatabase was keeping a naming convention for newly created elements that made sense and ensuring that we didnt keep useless files within the geodatabase.  The picture below illustrates how my geodatabase ended up at the end of the lab. 


The following steps were used within ArcGIS to create the basis for creating the map that you will see below.  First, a study area was created around the Mt. Hood area that encompassed the Multnohmah, Wasco, Clackamas, and Hood River counties.  This study area would serve as a clip feature to remove unnecessary features later on in the map creation.  

The next step was to create a mosaic raster out of the provided rasters files in the geodatabase.  This was done to make analysis easier by analyzing only one raster vice having to complete the analysis on multiple rasters.  Once the raster mosaic was created the following tools from the Spatial Analysis toolset in ArcGIS were used in the following order: the Fill tool, the Flow Direction Tool, and the Flow Accumulation Tool.  Appropriate file names were used for the resulting outputs.  So what did these do?  They basically identified the likely areas where liquid materials will flow to on Mt. Hood.  In essence, this amounted to a stream network flowing from the peak to the base/surround areas of the mountain.  

Next I used the math Int tool to convert our values from the previous steps to integer.  Originally the pixels had floating point values.  We would then determine what 1% of the value of the total number of pixels were in our stream network.  That would then be used in the Con tool to create what was more likely to be the true stream network.  This output was then converted to a geodatabase feature using the Stream to Feature tool.  

The remaining steps involved conducting the actual hazard analysis by creating a 1/2 mile buffer around our stream feature and determining which population blocks and schools would be affected by the 1/2 mile lahar buffer.  The results were then mapped and the output map is below.  Please let me know if this map works for you.  As a colorblind mapper, I always welcome comments and suggestions to make things better.  Cheers!



Wednesday, May 30, 2018

Python Fundamentals Part 1

This week was a shallow dive into the basics of Python scripting.  We learned about basic variables, objects, functions, methods and how to use these elements to create a basic script.  The script itself only had one input in the beginning.  This input would be a users name.  From there the script would take the variable with the name, turn it into a list for every name in the initial string that was separated by a space, and then print the last name to the screen.  From there the length of the last name was determined and then tripled.  The final part of the script was to print tripled count value of the last name to the screen.  Here is a screen shot of how mine turned out. 


The script was tested using a number of different name strings as the initial variable and the final outcome always printed whatever the last name and triple the number of characters in that last name.  Again, this was all basic scripting to allow us to scratch the surface of script programming and get a feel for the basic elements of Python. 

Let me know what you think in the comments below!  I can also answer any questions on how to achieve this outcome efficiently as well.  

Wednesday, May 23, 2018

Introduction to Python

Here we are in our first week of GIS Programming.  This week focused on setting up our folders in our S:/ drive and getting a basic understanding of Python scripting and editors.  The editors we looked at were IDLE, PythonWin, and the Python editor which is in ArcMap 10.5.1.  Within these editors, a simple script was run to display the text "Hello World".  Nothing too fancy, just enough to show how different elements are highlighted and displayed in the different editors.  For the most part we will be using PythonWin for this course. 

To set up our folders for the course, we used a python script vice manually creating 3 folders for each of the 12 modules in the course.  The screenshot below shows the outcome of this script.


The overall process summary felt like an introduction to completing process summaries.  This one focused on showing where you store your data, asked some questions from the reading, and then had us explain how a certain step was completed in our own words.  This introduction to process summaries would be beneficial to first time students or those who have been away for a couple of semesters. 

Other than that, this was probably the only simple week I can hope to have during this course.  My coding/scripting skills are extremely rusty, but I am hoping it will all come back quickly and overall I am looking forward to this course.  

Thursday, May 3, 2018

Computer Cartography - Finally Done!


This final project was designed to utilize the cartographic skills, methods, and principles that were learned throughout this course.  To illustrate mastery of these skills, a complex map depicting multiple layers of data was created.  The data used to create this map were the 2014 nationwide average SAT scores for each state as well as each state’s student participation percentage in taking the SAT.  By mapping these two pieces of data together on the same map, it could show a correlation between SAT scores and how many students actually participated in the testing. 
To create a map that depicts two different sets of data, two thematic methods were chosen to provide enough distinction between the data sets.  The two thematic methods used were choropleth mapping using graduated colors for the average composite SAT scores in each state and proportional symbol mapping using graduated symbols for the participation percentage. 
            The data for the state average SAT scores needed to be combined into a composite score as it was provided as the three SAT section scores.  No normalization of the composite scores was applied to the data.  Once these scores were combined for each state, the resulting composite scores were used to create a choropleth map of the United States.  Graduated color symbology was used in ArcMap to achieve this.  The data was also divided into five groups, or score ranges, using the quantile breaks method.  This was used to ensure an equal number of states was in each group, providing some color differentiation and even balance to the overall map.  The color ramp for this data series was red (lower SAT scores) to green (higher SAT scores
            The participation percentage data was taken directly from the source document and no manipulation or normalization was required for mapping.  To map this data, proportional symbology, in the form of graduated circles, was used to map the participation percentage on top of the existing choropleth map for the average scores.  A light blue circle with a black border was used to contrast the color scheme of the base map.  The actual percentage was also placed inside the circle to clearly show the data on the map.  For this symbology the data was divided into five groups, however using natural breaks instead of quantile.  
            As one looks at this map, you can quickly see that the states with the higher average composite SAT scores also happen to be the states reporting the lower participation percentage.  One could assume that only smarter students are encouraged to take the SAT’s in these states where as the states with the lower averages and higher participation rates have their average scores brought down by more average students.  One could also assume that if the states with lower participation rates were to encourage more students to take the SAT, their average composite scores would also come down and fall more in line with the higher participating states.  
          So this map brings this course to an end.  I will say that of all the GIS coursework I have done up till now, this has been one of the more frustrating courses for me.  I think that is because a lot of the evaluation has been subjective in interpretation.  That combined with being colorblind, application of certain principles has proven to be quite challenging.  This is definitely hasn’t been as easy as just obtaining data and creating a map from it.  That being said, I have learned a lot and I guess the Marine in me gets enjoyment and fulfillment from challenging and frustrating situations.  


Sunday, April 15, 2018

Mapping with Google Earth

Google Earth (GE) is a simple interface that users of all skills can use to view map layers.  GE provides the base with 3D structure layers, aerial landscape photography, borders, rivers, lakes, oceans, and just about anything you can think to want in a map.  What's more is that GE can import additional layers that you map in other programs like ArcGIS. 

This week we used Google Earth to revisit our Dot Density Population map of Southern Florida.  The dot density layers that were created in ArcGIS were converted to the .kmz format that GE uses.  The two layers were: 1.  Entire map containing dots depicting populations, water features were included as well and 2.  Layer that dot density was derived from that showed different information in GE.

The second layer did not show the dot densities in GE.  Instead, when a county was clicked on, like Miami-Dade County in the screenshot below, it would display the information for that country. 

County Information - Google Earth

The next task we completed in this lab was to create a tour using our existing dot density layers and visiting different points around southern Florida.  The map itself with the dot density layer looked like this:


Additional points were added in GE for reference when creating the tour.  The tour started from the view in the screenshot above, depicting the population dot density for the region as well as highlighting the different water features.  It then hops from Miami to Fort Lauderdale and then moves west across the state to St. Petersburg and Tampa.  I tried to give different perspectives throughout the tour and as hard as I tried, things weren't exactly smooth.  Once the tour was completed, it was added to the overall .kmz package in GE to complete the project.  I was even able to convert/export the movie to a .mp4 format and you all get the exclusive viewing, just click to watch below.  





Sunday, April 8, 2018

3D Mapping

This weeks lab took us into 3D Mapping and the various aspects of it.  ESRI offers a great online course for 3D Visualization using ArcScene and ArcGlobe.  If you have not taken this course before and 3D mapping is something you are interested in, then I would highly recommend you taking that course.  It gives the quick, down and dirty of how to accomplish these tasks using the ArcGIS suite. 

In this lab we focused on 5 sections using ArcScene.  The first section taught us how to set base heights for raster and feature data.  This was done using Crater Lake as a backdrop and adding the lake, rivers, watch towers, and land use data over the top of elevation data.  Here is how mine turned out:

The second module taught us how to apply vertical exaggeration to a map.  Minnesota is a fairly flat state, especially in the section we worked with.  This is what made it such a great candidate to apply vertical exaggeration to so we could see the features.  The picture below is Minnesota with about 23 times exaggeration to make the features stand out. 

 

The third module showed us how to use illumination to highlight certain features.  Santa Barbara Island was the backdrop and I set the sun to be coming from the south and not much over the horizon, about 9 degrees.  You can see how this highlighted the cliff face in this picture:

The next two exercises focused on using extrusions to create 3D buildings.  The first picture shows buildings and wells.  You could actually see the depth of the wells if you looked underneath the map.  How cool is that?  The second picture uses the dollar value of the specific parcels to illustrate which ones were worth more and whether they were commercial, residential, or industrial lands.


After completing the ESRI lesson, we moved on to making 3D buildings for Boston.  This started off in ArcGIS to create the data we would need to make the building extrusions in ArcScene.  Once that data was created, we moved to ArcScene and made the buildings there using the extrusion process.  Then that data was saved in as a .kml file to later be imported into Google Earth.  The picture below shows the buildings that were created in ArcGIS/ArcScene and then imported into Google Earth. 

Overall this was an interesting lab.  All of the work we have done over the semester has been in 2D and it was somewhat refreshing to take a different look at it this week.  I can definitely see some pros to 3D mapping in the fields of flood modeling and utilities planning.  The ability of seeing how different water levels may affect a 3D modeled city or how/where different utilities are placed in a 3D environment is definitely useful.  A couple things people need to be aware of though are 1. Be aware of where you are facing in your 3D maps and 2. Know that sometimes 3D maps are distorted to show you their message or enhance features.  The data doesn't change, only the perception of what it is.  

Sunday, April 1, 2018

Dot Dot Goose! Dot Mapping Southern Florida

Despite what you may be thinking, dot mapping is not a game of connect the dots to make a map.  Dot mapping, in fact, is a way to display conceptual data what is not uniform throughout a given area.  It allows us to to visualize patterns that may occur in the data.  So how does one come up with a schema for using the dots?  Well, in the map below we use dots to represent a set number of people.  In our case, dots equal 10,000 people on the map.  I also bound the dots to only occur within urbanized zones because it doesn't make much sense to see a dot of 10k people in the middle of Lake Okeechobee does it?  So here is the map and I will later discuss how I made it:


So the map in general was pretty easy to make....once I got past ArcGIS constantly crashing.  So here are the steps I took to make it:
  1. The majority of the work took place in ArcGIS with some final polishing in Adobe Illustrator.  
  2. First I added the south Florida shapefile to my TOC.
  3. Next the population data spreadsheet was added to the TOC and then joined to the Florida shapefile.  This allowed me to access the population data I needed to create the dot map you see above.
  4. The next step was to create the dot symbology for the map.  This was done in the symbology tab for the Florida shapefile's property window.  Under the Quantities section, Dot density was selected and the Population field was used to create the Dot Map.
  5. The Dot Value was set to 10,000 and the Dot Size was set to 2.6.  This took many iterations to figure out what seemed to look best.  
  6. Next, other map layers (water and urban) were added along with the essential map elements.  
  7. The water layer ended up causing multiple crashes of ArcGIS.  To combat this I turned the Dot layer into its own map file and kept the rest on the original map file.  
  8. With both map files ready, I exported them both to a .AI (Adobe Illustrator) format.  Later I would merge these two in Illustrator to finalize the map.  
  9. From here work shifted to Illustrator.  Both maps were merged by copying the dot layer and then using a paste-in-place function on the other layers.  This worked out great as you can see above.  
  10. The final pieces to this map were creating the legend and adding a simple drop shadow to the map.  
And there you have it.  I hope you enjoyed the map.  What do you think and how else could I have applied the dot size and values to make this map even better?  Let me know in the comments below!

Sunday, March 25, 2018

Flow Mapping

Hey there Mappers!

This week's focus is on Flow Mapping.  In particular, the map we created was a distributive flow map.  For those who don't know what a distributive flow map is, it depicts the movement of commodities, people, or ideas between geographic areas.  The map I created this week depicts the number of people who legally immigrated to the United States in 2007.  So how did I make this map?  Glad you asked because I am going to give you a basic run down on how I did this using Adobe Illustrator.

The first challenge was determining which provided template to use and how I would lay everything out.  Ultimately I chose Layout A, which had the world continents and an inset showing the immigration rate for each state.  From here I separated each continent (conveniently grouped in the template) and placed them around the artboard in Illustrator.  I then enlarged the "lower 48" states and moved Alaska and Hawaii closer to mainland USA.  One thing to note is that the scale bar was enlarged with the mainland to maintain proportions.  Next was fine tuning the positions of the continents around the United States.  With the main layout complete there was just one last piece that had to be added, that was for the "Unknown" immigrants.  For this I simply made a circle, placed it appropriately on the map and added a question mark to show that it was unknown.

The next big step that I worked on was placing the flow lines on the map using Illustrator's pen tool.  This allowed for nice arcs to depict the movement of people from a continent to the US.  This layer I placed below the continents layer, giving it a nice start point and then ending right before the US.  While the pen tool is a little wonky to get used to, it really can make nice flow lines on the map.  The next part for the flow lines was assigning the correct proportional width.  Using the data provided in excel, math was applied to determine the correct proportions based on a max width of 16pt.  Here is how my proportions came out:
  • Africa – 8pt
  • Asia – 16pt
  • Europe – 9pt
  • North America – 15pt
  • Oceania – 2pt
  • South America – 8pt
  • Unknown – 1pt 
With the majority of the map laid out, all that was left was to add the essential map elements and stylize the map.  I created a legend for the Percent of Total Immigration per State by creating boxes and then using Illustrator's eye dropper tool to match the colors.  Then I added the appropriate labels throughout the map along with the essential text elements in the lower right corner.  Here are the stylize elements that I used on my map:
  • Dark Background – I felt this contrasted best with the lighter colors of the provided continents
  • Outer Glow – I applied this to the continents and immigration layers.  I felt this helped those elements pop out from the dark background.  One thing to note is that I had to duplicate my continents layer to apply the outer glow and place it lower on the precedence table as the glow was fading out my flow arrows in a way I did not like.  
  • Drop Shadows – These were applied to the continents, the immigration map, title and flow arrows.  For the continents, there was a softer drop shadow which gives the overall map a more 3D look with the glow and dark background.  The continent shadow even covered the starting segments of the flow arrows which gave them a nice effect.  The title has a nice subtle drop shadow that doesn’t extend too much from it.  For the flow arrows I used a hard drop shadow which I felt helped to emphasize the point of the map, that being immigration to the United States.    
  • Continent Colors – For the most part, these stayed stock to the provided .ai file.  However, I did recolor Africa as the colors for Asia, Africa, and Oceania were all similar and I wanted there to be some separation of continent colors.  
  • Text Color – This was fairly simple.  I used black coloring against the lighter colors and white against the darker colors.
After all was said and done, I was left with the map below.  Let me know what you think!




Sunday, March 11, 2018

Isaritmic Mapping

This week was all about Isarithmic Mapping and with that we focused on continuous tones and hypsometric tints.  We even looked at how to add contour lines to our maps, with a focus on adding them to the hypsometric tints map. 

Continuous tones maps use unclassified data and plots it across a color spectrum (grey-scale can be used as well).  This helps show what each data point is in each location and is primarily useful in creating elevation, population, heat, and other maps that display similar types of data. 

Hypsometric maps, on the other hand, use data that has been classified into different groups.  This is also the type of map that people would more easily identify as they see them frequently in weather reports.  That's right, the kind of data that works best are different kinds of weather phenomena, to include temperatures and precipitation rates. 

So that takes us to the map I created this week.  What you see is a map of Washington State and the average annual amount of precipitation using hypsometric mapping with contour lines added as well.  The data that is being displayed was derived from Oregon State University's PRISM model which, in this case was, used by the US Department of Agriculture to display these precipitation totals.  What is most interesting about the PRISM model is that it better takes in account mountainous terrain where other models didn't.  Because of this, the PRISM model has become a widely used model by thousands of agencies, universities, and companies worldwide. 

Without further ado, here is this week's map:


Sunday, March 4, 2018

Choroplething...not Chloroplething

This week we focused on choropleth mapping and using proportional and graduated symbols.  The purpose of chorpleth maps is to display information that applies to a single entity like a country, state, or county.  To properly do this though, one has to standardize their data to the entity they are linking it to.  In our map this week, we used Europe as the back drop and our entities were the countries within Europe.  The specific data we displayed on the map was population density or population per square kilometer.  The color scale I chose for the population density was a green color ramp.  I chose this for a couple of reasons:

  1. This is a coloration of landmass depicting life (population density) and I felt this was appropriate.  
  2. Indirectly, this map will also depict wine consumption in a different symbology and this to me equates to the environment and relaxation, which green adequately represents. 
I will admit that I had to have my wife help pick out the green tones as I am red/green colorblind (amongst other colors) and I had trouble selecting the right one.

The other part of this shows how much wine people in each country drink.  We had to decide whether to use a proportional or graduated symbology.  I chose the graduated symbology and I felt the circles were easier to read and they also didnt take up as much space and overlap like the proportional ones did.  I used a simple blue color for these as I felt that contrasted well with the choropleth map colors.

Some stylized features I added to the map include:

  • Blue background to double as water as most of Europe is surrounded by water
  • Blue, italicized text for labeling water features
  • Drop shadow of the continent adds a little 3D element to the map
  • Fancy title text that I felt wine drinkers might identify with, also added a drop shadow
So my map is here below.  Please feel free to add any comments or suggestions.  


Sunday, February 25, 2018

Data Classification

This week we looked at how different classifications presents the same data sets.  We also looked at how these same classifications presented a 2nd similar set of data.  The map I will share will focus on the percent population of individuals over the age of 65 residing in Miami-Dade County. 


I chose to share this map because it actually shows the greater variation of classification representation on all the maps.  I fell that this is due to the values of the datasets from the two different maps.  The two different data sets were percent of citizens over 65 and the total number of citizens over 65 normalized per square mile.  So you can see that the class ranges for the percent values when from 0 to just under 80%.  In the other map, the values went from 0 to 13,190.  This wider range of values led to more washing out of the census tracts on that map when compared to the percent over 65 map. 

For fun, here is the total population over 65 map as well.  Enjoy!


Sunday, February 18, 2018

Spatial Statistics

This week we learned about how to apply spatial statistics to data inside a map.  The training provided by ESRI covered different aspects of geostatistics.  The first things learned in the training were how to apply a mean center, median center, and the directional distribution to a map in ArcGIS and what this actually means in the context of your data.  These tools basically show you where your data lies directionally and where it is centered.  You will see these applied to the map below. 

Next the training focused on histograms and QQ plots and how to interpret what those graphs display.  These are good tools to determine if your data is normally distributed or not and they are also good for finding outliers in your data. 

Finally the training focused on identifying outliers and conducting trend analysis.  Some tools to assist in identifying outliers included using Voronoi maps and semivariogram clouds in addition to the histograms and QQ plots learned earlier in the training.  Trend analysis was also conducted in ArcGIS using the Explore Data > Trend Analysis tool in the Geostatistical Analysis toolbar. 

The map I created below shows the weather monitoring stations in western Europe.  This map highlights the mean center and median center location as well as the general directional distribution of all the points.  A full explanation is included in the map itself. 


Sunday, February 11, 2018

Cartographic Design - Ward 7 Public Schools

The focus of this lab was to apply Gestalt's Principles into creating a map of all public schools of Ward 7 in Washington D.C.  These principles are visual hierarchy, contrast, figure-ground relationship, and balance.  Here is how I achieved each principle:
  • Visual Hierarchy - The most important items should stand out and that is the case here with Ward 7 itself, the schools we are identifying, the titles, and finally the legend.  All other elements are lower on the scale of importance and are therefore subdued or de-emphasized so they don't draw the reader's attention away prematurely.
  • Contrast - Contrast was mostly applied to the schools.  The red color stands out against a white and grey background.  The school symbols have contrast within themselves as well as their different sizes indicate what type of school they are.  The call out box in the locator map is also a form of contrast as it is bright yellow and that stands out from its surroundings.  
  • Figure-Ground Relationship - For this a couple of techniques were used.  1.  Ward 7 is brighter than the surrounding areas.  2.  The roads are mostly subdued (grey in color) to allow the school icons to stand out better. 
  • Balance - Balance was achieved by placing Ward 7 in a location that best filled the map space.  Then the locator was placed in the empty area in the top left of the map space.  The legend, scale bar, north star, and additional information was placed in the bottom right empty space.  For the top right empty space there wasn’t much left to add to the map, balance was achieved from the top left and bottom right corners, but the space felt empty.  So I decided to label that the colored space (matching the locator) was still indeed Maryland.  
Overall I think the map came out ok and some new skills were learned in Illustrator.  One thing I would like to learn in the future is how to make the roads that have dead ends round out vice remain open as they are in my map.  I did learn the neat trick of using multiple strokes in Illustrator to achieve the hollow road appearance, so I was happy with that.  I do hope that my color choices are acceptable as I always caveat that I am color blind, so these seem ok to me.  Leave your thoughts, comments, and suggestions below as they are always welcome.  

Cheers!


Sunday, February 4, 2018

Learning the Typ-to-the-Ography

This week's lab focus on Typography and reinforcing essential map elements.  Our instruction focused around creating a map, which you will see below, of Marathon, Florida located in the Florida Keys. 

An initial base map was created in ArcGIS which was then exported as an Adobe Illustrator (AI) file.  AI was where the predominance of work was done on this map.  The focus of work was to create a map of Marathon, identifying key (no pun intended) features utilizing typographic principles, and ensuring that we properly included key map features.  An inset map, indicating where Marathon is in relation to southern Florida and the rest of the Keys was included as reference. 

My map, again below, focused on 4 labeling standards.  All cities are Arial 12pt.  All keys (islands) are Arial 10pt.  All water features are Arial Italic, 60% grey in color, and sized to fit the feature.  I also used symbols, identified in the legend, to identify random features and to reduce word clutter on the map. 

Some custom features to this map include drop shadows, path text for the title/subtitle, and a unique border. 

Without further ado, here is my map.  As always, comments and suggestions for improvement are welcome below. 


Sunday, January 28, 2018

Getting My Fingers Wet in Adobe Illustrator

This week's module had us get into Adobe Illustrator.  This was a first for me and took some time getting around all the ins and outs of the program.  I have limited Photoshop experience from long ago, so that at least helped me navigate some of the basics, in particular that of managing layers. 

One thing I learned about was the power of scripting to make mass changes easy.  Take this for example from my process summary:

"For this lab I did a google search to find a symbol changing script.  I found one created by an Adobe user named Jet and it was linked on another user’s webpage.  I downloaded the script, selected the symbols I wanted to change, and then ran the script.  After the script was completed, I had to move the changed objects back to their appropriate sub-layers.  The file for the script was saved for later use."

My personal design preferences may not be the best.  I have never claimed to being artistically competent in any fashion.  To complicate things, being color blind doesn't help much.  So with that stated, I do welcome any suggestions that anyone cares to offer as a comment to this post.  

Below is a copy of the map I created for this module.  Hope you enjoy it!


Saturday, January 20, 2018

This week we learned about early history of mapping and also key map design principles.  Based on these principles, we were tasked with finding what we thought was a well-designed map as well as a poorly designed map.  Below are my examples along with my brief reviews of the maps. 

Well-Designed Map


Overall, I feel this is a well-designed map.  The key Map Design Principles (20 Tufteisms) I feel this map most meets are:
  • ·         5.  Graphical excellence requires telling the truth about the data: There is no further truth on this map than depicting the correct locations of the archaeological sites on the map.
  • ·         7.  Clear, detailed, and thorough labeling should be used to defeat graphical distortion and ambiguity: Labels are used throughout the map, where labeling could become ambiguous, lines are drawn from label to point, defeating any uncertainty about what is being labeled.
  • ·         13. Above all else, show the data: Clearly the data is shown in this map.  All archaeological sites are clearly depicted on the map.

For my own preferences and as a tourist to this island, I like how this map shows where key points are, how I could possibly get to them using the island’s road infrastructure, and despite being colorblind, I am able to distinguish the elevation levels on the map which would tell me how hard of a hike I would have if I chose to walk to some of these sites.

Poorly-Designed Map


Overall, I feel this is a poorly-designed map.  The key Map Design Principles (20 Tufteisms) I feel this map most meets are:
  • ·         2. Graphical excellence consists of complex ideas communicated with clarity, precision, and efficiency:  This map was single minded and did not clearly convey its message. 
  • ·         3. Graphical excellence is that which gives to the viewer the greatest number of ideas in the shortest time with the least ink in the smallest space: In my opinion, this map failed to do this.  The symbology was grossly inappropriate and didn’t tell the correct story.  Much time had to be invested to determine where even the capital cities belonged and if one was not familiar with their states/capitals, it would be difficult to determine what is what on this map.
  • ·         9.  Show data variation, not design variation: Here is where I don’t feel that the designer effectively displayed the variation in data in a way that makes it easy to interpret the map, let alone gather what message they were trying to convey.  What does it matter if a capital city has a large population or not?  What are you comparing that to? 

My critique of this map and suggestions for improvement would involve mainly redesigning how you display your capital city population densities.  I would suggest using color variations vice size of the symbology to convey this message.  Then, the symbology could be placed correctly at their appropriate locations on the map.  Next, I would find the “so what?” to compare it to.  What does a state capital’s population density mean compared to “X”?  Finally, more map elements: scale bars, more effective legend, maybe a text box explanation of what is going on.  There is much to improve upon in regards to basic map elements.  

Friday, January 12, 2018

Hello Fellow Computer Cartographers! I'm Jason!

Hello Everybody!

As the title says, I'm Jason and I will be joining you all in this fun mapping experience we are about to embark upon.  A little about myself: I am 38 years old, married, and have two kids (ages 14 & 10).  I have a BS in Computer Science and I am working on a second BS in Environmental Management.  This second degree is just buying time as I work through the application process for the Master's program in GIS Administration.  During my 20+ years in the work force I have worked in metal injection molding (we made small metal parts), I have been an officer in the Marine Corps (recently left the reserves as a Major), and since I left the active duty Marine Corps I have been working as a consultant with Booz Allen Hamilton where I have been on contract with the Navy for the last 4 years doing cyber security. 

This will be my third/fourth GIS related course that I have/am taking here at UWF.  Last semester I completed Intro to GIS and Photo Interpretation/Remote Sensing.  This semester I am of course in this cartography course and am also taking GIS Management (an approved Master's level course). 

My main goal right now is to execute a career transition from cyber security to that of forest management with the US Forest Service.  This is the main reason as to why I am back in school while still working a full time job to support my family. 

Adjectives that describe me: stubborn, determined, goal minded.

Below is my Story Map.  You will see that I have been a few places in the world, some are pictured in the map, some are not.  I have also been all over the USA.  If you like what you see or have any questions about me or where I have been, comment below!


Story Map: http://arcg.is/2r3nwTG