--- title: Explore and Download keywords: fastai sidebar: home_sidebar summary: "In this tutorial, the basics of Colabs are introduced and ACS data is Downloaded." ---
This Coding Notebook is the first in a series.
An Interactive version can be found here .
This colab and more can be found at https://github.com/BNIA/colabs
Content covered in previous tutorials will be used in later tutorials.
new code and or information should have explanations and or descriptions attached.
Concepts or code covered in previous tutorials will be used without being explaining in entirety.
If content can not be found in the current tutorial and is not covered in previous tutorials, please let me know.
Disclaimer
Views Expressed: All views expressed in this tutorial are the authors own and do not represent the opinions of any entity whatsover with which they have been, are now, or will be affiliated.
Responsibility, Errors and Ommissions: The author makes no assurance about the reliability of the information. The author makes takes no responsibility for updating the tutorial nor maintaining it porformant status. Under no circumstances shall the Author or its affiliates be liable for any indirect incedental, consequential, or special and or exemplary damages arising out of or in connection with this tutorial. Information is provided 'as is' with distinct plausability of errors and ommitions. Information found within the contents is attached with an MIT license. Please refer to the License for more information.
Use at Risk: Any action you take upon the information on this Tutorial is strictly at your own risk, and the author will not be liable for any losses and damages in connection with the use of this tutorial and subsequent products.
Fair Use this site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. While no intention is made to unlawfully use copyrighted work, circumstanes may arise in which such material is made available in effort to advance scientific literacy. We believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Titile 17 U.S.C. Section 108, the material on this tutorial is distributed without profit to those who have expressed a prior interest in receiving the included information for research and education purposes.
for more information go to: http://www.law.cornell.edu/uscode/17/107.shtml. if you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', ou must obtain permission from the copyright owner.
License
Copyright © 2019 BNIA-JFI
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
In this tutorial, the basics of Colabs are introduced.
By the end of this tutorial users should have an understanding of:
Instructions: Read all text and execute all code in order.
How to execute code:
If you would like to see the code you are executing, double click the label 'Run: '. Code is accompanied with brief descriptions inlined.
Try It! Go ahead and try running the cell below. What you will be shown as a result is a flow chart of how this current tutorial may be used.
#@title Run: View User Path
# This box uses HTML magic '%%html' to denote
# that anything after that tag will be in HTML not Python.
# Within the HTML I use Javascript magic to render
# a graph from the markup written inside the html div
# The graph maps out the possible paths
# you may take when using this notebook.
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid">
graph LR
User>User] --> ExploreAcsTables
ExploreAcsTables --> DownloadAcsTable
DownloadAcsTable -- Repeat --> DownloadAcsTable
DownloadAcsTable --> DecideWhatToDo{DecideWhatToDo}
DecideWhatToDo --> RunAnotherNotebook{RunAnotherNotebook}
DecideWhatToDo --> FinishTheNotebook</div>
Census data comes in 2 flavors:
1) American Community Survey (ACS)
2) Decienial Census
Census data can come in a variety of levels.
These levels define the specificity of the data.
Ie. Weather a data is reporing on individual communities, or entire cities is contingent on the data granularity.
The data we will be downloading in this tutorial, ACS Data, can be found at the Tract level and no closer.
Aggregating Tracts is the way BNIA calculates some of their yearly community indicators!
Each of the bolded words in the content below are levels that are identifiable through a (READ -> 'Geographic Reference Code') .
For more information on Geographic Reference Codes, refer to the table of contents for the section on that matter.
Run the following code to see how these different levels nest into eachother!
#@title Run: Census Granularities
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid">
graph LR
Census_Block --create--> block_group
block_group --> tract_level
tract_level --> csa_level
csa_level --> jurisdiction_level
jurisdiction_level --> state_level
state_level
</div>
State, County, and Tract ID's are called Geographic Reference Codes.
This information is crucial to know when accessing data.
In order to successfully pull data, Census State and County Codes must be provided.
The code herin is configured by default to pull data on Baltimore City, MD and its constituent Tracts.
In order to find your State and County code:
Either
A) Click the link: https://geocoding.geo.census.gov/geocoder/geographies/address where upon entering a unique address you can locate state and county codes under the associated values 'Counties' and 'State'
OR
B) Conversly, click https://www.census.gov/geographies/reference-files/time-series/geo/tallies.html
Searching for a dataset is the first step in the data processing pipeline.
In this tutorial we plan on processing ACS data in a programmatic fashion.
This tutorial will not just allow you to search/ explore ACS tables and inspect their contents (attributes), but also to download, format, and clean it!
Despite a table explorer section being provided, it is not suggested you use this approach, but rather, explore available data tables and retrieve their ID's using the dedicated websites provided below:
American Fact Finder may assist you in your data locating and download needs: https://factfinder.census.gov/faces/nav/jsf/pages/index.xhtml Fact Finder provides a nice interface to explore available datasets. From Fact Finder you can grab a Table's ID and continue the tutorial. Alternately, from Fact Finder, You can download the data for your community directly via an interface. From there, you may continue the tutorial by loading the downloaded dataset as an external resource, instructions on how to do this are provided further below in this tutorial.
Update : 12/18/2019 " American FactFinder (AFF) will remain as an "archive" system for accessing historical data until spring 2020. " - American Fact Finder Website
The New American Fact Finder : https://data.census.gov/cedsci/
This new website is provided by the Census Org. Within its 'Advanced Search' feature exist all the filtering abilities of the older, depricated, (soon discontinued) American Fact Finder Website. It is still a bit buggy to date and may not apply all filters. Filters include years(can only pick on year at a time), geography(state county tract), topic, surveys and Table ID. The filters you apply are shown at the bottom of the query and submitting the search will yield data tables ready for download as well as table ID's that you may snag for use in this tutorial.
From ME:
These tables are created by the census and are pre-compiled views of the data.
From the ACS Website:
Detailed Tables contain the most detailed cross-tabulations, many of which are published down to block groups. The data are population counts. There are over 20,000 variables in this dataset.
Subject Tables provide an overview of the estimates available in a particular topic. The data are presented as population counts and percentages. There are over 18,000 variables in this dataset.
For more Information (via API) Please Visit
You will need to run this next box first in order for anything following it to work
%matplotlib inline
!jupyter nbextension enable --py widgetsnbextension
Access Google Drive directories:
You can also import file directly into a temporary folder in a public folder
Now lets explore the file system using the built in terminal:
By default you are positioned in the ./content/ folder.
Please Note: The following section details a programmatic way to access and explore the census data catalogs. It is advised that rather than use this portion of the section of the tutorial, you read the section 'Searching For Data' --> 'Search Advice' above and which provide links to dedicated websites hosted by the census bureaue explicitly for your data exploration needs!
Retrieve and search available ACS datasets through the ACS's table directory.
The table directory contains TableId's and Descriptions for each datatable the ACS provides.
By running the next cell, an interactive searchbox will filter the directory for keywords within the description.
Be sure to grab the TableId once you find a table with a description of interest.
Once you a table from the explorer has been picked, you can inspect its column names in the next part.
This will help ensure it has the data you need!
The Data Structure we recieve is different than the prior table.
Intake and processing is different as a result
Now lets explore what we got, just like before.
Only difference is that the column names are automatically included in this query.
Intro
Hopefully, by now you know which datatable you would like to download!
The following Python function will do that for you.
Description: This function returns ACS data given appropriate params.
Purpose: Retrieves ACS data from the web
Services
Input:
Output:
How it works
Before our program retrieve the actual data, it will want the table's metadata.
The Function changes the URL it requests data from depending on if it is an S or B type table the user has requested
Multiple calls for data must be made as a single table may have several hundred columns in them.
Our program not just pulls tract level data but the aggregate for the county.
Finally, we will download the data in two different formats if desired.
If we choose to save the data, we save it with the Table IDs + ColumnNames, and once without the TableIDs.
#@title Run: Class Diagram retrieve_acs_data()
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid" style="height: 300px;">
classDiagram
RetrieveAcsData <|-- RetrieveAcsData
RetrieveAcsData : + String: state => required
RetrieveAcsData : + String: county => required
RetrieveAcsData : + String: tract => required
RetrieveAcsData : + String: tableId => required
RetrieveAcsData : + String: saveAcs => required
RetrieveAcsData : + String: includeCountyAgg => required
RetrieveAcsData : + Int: year => required
RetrieveAcsData: + getParams(keys)
RetrieveAcsData: + getBCityParams(keys)
RetrieveAcsData: + readIn( url )
RetrieveAcsData: + addKeys( table, params)
#@title Run: retrieve_acs_data Flow Chart
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid">
graph LR
retrieve_acs_data>retrieve_acs_data] --> getMetaData
getMetaData --> useSubjectTables{useSubjectTables?}
useSubjectTables --> getTractData
useSubjectTables --> getCountyData
getTractData --> RenameCols
getCountyData --> RenameCols
RenameCols --> Save{Save}
#@title Run: Gannt Chart retrieve_acs_data()
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid">
gantt
title retrieve_acs_data
dateFormat dd
section getMetaData
getMetaData :a1, 01, 1d
section getTractData
getTractData :a2, after a1 , 1d
section getCountyData
getCountyData :after a1 , 1d
section Clean
Clean :a3, after a2 , 1d
section Save
Save :after a3 , 1d
</div>
#@title Run: Sequence Diagram retrieve_acs_data()
%%html
<script src="https://code.jquery.com/jquery-1.10.2.js"></script>
<script src="https://unpkg.com/mermaid@7.1.0/dist/mermaid.min.js"> </script>
<script> window.mermaid.init() </script>
<div class="mermaid">
sequenceDiagram
retrieve_acs_data->>+getMetaData: URL
getMetaData-->>-retrieve_acs_data: MetaData
retrieve_acs_data->>+getData: URL
getData-->>-retrieve_acs_data: Data
retrieve_acs_data-->>+Prettify: Data
Prettify-->>+Save: Data
</div>
Now use this function to Download the Data!
# Our download function will use Baltimore City's tract, county and state as internal paramters
# Change these values in the cell below using different geographic reference codes will change those parameters
tract = '*'
county = '510'
state = '24'
# Specify the download parameters the function will receieve here
tableId = 'B19001'
year = '17'
saveAcs = True
df = retrieve_acs_data(state, county, tract, tableId, year, saveAcs)
df.head()