Changes

Jump to navigation Jump to search
=Notes=
The [https://www.census.gov/programs-surveys/acs/news/data-releases/2019/release-schedule.html ACS releases datasets ] in October September through December of the following year. It has the following datasets:
*ACS5: 5-year estimates for a broad list of variables, from 2009-2020, covering 29,573 places.
*ACS1: 1-year estimates from 2005-2020 for select variables for 630 places with populations of 65,000 people or more
*ACS1s: 1-year supplements covering 2,323 places with 20,000 people or more
*Some historical datasets (like ACS3), which are not currently supported
ACS provides information [https://www.census.gov/programs-surveys/acs/geography-acs/areas-published.html on a broad range of geographies, including states, places, and zctas].
*PUMS data for 2017: https://www2.census.gov/programs-surveys/acs/data/pums/2017/1-Year/
*Population and other data are available from: http://census.ire.org/data/bulkdata.html
Note: PUMS is only available for places in the [https://en.wikipedia.org/wiki/Public_Use_Microdata_Area Public Use Microdata Areas], which have a population 100,000 or more.
=Using api.census.gov=
==Working with the API==
===Getting a Key===
The API doesn't seem to require a key, but you can request one from https://api.census.gov/.
===Quoting, Nulls, and Error Codes===Every field is string quoted (even if it's a number) including error codes, except for nulls. The data contains null values, as well as values that indicate error codes (-999999999,-888888888,-666666666,-555555555,-222222222, and others. See https://www.census.gov/data/developers/data-sets/acs-1year/notes-on-acs-estimate-and-annotation-values.html. Generally, it appears that all negative values can be striped. Every field is string quoted including error codes, except for nulls.
===Improper Encoding===
The API uses improper URL encoding (or more accurately, doesn't use URL encoding), so most methods for auto-populating URL parameters in scripting languages will not function correctly. Build URLs manually.
The API returns improper JSON. Essentially, it just encodes a flat data structure into JSON-like syntax one line at a time, and completely ignores the structure of the data (and so much of the point of encoding it in JSON!). However, the first (zeroth) column is the requested geography name, so multiple pulls can be joined using that. And the first (zeroth) row is a valid set of column headers. Also, the nulls are valid JSON encodings. This, and the strong quoting, makes using a JSON handler likely worthwhile. An alternative is to strip the JSON tags and treat the data as a badly encoded CSV, which it is.
===Field and Group Names===To request a group ask for '''group(B06009)''' where you would normally request a field like '''B06011_001E'''. It looks like programmers can rely on groups and fields for estimates being identifiable from the "_" and the string length (\w{1}\d{5}\w{0,1}_\d{3}\w{1}), where the first letter is B or C and the last letter is E. Attributes for estimates can have two one last letter (like M for Margin) or two last letters(like AE - Annotation of Estimate and AM - Annotation of Margin). Note that requests of individual estimates don't provide the attributes but requests of groups do.
=Old Approach using AFF=

Navigation menu