Intermediate Data Visualization Techniques in Tableau

August 25-September 3, 2020

Virtual Training

AGENDA

A picture is worth a thousand words. Use data to state your case using easy-to-understand data visualization tools. Give your audience the freedom to adapt your data in new ways in interactive dashboards that answer immediate questions and uncovers new insights. Data visualization tools can help you communicate better both internally and with your partners.

Tableau can help you produce more intuitive data visualizations, and we can show you how. In this course, you will build your skills in making appropriate graphics, but you will also incorporate complex calculations in ways that improve insights, make charts more relevant, and create the most impactful dashboard graphics.

Learn how to clean, shape, aggregate, and merge frequently used public data in Tableau Prep. Then, organize your visualizations into sleek dashboards in Tableau Desktop. We will provide helpful tips on how to analyze, design, and communicate these data in ways that will wow your supervisor and organization’s customers.

Training Prerequisites:

Skills: Participants must have a basic understanding of how Tableau works before attending this class, including knowledge of Tableau terminology, uploading data, editing data sources, and creating basic charts. Attendees should be familiar with all materials presented in the Pre-Session Videos: Overview of Charts and Calculated
Fields.
Tools: Laptop, wired mouse, Tableau Desktop (personal, professional, or public version), and Tableau Prep.
• Public version of the Tableau desktop is available at:
https://public.tableau.com/s/download
• Tableau Prep Software can be downloaded here:
https://www.tableau.com/products/prep/download

**Zoom will be required for this training – if you have Zoom restrictions for a work laptop, we recommend using a personal laptop or desktop. We do not recommend using an iPad for this training.
Pricing
APDU, C2ER, LMI Institute Premium Organizational Members $ 495
APDU, C2ER, LMI Institute Individual & Organizational Members $ 575
Non-Members $ 715

CANCELLATION POLICY: APDU must confirm cancellation before 5:00 PM (Eastern Standard Time) on August 14, 2020, after which a $135 cancellation fee will apply. Substitute registrations will be accepted.

APDU Member Blog Post: It’s not too late to rebuild data-user trust in Census 2020 data products

By: Jan Vink, Cornell Program on Applied Demographics
Vice chair of the Federal State Cooperative on Population Estimates Steering Committee
Twitter: @JanVink18
Opinions are my own

The Census Bureau is rethinking the way it will produce the data published from the Census 2020. They argue that the old way is not good enough anymore in this day and age because with enough computer power someone could learn too many details about the respondents.

There are two separate but related aspects to this rethinking:

  1. The table shells: what tabulations to publish and what not to publish
  2. Disclosure Avoidance Systems (DAS) that add noise to the data before creating these tables

Both aspects have huge consequences for data users. A good place to start reading about this rethinking is the 2020 Census Data Products pages at the Census Bureau.

The Census Bureau is aware that there will be this impact and has asked the data-user community for input in the decision process along the way. There were Federal Register Notices asking for use cases related to the 2010 tables, an ask for feedback on a proposed set of tables. There were publications of application of a DAS to 1940 Census data, 2018 PL94-171 data from the 2018 test and the 2010 Demonstration Products. Currently the Census Bureau is asking for feedback on the measurement of progress of the DAS implementation they plan to use for the first set of products coming out of the Census.

The intentions of stakeholder involvement were good BUT didn’t lead to buy-in from those stakeholders and many are afraid that the quantity and quality of the published data will severely impact the capability to make sound decisions and do sound research based on Census 2020 and products that are directly or indirectly based on that data. Adding to this anxiety is the very difficult unexpected circumstances the Census Bureau has to deal with while collecting the data.

From my perspective as one of those stakeholders that is wary about the quantity and quality of the data there are a few things that could have gone better:

  • The need for rethinking is not communicated clearly. For example, I cannot find a Census Bureau publication that plainly describe the re-identification process, all I can find are a few slides in a presentation. A layman’s explanation of the legal underpinning would be helpful as well as some argue that there has been a drastic reinterpretation.
  • The asks for feedback were all very complicated, time consuming and reached only a small group of very dedicated data users that felt tasked to respond for many and stick with the low hanging fruits.
  • It is not clear what the Census Bureau did with the responses.
  • The quality of the 2010 Demonstration Products was very low and would have severely impacted my use of the data and many others uses.
  • Most Census Bureau communications about this rethinking consisted of a mention of a trade-off between privacy and accuracy followed by a slew of arguments about the importance of privacy and hardly any mention how important accuracy is for the mission of the Census Bureau. Many stakeholders walked away with the feeling that the Bureau feels responsibility for privacy protection, but not as much for accuracy.

There is a hard deadline for the production of the PL94-171 data, although Congress has the power to extend that date because of the Covid-19 pandemic. Working back from that, I am afraid that decision time is not too far away. The Census Bureau is developing the DAS using an agile system with about 8 weeks between ‘sprints’. The Bureau published updated metrics from sprint II at the end of May, but already started with sprint IV at that time. If we keep the 8 weeks between sprints this implies in my estimation that there is room on the schedule for 2 or 3 more sprints and very little time to rebuild trust from the data-user community.

Examples of actions that would help rebuilding some trust are:

  • Appointing someone that is responsible for the stakeholder interaction. So far, my impression is that there is no big picture communication plan and two-way communication depends too much on who you happen to know within the Census Bureau. Otherwise the communication is impersonal and slow and often without a possibility for back-and-forth. This person should also have the seniority to fast-trac the publication review process so stakeholders are not constantly 2 steps behind.
  • Plan B. A chart often presented to us is a line that shows the trade-off between privacy and accuracy. The exact location of that line depends on the privacy budget and the implementation of the DAS and the Census Bureau seems to have the position that they can implement a DAS with a sweet spot between accuracy and privacy that would be an acceptable compromise. But what if there is no differential privacy based DAS implementation (yet?) that can satisfy a minimal required accuracy and a maximal allowed disclosure risk simultaneous? So far it is an unproven technique for such a complex application. It would be good to hear that the Census Bureau has a plan B and a set of criteria that would lead to a decision to go with plan B.
  • Promise another set of 2010 data similar to the 2010 demonstration products so data users can re-evaluate the implications of the DAS. This should be done in a time frame that allows for tweaks to the DAS. Results of these evaluations could be part of the decision whether to move to plan B.
  • Have a public quality assurance plan. The mission of the Census Bureau is to be the publisher of quality data, but I could not find anything on the Census Bureau website that indicates what is meant with data quality and what quality standards are used. Neither could I find who in the Census Bureau oversees and is responsible for data quality. For example: does the Bureau see accuracy and fitness for use as the same concepts? Others disagree. And what about consistency? Can inconsistent census data still be of high quality? Being open about data quality and have a clear set of quality standards would help showing that quality is of similar priority as privacy.
  • Publish a time line, with goals and decision points.
  • Feedback on the feedback: what did the Bureau do with the feedback? What criteria were used to implementing some feedback but not others?

Time is short and stakes are high, but I think there are still openings to regain trust of the data community and have Census data products that will be of provable high quality and protects the privacy of the respondents at the same time.

 

 

 

Intermediate Application of Data Sets: BLS Unemployment Data

Did you know that there are at least three sources of unemployment statistics in the United States? In this APDU webinar you’ll learn about the three primary data sources—Current Population Survey (CPS), Local Area Unemployment Statistics (LAUS), and American Community Survey (ACS)—and how they differ. Then we’ll explore how to access the official national and state unemployment statistics, based on CPS.

Presenter:
Garrett Schmitt, Senior Economist, Bureau of Labor Statistics

Intermediate Application of Data Sets: CDC Mortality Data

Mortality data are in the news on a daily basis. Accurate data is key to tracking the spread of COVID-19. However, there are important nuances that data users need to know:

  • How are mortality data collected?
  • When are data released?
  • Where can you access the data?
  • What are the differences between provisional and final mortality data?

Register for this APDU webinar today to learn more about mortality data from the CDC.

Presenter:
Robert N. Anderson, Ph.D., Chief, Mortality Statistics Branch, National Center for Health Statistics

Intermediate Application of Data Sets: New Coronavirus Household Pulse Survey from the Census Bureau

The coronavirus pandemic is unprecedented in recent history. To meet this challenge, the U.S. Census Bureau, in collaboration with other federal statistical agencies, has developed two new experimental surveys: a Household Pulse Survey and a Small Business Pulse Survey. This webinar will provide an overview of the household survey, including the following:

  • What type of information is collected in the household survey?
  • What is the data collection method?
  • What levels of geography are being reported?
  • Where can someone find the data?

Presenter:
Jason Fields, Senior Researcher for Demographic Programs, U.S. Census Bureau

New Microdata Access Tool for ACS and CPS

Have you ever looked for data from ACS or CPS, but published tables did not have exactly what you needed? Have you ever wished you could create quick crosstabs without writing code? The Census Bureau is developing a new microdata access feature on data.census.gov that allows users to create custom crosstabulations from ACS and CPS. Register for this webinar today to learn more about this feature and to submit feedback to the Census Bureau about how microdata access could be improved.

Presenter:
Tyson Weister, Survey Statistician, U.S. Census Bureau

APDU Launches 2020 Annual Conference

APDU is opening registration for the 2020 Annual Conference, set to be held at the Key Bridge Marriott in Arlington, VA on July 29-30, 2020. Trending issues in the world of data – issues of privacy, accuracy, and access – are profoundly changing how we think about the collection, production, sharing, and use of data. Register for the APDU Annual Conference today to learn how the coronavirus is impacting public data and evidence-based policymaking. Attendees will also hear about outcomes from the decennial census and the privacy and public health issues that are impacting it in 2020.

We recognize the tentative nature of in-person events in these uncertain times, but we will continue to plan for the conference with the hopes of a return to normal. We are evaluating plans for a hybrid virtual conference to ensure that the conference will be delivered either live or online. Please know that cancellation fees will not apply to those who register early. In the case the conference cannot be held in-person it will transition to online, your registration will automatically transfer to the online content. If you don’t wish to attend online, we will provide a full refund on request. We will monitor these issues closely and be responsive to our members and partners.

APDU Member Post: Assessing the Use of Differential Privacy for the 2020 Census: Summary of What We Learned from the CNSTAT Workshop

By:

Joseph Hotz, Duke University

Joseph Salvo, New York City Department of City Planning

Background

The mission of the Census Bureau is to provide data that can be used to draw a picture of the nation, from the smallest towns and villages to the neighborhoods of the largest cities. Advances in computer science, better record linkage technology, and the proliferation of large public data sets have increased the risk of disclosing information about individuals in the census.

To assess these threats, the Census Bureau conducted a simulated attack, reconstructing person-level records from published 2010 Census tabulations using its previous Disclosure Avoidance System (DAS) that was based in large part on swapping data records across households and localities. When combined with information in commercial and publicly available databases, these reconstructed data suggested that 18 percent of the U.S. population could be identified with a high level of certainty. The Census Bureau concluded that, if adopted for 2020, the 2010 confidentiality measures would lead to a high risk of disclosing individual responses violating Title 13 of the U.S. Code, the law that prohibits such disclosures.

Thus, the Census Bureau was compelled to devise new methods to protect individual responses from disclosure. Nonetheless, such efforts – however well-intentioned – may pose a threat to the content, quality and usefulness of the very data that defines the Census Bureau’s mission and that demographers and statisticians rely on to draw a portrait of the nation’s communities.

The Census Bureau’s solution to protecting privacy is a new DAS based on a methodology referred to as Differential Privacy (DP). In brief, it functions by leveraging the same database reconstruction techniques that were used to diagnose the problem in the previous system: the 2020 DAS synthesizes a complete set of person- and household-level data records based on an extensive set of tabulations to which statistical noise has been added. Viewed as a continuum between total noise and total disclosure, the core of this method involves a determination regarding the amount of privacy loss or e, that can be accepted without compromising data privacy while ensuring the utility of the data. The key then becomes “where to set the dial”—set e too low and privacy is ensured at the cost of utility, but set e too high and utility is ensured but privacy in compromised. In addition to the overall level of e, its allocation over the content and detail of the census tabulations for 2020 is important. For example, specific block-level tabulations needed for redistricting may require a substantial allocation of the privacy-loss budget to achieve acceptable accuracy for this key use, but the cost is that accuracy of other important data (including for blocks, such as persons per household) will likely be compromised. Finding ways to resolve these difficult tradeoffs represents a serious challenge for the Census Bureau and users of its data.

The CNSTAT Workshop

In order to test how well this methodology worked in terms of the accuracy of noise-infused data, the Census Bureau issued special 2010 Census files subject to the 2020 DAS. The demonstration files applied the 2020 Census DAS to the 2010 Census confidential data — that is, the unprotected data from the 2010 Census that are not publicly available. The demonstration data permit scientific inquiry into the impact of DP. In addition, the Census commissioned the Committee on National Statistics (CNSTAT) of the National Academies of Sciences, Engineering and Medicine to host a 2-day Workshop on 2020 Census Data Products: Data Needs and Privacy Considerations, held in Washington, DC, on December 11-12, 2019. The two-fold purpose of the workshop was:

  • To assess the utility of the tabulations in the 2010 Demonstration Product for specific use cases/real-life data applications.
  • Generate constructive feedback for the Census Bureau that will be useful in setting the ultimate privacy loss budget and on the allocation of shares of that budget over the broad array of possible tables and geographic levels.

We both served as the co-chairs of the Committee that planned the Workshop. The Workshop brought together a diverse group of researchers who presented findings for a wide range of use cases that relied on data from past censuses.

These presentations, and the discussions surrounding them, provided a new set of evidence-based findings on the potential consequences of the Census Bureau’s new DAS. In what follows, we summarize “what we heard” or learned from the Workshop. This summary is ours alone; we do not speak for the Workshop’s Planning Committee, CNSTAT, or the Census Bureau. Nonetheless, we hope that the summary below provides the broader community of users of decennial census data with a better understanding of some of the potential consequences of the new DAS for the utility of the 2020 Census data products. Moreover we hope it fosters an on-going dialogue between the user community and the Census Bureau on ways to help ensure that data from the 2020 Census are of high quality, while still safeguarding the privacy and confidentiality of individual responses.

What We Heard

  • Population counts for some geographic units and demographic characteristics were not adversely affected by Differential Privacy (DP). Based on results presented at the Workshop, it appears that there were not, in general, differences in population counts between the 2010 demonstration file at some levels of geography. For the nation as a whole and for individual states, the Census’s algorithm, ensured that that counts were exact, i.e., counts at these levels were held invariant by design. Furthermore, the evidence presented also indicated that the counts in the demonstration products and those for actual 2010 data were not very different for geographic areas that received direct allocations of the privacy budget, including most counties, metro areas (aggregates of counties) and census tracts. Finally, for these geographic areas, the population counts by age in the demonstration products were fairly accurate when using broader age groupings (5-10 year groupings or broader ones), as well as for some demographic characteristics (e.g., for non-Hispanic whites, and sometimes for Hispanics).
  • Concerns with data for small geographic areas and units and certain population groups. At the same time, evidence presented at the Workshop indicated that most data for small geographic areas – especially census blocks – are not usable given the privacy-loss level used to produce the demonstration file. With some exceptions, applications demonstrated that the variability of small-area data (i.e., blocks, block groups, census tracts) compromised existing analyses. Many Workshop participants indicated that a larger privacy loss budget will be needed for the 2020 Census products to attain a minimum threshold of utility for small-area data. Alternatively, compromises in the content of the publicly-released products will be required to ensure greater accuracy for small areas.

The Census did not include a direct allocation of the privacy-loss budget 2010 demonstration file to all geographic areas, such as places and county subdivisions, or to detailed race groups, such as American Indians. As noted by numerous presenters, these units and groups are very important for many use cases, as they are the basis for political, legal, and administrative decision-making. Many of these cases involve small populations and local officials rely on the census as a key benchmark; in many cases, it defines who they are.

  • Problems for temporal consistency of population counts. Several presentations highlighted the problem of temporal inconsistency of counts, i.e., from one census to the next using DP. The analyses presented at the Workshop suggested that comparisons of 2010 Census data under the old DAS to 2020 Census data under DP may well show inexplicable trends, up or down, for small geographic areas and population groups. (And comparisons of 2030 data under DP with 2020 data under DP may also show inconsistencies over time). For example, when using counts as denominators to monitor disease rates or mortality at finer levels of geography by race, by old vs young, etc., the concern is that it will be difficult to determine real changes in population counts, and, thus, real trends in disease or mortality rates, versus the impact of using DP.
  • Unexpected issues with the post-processing of the proposed DAS. The Top-Down algorithm (TDA) employed by the Census Bureau in constructing the 2010 demonstration data produced histograms at different levels of geography that are, by design, unbiased —but they are not integers and include negative counts. The post-processing required to produce a microdata file capable of generating tabulations of persons and housing units with non-negative integer counts produced biases that are responsible for many anomalies observed in the tabulations. These are both systematic and problematic for many use cases. Additional complications arise from the need to hold some data cells invariant to change (e.g., total population at the state level) and from the separate processing of person and housing unit tabulations.

The application of DP to raw census data (the Census Edited File [CEF]) produces estimates that can be used to model error, but the post-processing adds a layer of complexity that may be very difficult to model, making the creation of “confidence intervals” problematic.

  • Implications for other Census Bureau data products. Important parts of the planned 2020 Census data products cannot be handled by the current 2020 DAS and TDA approach. They will be handled using different but as-yet-unspecified methods that will need to be consistent with the global privacy-loss budget for the 2020 Census. These products were not included in the demonstration files and were out of scope for the Workshop. Nonetheless, as noted by several presenters and participants in the Workshop, these decisions raise important issues for many users and use cases going forward. To what extent will content for detailed race/Hispanic/nationality groups be available, especially for American Indian and Alaska Native populations? To what degree will data on household-person combinations and within-household composition be possible under DAS?

For example, while the Census Bureau has stated that 2025 will be the target date for the possible application of DP to the ACS, they indicated that the population estimates program will be subject to DP immediately following 2020. These estimates would then then be used for weighting and post-stratification adjustments to the ACS.

  • Need plan to educate and provide guidance for users of the 2020 Census products. Regardless of what the Census Bureau decides with respect to ε and how it is allocated across tables, the Workshop participants made clear that a major re-education plan for data users’ needs to be put in place, with a focus on how best to describe key data and the shortcomings imposed by privacy considerations and error in general. Furthermore, as many at the Workshop voiced, such plans must be in place when the 2020 Census products are released to minimize major disruptions to and problems with the myriad uses made of these data and the decisions based on them.
  • Challenging privacy concerns and their potential consequences for the success of the 2020 Census. Finally, the Workshop included a panel of experts on privacy. These experts highlighted the disclosure risks associated with advances in linking information in public data sources, like the decennial census, with commercial data bases containing information on bankruptcies and credit card debt, driver licenses, and federal, state and local government databases on criminal offenses, public housing, and even citizenship status. While there are federal and state laws in place to protect the misuse of these governmental databases as well as the census (i.e., Title 13), their adequacy is challenged by advances in data linkage technologies and algorithms. And, as several panelists noted, these potential disclosure risks may well undercut the willingness of members of various groups – including immigrants (whether citizens or not), individuals violating public housing codes, or those at risk of domestic violence – to participate in the 2020 Census.

The Census Bureau has recently stated that it plans to have CNSTAT organize a follow-up set of expert meetings to “document improvements and overcome remaining challenges in the 2020 DAS.” In our view, such efforts, however they are organized, need to ensure meaningful involvement and feedback from the user community. Many within that community remain skeptical of the Bureau’s adoption of Differential Privacy and its consequences for their use cases. So, not only is it important that Census try to address the various problems identified by Workshop presenters and others who evaluated the 2010 demonstration products, it also is essential that follow-up activities are designed to involve a broader base of user communities in a meaningful way.

We encourage members of the census data user community to become engaged in this evaluation process, agreeing, if asked, to become involved in these follow-up efforts. Such efforts will be essential to help ensure that the Census Bureau meets its dual mandate of being the nation’s leading provider of quality information about its people and economy while safeguarding the privacy of those who provide this information.

2020 APDU Conference Call for Proposals

#Trending in 2020: Data Privacy, Accuracy, and Access

APDU is welcoming proposals on any topic related to the privacy, accuracy, and access of public data.  Proposals can be for a single presentation or panel, whether based on a particular project, data practice, or formal paper.  In keeping with the theme of the conference, our interest is in highlighting the breadth of public data to both producers and consumers of public data.  Some examples of topics might cover:

  • Privacy
    • Differential privacy and tiered data
    • State/local data privacy issues
    • Data Suppression
    • Corporate data privacy (ex. Facebook’s use of differential privacy)
  • Accuracy
    • Machine learning and the use of programming languages
    • How data accuracy will affect redistricting or federal allocations
    • Federal agencies data protection actions’ impact on other agency data
    • Synthetic or administrative data
    • Decennial Census
      • Citizenship question
      • Complete Count Committee
  •  Access
    • Future public data and policy developments
    • Current availability of public data (health, education, the economy, energy, the environment, climate, and other areas)
    • Federal statistical microdata such as ResearchDataGov
    • Federal Data Strategy updates and advocacy

Proposal Deadline: February 28, 2020.

You may submit ideas for a single presentation or a full panel (three presenters, plus a moderator). However, it is possible that we will accept portions of panel submissions to combine with other presenters. Submissions will be evaluated on the quality of work, relevance to APDU Conference attendees, uniqueness of topic and presenter, and thematic fit.

Please submit your proposal using the Survey Monkey collection window below.  Proposals will need to be submitted by members of APDU, and all presenters in a panel must register for the conference (full conference registration comes with a free APDU membership).  Proposers will be notified of our decision by March 13, 2020.

About APDU

The Association of Public Data Users (APDU) is a national network that links users, producers, and disseminators of government statistical data. APDU members share a vital concern about the collection, dissemination, preservation, and interpretation of public data.  The conference is in Arlington, VA on July 29-30, 2020, and brings together data users and data producers for conversations and presentations on a wide variety of data and statistical topics.

Create your own user feedback survey

The 2020 Census is Here and Businesses can Help

Companies make strategic decisions every day that rely on accurate data about customers, employees and markets. In the United States, the information gleaned from the decennial population census is an important ingredient in much of the data that companies use to make a range of decisions such as where to locate new stores/facilities, how to market products, and what services to offer customers. The federal government also uses census information to distribute more than $1.5 trillion for programs like roads, education and workforce development that help to strengthen the economy.

The next nationwide count starts in most of the country this March, and companies can help ensure its accuracy by encouraging employees and customers to participate.

Below are a series of resources from the US Census Bureau and ReadyNation that businesses and business membership organizations may find helpful when developing plans to support the count:

  • Newsletter language: Templates for (i) business organizations to engage their membership and (ii) companies to engage their employees.