Nancy Potok, formerly deputy director and chief operating officer of the U.S. Census Bureau, has been appointed chief statistician of the United States. Located within the Office of Management and Budget, Potok?s new position is responsible for providing coordination, guidance, and oversight for U.S. federal statistical agencies and activities. This marks the first time in 24 years that the position has been vacant after most recently being occupied by Katherine Wallman, who retired.
CREC is happy to announce the release of the first State Data Sharing Initiative report, ?Improved State Administrative Data Sharing: A Strategy to Promote Evidence-Based Policymaking for Economic and Workforce Development.? The research findings provide policymakers with insights about the parameters used to manage intrastate administrative data sharing, especially for corporate income tax and unemployment insurance wage records?the most valuable data resources available for economic and workforce development program evaluation and policy analysis. For the full report, click here.
News
Commission on Evidence-Based Policymaking Hearing The Commission on Evidence-Based Policymaking (CEP) will hold its third hearing to gather input from stakeholders on February 9, 2017 in San Francisco, CA. The hearing provides participants with the opportunity to present their views on issues relevant to the Commission’s charge as outlined in Public Law 114-140. Speakers will have 5-7 minutes to provide a brief oral statement and submit any written statements for the record. CEP encourages interested participants to provide a two-three sentence abstract of their statement no later than January 24 and include their name and professional affiliation (if applicable) to Input@cep.gov.
Self-Funding Our Statistical Agencies It?s impossible to evaluate the effectiveness of public policies without quality data. It?s impossible to accurately market goods and services without geographic and demographic inputs. It?s impossible to track global warming and even to tap the benefits of Waze and Google Maps without inputs from government surveys. And yet, the ability of our statistical agencies to consistently provide top quality data is subject to the whims of congressional appropriations.
Open Data in the Age of Trump The federal government?s open data transformation can, and should, continue. Here?s how.
Eight Key Steps to Implement the DATA Act Carol Fortine Ochoa, Inspector General at GSA, discusses the agency?s progress in meeting the upcoming DATA Act deadlines.
Preserving Agency Data During the Presidential Transition Hudson Hollister, executive director of the Data Coalition, discusses how agencies can ensure their data is preserved during the presidential transition, and the upcoming DATA Act deadline.
Governments large and small spend considerable amounts of public money to pay for health facilities, public safety, social aid and public works, and capital improvement projects. This money is usually derived from taxes that are allocated to federal programs by Congress, but they can also come directly from agency fines, fees, or settlement collections. This makes reporting Federal public spending data including agency financial information somewhat problematic and just plain difficult.
Why Data Will Rule Higher Education At the annual EDUCAUSE conference this past fall, Jeffrey J. Selingo sat down with F. King Alexander, president of Louisiana State University (LSU), for a conversation on bridging the gap between technology solutions and business problems in higher education. King is a big fan of data?s ability to help educators drive success for their institutions and students.
The ?democratizing? of data promotes transparency and knowledge building. The Office of Policy Development and Research (PD&R) at the Department of Housing and Urban Development (HUD) has made great strides in this area; this article highlights some of their most recent accomplishments.
New & Updated Data Sources
Visualization of the Week
Pairing Wine and Cheese with Data Science University of Toronto computational biology professor Gary Bader has created an interactive data visualization that allows users to identify complementary wine and cheese pairings based on different factors, including a wine?s country of origin and a cheese?s moisture level. The visualization uses software called Cytoscape that Bader and other researchers initially developed for complex genetic and molecular analysis, such as mapping the relationship between different genes and autism or cancer. Users can search for approximately 1,000 ideal pairings between 100 different red and white wines and 270 cheeses.
Did you work on a great report that you want your colleagues to know about? Just email us and we?ll include it here.
Federal Rulemaking and Calls for Comment
APDU maintains a list of open calls for comment on proposed federal data collections. We periodically alert APDU members to newly added calls for comment. Over the last several weeks, calls for comment on the following proposed data collections we