Software programmers, librarians, and others are archiving scientific data from government websites, especially with the Trump administration proposing deep budget cuts to agencies that support significant research, as well as government-sanctioned deletion of important data.
"We're most concerned that data might be taken offline and public accessibility will be gone and it'll only be available as [Freedom of Information Act] requests," says University of Pennsylvania librarian Margaret Janz. "Our goal is to make trustworthy copies of data so it will be available to the public and suitable for research."
Janz helps organize archiving events via the DataRefuge program, whose volunteers only copy data in the public domain. The process starts with the nomination of URLs for storage in the nonprofit public Internet Archive, with more complex data "harvested" by participants using scripts and tools developed with either the R or Python coding languages.
From Computerworld
View Full Article
Abstracts Copyright © 2017 Information Inc., Bethesda, Maryland, USA
No entries found