160 likes | 346 Vues
Data Seal of Approval 16 guidelines. in 16 slides Dr. Henk Harmsen. 16 guidelines. 3 for data providers : quality of content 10 for data repositories : quality of dissemination and preservation 3 for data users : quality of usage Implemented by the data repository.
E N D
Data Seal of Approval16 guidelines in 16 slides Dr. Henk Harmsen
16 guidelines • 3 for data providers: quality of content • 10 for data repositories: quality of dissemination and preservation • 3 for data users: quality of usage Implemented by the data repository
one simple assessment process • Self assessment • External review of assessment • Eventually a second review • After approval the assessment with review commentary will be placed on repository website • Five guidelines can be outsourced • The DSA will be granted for the period of one year
Self assessment (1) • Every guidelines has a minimum of dots obliged. Every year the number of obliged dots per guideline is determined by the Board.
Self assessment (2) • Every guideline comes with a series of attention points • Follow the attention points and tell your story • Allocate your dots
Example 4. The data repository has an explicit mission in the area of digital archiving and promulgates it • This guideline relates to the level of authority which the repository has. • Does the repository have a Mission Statement? Does it clearly reference a commissioning authority? • Does the repository have a document which outlines the way in which the mission statement is implemented? • Does the repository carry out promotional activities? • Does the repository have succession planning in place for its digital assets? If so, please describe the plan. • Or if applicable to which TDR have you outsourced?
The mission of DANS is: “DANS is the national organization for storage and durable accessibility of research data in the social sciences and humanities”. DANS propagates this mission among other things, as follows. By means of activities oriented toward making research data accessible: By archiving research data and making them accessibly by means of the online archiving system EASY; Making agreements with organizations that finance research, give orders for research to be carried out or carry out research with the purpose of making data available to others; Targeted acquisition of research data; By participating in research into the need for data archiving in certain disciplines and thus developing contacts among representatives of those disciplines; By granting subsidies to small-scale data-archiving projects (KDP); By issuing a quarterly in conjunction with other institutes: e-Data&Research with an issue of 5000; Developing an activity-based cost model; Selecting academic heritage and making it available, by means of the ADA approach. By means of activities oriented toward keeping research data usable: Development of preservation strategies; Converting research date into other formats. By means of activities oriented toward international standardization and cooperation: By participating in the development of data infrastructures, for example CESSDA PPP, DARIAH, the DANS Text&Taal investigation and other explorations; Developing and establishing a data seal of approval; Developing and operating a Persistent Identifier infrastructure; By contributing to the DDI Tools Foundation. By means of activities oriented toward reuse of research data: Giving scientific credits to researchers who make their data available to others by means of registration in Metis; Putting persons or organizations in the spotlight that encourage data sharing by means of a data prize; Coupling of publications, data and research information; Creation of of a demonstrator for enriched publications; Linking of data sets to articles in journals. (JALC); Linking of components of data sets to articles in journals. DataPlus; Organization of symposia around certain data sets.
the assessment process • Assessments are filled in and stored in a single database • For the self-assessment an easy web based tool is being developed • For granting the DSA and make the logo visible on the repository website a tool is being developed • It will be possible to update only specific guidelines (versioning)
The current DSA-BOARD • Different countries: • NL, UK, F, D, USA • Different scientific fields • Linguistics, Social sciences, Life sciences • Different functions: • Archives, research centers, ICT centers
External references • Criteria for CLARIN centers: Centers need to have a proper and clearly specified repository system and participate in a quality self assessment procedure as proposed by the Data Seal of Approval approach. • DARIAH VCC’s: Repositories should have the DSA • CESSDA call together with statistics: Repositories should have the DSA • KNAW institutes should have DSA
Unique selling points DSA can be applied to any archive. DSA not only pays attention to the archiving institution, but also to the data producer and the data consumer (shared responsibility). DSA is not in conflict with for example RAC (repository audit certification) or DIN, but is rather a step toward it. DSA does not chooses standardization but opts for ‘trust’ (like custom of peer review in the scientific world). DSA also focuses on smaller organizations. DSA is relatively light and therefore easy to implement. Openness, dynamics and speed are possible in the actual DSA implementation. DSA is formulated as points of attention, not as solutions. DSA offers possibilities for subcontracting archiving and still meet the requirements of the DSA .
TDR framework Gold Silver Bronze Full audit based on ISO 16363 or DIN 31644. DSA + public self audit based on ISO 16263 or DIN 31644 DSA
Dynamics • Dynamic process: yearly pinpointing the obliged dots. • Yearly new DSA assessment, could be focused on updating one or more guideline • Self assessment takes no more than one day • Review takes not more than half a day • Guidelines are still ‘liquid’ and could continuously be changed, which however would not affect the process
Keywords • Raising awareness • Built on trust • Easy implementation
Road map • September 2010 • Board fully assessed • Digital assessment tool ready • DSA granting tool ready • June 2011 • 50 new assessments ready • 2012 • Community takes over DSA