Child pages
  • Grant Information

Measuring Program Results

 

Organization Name: Indiana University

Date: March 17, 2006

 

Activity 4

F raming the Evaluation

 

What is the program’s name?

Variations3: An Integrated Digital Library and Learning System for the Music Community

What partners are involved?

Test Sites: New England Conservatory, The Ohio State University, The Tri-College Consortium (Bryn Mawr, Haverford, Swarthmore); Content Partner: New World Records

Who are the program’s stakeholders?

What do they want to know?

IMLS

Have we achieved outcomes?

Have we spent money (to what extent)?

How many institutions adopt Variations3?

Indiana University

Can we sustain?

Will other institutions contribute to this effort? (system and metadata model)

Can we still get new functionality?

Librarians/Libraries (test sites)

How much work and money will it take to implement?

How does this compare to other solutions?

Will this work for our institution?

How does this help us advance?

What happens after the grant ends?

Will IU deliver what is promised in the grant?

Faculty

Will this save me time?

Will it help my students?

Will it improve my teaching?

Is it worth the effort?

Will it improve my research/performance?

Can I have the features I need?

Students

Will this save me time?

Can I put it on my iPod?

Can I have access after I graduate?

Can I get new features?

Content providers/publishers/record companies

What are you doing with my intellectual property?

What is the impact on my revenue?

How does this affect us?

Can we adopt tools in our service (tech transfer)?


Program Purpose Statement

We do what?

Provide a sustainable digital music library system that supports discovery, delivery, and use of music in various formats.

For whom?

Academic music libraries

Cataloging strategists

 

For what outcome/benefit?

Libraries will successfully adopt Variations3 to provide access to collections for research, teaching, and study.

Libraries will contribute to sustainability of Variations3 (ongoing development and support).

Cataloging strategists will improve their understanding of the costs and benefits of metadata models similar to Variations3.

 

 


 

 

Inputs

Outputs

Project staff

Number of adopters

Advisory Board

Number of metadata records created

IMLS Grant and guidance

Number of scores and recordings served

IU Funding Match

Number of scores and recordings added

Consultants (metadata)

Number of sessions

Variations2 software

Conference presentations

Test Sites

Articles

IU Digital Library Program

 

 

 

 

 

 

 

 

 

Activities

Services

Gather requirements (test sites)

Deliver software

Recruit Advisory Board members

Release Metadata model specification

Hire Staff

Support partners

Conduct usability test

 

Field studies of use of software

 

Analyze usage logs

 

Improve metadata model

 

Develop and implement cooperative cataloging model

 

Experiment with other metadata streamlining strategies

 

IMLS reporting

 

Grant administration

 

PR

 

Dissemination of results

 

Develop sustainability model

 


Outcomes

 

 

 

 

 

Outcome 1.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries that are test sites will successfully adopt Variations3 to provide access to collections for research, teaching, and study.

 

 

 

The # and % of institutions participating as test sites that commit to using the system past the period funded by this grant.

Structured Interviews

Test sites

End of Grant

3

 

Outcomes

 

 

 

 

 

Outcome 2.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries that were not test sites will successfully adopt Variations3 to provide access to collections for research, teaching, and study.

 

 

During Grant period:

The # and % of non-test site institutions that express interest in learning more about Variations3.

 

After Grant:

The # and % of non-test site institutions that adopt Variations3.

 

During Grant period: Contacts

 

 

After Grant:

Survey or structured interview

Music Libraries world wide (excluding test sites)

During Grant: Continually

 

 

After Grant:

Annually

 

 

5-10 annually

 

 

3-5 annually

 

Outcomes

 

 

 

 

 

Outcome 3.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries will contribute to sustainability of Variations3 (ongoing development and support).

 

 

 

The # and % of institutions deciding to contribute to the sustainability of Variations3.

Depends the model selected.

?

?

100% of adopters

 


Outcomes

 

 

 

 

 

Outcome 4.

Indicators

Data Source

Applied to

Data Interval

Target

Cataloging strategists will improve their understanding of the costs and benefits of metadata models similar to Variations3.

 

 

 

 

The # and % of cataloging strategists aware of the Variations3 metadata model and report that it was useful in making decisions about metadata models.

Survey

Cataloging strategists

Yearly

?

 

 

 

 

 

 

 

Outcome 5.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries will contribute metadata records to the shared Variations3 database.

 

 

The # and % of Variations3 implementers that contribute n records to the shared Variations3 database.

System logs and statistics.

Variations3 implementers

Yearly

50%

 


Outcomes

 

 

 

 

 

Outcome 6.

Indicators

Data Source

Applied to

Data Interval

Target