Child pages
  • Grant Information

Measuring Program Results

 

Organization Name: Indiana University

Date: March 17, 2006

 

Activity 4

F raming the Evaluation

 

What is the program’s name?

Variations3 : An Integrated Digital Library and Learning System for the Music Community

What partners are involved?

Test Sites: New England Conservatory, The Ohio State University , The Tri-College Consortium (Bryn Mawr, Haverford, Swarthmore); Content Partner: New World Records

Who are the program’s stakeholders?

What do they want to know?

IMLS

Have we achieved outcomes?

Have we spent money (to what extent)?

How many institutions adopt Variations3?

Indiana University  

Can we sustain?

Will other institutions contribute to this effort? (system and metadata model)

Can we still get new functionality?

Librarians/Libraries (test sites)

How much work and money will it take to implement?

How does this compare to other solutions?

Will this work for our institution?

How does this help us advance?

What happens after the grant ends?

Will IU deliver what is promised in the grant?

Faculty

Will this save me time?

Will it help my students?

Will it improve my teaching?

Is it worth the effort?

Will it improve my research/performance?

Can I have the features I need?

Students

Will this save me time?

Can I put it on my iPod?

Can I have access after I graduate?

Can I get new features?

Content providers/publishers/record companies

What are you doing with my intellectual property?

What is the impact on my revenue?

How does this affect us?

Can we adopt tools in our service (tech transfer)?

 


Program Purpose Statement

We do what?

Provide a sustainable digital music library system that supports discovery, delivery, and use of music in various formats.

For whom?

Academic music libraries

Cataloging strategists

For what outcome/benefit?

Libraries will successfully adopt Variations3 to provide access to collections for research, teaching, and study.

Libraries will contribute to sustainability of Variations3 (ongoing development and support).

Cataloging strategists will improve their understanding of the costs and benefits of metadata models similar to Variations3.

 


 

 

Inputs

Outputs

Project staff

1 Report

Advisory Board

Variations 3

IMLS Grant and guidance

# website hits

IU Funding Match

 

Consultants (metadata)

 

Variations2 software

 

Test Sites

 

IU Digital Library Program

 

 

 

 

 

 

 

 

 

Activities

Services

Gather requirements (test sites)

 

Recruit Advisory Board members

 

Hire Staff

 

Conduct usability test

 

Field studies of use of software

 

Analyze usage logs

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Outcomes

 

 

 

 

 

Outcome 1.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries that are test sites will ( applied to) successfully adopt Variations3 to provide access to collections for research, teaching, and study.

Why not keep this outcome combined with the second—use the applied to section as your means of separating non-test sites from test sites.

 

 

The # and % of institutions (participating as test sites Applied To ) that commit to using the system past the period funded by this grant.

Structured Interviews

Test sites

End of Grant

3

 

Outcomes

 

 

 

 

 

Outcome 2.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries that were not test sites will ( applied to) successfully adopt Variations3 to provide access to collections for research, teaching, and study.

 

 

During Grant period:

The # and % of non-test site institutions that express interest in learning more about Variations3. 

Strengthen this indicator to something like:
#/% of libraries that respond via email, telephone, etc. within 3 months of the beginning/marketing .

 

After Grant:

The # and % of non-test site institutions that adopt Variations3 within 1 year of the grant .

 

 

During Grant period: Contacts

 

 

After Grant:

Survey or structured interview

Music Libraries world wide (excluding test sites)

During Grant: Continually

 

 

After Grant:

Annually

 

 

5-10 annually

 

 

3-5 annually

 

Outcomes

 

 

 

 

 

Outcome 3.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries will contribute to sustainability of Variations3 (ongoing development and support).

 

 

 

The # and % of institutions deciding to contribute to the sustainability of Variations3. 

#/% of institutions that utilize/incorporate the Variations3 model within 1 year of the projects’ beginning.
How are you measuring the contribution?  Is this through an institution providing financial support or collaborative physical support?

Depends the model selected.

?

Test sites

Non-test sites

?

Once per year

100% of adopters


Outcomes

 

 

 

 

 

Outcome 4.

Indicators

Data Source

Applied to

Data Interval

Target

Cataloging strategists will improve their understanding of the costs and benefits of metadata models similar to Variations3 .

Cataloging strategists know about Variations 3 (or similar models) and find it useful .

 

 

 

 

The # and % of cataloging strategists know about   aware of the Variations3 metadata model

And

#/% of strategists that report it was useful in making decisions about metadata models. 
#/% of strategists that report the model is useful to very useful on a 5-point likert scale.  

(What type of decisions would be made based on this model?)

Survey

Cataloging strategists

Yearly

?

 

 

 

 

 

 

 

Outcome 5.

Indicators

Data Source

Applied to

Data Interval

Target

Libraries will contribute metadata records to the shared Variations3 database.

 

 

The # and % of Variations3 implementers that contribute n records to the shared Variations3 database. 

(Determine a suitable # of records.)

System logs and statistics.

Variations3 implementers

Yearly

50%