Posts Tagged ‘ assessment ’

Webscale Collection Analysis and Development (Intota) #erl13

Marist College is one of the development partners for Intota. Kathryn (Katie) Silberger gave an overview of assessment efforts at Marist and how webscale (360COUNTER, Google, Summon) has helped them.

Using 360COUNTER provides multi-year comparison, centralized gathering and storage, while still offering robust reports in Excel format. With cost data in the system, the renewal analysis and decision takes 10 minutes.  You even get stats for products that are not providing reports. For example, click-throughs for open access in order to raise faculty awareness; referrer reports — where are people starting; and a report of widget for usage in LibGuides (which LibGuides can’t provide).  Other types of assessment services they tried include using Google forms (DIY) for reference question analysis and a direct connection to collection decisions, and looking at discovery logs and posting the top 10 or more questions to your internal staff.

Like many, their assessment environment means dealing with data in multiple systems and a proliferation of spreadsheets.   Beside collecting these into one system, other reasons for going webscale were that e-stats and p-stats are different. Current p(hysical)-stats, like circulation statistics reports, don’t account for highly circulating items like laptops, study rooms. Also, when assessment it takes too much time and effort, you often can’t ask for things “out of curiosity”.  Webscale means less time manipulating data and more time for analysis.

Mark Tullos (ProQuest) discussed how to bring all of this together in one place with Intota Assessment.  Intota Asessment has been rolled out this year in advance of the entire Intota Webscale systems. The claim is that Intota offers “a total picture of holdings, usage and overlap across all formats.

This Spring they are beta testing with current partners (and possibly adding other partners). After this process they will be recommending best practices.

Question — How do you deal with the fact that ingesting data is problematic to 360COUNTER or homegrown solutions, and requires a lot of normalization.
Answer — 360’s Data Retrieval Service (DRS) has helped this by using authority control — much like the questioner suggested they were manually doing. Problem with normalization of COUNTER data is often with the header, so have replaced this. DRS doesn’t require SUSHI compliance.

Question — What about non COUNTER data?
Answer — For no stats, use the click-through, for the non-COUNTER, normalize them to make them COUNTER-ish and load them.

Question — How are you loading cost?
Answer — By hand. The form fas not been as easy as giving it out to clerical staff given that invoicing is so varied across product. Once you have it in,

Question — If Banner could interact with Serials Solution this seems it would make this process easier. Are you planning for this?
Answer — Serials Solutions in following this and other payment systems, anticipating something to assist when rolling out Intota Webscale.

Troubleshooting and Tracking #erl13

Nathan Hosburgh (Montana State University) and Katie Gohn (University of Tennessee) spoke to a packed crowd about troubleshooting and tracking e-resource access problems by reviewing the various approaches, tools, and information resources used.

Outlining approaches to troubleshooting through the lens of “psychology and philosophy” seemed to speak more to the fundamental skills and talents effective troubleshooters have — remain calm, high tech with a human touch, logical & analytical thinking, can-do attitude, and don’t assume operator error.

Knowing you users is foremost, and this includes both internal and external users. Your internal users (ILL, Reference, Collection development, systems) provide valuable feedback from varied points of access and patterns of use. Knowing specifics about your external users — who will have different enrollment statuses, needs, devices — will inform the approach for solving problems.

How problems are solved varies widely — email, link to problem report form, internal error log, ticket system, and AskaLibrarian. The lengths people go to solve problem ranges from simple to complex guides for users to more detailed internal documentation.

Question is, how are you evaluating the effectiveness of these methods?

Katie Gohn shared observations of the widely varying sources for reporting e-resource troubles — anywhere from water cooler talks to direct emails. But her portion of the presentation focused primarily on the e-resource tracking system Footprints. Her library had this system set up as an instance of the wider University IT’s version.

Their web-based “Report IT” form populates the system from a user-selected category assignment of the problem and a general comment box. On the back end, this form also gathers types of computer, IP, and referring URL.

What are the key features that a tracking systems provides that email or other existing methods don’t?
1) the ability to see status and who’s responsible
2) communicate centrally in a system that is easily searchable
3) ability to categorize which allow you to assess needs from vendors or identify internal training needs
4) Have numbers to know staffing needs in this area.

6 people assigned to these troubleshooting teams for a 12K FTE-sized organization. They are hoping to justify the hire of one more. Basic troubleshooting training is important and this tool will help shape that.

Listening to [Google Generation] Users #erl13

The ER&L 2013 conference began with a great intro by ER&L founder Bonnie Tijerina who provided her personal theme for this year — bridging our community with other communities and cross-pollinating ideas. Introducing the keynote speaker, it seems an apt theme, as we aim to continually bridge the ER community with our user community.

Michael Eisenberg (University of Washington Information School) provided an overview of the Google Generation (1993-2013) and their information seeking habits informed by the results of the Project Information Literacy. Reminding us of the “stack of needles” information environment in which we and our user find ourselves, Eisenberg offered insights and possible responses to the information needs of this generation. He also offered some interesting projections for the ??? (to be named) Generation of 2013+, like GoogleMS (a Google Microsoft merger) and brain-controlled environments (Google glasses, as a start).

Project Information Literacy (PIL) has to do with questions of how people find information, what they do with it, and what problems are encountered along the way.  Eisenberg presented the findings of the study and provided an excellent worksheet with one column outlining all the results, and another (blank) one for the takeaway lessons for libraries.

The results may not surprise librarians or teaching faculty.  These users have expectations for perfection, and yet believe the best approach is Google.  Their course research sources are limited, but do include course readings, databases, instructors, and, yes, Wikipedia — ignoring faculty recommendations to avoid it and just intentionally not citing it. Their personal research, however was quite different.  Here users start with Google and Wikipedia first.  Don’t you?  Takeaway: Librarians should consider Wikipedia as another social media resource for being where their users are.  Begin reviewing, updating, and writing content here — where users can find it!

Other results of the study emphasize that there are legitimate reasons for all-familiar user procrastination, including multiple jobs, studying, and extra scholarly-curricular activities. Takeaway: Have we changed our thinking and staffing and services to accommodate this or are we just judgers?  Their needs change across the academic year for which opening later hours at crunch time is insufficient to address.

The study also shows these students are in fact applying evaluative criteria to online resources and are asking for help, but still they are not asking librarians.  These users consider librarians as assistants with resources.  Given the stack of needles, they don’t need help with resources.  So what do they need?

What the Google Generation needs goes back to a generation-ago of LIS research — formulating research questions, understanding the research process, and the ability to assess themselves through it!  Carol Kuhlthau anyone?  While K-12 prepares students in writing techniques alone, it is lacking in the steps of the research process. Helping users understand the development of research ideas and managing the process/project of research is a critical need for which Eisenberg challenged the audience of mostly ERLs to think of solutions beyond one-off instruction.

Another interesting portion of the PIL study results was the handout (faculty syllabi) assessment.  [Six out of ten – not sure of this stat, but most!] handouts refer students to print resources and almost none to relevant databases.  Takeaway: Librarians should offer to faculty help with updating these.   At my institution, I attended a new faculty luncheon in which “front loading” course design/content was recommended again and again, especially for new faculty.  So that you “do it right the first time” and recycle/tweak the content in ongoing years.  This and these PIL results continue to make me wonder how the library can help the front loading process both from a distributing the workload and “getting it right” perspective.  This is a big opportunity area for the library to play the role it talks about playing in outreach, course design, copyright, information literacy, etc.

Another surprising finding  was that the library desktop/laptop was seen as a valuable tool in how it helped avoid distractions in ways users’ own technology might not.  This continued a more tried and true notion of the nature of focus and an emphasis on the library as place (they liken it to a monastery). Users used terms like “IT fasting” and noted it requires planning ahead (with parents, with peer expectations, etc). Takeaway:  All of this would be good marketing and outreach ideas.

 

What was great about this keyonte, besides the useful data shared, was Eisenberg’s approach to put it back on librarians.   This included an actual audience participation in completing the sections of  the PIL results/Library Lessons handout.   My group had the result:  “Defining the task and assessing the process are harder than finding the resources”  One tool my library uses to help with this is an assignment planner tool  (which could use a new name, maybe).  Another interesting suggestion was to use information literacy language that makes sense to user, like using the term credit vs cite, or calling it an article search engine rather than a database.

All of this also supports of the oft-repeated concept that we are transitioning the librarian/library from content to service — which was also highlighted in the closing session, The Courage of our Connections: Thoughts on Professional Identities, Organizational Affiliations and Common Communities by Rachel Frick (well-played ER&L, well-played).  So, have we told our users about all this?  Have we trained our librarians?  Have we adjusted our library spaces? Takeaway: What are we going to do about it?  I would encourage you to find out more about the Project Information Literacy research, and share with your communities what you’re doing about it!