Posts Tagged ‘ workflow ’

Intota Implementation: User Experiences #PQsummit

Presenter: Sandra Morden, Head, Discovery and Technology Services, Queens University, Ontario, Canada

Presenter: Michael Vandenburg, Associate University Librarian, Queens University

Presenter: Ashley Zmau, E-Resources Librarian, University of Texas at Arlington Libraries: 

Michael Vandenburg and Sandra Modern from Queens University started off this two-part session. Their presentation,  There is no Manual: Challenging traditional workflow processes and developing problem-solving skills, described their migration path to Intota, which occurred during a period of significant change at their institution.  A primary result of that change, structurally, was increasing support for e-resource by combining a team of e-resources and collection development.   Other workflow tactics used along with this restructuring included consolidating subscription vendors and reduce shadow systems.  Intota went live for them in August 2014 without a RFP, more as an upgrade of existing Serials Solution system.  

Morden picked up the presentation at this point describing the Intota interface implementation.  The Intota interface is different, but the underlying features are the same.  [And in this reporter’s humble opinion, that is precisely the problem!]  Morden had hoped the implementation would provide for more directed process, but instead that they were already using the product mostly as expected, but made a few minor adjustments. She suggested that role-based task lists that would be more helpful for connecting staff to the system and workflows of electronic resource.  Lacking this, required different approach to staff training program focused on  understanding the larger concepts and interest in problem-solving skills.  They built on previous training in the use of existing ticketing system, and also built personnel skill in independent problem solving and accountability with a greater understanding of referral.  The training involved a series of workshops with practical exercises.  Overall they were left with questions, like: Is this all there is?  Are there just not there better ways? Looking forward to collaborative futures that will help with consortial partnerships in development.  Morden expressed optimism for what could be to learned from the next presenter, Ashley Zmau, E-Resources Librarian, University of Texas at Arlington (UTA) Libraries.  

Zmau presented an Overview of how Intota can Meet the Needs of your Library Now  which started with an immediate recommendation to begin using a team email account with subscription agents, vendors and publishers in order  to better manage communication and workflows.   This recommendation and the structure of all her recommendations followed what Zmau called operating by the “hit by bus” model.

Highlighting Renewal Details customization, she showed how UTA only used a handful of the many fields available, and took advantage of certain field to code key information (like the Renewal Note field for PO#).  Renewal Checklist is a favorite feature which UTA uses three ways, for Renewal, License and New Order.   These allow her to delegate renewal task to other staff through the use of checks and  the text fields associated with those checks to include staff initials and date.  This helps to troubleshoot problems.  Multiple people updating the checklist at any given time.  [But this reporter has oft complained that this feature still does not include an update trigger that would email the next person to act.]

Use of Collections was another valuable feature. UTA used it for resource for which they partner with their business school. This was important for the various people involved in the renewal, as well as the fact that this particular renewal must be renegotiated annually. 

Zmau offer key tips for developing documentation, including these three necessary components: 1) What, 2) why and 3) where to look.  Having theses components  in all documentation helps develop higher level thinking and independent decision making among staff.  Zmau also recommended including staff in development of menus and menu definition and other documentation. 

My Intota feature is another favorite, especially the My Databases page.  This page includes at a glance renewal dates and quick access icons that save time by requiring less clicks through buried screens.  They allow title list, renewal checklist, contacts, and license data associated with a database to be readily available. Reporting features allow to see the larger overview of where workflows are at any given point.   Management Reports show list of databases with key info.

Zmau noted a number of  alerts that could be assigned to individuals and prepopulated with notification email text . [But this reporter has oft complained that this feature doesn’t allow you to select from preexisting contact list, or assign to different contact based on say, the resource (not the alert), all of which is necessary for these team-based workflows!]

Questions from the audience: 

Q:  What kinds of staff resistance was encountered and how did you address?

A: Less resistance since we had been using Client Center — joked that  biggest difference b/w the two is that  one is blue and one is green. Beta partnership also helped in being able to say what is coming down the development track. 

Q: What would be the ideal “manual” if there was one?

A: Not step by step, more along the lines of best practices.  Remove dependence upon step by step and screen shot approach in order to get to more  self-directed learning expectations [and to keep up with the rate of change!]

A: The ideal system too would enable staff to see what the next logical step is. 

A: Important for any manual to capture local and historic decision context for each resource.

Q: Say more about Drupal database use and its connection of this to ERM?

A: “Database of databases” used, but is actually manually connected to KB, because its information is more public facing than administrative and that this list predated licensing of Client Center/Intota.

A:  First step at UTA was to match these lists against each other. 



Summary: Trends, Ideas, Looking Ahead #erl14


ER&L 2014 did not disappoint.  The three great keynote speakers offered a good frame for describing the breadth of topic the conference typically offers. Opening keynote, Barbara Fister, reminded us that where the issue of the 90s was ownership to access, today the issue is toll access to open access.  Fister approached her topic by challenging the passive language that predominates library missions and our somewhat hypocritical promotion of “lifelong learning” when it comes to providing access.twittererl14_chris
Fister encouraged us to find more activist methods that connect us and our patrons to the open access and scholarly publishing issues, including devoting portions of budget and staff time to OA projects. (Check.)  Expanding our neighborhood. (Check.) And beyond that, finding and offering solutions to problems. “Do more than negotiate favorable terms; provide alternatives to market driven economy that is eroding our mission.”  Sarah Dutton shared her research and consulting practice in resilience, addressing the negative biological effects of constant disruptive change and the potential solutions that personal practices of resilience can offer.  Soundbites include: “Embrace vulnerability, failure, resilience through connection.  Pay attention to “being” in addition to “doing” in our work” (Durant, Red Sage consulting).  Will maybe begin exploring possibilities for bringing her in for future organizational development related programming in my library.
Finally, Brent Hecht shared some brilliant applications of data mined from open information sources, primarily Wikipedia. With this data he showed how English language bias could be found in Wikipedia and how that led to better shared knowledge applications using alternative data visualization models.  You might check out some of other wiki-applications in the Resources at the end, as well as a great summary of this closing keynote by eclectic librarian, Anna Creech.
The concepts the keynote speakers offered echoed across multiple presentations I attended revealing several trends in each of these areas  and leading to some key ideas for actions, areas to begin looking ahead and keep in mind, and useful resources to refer back to.


Pulling these ideas into areas specific to e-resources, one constant refrain was how to maintain agility and resilience when e-resources continues as an increasing portion of budget and a small portion of organizational staffing resources.  While there is justified need for increased staffing or addressing staffing to e-resources, it remains perhaps most problematic that a majority library workflows remain predominantly centered on print — not just technical services workflows, but also content development and access services.  (ALL SESSIONS, but #erl14humanterms specifically addressing collection development, #nexuserm specifically to Access Services)
    How organizations understand and begin to address this revealed an interesting interplay, debate maybe, between e-resources=”someone(s)” vs. e-resources=”everyone”.  There were many different approaches to workflow and reorganization based on how you conceive of e-resources management in these two ways.  Those who divide by format, aka the e-resources=someone(s), see it as a way to address the problem they see that the continuously changing nature of e-resources requires staff to devote more focused time in e, not divided time in both p and e (MIT).  Alternatively, the everyone does e-resources model argues that it can’t possibly be focused or siloed in this way and requires on-going communication, coordination, check-in, training, and evaluation.  The questions I was left with was, “which one best supports your organizational or staffing strengths?” (ALL SESSIONS, #erl14humanterms specifically “e should be our core”, #nexuserm).
    Both TERMS and NASIG Core Competencies for E-resources [in] Librari[es]  popped up in various context, including addressing organizational analyses of e-resources workflow interdependencies. (#nexuserm, TERMS workshop). Both were also mentioned as a way to advocate for staffing and to frame team development and training (#erl14humanterms).  This lead me to the idea of using TERMS as a workflow checklist, or a documentation tool in my department. But perhaps more broadly, and following the “e-resources everyone” model, why not  make a survey where people can identify whether they feel certain activities/workflows (TERMS) and competencies/skills (NASIG CC for E) fall within their responsibility?
    Workflow analysis and restructuring was prevalent, and approaches had some commonalities such as positions and workflows re-aligning with libraries strategic plans, including many creating digitization programs to manage OA resources and born digital assets.  Key points repeated about these workflow analyses efforts emphasize:
  1. it will take time (years!)
  2. it will be painful
  3. it will require concerted attention to information management.
    Information management also stood out as a critically important goal and ongoing activity in its own right, with repeated emphasis on visualization/process maps, and with common sets of success measures, including:
  1. reduce reliance upon email and human memory,
  2. automate hand-offs and notifications,
  3. promote ease of access to existing documentation,
  4. improve visibility of (and to those responsible within) the entire life cycle. (Duke, MIT, TERMS).
    Related both to information management and shared/open knowledge, using wiki as a conceptual model, specifically for workflow and procedures documentation was mentioned frequently, as were various perspectives on the readiness (or lack of) on the part of new ILS systems to address our key  information management needs.  I still agree with the vendor who said at ALA Midwinter, and I repeated in a session at ER&L: “You can’t tell [vendors] soon enough that you are considering ILS migration”.  However, given all this,  I began to admit and come a little bit closer to acceptance with (kind of) the point that these new ILS systems are not quite ready for what we really need. But, what are we supposed to do in the meantime that is NOT EMAIL!
    Other bits here and there related to nagging e-resources needs to address include: needs in usability, navigation, mobile access, DRM & Licensing, E-books (#nexuserm).  Perpetual access problems to solve include the problems with providing proof of payment, whether license language should be specific or vague,  and the fact that even new ILS systems still rely on outdated DLF standards, not covering all fields that are needed.


In addition to a few ideas in workflow and information management, I jotted down some other, perhaps less thought-out, ideas to consider working on here at home.
  • Working with external vendors and user services office (in our case the Centers?) to establish training and promotion of e-resources.
  • Establishing paid fellowships/apprenticeships to deal with staffing issues and practical learning opportunities for graduate students. (#erl14humanterms)
  • Standards vs API and open source: should move toward outcomes based partnerships and work. (Playing Nicely)
  • How can we apply “dogfooding” in the library organization: internal customer service as you would external customer service. (Playing Nicely)
  • Access Services is demand driven, E-resources Management is workflow based, challenge or opportunity? (#nexuserm)
  • E-resources troubleshooting as Access Services function, could benefit from merged service desk, merged tracking tools. (#nexuserm)
  • Information Mgmt: consolidate storage places for title list spreadsheets with the licenses (Duke)
    Looking ahead to some specific e-resources trends on a more immediate horizon, I noted some takeaways from the presentation on Streaming Video is an E-resource — both commercial and digitization of local assets.  I also paid attention to a bit I overheard from publishers that the short-term loan model for demand driven acquisition is problematic, unsustainable (#niso #dda).
   Also, on the more hazy horizon, the concept of how we support OA resources management in our organization came up, as this is strategic priority in my library.  But, we still don’t exactly have clear answers.  Jill Emery & Graham Stone, who lead the TERMS project for e-resources management, are building on that approach for a new project, Open Access Workflows in Academic LIbraries (OAWAL) to gather collective techniques and workflow approaches for open access resources management. Other OA projects mentioned for which to keep on the look out include: Bluejar (like Knowledge Unlatched, crowd-sourced funding for making books open access) and Pivots (not monographs, but shorter e-bits of content — of interest for online learning).

RESOURCES to Read, Explore

– (Lightening Talk)
– OpenStreetMap, Omnipedia, Atlasisfy (Closing Keynote)
– Catalog 2.0 by Sally Chambers (2013) recommended reading for thinking of transitioning ILS. (Playing Nicely)

Core Competencies for Electronic Resources Librarians #alamw14

How I missed adding Dr. Sarah Sutton’s presentation of Electronic Resources Core Competencies to my scheduler, I’ll never know.  But, thanks to Twitter, I got there in time to catch the key points, just after Sutton’s overview of the competencies themselves.

Sutton gave high praise to many of the unique ways (mostly academic) libraries are already putting the competencies into practice. Most are using them to analyze, restructure, and define workflows and staffing, either at the department or unit level, and even across the entire library. The latter speaks to a significant takeaway of the competencies, that in most cases “one person can’t possibly do all of this”.  The competencies document emphasizes how they are not a set of competencies for an e-resources librarian, but a focus on the collaborative nature of managing these resources throughout the organization.  Other applications Sutton shared include informing MLS course programming and continuing education opportunities for both professionals and paraprofessionals, and creating job descriptions and hiring advertisements.  The audience provided additional applications, such as assessing and targeting specific areas of strengths and weaknesses.

Sutton plans to continue her research by investigating how the competencies shape student learning outcomes in MLS programs.  For myself, I see connections to my research interest in organizational communication, as well as pursuing the question of how you develop training in these competencies, especially in such amorphous concepts as “tolerance for ambiguity and complexity”.  How do you practice that, and how do you measure it?!

There was an important final question from the floor that spoke to how these competencies relate to Emery & Stone’s Techniques for E-resources Management (TERMS).  Sutton aptly addressed the similarities between the two, while noting the two have differing approaches — TERMS being more practical in nature and the Core Competencies being more conceptual, addressing the knowledge skills, and abilities of the people doing the work of e-resources management.  I shared my agreement with others in the audience that the two are complementary,  pointing out that I posed a very similar question for the TERMS project  —  imagining how techniques mapped to the e-resources life cycle could extend to mapping improved workflows and organizational communication.

The nature of e-resources evokes themes of constant change and adaptability.  As such, the process for updating these competencies, according to Sutton, will be ongoing, and the opportunities for training and other applications of the competencies will continue to evolve.  It will be interesting to see how the programming takes shape for the upcoming ER&L, and especially NASIG’s Annual Conference in Fort Worth, as its call for proposals were modeled on these Core Competencies.

ER&L Conference Summary #erl13

I recently attended the 2013 Electronic Resources & Libraries (ER&L) conference in Austin, TX. As before, the conference reinforced and solidified ideas among my fellow electronic resources librarians (ERLs), re-energized my research agenda, and reminded me that I am not alone in making wider connections out from my work as an ERL.

In fact, the “we don’t usually have a theme” theme of the conference was bridging communities and cross pollinating ideas — which led me to ask myself, Self: How do I communicate and bridge ideas across the world of ERL and the larger library mission in practice? But another subtler theme that I picked up on throughout the conference turns out to be a very reality-based response to my own question.

The keynote opening and closing speakers, as well as many presenters throughout the conference, challenged all of us to move beyond research results or the identification of problems in our communities (content) and become involved in myriad ways with solving problems and building bridges (service). Even more than my little parenthetical emphasis on service over content — this was a call to individual action.

“What are you going to do with what you now know about Google Generation users?” asked Michael Eisenbert (session notes) — Opening keynote: Listening to Users: What the “Google Generation” Says About Using Library & Information Collections, Services, and Systems in the Digital Age

You are the Digital Library Federation” chided Rachel Frick — Closing keynote: The Courage of our Connections: Thoughts on Professional Identities, Organizational Affiliations and Common Communities)

“Are you disgruntled? Support these start-ups, your fellow Disgrunterati who are making things happen!” coined Jason Price — Lightening Talks

I attended sessions mostly focused on my passion areas, the places where I am most action-oriented — workflow and communication. I felt particularly energized by presentations from early adopters of webscale systems like Intota (session notes) and Alma (session notes). Unlike years past when new ERM systems were adopted and met with fairly wide-scale disappointment, these adopters spoke specifically to how these new ILS systems are helping them manage the complex nature of our work across the library (e and p, content and service) And they seemed so happy! They clearly demonstrated how the ability of these systems to centralize and structure key data and to bridge that data across all library service workflows enabled them to more quickly take action to address internal and external users needs.

I was also very pleased with the Project Management in Libraries (session notes) post conference workshop led by the most excellent Jennifer Vinopal (NYU). To energize my research agenda there was a welcomed talk on the importance of both Internal and External Customer Service (session notes) , especially as it relates to various organizational restructuring. Timely! These sessions helped me see where I can act by both confirming current thinking and offering new ideas to help me move forward.

Some others included Jill Emery’s and Graham Stone’s TERMS (session notes) project. I would love to become involved in extending areas of TERMS that relate to communication and information mangement, as well as key troubleshooting best practices. Another was Extreme E-resources Endeavors…(seesion notes), which included a mix of things we have already acted on (PDA, E-reserves) and things we are hoping to (renewal calendars, POOF!). Feeding one of my passions (and past professions), Instructing Future ERLs (session notes) was another inspiring call to act, although maybe further down the road with this one.

Now, strangely, and despite all Dan Tonkery’s advice to the keep emotion out of it (Improving Communication & Relationships Between Librarians & Publishers session notes), my initial overall response to the conference (after a great closing keynote) was not resolve and energy, but reservedness, fear, frustration, and believe it or not – tears! I reasoned that it was frustration with wanting to act, but not being able to due to lack of resources or, possibly, as Frick suggests, the “courage of my connections”. But I also think changes going on back at my organization may have played some subconscious role in that perhaps too — the sense of uncertainty about where these idea and action bridges will be built.

You should also probably know that I was reading Susan Cain’s Quiet: The Power of Introverts in a World That Can’t Stop Talking during the trip. I claim to be an ambivert, but I was operating strongly with my “I” throughout the conference. I didn’t do a lot of networking, even though I had many opportunities, and it has taken me much longer to recover my energy post-conference (also classic introvert behavior). Thankfully, some of my first tasks back were sharing project management approaches, discussing ideas for development programs, and spending most of today cleaning up my notes and summarizing my experience.

And look what I found on facebook!

And look what I found on facebook!

Reflecting now upon these bold calls to action and individual responsibility, I’m reminded that I begin acting within my circle of concern. My strengths in learning, strategy, analysis, and taking action with others are what help me be effective in my circle. These same strengths also enable me to see and act beyond this area by sharing ideas and bridging communities. I have always thought of myself as a bridge-builder of both ideas and communities. This conference is always great reminder of how I do that as an ERL prepares me in all sorts of ways to be a greater and broader leader in librarianship as a whole.

Project Management in Libraries #erl13

First of all, I only realized when sitting down to this post-conference workshop that it was being led by Jennifer Vinopal of the wonderful article on project and portfolio management that I’d shared within my organization this past year.  So I was very energized, which was good since I was otherwise totally exhausted coming off the end of the conference as a whole.

Vinopal did not disappoint.  She is an excellent teacher and clearly knows her subject and how to present it to librarians. The context, outline, timing/pace, and the activities (a mix of alone work, pair and share, and open discussion) were very helpful for building a greater understanding of project management in libraries.

The session outline basically followed a “talk, do, discuss approach” around the following:

  1. project manag(er/ment) – what is it?
  2. project charter – documentation of the scope agreement (i.e. collaborative), which includes scope,goals,deliverables,
  3. project plan
  4. project execution
  5. [if time…portfolio management]

An important distinction about project management in libraries is to remember that library services are not the same as projects that never end. That is both an ill-defined project and an ill-defined understanding of service

Vinopal’s overview of the reasons projects fail (there were 8) is a good way to reveal why project management is valuable, and offers an approach for gaining organizational buy in.  She observed that, as librarians, we all likely got where we are because were good doers, who are able to plan quickly. Project management, however, requires slowing down and building consensus, which are two different skills. There is an emphasis on facilitating both the work  and workers involved in the project. This requires knowing your workers and what they need in all areas of project management, including (lightbulb moment for me) — communication.  Her advice: Don’t force tools that don’t work. Use communication and project tracking tools that will enable you workers to work.

One of the great skills Vinopal had in her presentation was helping to translate the project manag-ese into terms that would be meaningful for libraries.  We started by going around the room introducing ourselves and our planned project examples, which allowed us to identify commonalities and possible partners for our upcoming activities.  Some of the project examples included:

  • understanding the transition from project to process
  • e-books and various related implementation projects
  • a cancellation project
  • a communication audit (evaluation)
  • transitioning an ERMS
  • a digitization project

Then, we got started on creating the project charter, aka project one-pager, project home page.  Project charters need:

  • a name =this may not be as simple as it seems, esp when dealing with multiple products
  • description (goal)
  • success criteria (assessment)
  • the requirements (deliverables, optionals, and out of scope)
  • who is on the project team (including roles and contact info)*
  • milestones/schedule (high-level proposed dates)

*note especially the role difference b/w sponsor (finance, support, giving you authority) and stakeholder (advocate, ally)

After describing the requirements of the charter, Vinopal challenged us to think about where this information would come from.  Seems like a no-brainer, but this is often when people get stuck and activate their doer over their planner. Some information resources for the charter may include:

  • the sponsor and stakeholder (without promising at this stage)
  • past projects
  • surveys or environmental scans
  • any and all correspondence and documentation (email, grant documents)

As we worked through and discussed our activities,  I jotted down the some additional highlights (below).

Writing the charter. Consider the audience in the language of the charter. Your charter may include a communication plan, depending on the audience or project. Other considerations are a risk management plan which will vary as well depending on the audience

“Running Meeting Notes” are an easy way to keep notes and action by build from the bottom up.  Create a home page linking to this additional documentation.

Communication plan can be an avoidance-of-risk plan and can be as simple as identifying how you communicate within the roles section of your charter.

All that matters is that you do and use what works for your organization.  Microsoft Project is often overkill both for you as the planner and the audience who must follow it.
Creating tasks and setting timelines always take longer than estimate. “Make your best estimate and adjust up”. You have to talk with others in this step to determine how long certain things take. Planning Poker can be a fun (facilitation technique) way of invovling your team in estimating time.

Workflow design and redesign may be necessary within project planning.
Handoffs and triggers need to be part of the workflows. At the very least, add as a meeting agenda items to address handoffs and what’s in the pipeline.

Responsibility without authority is ugly.  View project manager as facilitator vs. task master. Organizational buy-in needed especially to the language and approach.  It is helpful if someone can “be-knight you” as the project manager.

Project and Service portfolio management (PPM). Can be portfolios within organization as a whole or within just a small subset. Can be as basic as a list (inventory) of the project and services ongoing or on tap in the organization. PPM originated in business and IT; libraries may be doing it but they aren’t writing about it. Requires a good amount of buy in from the top and a governance structure for it beneath. Recommend having Project Portfolio Manager for the whole organization.

PPM can also track just who is doing things, rather than how long it is estimated to take.  This may be less scary for individuals but while still giving the manager the ability to see realities.

Vinopal emailed us her complete presentation with her notes and encouraged us to contribute our projects and ideas about project management to the Crowdsources PM Toolkit.

Unified Resource Management (Alma) #erl13

Jimmy Ghaphery of Virginia Commonwealth University (VCU) shared the public side of the universal resource management system, Alma.  As an early adopter, VCU migrated to Alma b/w April to October 2012 from their existing array of systems: SFX, ARC, Aleph, and Verde.  January 2013 included further migration to Alma OpenURL and Alma CourseReserves.

Ghaphery emphasized the benefit of Alma as the back-end system upon which other layers can be added. Alma at VCU means no other catalog for users. Staff search as the public do and there has been no huge uproar. Although, browse functionality necessary and this functionality and need appears to be a bigger issue in humanities research.

Ghaphery noted the OpenURL interface as one of the most used by our public and found Alma provided a better visibility, especially of print holdings.  There is still a need for better support for custom parsers in order to include collections not indexed by PrimoCentral.

The details of the back-end system was presented by Erika Johnson (Boston College) who emphasized the dashboard and task list interface and the system requirement to set up workflows. Staff training in this was done in house, not ExLibris, and within a sandbox.  The benefit of the system, is that it knows the next step in a workflow, you don’t have to tell it or track and then manually push an event to its next step.  For example, if you created the order and then “sent to vendor” it automatically moves to the “activation” task list. Alma tasklist and workflow design and centralization structure simplifies the renewal process in one system.

Johnson also talked a bit about Alma Analytics which allows a number widgets to produce budget, task, and cost per use analysis reports.  In the April release, this cost per use tool will be visible from the search interface, and potentially coming into the Primo interface as well. Johnson noted their prior reorganization which created a continuing/e-resource and metadata unit, worked well for Alma implementation.

Susan Stearns (Ex Libris) finished up with additional Alma updates and summarized the four major areas of evaluation focus on which they have worked closely with partners.
1) Streamlining workflows
2) Increased visibility through Analytics
3) Creating an environment for collaboration (ARL community facilitation, OrbisCascade award, infrastructure to support sharing resources on both collection and technical services )
4) Becoming agile — agile development and a different (agile) mindset required for dashboard workflow interface


Webscale Collection Analysis and Development (Intota) #erl13

Marist College is one of the development partners for Intota. Kathryn (Katie) Silberger gave an overview of assessment efforts at Marist and how webscale (360COUNTER, Google, Summon) has helped them.

Using 360COUNTER provides multi-year comparison, centralized gathering and storage, while still offering robust reports in Excel format. With cost data in the system, the renewal analysis and decision takes 10 minutes.  You even get stats for products that are not providing reports. For example, click-throughs for open access in order to raise faculty awareness; referrer reports — where are people starting; and a report of widget for usage in LibGuides (which LibGuides can’t provide).  Other types of assessment services they tried include using Google forms (DIY) for reference question analysis and a direct connection to collection decisions, and looking at discovery logs and posting the top 10 or more questions to your internal staff.

Like many, their assessment environment means dealing with data in multiple systems and a proliferation of spreadsheets.   Beside collecting these into one system, other reasons for going webscale were that e-stats and p-stats are different. Current p(hysical)-stats, like circulation statistics reports, don’t account for highly circulating items like laptops, study rooms. Also, when assessment it takes too much time and effort, you often can’t ask for things “out of curiosity”.  Webscale means less time manipulating data and more time for analysis.

Mark Tullos (ProQuest) discussed how to bring all of this together in one place with Intota Assessment.  Intota Asessment has been rolled out this year in advance of the entire Intota Webscale systems. The claim is that Intota offers “a total picture of holdings, usage and overlap across all formats.

This Spring they are beta testing with current partners (and possibly adding other partners). After this process they will be recommending best practices.

Question — How do you deal with the fact that ingesting data is problematic to 360COUNTER or homegrown solutions, and requires a lot of normalization.
Answer — 360’s Data Retrieval Service (DRS) has helped this by using authority control — much like the questioner suggested they were manually doing. Problem with normalization of COUNTER data is often with the header, so have replaced this. DRS doesn’t require SUSHI compliance.

Question — What about non COUNTER data?
Answer — For no stats, use the click-through, for the non-COUNTER, normalize them to make them COUNTER-ish and load them.

Question — How are you loading cost?
Answer — By hand. The form fas not been as easy as giving it out to clerical staff given that invoicing is so varied across product. Once you have it in,

Question — If Banner could interact with Serials Solution this seems it would make this process easier. Are you planning for this?
Answer — Serials Solutions in following this and other payment systems, anticipating something to assist when rolling out Intota Webscale.

%d bloggers like this: