AC

Archive for 2011|Yearly archive page

Throwing Down the Gauntlet: Moving from Ideas to Action

In Innovation, Society for Scholarly Publishing on May 27, 2011 at 7:00 am

Image by Robyn Jay on flickr

Everyone wants to find new ways to compete while they also keep the machine oiled and running. The planners of the 2011 SSP IN Meeting have been wrangling with this duality between big ideas and practical requirements for weeks.

A recent story in Information Week pinpoints the need for executives to have the skills to evaluate, prioritize, and sell ideas in order to take them from the drawing board to the market.

Companies are attempting to codify the processes through which innovation can be nurtured. More important than ideas, which quite frankly are cheap, is the ability to pick which concepts are worthy of the heavy investment of time, money, and corporate mindshare required to take them to productization. — Alexander Wolfe in “Top 5 Tech Trends for 2011″

But keeping things running is what’s keeping people up at night as well, as a colleague found out when she spoke with a handful of society publishing directors recently (under conditions of anonymity). Their concerns are:

  • Competition on price and availability: New ways to deliver content, which are taking shape on the web, threaten publishers because they compete, not on quality, but on price. Our journals are costly to produce, the subscription model is threatened, and it getting more challenging to compete and retain market share with more inexpensive, “good enough” content.
  • Pressure from consolidation: Users are getting irritated with all the interfaces. Suffering from information overload, our users are saying that they prefer fewer, better, go-to resources. How will smaller publishers compete with the behemoth databases, especially in an environment of acquisition and consolidation? Is PubMed Central be the model for the future? No journals, no branding, just data-based information?
  • Workload and resource demands: The workload of accepted papers is increasing but publications revenues cannot keep up with increasing demands for services and programs by other areas of the society that either do not produce revenues or are not self-supporting.
  • Changing membership: Making our publications program more relevant to new members, who may not have terminal degrees, without devaluing journals for academic authors and reviewers.
  • Journals cuts: Our discipline is facing challenges — many departments have been closed or merged with departments for other disciplines.  In some places, the subject is being taught by people without a discipline-specific background. There are scientists doing pharmacological research who identify with larger practice areas, not sub-specialties. With fewer people to speak up in defense of journals, it has become easier for them to go on the chopping block. Also, how to compete with big package deals — journals are significantly less expensive than their commercially published competitors but are easier to cut than larger packages. Most, if not all, consortia will not bother with small numbers of journals, so we get squeezed out of that market.
  • Keeping pace with technology: Semantic tagging is important, but I have neither the money nor the time to implement it.  My editorial boards are seeing new technologies before I do — and want to know when I’m going to adopt them.  The pace of change seems to be quickening. Staying informed is a challenge and arriving at ways to implement technologies is more difficult. At the same time, my resources are shrinking.
  • The squeeze: What if subscription sales decline and scientists’ research grants can’t support publication charges?
  • The bottom line: What keeps me up at night? The need for more sustainable business models.

SSP members were also recently asked to vote for three strategic issues (of eight identified by the SSP Board of Directors) that they felt would most wouldsignificantly impact them — and about which the society is positioned to take constructive action. What they flagged:

  1. User expectations that they can get information in variable shapes, sizes, and prices (especially free) challenge existing publisher/librarian roles and business models (128 votes)
  2. New products and technology require new skill sets from employees, straining traditional career progression and job descriptions, and requiring constant revamping and retraining (123 votes)
  3. Publishers’ increasing reliance on multiple, unstable revenue streams places a premium on business agility, adaptability, and collaborative partnerships (104 votes)

IN Meeting organizers have incorporated feedback from SSP members about their strategic priorities throughout the planning process. The dual purpose of this 2.5-day meeting is to give attendees new ideas and experiences and to help them translate what they’ve heard into practical, needs-focused actions — as moderator Mary Waltham has put it, steps they can take “within the first 10 days back in the office.”

Continue reading on The Scholarly Kitchen.

Top-Down and Bottom-Up: The Squeeze That Can Revolutionize (and Save) American Education

In Education on April 12, 2011 at 9:34 pm

Source: seantoyer on flickr

Last week I attended the Education Innovation Summit at Arizona State University SkySong, which was organized for the second year by Michael Moe and Deborah Quazzo. The conference was unusual for its intimate size and the access to “top-down” influencers and “bottom-up” innovators – technologists, educators, authors, CEOs, and politicos — that Moe and Quazzo brought together for this two-and-a-half day meeting.

For growth companies and funders, the meeting was a mechanism for speeding the capital process and gaining traction for new ideas. Those committed to system reform heard from speakers, including F. Philip Handy, member of The Aspen Institute’s Commission on No Child Left Behind, who addressed the very significant political and infrastructure challenges that continue to obstruct more progressive technology adoption in mainstream education. Everyone grappled with what needs to happen next, the common refrain being that incrementalism will not be enough to get our students and system from here to where it needs to be, and quickly.

In a candid and entertaining closing session, Marguerite Kondracke interviewed Joel Klein about his departure from the New York City Department of Education, where he was Chancellor until 2010, and his plans and priorities as Executive Vice President of Rupert Murdoch’s News Corporation.

Of particular interest to the publishing sector, Klein zeroed in on five key drivers transforming learning media, which Tom Vander Ark summarizes in EdReformer:

  1. The shift print to digital: dynamic and interactive instructional content is coming fast.
  2. Data driven system: with digital learning and more instant feedback, we can try a dozen lessons and see what works best, test empirically whether fractions should come before decimals or whether it matters whether physics comes before biology.  Klein thought Wireless Generation (a News Corp company) was well positioned in this regard.
  3. The shift from classroom-centric to device-centric learning unbound by time and place.
  4. Customization by level and approach.
  5. Human capital: the ability to focus on the value-added and really inspirational part of learning, and not asking every one to do the same stuff (like build lesson plans).

In a keynote a day earlier, Michael Crow, President of Arizona State University, described the dramatic changes he is making to reinvent the state’s higher education system. ”Tradition is the enemy,” said Crow — it threatens our educational outcomes, knowledge base, and global competitiveness. The system as it exists today centers on faculty, not students, and this needs to change dramatically, not incrementally. Why hold on to constructs and systems that are no longer practical or relevant, which gate progress, performance, and success?

Crow’s vision calls for a scalable, student-at-the-center system with top-level researchers investing more time in the classroom as master teachers. His goal is not only to overturn the status quo but to transform Scottsdale and the region as a hub for business and research innovation in the model of Silicon Valley.

Continue reading on The Scholarly Kitchen.

Innovation and Longevity in Digital Publishing: Surfing the S-Curve

In Innovation on March 22, 2011 at 12:37 pm

Source: Cogdogblog on flickr

Some scholars — including Clay Christensen, author of “The Innovator’s Dilemma” and “The Innovator’s Solution” — argue persuasively that the disruption necessary to create viable innovations must come from outside an industry’s traditional ecosystem. This was elaborated upon recently in an interview with Soomo Publishing’s CEO David Lindrum:

Christensen helped us understand why, in 15 years of trying, we failed to get traditional publishers to build these [new] kinds of resources. . . . Everything in traditional publishing is built around the book, from how the market is analyzed, to the range of features considered and the process of product creation all the way down to how the rep learns a product and makes a call. Every process, metric, and assumption is built around print. . . . [I]f Christensen’s model from Innovator’s Dilemma holds up in this market, the new products must come from outsider organizations and will flourish first in fields that traditional publishers see as low-margin and undesirable.

Publishers, who often struggle with innovation and experimentation, might benefit from roadmaps for publishing innovation which encompass concept development, business modeling, market readiness, and audience targeting. For example, David Wojick and I recently collaborated on an article recently, “Reference Content for Mobile Devices: Free the Facts from the Format,” that steps through the initial challenges of transitioning content from websites to mobile devices.

After translating theory to viable business models, the next elephant in the room is consumer readiness. A body of literature has been produced since the 1950s about the technology adoption curve. The diffusion of innovations theory, summarized below, was published by Everett Rogers in 1962. From Wikipedia:

Technology adoption typically occurs in an S curve . . . [d]iffusion of innovations theory, pioneered by Everett Rogers, posits that people have different levels of readiness for adopting innovations and that the characteristics of a product affect overall adoption. Rogers classified individuals into five groups: innovators, early adopters, early majority, late majority, and laggards. In terms of the S curve, innovators occupy 2.5%, early adopters 13.5%, early majority 34%, late majority 34%, and laggards 16%.

Consider the recent tipping point for e-books. According to the diffusion of innovations theory, the market for e-books is transitioning from early adopters to early majority. How long have we been waiting for this to hit?

Many in our industry recall visiting the well-funded NetLibrary campus in Denver during the late 1990s, followed by the company’s crash and subsequent reboots under new owners. Competitors — like ebrary, EookLibrary, and Knovel — entered the market around 2004 and engaged the innovators and early adopters in our community. But, it wasn’t until 2009-2010 that e-books gained a significant measure of commercial traction. We’re now seeing the acquisitions and consolidations that denote a maturing market, ushered in by Amazon, Apple, and Google.

Hybrid cars were viable before consumers were ready for them. Text messaging emerged in the early 1990s and has taken decades to become “state of the art.” Patron-driven access (PDA) models have been available for several years but have only now entered our mainstream conversation.

Trends and ideas spark around us all the time. Some gain early acceptance, seemingly level off, and then burst on to the mainstream scene years later. Having insight into innovation adoption theories will help us gauge how to best work market levers in order to establish new products and win over larger markets.

Randy Elrod, an artist and author, writes about three types of audience influencers and their impact on innovation adoption in “How to Diffuse Ideas and Influence People:”

(1) Opinion leadership is the degree to which an individual is able to influence informally other individual’s attitudes or overt behavior in a desired way with relative frequency. (2) A change agent is an individual who attempts to influence client’s innovation-decisions in a direction that is deemed desirable by a change agency. (3) An aide is a less than fully professional change agent who intensively contacts clients to influence their innovation-decisions.

Generally, the fastest rate adoption of an innovation results from influencing the innovative influencers’ decisions. As Don Henley of the band Eagles fame once stated during an interview when asked how it feels to be so famous and have his songs permeate society, he replied, “It’s not the fame, it’s the ripple effect I’m hoping for”.

It’s precisely this ripple effect we are seeking through our efforts to re-invent the digital publishing business. We need to achieve substantial commercial success to offset the loss of formerly stable revenue streams. We can give ourselves advantages by acquiring new skills that help us — plan for distinct scenarios, take the temperature of the marketplace, renovate our ecosystems, and prime our audiences for new offerings.

Continue reading on The Scholarly Kitchen.

Reference Content for Mobile Devices: Free the Facts from the Format

In Technology on March 15, 2011 at 5:10 pm

Source: William Hook on flickr

An excerpt from the article that David Wojick and I have written for E-Reference Context and Discoverability in Libraries: Issues and Concepts, which will be published by IGI Global and edited by Sue Polanka, head of reference and instruction at the Wright State University Libraries:

The rapid rise of mobile devices presents reference content providers with a grand challenge. Traditional content designs, especially web pages, simply do not work on the tiny screens of mobile devices. The typical computer screen is 50 or more times larger than the typical mobile device screen. From 200 to 300 square inches for the computer, compared to just four to six square inches for the mobile machine. As a result, traditional web-based content designs are virtually unreadable on the mobile screen. The solution is to radically restructure content, presenting it in a way that breaks it down into tiny pieces and frees the facts from the format. But the organization will not be on the screen as format, as it typically is with web pages. Instead, the key to effective presentation of factual material will be in the linkages among the tiny pages, of which a great many will be required.

An abridged version of the article appears in the first issue of the Advances in Library and Information Science (ALIS) Newsletter.

[About the authors: Alix Vance owns Architrave Consulting and is Chief Operating Officer at The Center for Education Reform. David E. Wojick is Senior Consultant for Innovation at the Office of Scientific and Technical Information (OSTI) of the U.S. Department of Energy. OSTI operates several of the world’s largest technical reference portals, including www.science.gov and www.worldwidescience.org.]

Smarter Metadata — Aiding Discovery in Next Generation E-book and E-journal Gateways

In Linked Data, Technology on March 8, 2011 at 3:13 pm

Source: Andrew Mason on flickr

From my February post on The Scholarly Kitchen —

With the recent surge in library e-book sales, serials aggregators are racing to add e-books to their platforms.ProQuest’s recent acquisition of ebrary and JSTOR’s expansion into current journals and e-books signal a shift from standalone e-book and e-journal aggregator platforms to mixed content gateways, with e-books and e-journals living cheek by jowl in the same aggregation.

Meanwhile, researchers have become accustomed to the big search engines, and have shifted from reading to skimming. As the authors of an article in the January issue of Learned Publishing, “E-journals, researchers – and the new librarians,” summarize:

Gateway services are the new librarians. . . . Reading should not be associated with the consumption of a full-text article. In fact, almost 40% of researchers said they had not read in full the last important article they consulted. . . . ‘Power browsing’ is in fact the consumption method of choice.

These changes in behavior mean that gateway vendors have to develop more sophisticated tools for organizing and surfacing content. ProQuest, OCLC, EBSCO, and others have responded by creating new tools and systems. But is it enough?

Publishers often discuss distinctions between e-book and e-journal business and access models, but the truly complex differences in e-books and e-journals reside beneath the surface, in the metadata layer. Understanding and compensating for these differences is essential for interoperable content discovery and navigation when mixed e-book and e-journal content is delivered in large-scale databases, which is increasingly the norm.

Continue reading on TSK.

Low-Hanging Fruit and the Re-Ordering of the Value Chain

In Internet Business Models on January 18, 2011 at 12:43 am

Source: Jamescapp02 on flickr

Outtakes from my post yesterday on The Scholarly Kitchen:

The machines of scholarly research and content dissemination require monetary input at some point(s) in the process in order to run. Despite this reality, many mission-driven organizations are uncomfortable acting as (or, in some cases, are ill-suited to be) commercial survivors.

Quoting Joe Esposito’s comment to Phil Davis’ recent post on OA competition between PLOS and Nature:

What fascinates me is that the governing boards of prestigious journals are interfering with the necessary moves to counter these developments. Author-pays open access is growing in strength but conservative boards do not always understand the competitive circumstances that their operating staff bring to their attention.

In open access journal publishing, more submissions + more acceptance + less/different forms of payment + increased use + varying levels of peer review translate (one hopes) to high impact factor = success. Volume (of content and use) are key drivers of success, but money is not absent from the process. Ultimately, it cannot be.

Another excerpt, further along:

Kent [Anderson] recently re-posted a round-up of “10 business models that rocked 2010“. Most of the models involved first building audiences and then working levers to generate financial wins (a re-ordering of old-school models in which a product was produced and marketed to audiences that were built over time, through sales).

The mechanisms for commoditization in these new-school cases included:

  • Selling customer data
  • Crowdsourcing ideas
  • Upselling high-volume audiences
  • New storefronts and cloud-based points of purchase

What strikes me as remarkable about these examples is how unremarkable the core transactions (what is being exchanged by whom) really are. The tools and re-ordering are new, but the commercial exchange of goods is very basic, ancient even. The more things change, the more they remain the same.

One of the big overarching differences is the reversal of the ordering sequence. Rather than if you build it, they will come these models conform to if you get them, you can build it (and ultimately sell it).

Read the entire post on TSK.