Thomson Reuters in the Clouds

Almost 10 years ago, Thomson Reuters embarked on the development of their Elektron Data Platform and sold it as a mechanism to get faster as a data provider and closer to their customers.  As part of this effort they built data centers around the world to support their objectives and the efforts were highly successful.  That said, the rapidity by which technology advances is now encouraging the company to place their financial data into the cloud supported not by their own technology but by Amazon and the AWS products.

Financial data is highly transacted and Thomson Reuters counts as clients most financial companies of any consequence.   As a result of this implementation to AWS, their clients will gain increased flexibility in how they use data and how clients can develop new products based on this data.   Not only do clients not have to build and maintain their own data centers (obviously they can if they want) but they can build their own applications on AWS which in turn will allow them to be more flexible in how they service their customers.  The model Thomson Reuters is establishing could be revolutionary in the manner in which users manage financial data and service customers.

“The enhancement to the Elektron Data Platform will initially provide access to real-time data on the secure and scalable Amazon Web Service (AWS) Cloud in North America, with plans to expand to Europe and Asia later this year. With the cloud API, data can be consumed natively on AWS, directed to applications based in other cloud environments, or to an on-premise environment.

As a simplified, conflated real-time service, the real-time in the cloud service can power up to three client applications at three updates per second across 50,000 instruments at the same time, which can be selected from the full universe of over 70 million instruments covered by the Elektron Data Platform.”

No less important from this announcement is that by using AWS, Thomson Reuters will be buying in to a set of standards and protocols which will encourage application development, experimentation and likely broader usage.  This will lower the barriers to entry for many existing and new customers.

As the sheer amount of data increases and complexity grows, Thomson Reuters have taken the view that making data accessible can reduce complexity and help companies focus more on the delivery of analytics, machine learning applications and other innovations.   Enabling this without a cumbersome back end technical architecture will be the strategy all data managers will begin to execute.

(Press Release)

Where Art Thou Growth?

A panel of experts provides some insight on how publishers can find growth opportunities to expand their businesses.

As one of the last session’s panelists at the Publisher’s Forum in Berlin stated, “it is easy to go mad if you allow yourself to be consumed by all the threats and opportunities to publishing and it is far better to focus on the opportunities.”  And, for these panelists speaking on the challenge growth poses, there are plenty of opportunities to contemplate.  The predominant theme of the conference’s discussions was that ‘change is here’ and publishers can no longer choose to ignore it.  This panel on Innovation and Growth was moderated by David Worlock with panelists Fionnuala Duggan from Informa plc, Joerg Rheinboldt from Axel Springer and Joseph Evans from Enders Analysis.

As publishers strategize their growth plans, Duggan noted they are beginning to experiment with new models and market-entry opportunities.  One company discussed at the forum was the education company Alison, which brought to the self-teaching learning and life-long education market a ‘freemium’ model.  Alison’s model requires great scale but it’s a global business and the company is going into areas where there are large populations of ambitious, underserved people such as Nigeria.   Cengage has also challenged the traditional pricing model for educational materials with the launch of their content subscription model which has upended the textbook pricing model and may well become an industry standard model.  In seeking growth, new business model innovation can have a profound impact on your business.

Read the entire post here.

Corporate Data Strategy and The Chief Data Officer

Are you managing your data as a corporate asset? Is data – customer, product, user/transaction – even acknowledged by senior management? Responsibility for data within an organization reflects its importance; so, who manages your data?

Few companies recognize the tangible value of the data their organizations produce and generate. Some data, such as product meta-data, are seen as problematic necessities that generally support the sale of the company’s products; but management of much of the other data (such as information generated as a customer passes through the operations of the business) is often ad-hoc and creates only operational headaches rather than usable business intelligence. Yet, a few data aware companies are starting to understand the value of the data generated by their companies and are creating specific business strategies to manage their internal data.

Establishing an environment in which a corporate data strategy can flourish is not an inconsequential task. It requires strong, active senior-level sponsorship, a financial commitment and adoption of change-management principles to rethink how business operations manage and control internal data.   Without CEO-level support, a uniform data-strategy program will never take off because inertia, internal politics and/or self-interest will conspire to undermine any effort. Which raises a question: “Why adopt a corporate data strategy program?”

In simple terms, more effectively managing proprietary data can help a company grow revenue, reduce expenses and improve operational activities (such as customer support.) In years past, company data may have been meaningless in so far that businesses did not or could not collect business information in an organized or coordinated manner. Corporate data warehouses, data stores and similar infrastructure improvements are now commonplace and, coupled with access to much more transaction information (from web traffic to consumer purchase data), these technological improvements have created environments where data benefits become tangible. In data-aware businesses, employees know where to look for the right data, are able to source and search it effectively and are often compensated for effectively managing it.

Recognizing the potential value in data represents a critical first-step in establishing a data strategy and an increasing number of companies are building on this to create a corporate data strategy function.

Businesses embarking on a data-asset program will only do so successfully if the CEO assigns responsibility and accountability to a Corporate Data Officer. This position is a new management role and not additive to an existing manager’s responsibilities (such as the head of marketing or information technology). In order to be successful, this position carries with it the responsibility for organizing, aggregating and managing the organization’s corporate data to better effect communications with supply chain partners, customers and internal data users.

Impediments to implementing a corporate data strategy might include internal politics, inertia and a lack of commitment, all of which must be overcome by unequivocal support from the CEO. Business fundamentals should drive the initiative so that its expected benefits are captured explicitly. Those metrics might include revenue goals, expense savings, return on investment and other, narrower measures. In addition, operating procedures that define data policies and responsibilities should be established early in the project so that corporate ‘behavior’ can be articulated without the chance for mis- and/or self-interpretation.

Formulating a three-year strategic plan in support of this initiative should be considered a basic requirement that will establish clear objectives and goals. In addition, managing expectations for what is likely to be a complex initiative will be vital. Planning and then delivering will enable the program to build on iterative successes. Included in this plan will be a cohesive communication program to ensure the organization is routinely made aware of objectives, timing and achievements.

In general terms, there are likely to be four significant elements to this plan: (1) the identification and description of the existing data sources within an organization; (2) the development of data models supporting both individual businesses and the corporate entity; (3) the sourcing of technology and tools needed to enact the program to best effect; and then, finally, (4) a progressive plan to consolidate data and responsibility into a single entity. Around this effort would also be the implementation of policies and procedures to govern how each stakeholder in the process interacts with others.

While this effort may appear to have more relevance for very large companies, all companies should be able to generate value from the data their businesses produce. At larger companies the problems will be more complex and challenging but, in smaller companies, the opportunities may be more immediate and the implementation challenges more manageable. Importantly, as more of our business relationships assume a data component, data becomes integral to the way business itself is conducted. Big or small, establishing a data strategy with CEO-level sponsorship should become an important element of corporate strategy.

The following are the next articles in the series:

  1.  Setting the Data Strategy Agenda
  2. Corporate Data Program: Where to Start?
  3. Strategically Managing Data for Long Term Benefit

Predictions for 2016: Education, China, Platforms and Blockchain.

For many years now I’ve been putting my thoughts about the future of the media and publishing in writing.  Here are my thoughts on the coming year.

2016 Predictions:

Education publishing may well see a lot of turmoil during 2016.   At Houghton Mifflin, CEO Linda Zecher has continued to make changes to her organizational and executive team, while at Cengage Michael Hansen‘s team is now well bedded in.  In both cases, the companies are focused in investing in digital products and distribution, which they couldn’t do doing while their businesses were under considerable financial constraints prior to refinancing.   Where change will really be evident is at Pearson, Wiley, Scholastic and Macmillan.   Given the share slides of both Wiley and Pearson, I expect some restructuring is inevitable at both companies.   Pearson has already announced significant headcount reductions and has sold off most of its ‘non-core’ operations.  Pearson’s share price is at a ten-year low and any long-term shareholder must be wondering what happened to the ROI from the asset sales and education company purchases made during the past 10 years.   At the current price, the company must be a target for private equity.  Perhaps even Bertelsmann will take a close look at the company in collaboration with a PE company.

Similarly, at Wiley there is an argument that their educational division is not big enough to be a “real” player against the bigger companies.   That may have been fine when the business as a whole was running well; however, the business is fighting a general market slow-down and internal operational issues, all of which are reflected in their operational results.   Look for some announcement in 2016 that Wiley is looking at ‘strategic options’ for parts of its business.   It is also possible that Scholastic may consider similar options for its education business and perhaps Macmillan could look to pick up more assets to grow the scale of their education textbook business.

The expansion of China.  In years past I’ve predicted that a Chinese publisher would make a significant purchase in the US/Europe of an academic/professional publisher, but that has yet to happen.  Still, there have been small, modest investments by Chinese publishers over the past few years and the Chinese publishing industry has begun to expose itself internationally at BookExpo, LBF, etc.  I think this shows increasing confidence (which may have been lacking five years ago) and that makes expansion into western markets a probability.  In addition, there is a recognition that the domestic Chinese publishing market is significant, both in size and reputation, and this presents international expansion opportunities for Chinese publishers which were not appreciated five years ago.  This developing strength will also help propel Chinese publishers towards global expansion.

And, just this week, a Chinese consortium announced it was bidding for Opera, a web browser design company based in Norway.  While this deal is not directly in our market, it is indicative of the intention of Chinese investors to expand into the media market in a big way.  (Opera actually has a larger role in content distribution than may be obviously apparent).

Platforms purposely open will become a strategic imperative for all CTOs looking for new content management options in the coming years.  The launch of Facebook, Apple News and other large distribution networks will actually convince more content owners that their content repositories and distribution networks need to be built with open-source, non-proprietary tools, and retain open APIs so that linking and third-party application development can be encouraged and fostered.   While the entry of the larger players is important, it will not diminish the need for individual publishers (and/or aggregators) to maintain their own market presence.  What becomes more important is that the platforms on which these are built are true platforms which can be upgraded frequently, without disruption or added cost by the developer.  In addition, development and third-party app “tiers” sit on top of this base platform to enable extensions and ‘bespoke’ applications.  These latter elements can be built by the software provider, the client publisher or third-party developers.  The third-party development capability will become a marketplace for applications similar to the manner in which salesforce.com has established their developer community.   These product criteria will become critical entry points for any technology provider presenting their solution to education, academic and scholarly publishers from this point forward (if it isn’t already).

The growth of corporate communication platforms is another prediction I’ve made in years past.  It hasn’t yet become prevalent; however, I believe virtually all corporations and businesses are becoming publishers to some degree.   Accelerating this is the availability of the tools needed as well as the business imperative for companies to manage their own internal and external content in more effective ways.   I recently met an ex-colleague who has developed a content tool that enables a company to host its HR and policies and procedures manuals in a central service.  This content platform offers edit features so, not only is the content updated daily, but employees are empowered to offer input to improve procedures and safety practices, which can then be immediately rolled out to other offices.  A global retailer is now testing this tool across its business.   Similarly, communication with external constituencies can be improved significantly for many businesses by adopting many of the same practices which publishers have employed with their subscribers, like content platforms and access and control features.

Growth of licensing revenues:  CCC has been on an accelerated expansion of overseas activities which underscores the opportunities for publishers outside the US marketplace.   Most publishers are still focused on the form of their content but, increasingly form will be less and less important (the aforementioned Facebook and AppleNews sites are instructive on this point).  This will mean publishers providing flexible content and making it available to as many sources as possible will increasingly drive their revenues.   Licensing fees are becoming a very important source of revenue for publishers and if your revenues in this area haven’t increased more than 20% over the past three years you may want to re-think your policies.   Undoubtedly, licensed content will become one of a publisher’s main sources of revenue in the coming years.  This will have implications across businesses, especially for systems and accounting processes.

Application of Blockchain: And, speaking of copyright, expect to see the application of Blockchain to intellectual property rights.  As you know, Blockchain is the underlying foundation for BitCoin and, as such, its application to the protection and distribution of intellectual property will be another very interesting use.   Each step in a Blockchain transaction is protected by a tamper-proof encryption technology which supports BitCoin as a legitimate financial transaction service.   The use of Blockchain is being considered in several other applications, and media is one of them.

Blockchain can be used to facilitate the transfer of intellectual property from one owner to another.  Bitcoins are ‘tokens’ that represent money and are exchanged on the Blockchain network.  But there is no reason why a ‘token’ couldn’t represent some other specific item of value, such as a book or an article or a business case.  Once a transaction occurs, the user is supplied with a unique key for accessing the content.  If the user subsequently wants to sell or lend the item, they pass their unique key to the next person for their use.  This process eliminates the ‘residual’ copy issue which arises when someone tries to sell a second-hand e-file.

Ultimately, a network of “bitRights” ™ could represent a universal content repository or bazaar/market where rights and content could be exchanged or bought, traded and sold.  In addition, this aggregation would also generate significant user data and analytics to inform future pricing, content/topic areas, distribution models and a host of other benefits which currently get lost in the very inefficient rights and copyright clearance process we have today.   Recently, Ascribe received $2mm in seed capital to establish a Blockchain product for artwork.

Open Access for federal funded research will clear Congress in 2016.   In recent years, the Fair Access to Science & Technology Research Act (FASTR) bill has failed to pass Congress due to opposition from publishers and others.  FASTR will require any federal agency which provides more than $100million in grants (which, let’s face it, is a huge hurdle) to adopt an open-access policy.   Coupled with this will be more excitement and activity around the Obama Administration’s open data initiative.  Either way, there will be much more to happening in 2016 with open access to government information.   App developers and non-profit foundations are working together to drive better access to this type of information, and I recently saw a demo from CivicHall, which is doing just that for several cities already.

As always, I expect the coming year will be another exciting year with, I hope, the above trends occurring but almost certainly many other new and interesting things as well.

Michael Cairns has served as CEO and President of several technology and content-centric business supporting global media publishers, retailers and service provider.  He can be reached at michael.cairns@outlook.com and is interested in discussing new business opportunities for executive management and/or board and advisory positions.

The Curator and the Docent

Walking around a vast museum can be interesting and, sometimes serendipitous, but often it is an incomplete experience. Items are organized in specific groups yet not always in a manner that encourages exploration of the most important items. Presented with a gallery full of amphorae, it can be difficult to recognize the single important item while on your own and without a guide. Surfing the web for information and knowledge can offer a similar experience: Access and proximity is no guarantee you will happen on relevance.

Museums and libraries are good proxies for the concept of “curation,” which we’re hearing a lot about at the moment. Private equity (for one) has found its next buzz word and funding vultures are lacing their presentations with references to ‘curation’ in an effort to gain financial support for their new business ideas. But curation is an old concept: Television networks, newspapers, magazines, journals and other media have all practiced a form of content curation for hundreds of years. We’ve just recently latched onto the idea of curation as though it were something new. The need for curation in the old media world wasn’t as obvious as in the internet world because, on the web, ‘everything carries the same weight’ and the average user has difficulty discerning good content from bad. Indeed, as content on the web exploded over the past fifteen years, users accepted the “good enough” concept – free content was plentiful – and were content to ‘satisfice’ either knowingly or obliviously. User behavior and expectations are changing and investors are now chasing businesses that profess to actively curate content and communities of interest.

In recent years content curation has emerged out of the wild, wild, west of ‘mere’ content. Sites such as The Huffington Post, Red State and Politico all represent new attempts to build audiences around curated content. While they appear to be successful, at the same time there are other sites (such as Associated Content and Demand Media) contributing to the morass of filler content that can plague the web users’ experience. The buzz word ‘curation’ does carry with it some logic: As the sheer amount of information and content grows, consumers seek help parsing the good from the bad. And that’s where curation comes in.

The amount of content available to consumers – much of it free of charge, but scattered across thousands of websites – is growing exponentially every day. At the same time, consumers are increasingly doing independent research and attempting on their own to source important information to support their increasingly complicated lives. Questions or information relating to healthcare, finances, education and leisure activities represent a small sample of the range of topics on which consumers look for accuracy and relevance, yet encounter an immense sea of specious or outdated content. In many ways, the web – in its entirety – is the new dictionary, directory or reference encyclopedia, but users with specific interests are increasingly beginning to understand they need to spend as much time validating what they find as they do consuming their research. In the old days, it was as simple as pulling the volume off the shelf and, while the web offers a depth and accuracy of content that far outstrips any from the old days, finding content of similar veracity can be a challenge.

For the past two years, I was working on a project with Louis Borders at Mywire.com in an attempt to build a curated news and information service we called Week’sBest. For a variety of reasons we put the project on hold in February, but the concept was simple: Identify experts that can curate content on a range of specific topics and build a community of interested subscribers around the content. Our model was to find expert ‘content producers’ who retain unique knowledge and understanding of a specific topic and would filter content from across the web specific to their topic of expertise. Mywire.com built a unique editorial tool to make this process almost routine by pre-selecting topic-specific content from both brand name sources and from across the web. Our experts – the content producers – logged on each day and selected from this pre-sorted list only those items they considered the best content. Consumers interested in each of these topics subscribe to a free weekly email digest of the material selected. Our revenue model was based on turning a subset of our free email subscribers into paid subscribers who would gain access to high-quality content – such as content from Oxford University Press.

While we were unable to execute as we expected, we did gain validation of our concept from both the publishing and the private equity community. Publishers, who we were chasing to be our ‘content experts’ liked that there was a low cost of entry for their participation and liked the editorial platform we had invested in. The equity community liked the ‘curation’ model, the people involved in the project and the investment that Mywire had made in the platform. However, we suffered the ‘prove it’ syndrome. Both publishers and equity partners wanted to see the model work before they committed and we ran out of time and resources. Mywire.com continues to invest in other curation type models.

I remain convinced that applying technology to the selection of useful, valid and appropriate content is only part of the solution. At Mywire, we used a text mining tool as part of the editorial process and on simple news items – which are increasingly generic – placing content items into subject/topic groupings was relatively easy. The process isn’t perfect and requires frequent ‘fine tuning’ but while the tools are improving, human intervention is still required. Earlier this month we learned that even Google was applying some human filtering to their news site.

There is a real debate whether consumers will pay for real expertise and knowledge: I believe they will, just as they paid for specialist magazines, journals, cable channels and similar media in the analog centuries. The atomization of content has complicated matters in that it has taken the proverbial covers off the print limitation of the traditional magazine. While a reader or subscriber will buy into the expertise of ‘Glamour’ or ‘Men’s Health,’ they now expect all important and relevant content and not just the content prepared by the magazine’s writers. After all, there is a low hurdle in the user’s ability to search for content on their own and it is silly to ignore this ability. Acting as a ‘content producer,’ the editors of ‘Glamour’ should be able to provide their paying subscribers with a collective representation of all content that’s important and relevant to their readers even if the content is produced by Glamour’s competitors. This is an important service and doesn’t limit the ability of Glamour to produce their own content; rather, it enhances it because they are able to view in detail the interests of their subscribers and produce applicable content to match.

In the above example, generic news is never going to be the basis for paid subscriptions. For example, the news that suntan lotion causes skin cancer is a hyped news story. In the Glamour example, this news story would always remain in the free section of their site; however, available to subscribers would be a curated selection of in-depth content including reference material, added to over time, with commentary and discussion from their ‘expert’ editors and advisors about the real issue of sun protection products. With a brand such as Glamour, the number of expert curated topics made available to subscribers could easily exceed fifty and over time would be likely to grow. Strongly associated with this approach would be the development of communities around each topic, leading in turn to additional business opportunities such as ad programs, events and special publishing programs.

The interest of consumers across a wide variety of subjects and topics continues unabated and the internet has only facilitated that interest, although our expectations have been reduced or marginalized due to undifferentiated content. The consumer is increasingly smarter about the content they consume and they also continue to impress with their ability to seek out and absorb what, in the analog world, was considered too “advanced” for their understanding. There was always an arbitrary wall between “professional or academic” content and consumer content: Increasingly, consumers are making it clear that they want to make the decision themselves whether particular content is or is not too advanced for their comprehension or enjoyment.

Recently, as I wandered around a museum with overwhelming breadth and depth of content, I was lucky to be guided in my travels by a professional. When she introduced herself to me, she used the term ‘docent’ to describe her function. A docent is a ‘knowledgeable guide’ and the function seems to me to perfectly complement the process of curation. In an online world, where more and more content appears to “carry the same weight,” we will look to and pay for the combination of curator and docent – sometimes the same person or entity – who can organize and manage a range of content and also engage with the user so they gain insight and meaning from the material. At Mywire.com, we intentionally approached branded media companies because they were recognized as experts in their segments. These are the companies which should be able to build revenue models around the curation of content to offer subscribers a materially different experience than simply performing a Google search query delivering up generic news and semi-relevant content.

Michael Cairns has served as CEO and President of several technology and content-centric business supporting global media publishers, retailers and service provider.  He can be reached at michael.cairns@outlook.com and is interested in discussing new business opportunities for executive management and/or board and advisory positions. He blogs at personanondata.com