Autonomy Interwoven Spices Up DAM With Virage MediaBin 7

3 minute read
David Roe avatar

Autonomy Interwoven Relases Virage MediaBin 7 DAM System

It's been a busy few months for Autonomy Interwoven (news, site).Last week, it unveiled a new solution that provides analytics for social media.

This week,it's turn for MediaBin -- their Digital Asset Management (DAM) solution -- to get a once-over, which comes in the form of meaning-basedVirage MediaBin 7, a solution thatautomatically retrieves, analyzes and processes audio, video and graphic content.

As an upgrade to their existing DAM solution, this version of MediaBin, working with the original ‘master’ assets, can generate and deliver variations and derivatives and makes them available to geographically dispersed teams.

It does this using its high-volume media processing engine that can produce on-demand transformed versions of different assets almost immediately.

Automatic Extraction OfMetadata

The difference between Virage MediaBin 7 and other DAM systems is that the system itself derives an understanding of audio and video images byleveraging Autonomy’s Meaning Based Computing (MBC) technology.

By automatically identifying and extracting key concepts in the rich media assets themselves it can deliver -- amongst other functions -- advanced analytics, automatic categorization, dynamic content associations and business processes.

Assets, as a result now can be viewed in context with other content under management, both within the extended enterprise, across social networks and across an internet search.

Learning Opportunities

Deep Meaning In Digital Assets

Essentially, what MediaBin 7 does is finds meaning deep within a video and image files and extract highly detailed meta-data from them  using Autonomy's Deep Video Indexing technology.

This is done during the ingestion process and dispels with costly manual meta-tagging, which apart from the costs, cannot provide as much detail as the indexing can.

As a result, a central library of approved digital assets is made available to users who can access it at will and initiate searches through rich media files that produce results with pinpoint accuracy.

Highlights of the new solution include:

  • A single environment that enables automatic and user-defined standards-based tagging that dramatically improves the speed, quality and storage of rich media content.
  • Using Autonomy’s Meaning Based Computing it automates time consuming manual process like meta-data tagging and categorization.
  • Use of next generation of video and speech analytics technology that enables cross-referencing with other kinds of assets
  • Easy content re-use with secure access to content enables enterprise-wide collaboration on- and off-site.
  • Real-time transformation of assets which effectively does away with the need for video, or image versioning.
  • Easy workflow management with extensive drag-and-drop capabilities for processes like ingestion, retrieval, emailing or reviewing of assets.

With the steady increase in the use of rich media linked with increasing proliferation of social media, legacy solutions have been finding it increasingly difficult manage the growth of unstructured information.

While this solution may, as Autonomy says, be the first meaning-based rich media management/DAM system, it undoubtedly will not be the last as the competition increases and companies search for a bigger bang for their bucks.