Austin

1601 E. 5th St. #109

Austin, Texas 78702

United States

Coimbatore

Module 002/2, Ground Floor, Tidel Park

Elcosez, Aerodome Post

Coimbatore, 641014 India

Coonoor

138G Grays Hill

Opp. BSNL GM Office, Sims Park

Coonoor, Tamil Nadu 643101 India

News & Insights

News & Insights

Searching for Search Innovation

It’s been 20 years since Internet search was “born,” and searching the Web is still pretty much the same as it was in 1995. Dubious sources of information are not weighted to be “worse” than accurate sources. If they’re “popular,” Google even weights them more favorably. Image and video searches are virtually nonexistent, making the proliferation of content in those media a growing issue for the fundamental utility of the Internet.

The root of the problem is the need for metadata tagging of media files. The traditional guardians of knowledge—librarians and publishers—can’t bear the expense of doing this work because they’re focused on economic survival, not optimization. The only players in the information curation world with cash are Google and Microsoft, and they seem to be investing as little as possible to improve their core search experience.

So, who will rescue us from this situation? I’m hoping a combination of public and private initiatives led by librarians, archivists, and information professionals might hold the answer, and that the “fuel” driving the endeavor will be crowdsourcing.

I see this playing out as follows:

  • Information curators will tap into the volunteer “crowd” and emerging technology to do the essential metadata tagging required for effective search.
  • Amara and Clarify can generate keywords and transcripts for audio-video files.
  • WorkFusion has world-class crowd-based image annotation tools.
  • Prior volunteer crowdsourcing efforts have proven that there are literally millions of willing, able volunteers available to do the work.

The private firms powering search can help improve their algorithms by using the appended tags and weighting sources according to expert-based ranking by accuracy and accountability. (This fits with the oft-rumored move by Wikipedia to use recognized experts in given fields as curators.)

This combination of manual tagging and skilled curation could be the key to delivering granular, accurate results when searching all types of media. When it comes, it will be long overdue.

Share on facebook
Share on twitter
Share on linkedin

Keep on top of the information industry 
with our ‘Data Content Best Practices’ newsletter:

Keep on top of the information industry with our ‘Data Content Best Practices’ newsletter: