Posts Tagged ‘semantic discovery’

NASA Turns to Charles River Analytics to Detect Volcanic Eruptions, Storms

Charles

Sara Castellanos of Biz Journals reports, “Charles River Analytics on Thursday announced a contract to develop technology for NASA that detects volcanic eruptions, storms and algae blooms from satellite imagery. The Cambridge, Mass.-based firm develops computational intelligence technology, which is used to interpret data for the purpose of improving decision-making in real-time. The NASA contract is for a system called DIPSARS, or the Discover of Interesting Patterns and Semantic Analysis in Remote Space. The contract is valued at $125,000.” Read more

Amazon Gets An Emmy, But Semantic Discovery Wins Too

Yosi Glick, co-Founder and CEO of semantic taste engine Jinni, recently wrote a post about the technology and engineering Emmy award that is to be given to Amazon’s Instant Video  for its personalized recommendation algorithms.

The basis for awarding the honor, he writes, lies with Amazon’s early item-to-item collaborative filtering (CF) algorithms that analyze consumer data to find statistical connections between items and then uses that as the basis for recommendations. But, says Glick, the company may be soon heading toward a fundamentally different approach.

“Amazon,” Glick explains, “is using the Emmy award to flaunt its latest Video Finder service, that seems to leave CF behind and embrace a new semantic approach to recommendation.”

Amazon is embracing semantics for its video content because it realizes that video is different than regular consumer items. TV and movies are “entertainment that is consumed based on personal tastes and our particular mood at the moment.  The types of content each of us enjoy is not based on what ‘other people have also watched’, rather it has to do with the plots, moods, style and pace,” he writes. “So content has to be described and discovered the same way we choose and experience it.”

Categories in Amazon’s Video Finder service  include classifications that describe the mood, plot, style and pace of titles — meaningful classifications that Glick says are the basis for semantic discovery. You can read the entire piece here.

 

 

PureDiscovery Combines Forces with LexisNexis LAW PreDiscovery

PureDiscovery has announced “that it is combining the power of its semantic discovery technology with LexisNexis® LAW PreDiscovery™, the premier solution for electronic discovery processing and imaging. The integration between the two products will help litigation discovery professionals find the most relevant documents faster than ever before by integrating PureDiscovery’s LegalSuite product with LAW PreDiscovery. PureDiscovery’s semantic technology utilizes the company’s innovations in machine learning to produce highly relevant semantic matches.” Read more

Metadata and the Sensor Web

A new article has been published in the International Journal of Digital Earth entitled Metadata Requirements Analysis for the Emerging Sensor Web. According to the abstract, “The Sensor Web has emerged from Earth Science research with the development of Web technology, to achieve process automation, sensor interoperation, and service synergy. These promises require the discovery of the right sensor at the right time and the right location with the right quality. Metadata, for sensor, platform, and data, are crucial for achieving such goals. However, analysis and practical use of these metadata reveals that the metadata and their associations are not applicable or suitable for the Sensor Web.” Read more

Jinni Offers Taste & Mood Based Discovery Engine for Film & TV

A recent article reports, “mgMEDIA, developers of multi-device digital content distribution platform Open4Content and Jinni, innovators of the only taste-and-mood based discovery engine for movies and TV shows, today announced a new strategic partnership to deliver a fun, intuitive and highly accurate Jinni semantic discovery experience. The Jinni discovery engine will be built in to the core of mgMEDIA’s over-the-top platform, Open4Content. Users will enjoy finding and viewing high-quality entertainment that suits their personal tastes and moods anytime, anywhere.” Read more