Posts Tagged ‘semantic discovery’

Semantic Tech Lends A Hand To Thanksgiving Holiday Sales

Photo courtesy: https://www.flickr.com/photos/119886413@N05/

Photo courtesy: https://www.flickr.com/photos/119886413@N05/

Retailers are pushing holiday shopping deals earlier and earlier each year, but for many consumers the Thanksgiving weekend still signals the official start of the gift-buying season. With that in mind, we present some thoughts on how the use of semantic technology may impact your holiday shopping this year.

  • Pinterest has gained a reputation as the go-to social network for online retailers that want to drive traffic and sales. Shoppers get an advantage, too, as more e-tailers deploy Rich Pins, a feature made available for general use late last year, for their products, using either schema.org or Open Graph. Daily updated Product Rich Pins now include extra information such as real-time pricing, availability and where to buy metatags right on the Pin itself. And, anyone who’s pinned a product of interest will get a notification when the price has dropped. OverstockTarget, and Shopify shops are just some of the sites that take advantage of the feature. Given that 75 percent of its traffic comes from mobile devices, it’s nice that a recent update to Pinterest’s iPhone mobile app – and on the way for Andoid and iPads – also makes Pins information and images bigger on small screens.

 

  • Best Buy was one of the earliest retailers to look to semantic web technologies to help out shoppers (and its business), adding meaning to product data via RDFa and leveraging ontologies such as GoodRelations, FOAF and GEO. Today, the company’s web site properties use microdata and schema.org, continually adding to shopper engagement with added data elements, such as in-stock data and store location information for products in search results, as you can see in this presentation this summer by Jay Myers, Best Buy’s Emerging Digital Platforms Product Manager, given at Search Marketing Expo.

 

  • Retailers such as Urban Decay, Crate&Barrel, Golfsmith and Kate Somerville are using Edgecase’s Adaptive Experience platform, generating user-friendly taxonomies from the data they already have to drive a better customer navigation and discovery experience. The system relies on both machine learning and human curation to let online buyers shop on their terms, using the natural language they want to employ (see our story here for more details).

 

  • Walmart at its Walmart Labs has been steadily driving semantic technology further into its customer shopping experience. Last year, for example, Walmart Labs senior director Abhishek Gattani discussed at the Semantic Technology and Business conference capabilities it’s developed such as semantic algorithms for color detection so that it can rank apparel, for instance, by the color a shopper is looking for and show him items in colors close to read when red itself is not available, as well as categorizing queries to direct people to the department that’s really most interesting to them. This year, WalMart Labs added talent from Adchemy when it acquired the company to bring further expertise in semantic search and data analytics to its team, as well as Luvocracy, an online community that enables the social shopping experience—from discovery of products recommended by people a users trusts to commerce itself. Search and product discovery is at the heart of new features its rolling out to drive the in-store experience too, via mobile apps such as Search My Store to find exactly where items on their list are located at any retail site.

What’s your favorite semantically-enhanced shopping experience? Share it with our readers below to streamline their holiday shopping!

 

Rovi Acquires Content Discovery and Navigation Innovator Fanhattan

roviNovember 3, 2014 – Santa Clara, Calif. — Rovi Corporation, a leading provider of advanced entertainment discovery, data analytics, and monetization solutions, announced today that it has acquired Fanhattan, a venture-backed startup, in an all-cash transaction.  Fanhattan pioneers innovative ways to discover media and entertainment on any screen, from any source, through its cloud-based Fan TV branded products. Read more

NASA Turns to Charles River Analytics to Detect Volcanic Eruptions, Storms

Charles

Sara Castellanos of Biz Journals reports, “Charles River Analytics on Thursday announced a contract to develop technology for NASA that detects volcanic eruptions, storms and algae blooms from satellite imagery. The Cambridge, Mass.-based firm develops computational intelligence technology, which is used to interpret data for the purpose of improving decision-making in real-time. The NASA contract is for a system called DIPSARS, or the Discover of Interesting Patterns and Semantic Analysis in Remote Space. The contract is valued at $125,000.” Read more

Amazon Gets An Emmy, But Semantic Discovery Wins Too

Yosi Glick, co-Founder and CEO of semantic taste engine Jinni, recently wrote a post about the technology and engineering Emmy award that is to be given to Amazon’s Instant Video  for its personalized recommendation algorithms.

The basis for awarding the honor, he writes, lies with Amazon’s early item-to-item collaborative filtering (CF) algorithms that analyze consumer data to find statistical connections between items and then uses that as the basis for recommendations. But, says Glick, the company may be soon heading toward a fundamentally different approach.

“Amazon,” Glick explains, “is using the Emmy award to flaunt its latest Video Finder service, that seems to leave CF behind and embrace a new semantic approach to recommendation.”

Amazon is embracing semantics for its video content because it realizes that video is different than regular consumer items. TV and movies are “entertainment that is consumed based on personal tastes and our particular mood at the moment.  The types of content each of us enjoy is not based on what ‘other people have also watched’, rather it has to do with the plots, moods, style and pace,” he writes. “So content has to be described and discovered the same way we choose and experience it.”

Categories in Amazon’s Video Finder service  include classifications that describe the mood, plot, style and pace of titles — meaningful classifications that Glick says are the basis for semantic discovery. You can read the entire piece here.

 

 

PureDiscovery Combines Forces with LexisNexis LAW PreDiscovery

PureDiscovery has announced “that it is combining the power of its semantic discovery technology with LexisNexis® LAW PreDiscovery™, the premier solution for electronic discovery processing and imaging. The integration between the two products will help litigation discovery professionals find the most relevant documents faster than ever before by integrating PureDiscovery’s LegalSuite product with LAW PreDiscovery. PureDiscovery’s semantic technology utilizes the company’s innovations in machine learning to produce highly relevant semantic matches.” Read more

Metadata and the Sensor Web

A new article has been published in the International Journal of Digital Earth entitled Metadata Requirements Analysis for the Emerging Sensor Web. According to the abstract, “The Sensor Web has emerged from Earth Science research with the development of Web technology, to achieve process automation, sensor interoperation, and service synergy. These promises require the discovery of the right sensor at the right time and the right location with the right quality. Metadata, for sensor, platform, and data, are crucial for achieving such goals. However, analysis and practical use of these metadata reveals that the metadata and their associations are not applicable or suitable for the Sensor Web.” Read more

Jinni Offers Taste & Mood Based Discovery Engine for Film & TV

A recent article reports, “mgMEDIA, developers of multi-device digital content distribution platform Open4Content and Jinni, innovators of the only taste-and-mood based discovery engine for movies and TV shows, today announced a new strategic partnership to deliver a fun, intuitive and highly accurate Jinni semantic discovery experience. The Jinni discovery engine will be built in to the core of mgMEDIA’s over-the-top platform, Open4Content. Users will enjoy finding and viewing high-quality entertainment that suits their personal tastes and moods anytime, anywhere.” Read more