Jennifer Zaino

Semantic Tech Lends A Hand To Thanksgiving Holiday Sales

Photo courtesy: https://www.flickr.com/photos/119886413@N05/

Photo courtesy: https://www.flickr.com/photos/119886413@N05/

Retailers are pushing holiday shopping deals earlier and earlier each year, but for many consumers the Thanksgiving weekend still signals the official start of the gift-buying season. With that in mind, we present some thoughts on how the use of semantic technology may impact your holiday shopping this year.

  • Pinterest has gained a reputation as the go-to social network for online retailers that want to drive traffic and sales. Shoppers get an advantage, too, as more e-tailers deploy Rich Pins, a feature made available for general use late last year, for their products, using either schema.org or Open Graph. Daily updated Product Rich Pins now include extra information such as real-time pricing, availability and where to buy metatags right on the Pin itself. And, anyone who’s pinned a product of interest will get a notification when the price has dropped. OverstockTarget, and Shopify shops are just some of the sites that take advantage of the feature. Given that 75 percent of its traffic comes from mobile devices, it’s nice that a recent update to Pinterest’s iPhone mobile app – and on the way for Andoid and iPads – also makes Pins information and images bigger on small screens.

 

  • Best Buy was one of the earliest retailers to look to semantic web technologies to help out shoppers (and its business), adding meaning to product data via RDFa and leveraging ontologies such as GoodRelations, FOAF and GEO. Today, the company’s web site properties use microdata and schema.org, continually adding to shopper engagement with added data elements, such as in-stock data and store location information for products in search results, as you can see in this presentation this summer by Jay Myers, Best Buy’s Emerging Digital Platforms Product Manager, given at Search Marketing Expo.

 

  • Retailers such as Urban Decay, Crate&Barrel, Golfsmith and Kate Somerville are using Edgecase’s Adaptive Experience platform, generating user-friendly taxonomies from the data they already have to drive a better customer navigation and discovery experience. The system relies on both machine learning and human curation to let online buyers shop on their terms, using the natural language they want to employ (see our story here for more details).

 

  • Walmart at its Walmart Labs has been steadily driving semantic technology further into its customer shopping experience. Last year, for example, Walmart Labs senior director Abhishek Gattani discussed at the Semantic Technology and Business conference capabilities it’s developed such as semantic algorithms for color detection so that it can rank apparel, for instance, by the color a shopper is looking for and show him items in colors close to read when red itself is not available, as well as categorizing queries to direct people to the department that’s really most interesting to them. This year, WalMart Labs added talent from Adchemy when it acquired the company to bring further expertise in semantic search and data analytics to its team, as well as Luvocracy, an online community that enables the social shopping experience—from discovery of products recommended by people a users trusts to commerce itself. Search and product discovery is at the heart of new features its rolling out to drive the in-store experience too, via mobile apps such as Search My Store to find exactly where items on their list are located at any retail site.

What’s your favorite semantically-enhanced shopping experience? Share it with our readers below to streamline their holiday shopping!

 

Cognitum Ontology Editor Amps Up Analytical Computations and Collaborative Knowledge Editing

fluenteditorlogoThe end of the month should see the release of an update to Cognitum’s Fluent Editor 2014 ontology editor, which will bring with it new capabilities to further drive its usage not only in academia but also in industrial segments such as energy, pharmaceuticals and more.

The company is including among its additions the ability to run analytical computations over ontologies with the open source language R and its Controlled Natural Language. What has been lacking when it comes to performing computations over Big Data sets, says CEO Paweł Zarzycki, is some shortcut to easily combine the semantic and numerical worlds. R is great for doing statistical analysis over huge sets of numerical data, he says, but more knowledge opens up when Cognitum’s language is leveraged for analysis in a qualitative way.

Read more

Web Components: Even Better With Semantic Markup

W3C LogoThe W3C’s Web Components model is positioned to solve many of the problems that beset web developers today. “Developers are longing for the ability to have reusable, declarative, expressive components,” says Brian Sletten, a specialist in semantic web and next-generation technologies, software architecture, API design, software development and security, and data science, and president of software consultancy Bosatsu Consulting, Inc.

Web Components should fulfill that longing: With Templates, Custom Elements, Shadow DOM, and Imports draft specifications (and thus still subject to change), developers get a set of specifications for creating their web applications and elements as a set of reusable components. While most browsers don’t yet support these specifications, there are Web Component projects like Polymer that enable developers who want to start taking advantage of these capabilities right away to build Web objects and applications atop the specs today.

“With this kind of structure in place, now there is a market for people to create components that can be reused across any HTML-based application or document,” Sletten says. “There will be an explosion of people building reusable components so that you and I can use those elements and don’t have to write a ton of obnoxious JavaScript to do certain things.”

That in itself is exciting, Sletten says, but even more so is the connection he made that semantic markup can be added to any web component.

Read more

Google Researchers Use End-to-End Neural Network To Caption Pictures

pizzaGoogle researchers have announced the development of a machine-learning system that can automatically produce captions to accurately describe images in properly formed sentences the first time it sees them.

“This kind of system could eventually help visually impaired people understand pictures, provide alternate text for images in parts of the world where mobile connections are slow, and make it easier for everyone to search on Google for images,” report research Scientists Oriol Vinyals, Alexander Toshev, Samy Bengio, and Dumitru Erhan in a blog about how they’re building a neural image caption generator.

Getting there, the researchers say, involved merging recent computer vision and language models into a single jointly trained system that can directly produce a human readable sequence of words to describe a given image. The task is no easy one, they point out, explaining that unlike image classification or object recognition on its own, their work has to account not only for the objects contained in the image, but also for expressing how these objects relate to each other, as well as their attributes and the activities they are involved in.

The approach leverages an end- to-end neural network that can automatically view an image and generate a plan English description of it.

Read more

How To Improve Self-Service Support

steponeStepOne’s first product, Contextual Care, is a contextually-based recommendation engine designed to help a company’s customers service themselves online. It leads them to the information they need in realtime based on factors such as applying what a company knows about a customer to predicting the question he might ask; matching content to the question based on machine learning algorithms; and getting smarter about doing that by continuously learning through customer interactions.

It became clear to StepOne as it deployed this product with its clients that in many cases, the clients first needed to have a higher definition view of their self-service content to make the best calls for their customers. Now, it’s debuting Care Profiler, a diagnostic tool to help clients assess the quality of their self-service support libraries and boost them to better address customer issues.

Read more

How To Really Hear The Voice of the Customer

6038994140_f3071d4d70_m

Image courtesy: https://www.flickr.com/photos/carbonnyc/

There’s a whole lot of customer information out there, including the verbatim comments companies record as part of customer call center surveys or other voice-based interactions. At Verizon Wireless, for example, more than 190 million customers call in daily, weekly and monthly, and sound bites from them during after-call surveys, each a few seconds long, added up to about a ton of data that wasn’t being factored into its customer analytics efforts.

“We had the information, the WAV files, but we couldn’t analyze them with the same lens and same tools” Verizon was bringing to the text – emails, social media, surveys, and so on – commentary from its customers, according to Lorraine Schumacher, Director of Operations Customer Business Intelligence at Verizon, during a recent webinar hosted by customer experience management vendor Clarabridge. Verizon had been using Clarabridge’s technology to monitor its various listening posts to drive strategic business decisions based on analyzing text and sentiment in social media and other sources.

Now, it saw an opportunity to transcribe its WAV files of direct customer feedback so that the information in them could be processed and analyzed to support those same ends. Speech recognition and analytics vendor Voci Technologies partnered with Clarabridge to support those goals.

Read more

Smart Glasses Don’t Have Consumer Vote Yet

gglassGot your Smart Glasses on today? If not, you’re very much not alone. According to a report published this month by Juniper Research, Smart Glasses: Consumer, Enterprise and Healthcare Strategies and Forecasts 2014-2019, smart glass shipments are “unlikely to exceed 10 million per annum until 2018.”

What’s holding back one of the early entrants in the wearables sector? The report cites “emerging privacy concerns, dismissal of the initial devices as ugly and, most importantly, questions about exactly how useful the devices are in day-to-day life. While there is an active development community for smart glasses, no developers have precise answers as to how the devices will improve the lives of consumers.”

There is enterprise interest, it notes, but because businesses are more likely to share devices among users rather than buy them in bulk for everyone, “this will result in high investment but low shipment volumes to the enterprise for the next five years.”

Read more

Cleveland Clinic And IBM Watson To Tackle DNA Mutations

ibmwatsonlogoLate last month saw IBM expand its existing engagement with the Cleveland Clinic around its deployment of IBM Watson technology to cover new domains. The vendor already has worked with faculty, physicians and students at the Cleveland Clinic Lerner College of Medicine of Case Western Reserve University on a project to develop Watson-related cognitive technologies to help physicians make more informed and accurate decisions faster and to cull new insights from electronic medical records. Now, the Lerner Research Institute’s Genomic Medicine Institute at Cleveland Clinic will evaluate Watson’s ability to help oncologists develop more personalized care to patients for a variety of cancers.

Watson is being leveraged by other institutions in the field of cancer care, including Memorial Sloan Kettering and MD Anderson. The new venture with Cleveland Clinic is focused on identifying patterns in genome sequencing and medical data to unlock insights that will help clinicians bring the promise of genomic medicine to their patients, using Watson’s cognitive system, deep computational biology models and IBM’s public cloud infrastructure SoftLayer, IBM says.

“There is a lot of work going on in the cancer area,” says Steve Harvey, IBM VP of Watson Cancer Genomics. This latest partnership aims to work toward identifying drugs that might be relevant to treat a particular patient’s condition by working from the understanding that cancer is a disease of DNA, and by leveraging the fact that the cost of reading DNA has gone down drastically. Today, it’s possible to take a patient’s normal cell and see the DNA there and compare that to the DNA in a cancer cell to see the differences – the mutations – that can point medical professionals in the direction of what actually is causing the tumor.

Read more

Factiva Teams With Evernote To Redefine Productivity

Factiva Evernote 2Starting December, Factiva will integrate with Evernote’s Context, a recently added capability from the productivity platform’s new Augmented Intelligence team that surfaces content relevant to information that users are writing about or collecting. And starting today, all one million Factiva users will get the ability to add Factiva articles right to their Evernote notebooks.

Making its information easily available to users, including across devices such as smartphones and tablets, is key for Factiva. “The notions of productivity and mobility are very important to us,” says Frank Filippo, VP, GM of Corporate Products at Dow Jones and head of Factiva, which enriches the content it aggregates with capabilities such as company, industry, region, and subject taxonomies. Mobile considerations will become an increasing focus for Factiva as it seeks to expand further beyond its core audience of information pros and researchers to a wider and perhaps less deskbound audience that needs to efficiently access quality data, such as users charged with competitive intelligence or mergers and acquisitions strategies.

Read more

Lexalytics’ Semantria Accommodates Text Analytics Abroad

lexsemInternational expansion has been a focus for cloud-based text and sentiment analytics vendor Semantria since its acquisition by text mining vendor Lexalytics over the summer. This week, that’s being addressed by adding enterprise text analytics servers in Europe, to address compliance with EU privacy laws around the location of personal data, as well as making its services available in Arabic, Russian, Japanese and Malay.

Lexalytics’ Semantria SaaS and Excel text-mining platform has a few clients in Europe so far, including among them several large social media monitoring and voice-of-the-customer clients that it’s signed up in the last quarter, according to Seth Redmore, VP Product & Marketing Lexalytics.  eDigitalResearch in the UK is one of them. English, French, German, Spanish, Portuguese and Italian are already among its supported languages, and Dutch should be next on board.

Read more

NEXT PAGE >>