Nick Stockton of Quartz reports, “Computers stole your job; now they know your pain. Using a combination of facial recognition software and machine learning algorithms, researchers have trained computers to be dramatically better than humans at reading pained facial expressions. And they’re working on new programs to help clue you into what your friend, coworker, or client is feeling. In a study released Friday (paywall) in the journal Current Biology, researchers asked 170 subjects whether the expressions of pain shown on faces in a series of videos were real or faked. They found that the humans’ collective empathetic ability was about the same as a coin flip—they read the expressions correctly only 50% of the time. Even after researchers trained the subjects to read the subtle, involuntary muscle triggers that experts use to tell when an emotion is being faked, they were only right 55% of the time.” Read more
PALO ALTO, Calif., March 25, 2014 /PRNewswire-iReach/ — EngageClick (http://www.engageclick.com), the predictive and personalized multi-screen advertising platform that delivers superior ad engagement and performance, emerged out of stealth mode today. The EngageClick ad platform differentiates itself by applying data-driven technology that uses cognitive science, machine-learning technology and big data analytics to perform predictive segmentation, and subsequently delivers dynamic smart ads with automatic and incremental optimization across multiple screens. EngageClick maximizes advertiser ROI and increases media yield, helping ad agencies and brands increase consumer engagement and ad performance, at scale. Read more
BLOOMINGTON, Ind. — By understanding, managing and inferring patterns from data, machine learning has brought us self-driving vehicles, spam filters and smartphone personal assistants. Now an Indiana University Bloomington computer scientist has received $1.4 million to give machine learning more muscle by making it applicable to greater amounts of more diverse data.
Chung-chieh “Ken” Shan, an assistant professor in the School of Informatics and Computing, will receive the funding from the U.S. Defense Department’s Defense Advanced Research Projects Agency over 46 months. Read more
Supply chain and products standards organization GS1 – which this week joined the World Wide Web Consortium (W3C) to contribute to work on improving global commerce and logistics – also now has released the GTIN (Global Trade Item Number) Validation Guide. In the states the GTIN, which is the GS1-developed numbering sequence within bar codes for identifying products at point of sale, is known as the Universal Product Code (UPC).
The guide is part of the organization’s effort to drive awareness about “the business importance of having accurate product information on the web,” says Bernie Hogan, Senior Vice President, Emerging Capabilities and Industries. The guide has the endorsement of players including Google, eBay and Walmart, which are among the retailers that require the use of GTINs by onboarding suppliers, and support GTIN’s extension further into the online space to help ensure more accurate and consistent product descriptions that link to images and promotions, and help customers better find, compare and buy products.
“This is an effort to help clean up the data and get it more accurate,” he says. “That’s so foundational to any kind of commerce, because if it’s not the right number, you can have the best product data and images and the consumer still won’t find it.” The search hook, indeed, is the link between the work that GS1 is doing to encourage using GS1 standards online for improved product identification data with semantic web efforts such as schema.org, which The Semantic Web discussed with Hogan here.
According to a new article out of Mobile Marketing and Technology, there has been a breakthrough in the area of image search. The author states, “British company Cortexica has developed the first software in the world that will help consumers to purchase the perfect pair of shoes. Launching on March 19th, ‘FindSimilar™ for Shoes’ takes a photo of any sort of footwear and then analyses it against a database of images. Working like a ‘visual search engine’ it displays a range of shoes with similar characteristics such as shape, colour and design and allows the consumer to choose from a tailored selection. The technology works by mimicking the way the brain processes images and finds similarities.” Read more
Apparel.com reports, “Reflektion, a retail analytics firm, has raised $8 million in a Series B funding round led by Intel Capital and including NIKE, Inc. as well as several private investors. Reflektion provides a predictive analytics platform for retailers and brands through an easily deployable Software-as-a-Service (SaaS) model. The company plans to use the funding to support its customers — including Converse, Inc., which is currently piloting the technology, O’Neill, A.M. Leonard and RealTruck — and to launch broadly its ecommerce and business intelligence solutions in the market. The resources will also be used for additional product development, plus expanded sales and marketing activities.” Read more
Anthony Clark of Gainsville.com reports, “A Gainesville startup company received a $1.1 million federal grant to develop a Web portal for chemists to better share information over the next generation of the World Wide Web. Neil Ostlund, CEO of Chemical Semantics, said he learned of the grant from the Department of Energy on Friday. Chemical Semantics is developing a portal and software for computational chemists to publish and find data over the semantic web, also referred to as Web 3.0 or the web of data.”
Clark continues, “Chemical Semantics has created the semantic web vocabulary — or ontology — for computational chemistry called the Gainesville Core. Read more
Scott Brinker, whom we have covered many times in the past because of his insights into semantic technology and marketing, has written a new short book about modern marketing trends. The author presents “seven transformative meta-trends in modern marketing.” In the forward, he identifies these trends as “…wield[-ing] tremendous influence on the current evolution of marketing strategy and management.” The trends Brinker identifies are:
- From traditional to digital
- From media silos to converged media
- From outbound to inbound
- From communications to experiences
- From art and copy to code and data
- From rigid plans to agile iterations
- From agencies to in-house marketing
While he does not mention semantics explicitly in the book, knowing Scott as we do, we were curious about his thoughts on the subject. We caught up with him to ask, “So, how does this fit in with Semantic Web Technologies?”
Brinker responded, “Semantic web technologies are a great example of how technology is continuously changing what’s possible in marketing and business. But in the absence of ‘marketing technologists’ — these hybrid professionals who can translate technology capabilities to marketing opportunities, and vice versa — much of that potential remains untapped.”
“Structured and linked data can have such a tremendous impact on shaping customer experiences in a digital world. While not every marketer needs to understand the technical layer of how to make that happen, they need to have a sense of what’s possible — and they need to be able to work with more technical talent, as part of the modern marketing team, to make it happen.”
Brinker, who coined the term ”Chief Marketing Technologist,” is offering the 40-pager as a free download on his website.
Ontologies are getting a thumbs up to serve as the basis for the Office of Financial Research’s Instruments database. Last week, the Data & Technology Subcommittee of the OFR Financial Research Advisory Committee (FRAC) recommended that the OFR “adopt the goal of developing and validating a comprehensive ontology for financial instruments as part of its overall effort to meet its statutory requirement to ‘prepare and publish’ a financial instrument reference database.”
The Instruments database will define the official meaning of financial instruments for the financial system — derivatives, securities, and so on. The recommendation by the subcommittee is that the OFR conduct its own evaluation of private sector initiatives in this area, including the Financial Industry Business Ontology (FIBO), to assess whether and how ontology can support transparency and financial stability analysis.
FIBO, which The Semantic Web Blog discussed in detail most recently here, is designed to improve visibility to the financial industry and the regulatory community by standardizing the language used to precisely define the terms, conditions, and characteristics of financial instruments; the legal and relationship structure of business entities; the content and time dimensions of market data; and more. The effort is spearheaded by the Object Management Group and the Enterprise Data Management (EDM) Council.