iAlbums Alpha has launched utilizing a proprietary semantic curation engine. The article states, “iAlbums is the first major development in on-device music players since the launch of the iPod. A free application in the iPhone app store, iAlbums’ patent pending semantic engine analyzes the music already existing on one’s iPhone or iPod touch and gathers the most relevant and exciting information from over 20 different sources across the web. This content (videos, artwork, photos, artist quotes, lyrics, tweets and more) streams seamlessly into an easy to use feed on the user’s player, providing an all encompassing music experience in an interactive app that enhances the existing music files in their mobile libraries.” Read more
Posts Tagged ‘iPhone’
The Open Graph protocol continues to progress: Earlier this week Facebook’s Director of Developer Relations Douglas Purdy talked about its intersection with the mobile web.
According to Purdy, more people are accessing Facebook on the mobile web than from its top native apps combined, and the game is on to help developers conquer the challenges of building for that community. One of those challenges is app discovery. At the Mobile World Congress on Monday, the company announced that it’s continuing to address the first issue with plans to extend to native Android apps the ability for Facebook’s 425 million mobile app users to discover them through Open Graph connections.
Editor’s Note: Here at the Semantic Web Blog we’ve done a lot of coverage of the personalized news mag app space. That includes some in-depth looks into Zite, acquired by CNN in August, such as this article. Most recently, we brought you news of Zite’s iPhone app.
Today, over at Zite’s blog, the company today will run a piece entitled Zite: Under the Hood. It should be of interest to anyone who wants more details about how its technology operates. It goes like this:
Zite: Under the Hood
If you’re already a Zite user, you’ve experienced the delivery of personalized content that is updated every time you open the app. To make that transparent and easy for you, takes a lot of effort. The Zite team brings together decades of software development in artificial intelligence, machine learning and natural language technologies, and more than six years of product development, to blend and tune the experience for you. In short, Zite works by:
- mining content from your social web
- modeling that content
- modeling the community that interacts with it
- modeling your interests
- matching your interests to the content and your community, to help you discover content you’ll want to see.
Zite, the personalized news magazine app for the iPad, adds an iPhone version of its application to the lineup today. The company, which we wrote about here and which was acquired this summer by CNN, has focused on making the semantically-intelligent app fit the smaller-size format of the smartphone, with one-thumb navigation, vertical story and left-to-right category view flow, and a focus on the facts of story name, title and source , rather than snippets, as starter views.
CEO Mark Johnson says a prerequisite for the iPhone app was its release of Sybil technology in late October, which allowed Zite to have multiple profiles that adapt to the reader’s preference. This made it possible to share the Zite app on a family’s sole iPad without messing up individuals’ preferences. It comes in handy for the new smartphone app because, “if you did all this work on your iPad training this very intelligent AI, you don’t want to lose that when you go to the iPhone,” Johnson says.
Johnson expects the iPhone app to appeal to existing iPad users. “Personalization is really addictive. Once you have it one place, you want it everywhere,” he says
The popular news app, News360 has released a new version for iPhone and Android smartphones. The version 2.0 release “introduces News360′s sophisticated content personalization technology to the smartphone experience. With permission, News360 analyzes a user’s activity across social and Web services like Facebook, Twitter, Google+, Evernote and Google Reader to build a unique interest graph and uncover persistent reading interests and underlying topic areas.” Read more
After a very, very long appetizer course, Apple got down to the main entrée with today’s long-expected announcement of the Siri assistant Well over an hour into an event that trod over some well-covered ground from the Mac to the iPod Touch, the audience got its look at Siri on the new iPhone 4S.
By now you’ve heard the rumors and the reality: Apple’s acquisition of Siri a couple of years ago has worked itself into the latest iPhone as a voice-activated “humble personal assistant” which can do everything at your voice request, from pulling up the article on Wikipedia about Neil Armstrong to calculating the number of days until Christmas to telling you what time it is in Paris to reminding you to call your spouse before you leave work to text-messaging a lunch appointment request to the person you’ve defined in the message, without having to confirm the recipient before transmission.
Its natural language expertise and other semantic underpinnings, and some help from functionality like GPS, also mean that it knows to provide you with a map and route when you ask how to get home, or know that you want to see things like the NASDAQ composite when you ask how NASDAQ is doing today.
A new article reports, “Perhaps the biggest announcement at Apple’s iPhone event (about one hour from this posting) will be Assistant, Apple’s evolution of the Siri Personal Assistant Software. Siri, you’ll remember, is the company Apple picked up for a rumored $200 million in April of last year for, in Steve Jobs’ words, its “Artificial Intelligence”, not search or speech recognition.”
Before Apple bought the company, Siri described itself as a Virtual Personal Assistant Read more
A few short years ago, a group of semantic technology companies rode a wave of venture capital and inflated expectation. They were going to change the world. They were going to bring semantic technologies to the mainstream. They were going to make people very rich. They were the must-have keynotes of the conference circuit. And then, one by one, they disappeared. Powerset vanished inside Microsoft, to do something for Bing. Twine vanished inside Evri, amid rumours of a fire sale and investors covering their backs. Freebase vanished inside Google, and bits of Freebase DNA routinely pop up across Google’s sprawling empire. And Siri vanished inside Apple, as we scrambled to understand whether the Cupertino money machine was after semantic smarts or ‘just’ speech recognition technology. Now, though, the rumours suggest that Siri may be back, and that it’s going to be the thing that makes the next iPhone a compelling buy. Read more
The Semantic Web Blog mentioned here, there is speculation that the Siri intelligent personal digital assistant technology may come to light in Apple’s fall iOS 5 release. Which is all well and good for users on Apple’s mobile platforms, but myBantu founder and vice president of products and strategy Bharath Yadla sees an opportunity to bring personalized recommendations for entertainment to a wider swath of the populace. The iPhone and Android version of its application, to join its web and Facebook applications. officially launch next week.
“We see absolutely a great opportunity with other platforms from a mobile standpoint,” he says. “And when you leverage the application on Facebook, that is a predominant presence.”
What users are leveraging in this intelligent assistant is its ActiveRelevance platform for providing relevant and personalized recommendations based on their profiles and queries. The platform leverages both artificial intelligence and social relevance, assessing some nine dimensions, including personal interests, proximity of choices, popularity, and peer influence, in order to deliver a handful or two of results. The social relevance comes by way of crowd-sourcing recommendations from sites such as OpenTable and Rotton Tomatoes, friends on social networking platforms, and the output of search as well as semantic engines. Users can enter their requirements in natural language, such as top romantic restaurant nearby, for the ActiveRelevance engine to parse intent and come to some conclusions.
The two have teamed up with the goal of mobilizing NetSeer’s concept-based advertising, which uses algorithms to pinpoint relevant and related concepts based on the subject matter in a particular article, in the service of contextual ad delivery. On a mobile site for investing, NetSeer’s ConceptLinks might display related topics like “Exchange Traded Funds” and “Portfolio Management,” which are monetized links for the publisher, the company says. NetSeer says it already has several hundred publishers using its platform on the web.
Mobile Theory does the work of making an ad unit appear and function properly in a mobile environment. CEO Scott Swanson says that takes some work. On mobile platforms smaller, screen sizes, different screen ratios, ability for users to change orientations, and those users’ desire not to click on an ad and land somewhere else all must be considered. “We had to develop an ad unit that works completely different in mobile, to be faster, more simple, and more immediate than we do in online,” he says.