Mark Albertson of the Examiner recently wrote, “It was an unusual sight to be sure. Standing on a convention center stage together were computer engineers from the four largest search providers in the world (Google, Yahoo, Microsoft Bing, and Yandex). Normally, this group couldn’t even agree on where to go for dinner, but this week in San Jose, California they were united by a common cause: the Semantic Web… At the Semantic Technology and Business Conference is San Jose this week, researchers from around the world gathered to discuss how far they have come and the mountain of work still ahead of them.” Read more
Posts Tagged ‘bing’
These vistas will be explored in a session hosted by Kevin Ford, digital project coordinator at the Library of Congress at next week’s Semantic Technology & Business conference in San Jose. The door is being opened by the Bibliographic Framework Initiative (BIBFRAME) that the LOC launched a few years ago. Libraries will be moving from the MARC standards, their lingua franca for representing and communicating bibliographic and related information in machine-readable form, to BIBFRAME, which models bibliographic data in RDF using semantic technologies.
Amy Gesenhues of Search Engine Land reports, “To celebrate its 5th birthday today, Bing has posted a retrospective of the last five years and is offering Bing Reward credit perks for any users who search on the site between now and June 9. Going all the way back to 2009, Microsoft’s search engine outlined its initial goals of leveraging semantic search. Bing noted how it introduced specific verticals (“like Health and Travel”), and offered “left rail categories” to help users drill down into the information they wanted: ‘Searching for Chicago would show you both categories for the city and the band, and only return results that made sense for that category. Later in the year, we built additional vertical experiences that cleanly segregated the mass of web content into understandable and logical experiences, like TV entertainment, Shopping, electronics, and more’.” Read more
Derrick Harris of GigaOM recently wrote, “With all the money being spent on, and all the futuristic talk about about big data, machine learning, artificial intelligence and all things in between, it’s easy to forget that Microsoft and Google — two of the companies leading research in these technologies — still have large businesses in web search. So as cool and potentially life-altering as AI might be in fields such as medicine, we’ll probably continue to see the signs of things to come in search engines first. It’s big business and a great testing ground. Take, for example, Microsoft Bing’s new predictions feature that tries to predict the outcomes of popular fan-voting show such as The Voice, American Idol and Dancing with the Stars. Bing does this by analyzing a number of signals, including searches, Twitter and Facebook data, and, presumably the outcomes of previous episodes.” Read more
As The Semantic Web Blog discussed yesterday here, the Virtual Personal Assistant is getting more personal. Microsoft officially unveiled Cortana as part of the Windows Phone 8.1 smartphone software at its Build event yesterday, and the service effectively replaces the search function on Windows smartphones, both for the Internet and locally.
This statement served as the theme from corporate vice president and manager Joe Belfiore: “Cortana is the first truly personal digital assistant who learns about me and about the things that matter to me most and the people that matter to me most, that understands the Internet and is great at helping me get things done.”
The Bing-powered Cortana is launching in beta mode, and was still subject to a few hiccups during the presentation. For example, when Belfiore asked Cortana to give him the weather in Las Vegas, it reported the information in degrees, and was able to respond to his request to provide the same information in Celsius. But he couldn’t get her to make the calculations to Kelvin. But, he promised attendees, “Try it yourself because she is smart enough to tell you the answer in Kelvin.”
Microsoft – as you’ve no doubt heard by now – has a new CEO. Satya Nadella most recently was Microsoft’s executive VP, cloud and enterprise group. But before that, the man who succeeds Steve Ballmer, he was senior vp, R&D, of online services and before that, the senior vp of search, portal and advertising group. Nadella has been at the company since 1992.
The man who succeeds Steve Ballmer has been referred to as the King of Bing, rebranding the search service from Live Search to Bing and getting kudos for making technical fixes. Announcing his promotion to president of Microsoft’s Server and Tools Business in 2011, Ballmer wrote in a memo that Nadella “led the overall R&D efforts for some of the largest online services and drove the technical vision and strategy for several important milestones, including the critical launch of Bing, new releases of MSN, Yahoo! integration across Bing and adCenter, and much more.”
Bing is looking for a Senior Software Development Engineer in Bellevue, WA. The post states, “The Bing Local Search Relevance Platform team is responsible for building the world best relevance platform to serve the user’s search intent regarding location, business, address on web, mobile and map entry points and ensure the market expansion for Bing local search faster and cheaper crossing the world. Location and local query understanding is an important part of local search platform and we are looking for the talent who is interested in solving hard relevance problem in the scalable way: mining against large volume of data from web and internal logs; building scalable solution to construct the machine learned query intent classifiers and query parsers and other ranking features based on query and query context; handling the requests from markets about localization experience and design the platform feature to support the search relevance improvements for different cultures.” Read more
Zach Walton of Web Pro News recently wrote, “Image search is a cornerstone of any search engine. That’s why both Google and Bing are doing everything they can to improve image search to bring up the most relevant images for any search imaginable. While some may argue that recent changes made to Google image search make it worse, Bing is moving ahead with a new strategy that involves deep learning. So, what is deep learning? In short, it’s a type of machine learning that uses artificial neural networks to learn about and understand multiple concepts, including the abstract. In the past, computer systems had to be manually ‘trained’ to recognize patterns or specific images. With machine learning, these systems can now learn to recognize these patterns on their own. When it comes to image search quality, Bing found that integrating deep learning into its systems greatly increased the quality.” Read more
Interested in how schema.org has trended in the last couple of years since its birth? If you were at The International Semantic Web Conference event in Sydney a couple of weeks back, you may have caught Google Fellow Ramanathan V. Guha — the mind behind schema.org — present a keynote address about the initiative.
Of course, Australia’s a far way to go for a lot of people, so The Semantic Web Blog is happy to catch everyone up on Guha’s thoughts on the topic.
We caught up with him when he was back stateside:
The Semantic Web Blog: Tell us a little bit about the main focus of your keynote.
Guha: The basic discussion was a progress report on schema.org – its history and why it came about a couple of years ago. Other than a couple of panels at SemTech we’ve maintained a rather low profile and figured it might be a good time to talk more about it, and to a crowd that is different from the SemTech crowd.
The short version is that the goal, of course, is to make it easier for mainstream webmasters to add structured data markup to web pages, so that they wouldn’t have to track down many different vocabularies, or think about what Yahoo or Microsoft or Google understands. Before webmasters had to champion internally which vocabularies to use and how to mark up a site, but we have reduced that and also now it’s not an issue of which search engine to cater to.
It’s now a little over two years since launch and we are seeing adoption way beyond what we expected. The aggregate search engines see about 15 percent of the pages we crawl have schema.org markup. This is the first time we see markup approximately on the order of the scale of the web….Now over 5 million sites are using it. That’s helped by the mainstream platforms like Drupal and WordPress adopting it so that it becomes part of the regular workflow. Read more
NEXT PAGE >>