Posts Tagged ‘Hummingbird’

Bringing Speed & Precision to Search with Semantics

Google Hummingbird

Dave Lloyd of ClickZ.com recently wrote, “People are becoming more sophisticated in their searching, using longer queries, more precise terms, and more contextual info in their queries. Clearly, there’s exponentially more content on the Web than there was even five years ago, and this means the needle-in-a-haystack science of algorithms must become more sophisticated in finding the most effective answers for queries. The expanding use of mobile and voice technologies are also changing how we search. We’ve arrived at a place where literal matching by itself isn’t good enough. In response, we’re moving toward a new normal: semantic search. It’s an idea that’s been in the works for a long time and was described by the Web’s creator Tim Berners-Lee in 2001 but is only recently going live in a way that affects regular users. Read more

Hello 2014 (Part 2)

rsz_lookahead2

Courtesy: Flickr/faul

Picking up from where we left off yesterday, we continue exploring where 2014 may take us in the world of semantics, Linked and Smart Data, content analytics, and so much more.

Marco Neumann, CEO and co-founder, KONA and director, Lotico: On the technology side I am personally looking forward to make use of the new RDF1.1 implementations and the new SPARQL end-point deployment solutions in 2014 The Semantic Web idea is here to stay, though you might call it by a different name (again) in 2014.

Bill Roberts, CEO, Swirrl:   Looking forward to 2014, I see a growing use of Linked Data in open data ‘production’ systems, as opposed to proofs of concept, pilots and test systems.  I expect good progress on taking Linked Data out of the hands of specialists to be used by a broader group of data users.

Read more

Hello 2014

rsz_lookaheadone

Courtesy: Flickr/Wonderlane

Yesterday we said a fond farewell to 2013. Today, we look ahead to the New Year, with the help, once again, of our panel of experts:

Phil Archer, Data Activity Lead, W3C:

For me the new Working Groups (WG) are the focus. I think the CSV on the Web WG is going to be an important step in making more data interoperable with Sem Web.

I’d also like to draw attention to the upcoming Linking Geospatial Data workshop in London in March. There have been lots of attempts to use Geospatial data with Linked Data, notably GeoSPARQL of course. But it’s not always easy. We need to make it easier to publish and use data that includes geocoding in some fashion along with the power and functionality of Geospatial Information systems. The workshop brings together W3C, OGC, the UK government [Linked Data Working Group], Ordnance Survey and the geospatial department at Google. It’s going to be big!

[And about] JSON-LD: It’s JSON so Web developers love it, and it’s RDF. I am hopeful that more and more JSON will actually be JSON-LD. Then everyone should be happy.

Read more

Users Are Expecting More From Search Thanks to Semantic Tech

goo

Colin Jeavons of Search Engine Journal recently wrote, “Millions of people are already using semantic search and they don’t know it. Some of the world’s most popular search engines and social networking sites are using the technology to make it easier to make connections, learn, and explore interests. It’s quietly become a part of our lives, and innovative companies are pushing the technology and industry toward new horizons. So, what changed? Actually, it was users. Last year, 20 percent of Google searches were new, due to the fact that people started typing sentences and paragraphs into search engines, expecting keyword searches to operate like natural language.  Today, user demands for answers to their questions are satiated with innovations like Google’s Hummingbird. People are now searching for ‘cheap flights to Miami on January 7th” rather than just ‘cheap flights.’ This change in consumer behavior is a significant milestone.” Read more

Where Schema.org Is At: A Chat With Google’s R.V. Guha

 

rvg Interested in how schema.org has trended in the last couple of years since its birth? If you were at The International Semantic Web Conference event in Sydney a couple of weeks back, you may have caught Google Fellow Ramanathan V. Guha — the mind behind schema.org — present a keynote address about the initiative.

Of course, Australia’s a far way to go for a lot of people, so The Semantic Web Blog is happy to catch everyone up on Guha’s thoughts on the topic.

We caught up with him when he was back stateside:

The Semantic Web Blog: Tell us a little bit about the main focus of your keynote.

Guha: The basic discussion was a progress report on schema.org – its history and why it came about a couple of years ago. Other than a couple of panels at SemTech we’ve maintained a rather low profile and figured it might be a good time to talk more about it, and to a crowd that is different from the SemTech crowd.

The short version is that the goal, of course, is to make it easier for mainstream webmasters to add structured data markup to web pages, so that they wouldn’t have to track down many different vocabularies, or think about what Yahoo or Microsoft or Google understands. Before webmasters had to champion internally which vocabularies to use and how to mark up a site, but we have reduced that and also now it’s not an issue of which search engine to cater to.

It’s now a little over two years since launch and we are seeing adoption way beyond what we expected. The aggregate search engines see about 15 percent of the pages we crawl have schema.org markup. This is the first time we see markup approximately on the order of the scale of the web….Now over 5 million sites are using it.  That’s helped by the mainstream platforms like Drupal and WordPress adopting it so that it becomes part of the regular workflow. Read more

Unlocking the Benefits of Semantic Search: Barbara Starr’s 5 Ways

6715086421_239f874bd6

Barbara Starr of Search Engine Land reports, “Search is changing. It is now more personal, more engaging, more interactive and more predictive. SERPs no longer display just 10 blue links — they have become more useful and more visually appealing across all device types. Semantic search is at the forefront of these changes, as evidenced most recently by the launch of Google’s new Hummingbird algorithm. Beginning with user intent and interpretation of the query itself, semantic technology is used to refine the query, extract entities as answers, personalize search results, predict search queries and more — providing a more interactive, conversational or dialogue-based search result.” Read more

Google Brings Semantics to Image Search

Google Images

David Amerland of Social Media Today reports, “Google is systematically removing all the tools that traditional search engine optimizers and marketers had at their disposal that allowed them to reverse engineer search and create a set of metrics that could be used to gauge progress in search rankings for their clients… Hummingbird is clear evidence that Google search is getting smarter, drawing from a much more enriched set of entities in its semantic index and a much deeper Knowledge Graph. An Entity is a concept, divorced from its linguistic counterpart and defined by the relational amount of data that form its attributes, properties and uses. And Entities are now powering Google’s Image Search helping the search engine recognise images in pictures the same way you and I would.” Read more

Why Google’s Hummingbird is a Big Deal

hum

Gerry Brown of Business Insider recently wrote, “Last week, Google announced a brand new algorithm for its search engine, called Hummingbird. Although Google often produces updates and enhancements (such as the ‘Caffeine Update’ in 2010, and ‘Penguin’ and ‘Panda’ since), the last time Google introduced a brand new algorithm was 2001, so it is a big change. Although Google has not given away many details, it said that Hummingbird is focused on ranking information based on a more intelligent understanding of search requests. As Internet data volumes explode we increasingly have to type more and more words into Google Search to gain greater accuracy of results. Often we need to conduct multiple searches to find the information we are looking for, which is frustrating and time consuming.” Read more

Google Changes Search Algorithm to Handle More Complex Queries

2574614930_c9d74dcfa0

Claire Cain Miller of The New York Times reports, “Google on Thursday announced one of the biggest changes to its search engine, a rewriting of its algorithm to handle more complex queries that affects 90 percent of all searches. The change, which represents a new approach to search for Google, required the biggest changes to the company’s search algorithm since 2000. Now, Google, the world’s most popular search engine, will focus more on trying to understand the meanings of and relationships among things, as opposed to its original strategy of matching keywords.” Read more