Dave Lloyd of ClickZ.com recently wrote, “People are becoming more sophisticated in their searching, using longer queries, more precise terms, and more contextual info in their queries. Clearly, there’s exponentially more content on the Web than there was even five years ago, and this means the needle-in-a-haystack science of algorithms must become more sophisticated in finding the most effective answers for queries. The expanding use of mobile and voice technologies are also changing how we search. We’ve arrived at a place where literal matching by itself isn’t good enough. In response, we’re moving toward a new normal: semantic search. It’s an idea that’s been in the works for a long time and was described by the Web’s creator Tim Berners-Lee in 2001 but is only recently going live in a way that affects regular users. Read more
Posts Tagged ‘seo’
Where is SEO going? A panel hosted by Aaron Bradley, Internet marketing manager at InfoMine, Inc. at this week’s Semantic Technology & Business Conference in NYC took on the issue at full force. The session, featuring Bing senior product manager Duane Forrester, semantic web strategist and independent consultant Barbara H. Starr, Swellpath SEO Team Manager Mike Arnesen, and author and analyst David Amerland (see our Q&A with him here), provided some insight into why it’s an exciting time to be working in both semantic technology and search – and why that’s also a scary proposition for some in the SEO set who’ve lived by keywords and links.
On the exciting side of things, Arnesen pointed out that it was always a somewhat unnatural process to have to advise clients to craft content so that it can match to specific keywords to get traction. “Now we can tell them to just write good content, put what you need to put on the web and it will be easier find because of semantic markup and semantic search,” he said.
[UPDATE: This panel has a new panelist! Mike Arnesen, SEO Team Manager of SwellPath will participate in New York.]
On October 3 at the New York Semantic Technology & Business Conference (#SemTechBiz), a panel of experts will tackle the issue of how Semantic Web technologies are rapidly changing the landscape of Search Engine Optimization. The panel, titled “The Semantic Web Has Killed SEO. Long Live SEO.,” is made up of Aaron Bradley, David Amerland, Barbara Starr, Duane Forrester, and Mike Arnesen.
The session will address numerous issues at the intersection of Semantic Web and SEO. As the description reads, “From rich snippets to the Google Knowledge Graph to Bing Snapshots semantic technology has transformed the look, feel and functionality of search engines.”
Have these changes undermined the ways in which websites are optimized for search, effectively “killing” SEO? Or are tried-and-true SEO tactics still effective? And what does the future hold for SEO in a semantic world?
Paul Bruemmer of Search Engine Land recently wrote, “Imagine the future of SEO — a future in which you forget about using keywords or their synonyms multiple times on a page. In the future, this will be obsolete. Search engines of the future will provide users with answers to their queries by internally verifying validated data that link to trusted documents. To optimize websites for search in the future, SEOs will need to create relevant, machine-recognizable ‘entities’ on webpages that answer well-refined, focused or narrowed queries. To create these entities, SEOs will use semantic Web technology and structured data. This allows search engines to better understand the page content and thus display valid search results/answers for each query.” Read more
Marketers, SEO experts and businesses not yet on-board with retooling their approaches to the new world of semantic technology and semantic search need to seriously rethink their positions.
Why? Check out the Q&A below with writer, speaker and analyst David Amerland, author of the new book Google™ Semantic Search: Search Engine Optimization (SEO) Techniques That Get Your Company More Traffic, Increase Brand Impact, and Amplify Your Online Presence. Amerland also will be participating in this session, The Semantic Web Has Killed SEO; Long Live SEO, at the Semantic Technology & Business conference in New York City in October.
Semantic Web Blog: What was your motivation for writing Google Semantic Search?
Amerland: After working as a chemical engineer who wrote pieces for newspapers, and in cultural and business journalism, I became a communications director for a U.K. blue chip company, and part of my role was overseeing the changes of taking a massive company from the 19th century, where it was stuck, to the 21st century. Part of that was to create a web presence. And in different capacities I’ve guided other web companies. So I have seen the things I talk about around marketing in action.
I want to demystify SEO. I hate things to be cloaked in mystique. When there’s a mystique around things you do away with everything from comparison metrics to the opportunity to have best practices. That’s really bad for business. So that’s my motivation. Just as I used to demystify science in my early days as a journalist. I’m trying to open up SEO as it is today as much as possible.
David Amerland, author of Google Semantic Search and speaker at the upcoming Semantic Technology & Business Conference in New York, has given his take on Microsoft’s acquisition of Nokia’s Devices & Services division. In his analysis, he talks about the Semantic Web and how the company that stands to lose the most in this deal is neither Microsoft nor Nokia, but Yahoo!
Amerland posits, “In the semantic web there are specific vectors driving growth that revolve around relevance and the end-user experience. In order to guarantee both you need a means to constantly acquire quality data and control the environment. Apple gets this to the extent that it locks out virtually all third-party providers from its iPhones and iOs, Facebook got it which is why it launched its own app designed to help it take over users’ phones and Google gets it having recently launched Moto X, in addition to the Android environment being present in many third-party phones.”
Common Crawl, the non-profit organization creating a repository of openly and freely accessible web crawl data, is getting a present from search engine provider blekko. It’s donating its metadata on search engine ranking for 140 million websites and 22 billion webpages to Common Crawl.
“The blekko data donation is a huge benefit to Common Crawl,” Common Crawl director Lisa Green told The Semantic Web Blog. “Knowing what the blekko team is crawling and how they rate those pages allows us to improve our crawler and enrich our corpus for high-value webpages.”
Structured data makes the Web go around. Search engines love it when webmasters mark up page content. Google’s rich snippets, for instance, leverages sites’ use of microdata (preferred format), or RDFa or microformats: It makes it possible to highlight in a few lines specific types of content in search results, to give users some insight about what’s on the page and its relationship to their queries – prep time for a recipe, for instance.
Plenty of web sites generated from structured data haven’t added HTML markup to their pages, though, so they aren’t getting the benefits that come with search engines understanding the information on those web pages.
Maybe that will change, now that Google has introduced Data Highlighter, an easy way to tell its search engine about the structured data behind their web pages. A video posted by Google product management director Jack Menzel gives the snapshot: “Data Highlighter is a point- and-click tool that allows any webmaster to show Google the patterns of structured data on their pages without modifying the pages themselves,” he says.
With Thanksgiving Day, Black Friday and Small Business Saturday behind us, and Cyber-Monday right in front of us, it is clear the holiday season is in full force. Apparently, retailers – both online and real-world – are doing pretty well as a group when it comes to sales racked up.
Reports have it that e-commerce topped the $1 billion mark for Black Friday in the U.S. for the first time this year, with Amazon, Walmart, Best Buy, Target and Apple taking honors as the most visited online stores, according to ComScore. Consumers spent $11.2 billion at stores across the U.S. on Black Friday, said ShopperTrak, down from last year but probably impacted by more people heading out to more stores for deals that began on Thursday night. The National Retail Federation put total spending over the four-day weekend at a record $59.1 billion, up 13 percent from $52.4 billion last year.
Not surprisingly, semantic technology wants in on the shopping action. Social intelligence vendor NetBase, for instance, just launched a new online tool that analyzes the web for mentions of the 10 top retailers to show the mood of shoppers flocking to those sources. The Mood Meter, which media outlets and others can embed in their sites, ranks the 10 brands based on sentiment unearthed with the help of its natural language processing technology. Read more
NEXT PAGE >>