We’ve seen the big three players in the search engine space honing their semantic edge, and we may soon see one of them deploying semantic technology to sharpen image searches, too. nachofoto says it is in discussions with one of the giants (which it declines to name for now) that could result in a licensing deal to bring its ‘semantic, time-based vertical image search engine,’ currently in beta, to the big-time.

CTO Anuj Agarwal and CEO Vineet Agarwal, the co-founding brothers behind nachofoto and its focus on delivering the most recent image results, decline to name which major search engine we’re talking about.

It would, of course, be pure speculation to draw any conclusions from the fact that the Agarwals both consider the best analogy for what they’re doing as “Powerset for image search” (Microsoft acquired that semantic search engine in 2008 and it’s believed to be powering Bing Wikipedia).

Or from Yahoo Consumer Products’ Search VP Larry Cornett recently joining nachofoto’s advisory board. That board, by the way, also includes LinkedIn co-founder Konstantin Guericke. “When we had our first conference call with [the advisory board] to explain our technology, they were very excited about it and its application within existing search engines,” says Vineet Agarwal.

Why build when you can buy? “Google of course can do anything,” says Anuj Agarwal. “But we have a solution that is working now and that could be used by Google, Yahoo and Bing. Even if Google would try there is no guarantee they’d get the same result. The algorithms we use are very hard to reverse engineer,” he says of nachofoto’s user-intent focused image search, which has two U.S. patents pending and is the outcome of three years of research. The Agarwals said getting as far as they have has meant overcoming a lot of practical problems, from how to ensure that the image that is included with content is actually of the subject in question, to images that are tagged incorrectly to begin with.

Further application, the Agarwals say, would be in verticals like a Facebook or NY Times’ search, where fresh image results would be top-of-mind for users. When a user searches on nachofoto for – oh, say Chris Brown– the image results that appear in the main window are time-stamped of recent vintage, within the last 24 hours in this instance. The related searches at left (which breaks things down by topic, time or event) represent the core product, and let users drill down with even more accuracy, with meltdown site Good Morning America right at the top of that list for Chris Brown.

Users also can navigate a timeline to view images by the date they prefer – oldest, newest, or the month with the most images. In a just-for-fun comparison, the first image to appear on the lineup that comes up with a Google image search is a smiling Chris with a puppy on his shoulder.

“Existing image search engines use searching patterns to find suggestions,” Anuj Agarwal explains. “What we do is we find suggestions based on fresh content on the web. This is the basic difference between our technology and that in existing image search engines.”

nachofoto crawls the web in real-time, collecting data from recent news and blog postings that then informs its image search results. Its algorithms are at work to discover both content and update frequency across these sites to see which resources are freshest, and parsing out the meaning of the content. It stores the meaning of sentences in its semantic index, Anuj Agarwal says.

When a user makes a query, “we first find out the meaning of what the user is looking for when we search, then we query our semantic index to get the most fresh and relevant searches,” he says. “We’re finding the meaning of the user query and finding the meaning of available content on the web and storing it in the database,” which now is approaching a terabyte. Nachofoto is making use of Apache Lucene for searching and indexing, with a proprietary layer built atop it to support its particular requirements.

Vineet Agarwal says nachofoto hits both the users who know they want an image in a particular context (Justin Bieber’s wax figure at Madame Tussauds, anyone?) and also the casual browsers. “For such kinds of users, if a search engine can point them to browse content in a particular manner – for example, organized on timeline – there’s a high probability that user who gets access to organized content will become a returning user.” And both present a profitable opportunity for the major search engines to increase page views, and revenue, from users clicking on these image search links.

So, he says, “right now we are pursuing opportunities with existing search engines and we’re fully focused on getting that nailed down.”