Nathan Robertson of Business2Community recently wrote, “Now that business owners are starting to take social media seriously, they are finding that understanding the information that is coming to them at lightning speed has a way of getting away from them before they can make any sense of it. This new form of big data means the creation of a new way of aggregating and breaking down data so businesses can understand what consumers and potential consumers are saying about them, who is buying from them, who isn’t—and why not.” Read more
Posts Tagged ‘latent semantic analysis’
The e-discovery market is a growing one, with Gartner this year estimating electronic data discovery software sales will reach $2.9 billion by 2017. The same research firm just recently positioned where vendors fall in that space, with the release of its Magic Quadrant for E-Discovery software.
As recounted here, semantic technology is becoming an increasingly important part of the package; semantic inventory tools, for example, allow for better understanding the vocabulary of litigation and involve repetition in the process of looking for the relevant evidence.
Facebook IPO not panning out for you? Well, there are other opportunities out there where you can get in on the ground floor for a lot less.
Take a Kickstarter project dubbed Gooey Search – it’s trying to get funding of at least $125,000 by June 8 for its consumer-facing technology, based on latent semantic analysis (LSA). It has as its goal delivering the best and most accurate Google search results in what it calls a Gooey Graph real-time diagram of discovered network concepts, while keeping user privacy intact.
With the recent announcement of Google’s Knowledge Graph, do we need another way to probe the leading search engine? Ed Heinbockel, founder, president and CEO of Visual Purple, is betting we do. “To me [what Google’s done] validates the approach we’ve gone down in terms of visualizing and letting you navigate search. It’s the same direction. The question is can we provide value to that equation,” he says.
Where he sees the opportunity: “What drives us a lot is that we’re very concerned about the direction that privacy and the Internet are going,” he says. “And we think the quality of the results they give back to users isn’t in the order it should be.”
Military personnel are likely familiar with The Millennium Cohort study, which began in the late 1990s to evaluate the effect of service on long-term health. In addition to the service that thousands of men and women in uniform already have given their country, many of those who participated in the 2001-2003 and 2004-2006 survey cycles also may contribute to advancing the understanding of qualitative survey results that may further epidemiological research.
Researchers have released the results of their application of latent semantic analysis to an open-ended question found on The Millennium Cohort study. The question asked respondents to discuss their additional health concerns, in as much detail as they like about any health subject that was not otherwise covered. In October the researchers published the report, Application of Latent Semantic Analysis for Open-Ended Responses in a Large, Epidemiologic Study, which found significantly lower self-reported general health among the group of almost 28,000 Millennium Cohort respondents who answered the open-ended question, compared to the nearly 80,000 participants who did not.