Posts Tagged ‘research’
Ben Woods of The Next Web reports, “Google has joined forces with the University of Oxford in the UK in order to better study the potential of artificial intelligence (AI) in the areas of image recognition and natural language processing. The hope is that by joining forces with an esteemed academic institution, the research will progress more rapidly than going it alone for its DeepMind project. In total, Google has hired seven individuals (who also happen to be world experts in deep learning for natural language understanding), three of which will remain as professors holding joint appointments at Oxford University.” Read more
Derrick Harris of GigaOM reports, “Researchers from the University of California, Irvine, have published a paper demonstrating the effectiveness of deep learning in helping discover exotic particles such as Higgs bosons and supersymmetric particles. The research, which was published in Nature Communications, found that modern approaches to deep neural networks might be significantly more accurate than the types of machine learning scientists traditionally use for particle discovery and might also save scientists a lot of work. To get a sense of how challenging particle discovery is, consider that a collider can produce 100 billion collisions per hour and only about 300 will produce a Higgs boson. Because the particles decay almost immediately, scientists can’t expressly identify them, but instead must analyze (and sometimes infer) the products of their decay.” Read more
Greg Goth of Health Data Management reports, “Researchers at the University of Washington and Microsoft have developed technology that uses natural language processing and machine learning to speed up the diagnosis of pneumonia in ICU patients. Bioinformatics professor Meliha Yetisgen and her colleagues at the university teamed up with Microsoft researcher Lucy Vanderwende on the project, called deCIPHER, using the Microsoft Research Statistical Parsing and Linguistic Analysis Toolkit (Splat).” Read more
Elizabeth Harrington of Free Beacon reports, “The federal government is studying how to use Twitter for surveillance on depressed people. The University of California, San Diego (UCSD) began a study financed by the National Institutes of Health last month that will provide ‘population level depression monitoring’ through the social media site. The project, ‘Utilizing Social Media as a Resource for Mental Health Surveillance,’ is costing taxpayers $82,800.” Read more
Three fully-funded PhD Studentships are being offered at The Web and Internet Science (WAIS) Research Group at the University of Southampton. They are scheduled to begin in October 2013. The posting states, “The PhD Studentships will be aligned with the EPSRC-funded collaborative interdisciplinary project SOCIAM: The Theory and Practice of Social Machines (http://sociam.org/). The Project’s core objective is to establish the research, methods, tools, networks and collaborations to allow us to understand social machines (computational entities governed by both computational and social processes) in order that they can be designed and deployed by the full range of potential beneficiaries. The work of the Project is highly integrative and students will be expected to take a holistic approach to their studies.”
The successful candidate is likely to have “A 1st or high 2:1 degree in a relevant discipline and/or upper second degree with a related Masters [and] experience and an interest in multi- and interdisciplinary research.”
Alex Armstrong of I Programmer reports, “Google has awarded over $1.2 million to support research in several areas of natural language understanding that relate to Google’s concept of the Knowledge Graph. Google has been investing heavily in machine learning and deep neural networks to improve web search. Supporting natural language understanding is also motivated by the need to further search technology. In the announcement of the awards Google Research Blog explains how natural language processing is integral to its Knowledge Graph technology that represents a shift ‘from strings to things’.” Read more
Research Data Alliance Sees Semantics As Key To Helping Research Communities Get The Most From Their Data
The Research Data Alliance (RDA) was recently formed with the goal of accelerating international data-driven innovation and discovery. It aims to get there by facilitating the sharing and exchange of research data, its use and re-use, standards harmonization, and discoverability.
Funded by The Australian Commonwealth Government through the Australian National Data Service, the European Commission through the iCordi project under the 7th Framework Program, and the U.S. through the RDA/US activity supported by the National Science Foundation, it began its work last August when it established an international steering group with these funds. In March, it held its first plenary meeting and had its official launch.
Dr. Francine Berman, Professor of Computer Science, Rensselaer Polytechnic Institute, is Chair of the Research Data Alliance/U.S. The Semantic Web Blog recently conducted a Q&A with her to learn more about RDA plans:
NEXT PAGE >>