Posts Tagged ‘Fujitsu’

Fujitsu Supports Artificial Brain Project with Prep School Test Challenge

Fujitsu

Tokyo, Nov 25, 2013 – (JCN Newswire) – Fujitsu Laboratories Ltd. and Japan’s National Institute of Informatics (NII) announced today that under NII’s “artificial brain” project, known as “Todai Robot Project – Can a Robot Pass the University of Tokyo (Todai) Entrance Exam?,” for short, their entry has taken a practice exam held by Yoyogi Seminar – Education Research Institute, a leading Japanese preparatory school.

 

Under the Todai Robot project, Fujitsu Laboratories has been conducting joint research and participating as the core members of the math team. The overall project, led by NII professor Noriko Arai, commenced in 2011 with the goal of enabling an artificial brain to score high marks on the test administered by the National Center for University Entrance Examinations by 2016 (the “Center Test”), and crossing the threshold required for admission to the University of Tokyo by 2021. Read more

fluidOps Announces Data Management Solution for Fujitsu vShape Reference Architectures

FluidOps

Walldorf, Germany (PRWEB) October 11, 2013 — fluid Operations, leading provider of cloud and data management solutions based on semantic technologies, and Fujitsu, the leading Japanese information and communication technology (ICT) company, are collaborating to develop a joint solution which allows enterprises to transform legacy, silo-based IT environments into agile, automated infrastructures. The solution aims at providing unified application and service delivery and allowing for agile responses to rapidly changing business demands. Read more

Fujitsu Looks to Ireland for the Future of Linked Data, Semantic Web

Carmel Doyle of Silicon Republic reports, “Fujitsu Laboratories is set to engage in a series of collaborative Irish research projects over the next three years to test technologies in order to steer the ICT company’s future strategy. Fujitsu Ireland CEO Regina Moran said collaborative R&D has the scope to deliver a seven-fold return on an initial investment. Moran was speaking at an innovation conference organised by Fujitsu that kicked off in Croke Park in Dublin this morning. Industry strategists and academics are convening at the one-day event to thrash out ideas on ways of maximising R&D collaboration in order to translate research activity into commercial outputs.” Read more

It’s Time To Get Formal With Linked Data

It’s time to get real with Linked Data. The World Wide Web Consortium’s Linked Data Platform Working Group, convened almost a year ago, is on the case, with expectations by June to publish a last call working draft of the specification, and to have a final recommendation, the last stage of the W3C’s standards process, by early next year.

“The Linked Data Platform is expanding on the concept [originally] put forward by Tim Berners-Lee on his web site, to turn it into a specification,” says Arnaud J. Le Hors, co-chair of the working group and IBM’s Linked Data Standards Lead. He will address the work at this session during next month’s SemTechBiz conference in San Francisco.

Why the need to formalize Linked Data?  While there is a fairly significant list of W3C standards around the Semantic Web, the more loosely-defined Linked Data has led to an environment where interoperability suffers. That’s because people are left to solve the same problems, such as those around publishing and retrieving data, over and over again, and they take different paths to get there, Le Hors says. The guides that are out there are just that, guides, with people free to use or ignore them, if they can even find them – which in itself isn’t easy to do for those who aren’t well-informed members of the community, he says.

The Linked Data Platform extends the model to provide the industry with a formal definition for read-write access to Linked Data; it mandates publishing data in a standard format, RDF, and using a standard protocol, HTTP, “which is completely symmetrical with the way the web works today, with HTML and HTTP,” Le Hors says.

Read more

Fujitsu, DERI Set to Unveil Details of Linked Data Interface

Carmel Doyle of Silicon Republic recently covered some exciting developments that have arisen from the recent collaboration between Fujitsu Labs and DERI: “This week, Fujitsu is presenting the first results from its research collaboration with DERI at the XBRL26 conference taking place in Dublin. Speaking this afternoon, Fujitsu Ireland’s head of research Anthony McCauley said the team has been pioneering an interface that sits on linked data. ’We’ve been looking at that not just from a research perspective but also in terms of the real commercial opportunities that linked data can provide,’ he said.”

Doyle continues, “The big challenge at the moment for data miners is that data sets are dispersed in different locations. Read more

Addressing Price-Performance And Curation Issues For Big Data Work In The Cloud

The cloud’s role in processing big semantic data sets was recently highlighted in early April when DERI and Fujitsu Laboratories announced a new data storage technology for storing and querying Linked Open Data that resides on a cloud-based platform (see our story here).

The cloud conversation, with storage as one key discussion point, will continue to be an active one in Big Data circles, whether users are working with massive, connected Linked Data sets or trying to run NLP across the Twitter firehose. CloudSigma, for example, recently publicly disclosed that it is using an all solid-state drive (SSD) solution for its public cloud offering that lets users purchase CPU, RAM, storage and bandwidth independently. The use of SSD, says CEO Robert Jenkins, avoids the problem that spinning disks have with the randomized, multi-tenant access of a public cloud that leads to storage bottlenecks and curbs performance.

That, combined with the company’s approach of letting customers size virtual machine resources as they like, as well as leverage exposed advanced hypervisor settings to optimize for their particular applications, he says, brings the use of the public cloud infrastructure closer to what companies can get out of private cloud environments, and at a price-performance win.

Read more

DERI and Fujitsu Form Research Alliance in Ireland

Tech Central reports, “Fujitsu Ireland and local scientists have formed a research alliance to explore the commercial opportunities that can be exploited through the semantic web. The semantic web Big Data project is a collaborative movement led by the World Wide Web Consortium (W3C) that promotes common formats for data on the web. By encouraging the inclusion of semantic content in web pages, the semantic web aims to convert the current web of unstructured documents into a ‘web of data’.” Read more

DERI and Fujitsu Team On Research Program

The Digital Enterprise Research Institute (DERI) is kicking off a project with Fujitsu Laboratories Ltd. in Japan to build a large-scale RDF store in the cloud capable of processing hundreds of billions of triples. The idea, says DERI research fellow Dr. Michael Hausenblas, “is to build up a platform that allows you to process and convert any kind of data” — from relational databases to LDAP record-based, directory-like data, but also streaming sources of data, such as sensors and even the Twitter firehose.

The project has defined eight different potential enterprise use cases for such a platform, ranging from knowledge-sharing in health care and life science to dashboards in financial services informed by XBRL data. “Once the platform is there we will implement at least a couple of these use cases on business requirements, and essentially we are going to see which are the most promising for business units,” Hausenblas says.

Read more