Michael C. Daconta of GCN recently wrote, “Recent articles about Pandora’s and Netflix’s use of big data illustrate why government IT managers should not just focus on data management, data collection and even big data processing. They need to shift the focus from the data producer to the data consumer… In both these cases, we see big data is the stepping stone for consumer-centric information production. The Netflix micro-genres are not the trove of big data on movie viewing, or the movie data itself. Instead, it is useful information mined from that data. Likewise, the data containing Pandora users’ demographics and preferences create a way for advertisers to target buyers.” Read more
Posts Tagged ‘data management’
Industry leaders in sectors including banking and financial services look to have high hopes for semantic technology. They’re thinking about FIBO (Financial Industry Business Ontology) and leveraging semantic technology for more traditional types of data integration and analytics projects. At Cognizant, Thomas Kelly, a director in its Enterprise Information Management practice – and the author of this white paper on How Semantic Technology Drives Agile Business – sees the positive development that clients in the Fortune 500 space like these “are maturing in their use of semantic technology, from a project focus to more enterprise initiatives.”
The interest in FIBO, he says, is representative of an overall interest across in industries in leveraging industry ontologies as mechanisms to help companies better standardize, align and learn from the output of industry-wide efforts. The attention that industry analysts, including Gartner, have put on the semantic web in the last year – not to mention regulators beginning to consider its use in sharing information on a regulatory basis – have helped increase interest by commercial organizations, Kelly notes. That’s also evident in the life sciences sector, as another example, with the efforts of the FDA/PhUSE Semantic Technology Working Group Project to include a draft set of existing CDISC standards in RDF.
The pickup in attention to many things semantic ties to the different perspectives that organizations need to manage about their data, which include “how they currently think of their data, how it is currently perceived in managing business operations; and where they are looking to go in the future that makes it more inclusive of what’s going on in the world outside their walls – that is, how the rest of the industry looks at this data and uses it to support their business processes,” he says.
Jeni Tennison recently wrote a clever article for the Open Data Institute on the five stages of data grief. She writes, “As organisations come to recognise how important and useful data could be, they start to think about using the data that they have been collecting in new ways. Often data has been collected over many years as a matter of routine, to drive specific processes or sometimes just for the sake of it. Suddenly that data is repurposed. It is probed, analysed and visualised in ways that haven’t been tried before. Data analysts have a maxim: ‘If you don’t think you have a quality problem with your data, you haven’t looked at it yet.’ …In our last ODI Board meeting, Sir Tim Berners-Lee suggested that the data curators need to go through something like the five stages of grief described by the Kübler-Ross model. So here is an outline of what that looks like.” Read more
Medallia Expands Offering for Business-to-Business Companies With 360-Degree Account Management Reporting
PALO ALTO, California, November 12, 2013 — Medallia® the global leader in SaaS Customer Experience Management (CEM) solutions, today announced a new release of its Business-to-Business (B2B) offering. The release is designed to increase key stakeholders’ visibility into account feedback across B2B organizations so they can improve the customer experience.
Bob Emmerson recently wrote an article in NoJitter.com in which he stated, “Sky-high guesstimates for the Internet of Things–10B devices by 2016 (Cisco), 50B by 2020 (Ericsson)–can overshadow the importance of this concept in corporate environments. In this context, the Internet of Corporate Things is a more meaningful term. General Electric calls it the Industrial Internet. But whatever term you employ, a serious, significant development is taking shape.” Read more
Walldorf, Germany (PRWEB) October 11, 2013 — fluid Operations, leading provider of cloud and data management solutions based on semantic technologies, and Fujitsu, the leading Japanese information and communication technology (ICT) company, are collaborating to develop a joint solution which allows enterprises to transform legacy, silo-based IT environments into agile, automated infrastructures. The solution aims at providing unified application and service delivery and allowing for agile responses to rapidly changing business demands. Read more
Gary Hamilton of GovHealthIT recently wrote, “Today, the acquisition of patient information for population health management is typically done through Continuity of Care Documents (CCDs). Although the exchange of health information is possible via CCDs, the amount of information they contain can be overwhelming. As such, poring over CCDs to find information relevant to patient populations can be unwieldy and time consuming. With providers challenged to manage information in just one CCD, how can they hope to use these documents to effectively influence care at the population level? The key is to look for ways to use technology to target specific patient information, pinpoint new and relevant information and alert both patients and providers when updated information is available.” Read more
Hadoop is on almost every enterprise’s radar – even if they’re not yet actively engaged with the platform and its advantages for Big Data efforts. Analyst firm IDC earlier this year said the market for software related to the Hadoop and MapReduce programming frameworks for large-scale data analysis will have a compound annual growth rate of more than sixty percent between 2011 and 2016, rising from $77 million to more than $812 million.
Yet, challenges remain to leveraging all the possibilities of Hadoop, an Apache Software Foundation open source project, especially as it relates to empowering the data scientist. Hadoop is composed of two sub-projects: HDFS, a distributed file system built on a cluster of commodity hardware so that data stored in any node can be shared across all the servers, and the MapReduce framework for processing the data stored in those files.
Semantic technology can help solve many of the challenges, Michael A. Lang Jr., VP, Director of Ontology Engineering Services at Revelytix, Inc., told an audience gathered at the Semantic Technology & Business Conference in New York City yesterday.
Philip Connolly of the Daily Business Post recently profiled Galway-based semantic start up, SindiceTech. According to Connolly, “While many people never look underneath the bonnet of the internet, web technology never stands still. Many people see the semantic web as the next step, a technology that allows machines to understand the meaning of information on the web. Most of us online will probably not notice semantic web technologies running in the background, the technologies could lead to an improvement in the relevance of the data returned through search engines for both individuals and enterprises using large amounts of data.” Read more
NEXT PAGE >>