Rick Delgado recently wrote on Mike2.0, “When you get right down to it, computer security is all about being able to analyze the data. A company’s security is largely dependent on the amount of data analysis they’re capable of, along with the quality of that data. A company that can collect a lot of data at once but doesn’t have the means to analyze it properly for threats won’t get very far. The same goes for a business with excellent analytic tools but without the resources to gather and store that information. These facts are very important because without a lot of data, machine learning simply can’t be as effective.” Read more
Andrew Osborne, CTO of GS1 UK recently shared an overview of how the non-profit is leveraging the Semantic Web to improve customer experiences. He writes, “For those of you unfamiliar with what GS1 actually does, we are a not-for-profit standards development organisation. Put simply, our role is to define data structures and how these are used to identify things, a role we have been performing since the 1970s. We provide a series of ‘keys’ for industry which identify various types of entity (products, locations, assets and so on) and which have highly developed allocation rules. We have also defined product attributes for bar coding (the application identifier standards), have over 1,000 product attributes defined for synchronisation in the Global Data Synchronisation Network and an extensive Global Product Classification that is used to categorise products. For visibility systems we have a standard ‘Core Business Vocabulary.’ “ Read more
SAN DIEGO — Teradata (NYSE: TDC), the analytic data platforms, marketing applications, and services company, today announced two acquisitions that accelerate the growth of its big data capabilities.
On July 16th, Teradata acquired assets of Revelytix, a leader in information management products for big data with unique metadata management technology and deep expertise in integrating information across the enterprise. On July 17th, Teradata acquired assets of Hadapt, including experienced big data technologists and intellectual property. Read more
Seth Grimes, president and principal consultant of Alta Plana Corp. and founding chair of the Sentiment Analysis Symposium, has put together a thorough new report, Text Analytics 2014: User Perspectives on Solutions and Providers. Among the interesting findings of the report is that “growth in text analytics, as a vendor market category, has slackened, even while adoption of text analytics, as a technique, has continued to expand rapidly.”
Grimes explains that in a fragmented market, consisting of everything from text analytics services to solution-embedded technologies, the opportunities for users to practice text analytics is strong, but that increasingly text analytics is not the main focal point of the solutions being leveraged.
Reflecting the diversity of options, respondents listed among their providers a number of open-source offerings such as Apache OpenNLP and GATE, API services such as AlchemyAPI and Semantria, and enterprise software solution and business suite providers like SAP. The word cloud above was generated by Alta Plana at Wordle.net to show how users responded to the question of companies they know provide text/content analytics functionality. Nearly 50 percent of users are likely to recommend their most important provider.
Data-Intensive Businesses Turn to Infobright to Transform Machine-Generated Data into Business Improvements
TORONTO–(BUSINESS WIRE)–Infobright, the database analytics platform for the Internet of Things (IoT), today announced that Rez-1, Smart 421, and PayPoint, three data-heavy businesses in the financial services, logistics and transportation industries, selected Infobright’s Enterprise Edition to transform their machine-generated data into business insights. The Infobright analytic database platform enables customers to interrogate machine and IoT generated data to quickly identify the patterns that drive smarter business decisions. Read more
OpenText yesterday made its secure file sharing and synchronization product, Tempo Box, available for free to customers using its OpenText Content Suite enterprise information management tool.
“A lot of our customers have major concerns about employees sharing documents with cloud tools like Dropbox,” says Lubor Ptacek, vp of strategic marketing. They want them to be available, synched and sharable across all their devices, but using such services can create security and compliance problems. By deploying Tempo Box on top of their existing infrastructure, at no charge to all internal employees and any external parties they may need to share content with, companies get a seamless and cost-effective way to share files in the cloud without compromising security, records management requirements and storage optimization, he says – “the things that enterprise customers care about, especially those operating in regulated environments.”
Among those capabilities is applying automatic content classification, which is usually required for records management reasons – for example, helping companies determine if a document is an employee record they must keep for five years or a tax record they have to hold for seven years. That under-the-hood classification engine is an outgrowth of OpenText’s acquisition a few years back of text mining, analytics and search company Nstein. Since the acquisition, says Ptacek, the company has been looking at ways to apply the technology to specific business problems and make it part of its applications.
Jorge Garcia of Wired recently wrote, “IBM’s recent announcements of three new services based in Watson technology make it clear that there is pressure in the enterprise software space to incorporate new technologies, both in hardware and software, in order to keep pace with modern business. It seems we are approaching another turning point in technology where many concepts that were previously limited to academic research or very narrow industry niches are now being considered for mainstream enterprise software applications. Machine learning, along with many other disciplines within the field of artificial intelligence and cognitive systems, is gaining popularity, and it may in the not so distant future have a colossal impact on the software industry. This first part of my series on machine learning explores some basic concepts of the discipline and its potential for transforming the business intelligence and analytics space.” Read more
Data scientists can add another tool to their toolset today: GraphLab has launched GraphLab Create 1.0, which bundles up everything starting from tools for data cleaning and engineering through to state-of-the-art machine learning and predictive analytics capabilities.
Think of it, company execs say, as the single platform that data scientists or engineers can leverage to unleash their creativity in building new data products, enabling them to write code at scale on their own laptops. The driving concept behind the solution, they say, is to make large-scale machine learning and predictive analytics easy enough that companies won’t have to hire huge teams of data scientists and engineers and build the big hardware infrastructures that lie behind many of today’s Big Data-intensive products. And, the data scientists and engineers that do use it won’t need to be experts at machine-learning algorithms – just experienced enough to write Python code.
Kevin Casey of Information Week recently wrote, “Old-school organizations will fuel the next swell of data-driven initiatives in IT. So what’s in store for the early movers and, specifically, their big-data professionals? How will the data scientist and similar roles evolve? ‘The role is becoming bigger,’ said Olly Downs, chief scientist at big-data analytics firm Globys, in a recent interview. By bigger, he means in every way — what was once a niche is now, at least in some companies, a driving force.” Read more
July 7, 2014 – Wolters Kluwer Health, a leading global provider of information for healthcare professionals and students, announced today that Holy Name Medical Center (HNMC) has selected Health Language® to improve problem and diagnosis searches within its electronic health record (EHR). HNMC will use the Health Language Workflow-Enhancing Search solution to support encoding its problem lists in SNOMED-CT® for Stage 2 Meaningful Use and the transition to ICD-10. Read more
NEXT PAGE >>