Colorado Learns about The Internet of Things

Still image from the video about Internet of Things (source: New York Times)The Denver Post recently reported that, “After a strong earthquake rattled Napa Valley early Sunday, California device maker Jawbone found out how many of its UP wristband users were shaken from their sleep and stayed up. About 93 percent of its customers within 15 miles of Napa, Calif., didn’t go back to sleep after the 6.0 quake struck at 3:20 a.m., said Andrew Rosenthal, senior product engineer for wellness at Jawbone. But what use could come from that information? “Why not tell people to go to work at 11 a.m. on Monday,” he said. The anecdote represents just one example of information being generated by what technologists call “The Internet of Things,” a topic Rosenthal and other panelists discussed Tuesday at the Colorado Innovation Network Summit in Denver. The summit continues Wednesday at the Denver Performing Arts Complex.”.

The article also states, “As recently as 2005, most households had “The Internet of Thing” — a desktop or laptop computer connected to the Internet, said Eric Schaefer, general manager of communications, data and mobility for Comcast Communications.

By 2010, “The Internet of Wireless Things” started to appear with the rising popularity of smartphones and tablets. The next phase is what Schaefer called “The Internet of Disjointed Things.” Schaefer described one co-worker who has 25 applications to run items in his home, many on different platforms. He predicts that those systems, by 2020, will communicate and operate with one another and be everywhere, a trend that ever-increasing broadband capacity will allow.”

Read more here.

XSB and SemanticWeb.Com Partner In App Developer Challenge To Help Build The Industrial Semantic Web

Semantic Web Developer Challenge - sponsored by XSB and SemanticWeb.comAn invitation was issued to developers at last week’s Semantic Technology and Business Conference: XSB and SemanticWeb.com have joined to sponsor the Semantic Web Developer Challenge, which asks participants to build sourcing and product life cycle management applications leveraging XSB’s PartLink Data Model.

XSB is developing PartLink as a project for the Department of Defense Rapid Innovation Fund. It uses semantic web technology to create a coherent Linked Data model for all part information in the Department of Defense’s supply chain – some 40 million parts strong.

“XSB recognized the opportunity to standardize and link together information about the parts, manufacturers, suppliers, materials, [and] technical characteristics using semantic technologies. The parts ontology is deep and detailed with 10,000 parts categories and 1,000 standard attributes defined,” says Alberto Cassola, vp sales and marketing at XSB, a leading provider of master data management solutions to large commercial and government entities. PartLink’s Linked Data model, he says, “will serve as the foundation for building the industrial semantic web.”

Read more

Semantic Technology Job: Natural Language Processing Expert

morfologica logoMorfologica, Inc. is looking for a Natural Language Processing expert. The job description states: “Morfologica Inc. is a small business that provides consulting and engineering services in the fields of Natural Language Processing (NLP) and Computational Linguistics to academic, business and government organizations. We are a growing company with lots of opportunities and great benefits for qualified candidates with a passion for this field. We are looking to support a customer in the Fort Meade area by adding experienced NLP Experts, Computer Scientists, Computational Linguists, Theoretical or General Linguists, and Knowledge Engineers to our team. Qualified candidates with a strong background in Artificial Intelligence, Cognitive Science or Library Science are also encouraged to apply. Interested candidates will be working at a customer site supporting ongoing research efforts for NSA.  The work being done will include development and testing of parsers, part-of-speech taggers, Wordnet applications, and processing of multi-lingual data. The ideal candidate will have a strong understanding of one or more natural language processing techniques, knowledge-base or rule-based developent, and programming experience in NLP-related technologies. The candidate will also have on-going experience providing consulting and support in fields related NLP. Individuals with continued experience with multiple NLP tools and techniques are preferred.”
Read more

Britain Capitalizes on Internet of Things?

Photo of the Union Jack courtesy flickr / defenceimagesGerard Grech recently wrote, “When you hear the term “the internet of things”, what immediately comes to mind? If you’ve been following recent stories in the media, you might have it pegged as the Big New Thing to revolutionise all our lives any minute now. Equally, you might be forgiven for thinking it’s a lot of hype generated by overexcited tech types and inflated billion-dollar deals. The truth, as ever, lies somewhere in the middle. The combination of connected products, together with intelligent data analysis, has the potential to transform the way we produce goods, run machinery, manage our cities and improve our lives. The internet of things is a real phenomenon and will take off in much the same way as the worldwide web did back in the 1990s.”

Grech continued, “And, just like the web, the full deployment of IoT across industries will take time, talent and persistence. The question is not whether it is going to happen, but what part the UK will play in it all. Consumers stand to benefit from electricity meters that talk to the grid to get the best deals, and health monitors that provide minute-by-minute data on people’s heart rates. With Google’s acquisition of Nest Labs and Samsung’s recent purchase of SmartThings, we will soon have access to a suite of clever gadgets that will create our future “smart” home. It’s a beguiling vision, albeit one with alarm bells (privacy and security obviously need resolving). But the real power of the internet of things lies beyond eye-catching consumer goods.”

Read more here.

Image courtesy flickr / defenceimages

W3C Publishes Linked Data Platform Best Practices and Guidelines

Photo of Arnaud Le Hors presenting the LDP at SemTechBiz 2014The W3C’s Linked Data Platform (LDP) Working Group has published a document outlining best practices and guidelines for implementing Linked Data Platform servers and clients. The document was edited by Cody Burleson, Base22, and Miguel Esteban Gutiérrez and Nandana Mihindukulasooriya of the Ontology Engineering Group, Universidad Politécnica de Madrid.

For those new to LDP, SemanticWeb.com has recently published the following materials:

WEBINAR: “Getting Started with the Linked Data Platform (LDP)” with LDP Working Group Chair, Arnaud Le Hors, IBM (pictured above presenting LDP work at the SemTechBiz conference last week).

ARTICLE: “Introduction to: Linked Data Platform” by Cody Burleson, Base 22

Those ready to dive into the nuts and bolts of the document will find detailed guidance on topics such as:

  • Predicate URIs
  • Use of relative URIs
  • Hierarchy and container URIs
  • Working with fragments
  • Working with standard datatypes
  • Representing relationships between resources
  • Finding established vocabularies

…and much more. See the full document at http://www.w3.org/TR/ldp-bp/

SemanticWeb.com congratulates the Working Group on this step and looks forward to reporting on use cases and implementations of LDP.

Prelert’s Elasticsearch Equipped with Anomaly Detection

Prelert logoDaniel Gutierrez reported, “Prelert, the anomaly detection company, today announced the release of an Elasticsearch Connector to help developers quickly and easily deploy its machine learning-based Anomaly Detective® engine on their Elasticsearch ELK (Elasticsearch, Logstash, Kibana) stack. Earlier this year, Prelert released its Engine API enabling developers and power users to leverage its advanced analytics algorithms in their operations monitoring and security architectures. By offering an Elasticsearch Connector, the company further strengthens its commitment to democratizing the use of machine learning technology, providing tools that make it even easier to identify threats and opportunities hidden within massive data sets. Written in Python, the Prelert Elasticsearch Connector source is available on GitHub. This enables developers to apply Prelert’s advanced, machine learning-based analytics to fit the big data needs within their unique environment.”

The article continues with, “Prelert’s Anomaly Detective processes huge volumes of streaming data, automatically learns normal behavior patterns represented by the data and identifies and cross-correlates any anomalies. It routinely processes millions of data points in real-time and identifies performance, security and operational anomalies so they can be acted on before they impact business. The Elasticsearch Connector is the first connector to be officially released by Prelert. Additional connectors to several of the most popular technologies used with big data will be released throughout the coming months.”

Read more here.

Image courtesy Prelert.

IBM Taps Global Network of Innovation Centers to Fuel Linux on Power Systems for Big Data and Cloud Computing

ibmCHICAGO, Aug. 22, 2014 /PRNewswire/ — At the LinuxCon North America conference last week, IBM (NYSE: IBM) announced it is tapping into its global network of over 50 IBM Innovation Centers and IBM Client Centers to help IBM Business Partners, IT professionals, academics, and entrepreneurs develop and deliver new Big Data and cloud computing software applications for clients using Linux on IBM Power Systems servers. Read more

Gartner Reports ‘Internet of Things’ Tops the Technology Hype Cycle

Gartner_2014-2Gil Press of Forbes reports, “Gartner released last week its latest Hype Cycle for Emerging Technologies. Last year, big data reigned supreme, at what Gartner calls the ‘peak of inflated expectations.’  But now big data has moved down the ‘trough of disillusionment’ replaced by the Internet of Things at the top of the hype cycle.  In 2012 and in 2013 Gartner’s analysts thought that the Internet of Things had more than 10 years to reach the ‘plateau of productivity’ but this year they give it five to ten years to reach this final stage of maturity. The Internet of Things, says Gartner, ‘is becoming a vibrant part of our, our customers’ and our partners’ business and IT landscape’.” Read more

Combining Connectivity with Cognitive Computing

Photo of David LoshinA recent report from David Loshin states, “As our world becomes more attuned to the generation, and more importantly, the use of massive amounts of data, information technology (IT) professionals are increasingly looking to new technologies to help focus on deriving value from the velocity of data streaming from a wide variety of data sources. The breadth of the internet and its connective capabilities has enabled the evolution of the internet of things (IoT), a dynamic ecosystem that facilitates the exchange of information among a cohort of devices organized to meet specific business needs. It does this through a growing, yet intricate interconnection of uniquely identifiable computing resources, using the internet’s infrastructure and employing internet protocols. Extending beyond the traditional system-to-system networks, these connected devices span the architectural palette, from traditional computing systems, to specialty embedded computer modules, down to tiny micro-sensors with mobile-networking capabilities.”

Loshin added, “In this paper, geared to the needs of the C-suite, we’ll explore the future of predictive analytics by looking at some potential use cases in which multiple data sets from different types of devices contribute to evolving models that provide value and benefits to hierarchies of vested stakeholders. We’ll also introduce the concept of the “insightful fog,” in which storage models and computing demands are distributed among interconnected devices, facilitating business discoveries that influence improved operations and decisions. We’ll then summarize the key aspects of the intelligent systems that would be able to deliver on the promise of this vision.”

The full report, “How IT can blend massive connectivity with cognitive computing to enable insights” is available for download for a fee.

Read more here.

Google Releases Linguistic Data based on NY Times Annotated Corpus

Photo of New York Times Building in New York City

Dan Gillick and Dave Orr recently wrote, “Language understanding systems are largely trained on freely available data, such as the Penn Treebank, perhaps the most widely used linguistic resource ever created. We have previously released lots of linguistic data ourselves, to contribute to the language understanding community as well as encourage further research into these areas. Now, we’re releasing a new dataset, based on another great resource: the New York Times Annotated Corpus, a set of 1.8 million articles spanning 20 years. 600,000 articles in the NYTimes Corpus have hand-written summaries, and more than 1.5 million of them are tagged with people, places, and organizations mentioned in the article. The Times encourages use of the metadata for all kinds of things, and has set up a forum to discuss related research.”

The blog continues with, “We recently used this corpus to study a topic called “entity salience”. To understand salience, consider: how do you know what a news article or a web page is about? Reading comes pretty easily to people — we can quickly identify the places or things or people most central to a piece of text. But how might we teach a machine to perform this same task? This problem is a key step towards being able to read and understand an article. One way to approach the problem is to look for words that appear more often than their ordinary rates.”

Read more here.

Photo credit : Eric Franzon

<< PREVIOUS PAGENEXT PAGE >>