Jelani Harper

Semantic Web Job: Software Engineer

elevada logoElevada is looking for a software engineer. The job description states: “Elevada is a data management company seeking a skilled Software Engineer with 4+ years of professional development experience. This is an opportunity to get in early (employee number < 5) at a real company with a strong product vision + enterprise customers, real revenue, and a strong sales pipeline. Compensation will be a mix of cash and equity at the end of a trial contract. Below are parameters for the position. We will tailor responsibilities to suit the individual who best fits our culture and goals. Candidate responsibilities:

  • Web-based front-end development in Java+GWT, transitioning to a JavaScript framework in the future.
  • Develop/package/wrap JavaScript libraries for GWT integration via JSNI (and eventually into the new framework).
  • Server-side development using Java, Spring Framework, JPA, Hibernate.
  • Help maintain development and deployment infrastructure in Linux environments.”

Skills requirements include:

Read more

Natural Language Processing Market Booms

Reports-n-Reports logoA recent press release indicates that, “The Natural Language Processing (NLP) market is estimated to grow from $ 3,787.3 million in 2013 to $9,858.4 million in 2018. This represents a Compounded Annual Growth Rate (CAGR) of 21.1% from 2013 to 2018. In the current scenario, web and e-commerce, healthcare, IT and Telecommunication vertical continues to grow and are the largest contributor for Natural Language Processing (NLP) software market. In terms of regional growth, North America is expected to be the biggest market in terms of revenue contribution. European and APAC region is expected to experience increased market traction, due to increasing adoption across various verticals and investment support in research projects from the regional government ”.

The release also states, “The major forces driving natural language processing market (NLP) are the growing demand for enhanced customer experience, increase in adoption of smartphone, leveraging big data and growth in machine to machine (M2M) technologies. Furthermore, in industries such as healthcare, BFSI, social websites and e-commerce channels have witnessed exponential rise in real time customer data and transaction information. NLP technology can leverage this unstructured data for analyzing customer needs, expectations and enhancing customer experience by optimizing cost effective lingual response system in organizational processes. By using NLP software solutions, organization can have better insights on customer’s perception, optimize business processes and reduce operational cost.”

Read more here.

Colorado Learns about The Internet of Things

Still image from the video about Internet of Things (source: New York Times)The Denver Post recently reported that, “After a strong earthquake rattled Napa Valley early Sunday, California device maker Jawbone found out how many of its UP wristband users were shaken from their sleep and stayed up. About 93 percent of its customers within 15 miles of Napa, Calif., didn’t go back to sleep after the 6.0 quake struck at 3:20 a.m., said Andrew Rosenthal, senior product engineer for wellness at Jawbone. But what use could come from that information? “Why not tell people to go to work at 11 a.m. on Monday,” he said. The anecdote represents just one example of information being generated by what technologists call “The Internet of Things,” a topic Rosenthal and other panelists discussed Tuesday at the Colorado Innovation Network Summit in Denver. The summit continues Wednesday at the Denver Performing Arts Complex.”.

The article also states, “As recently as 2005, most households had “The Internet of Thing” — a desktop or laptop computer connected to the Internet, said Eric Schaefer, general manager of communications, data and mobility for Comcast Communications.

By 2010, “The Internet of Wireless Things” started to appear with the rising popularity of smartphones and tablets. The next phase is what Schaefer called “The Internet of Disjointed Things.” Schaefer described one co-worker who has 25 applications to run items in his home, many on different platforms. He predicts that those systems, by 2020, will communicate and operate with one another and be everywhere, a trend that ever-increasing broadband capacity will allow.”

Read more here.

Britain Capitalizes on Internet of Things?

Photo of the Union Jack courtesy flickr / defenceimagesGerard Grech recently wrote, “When you hear the term “the internet of things”, what immediately comes to mind? If you’ve been following recent stories in the media, you might have it pegged as the Big New Thing to revolutionise all our lives any minute now. Equally, you might be forgiven for thinking it’s a lot of hype generated by overexcited tech types and inflated billion-dollar deals. The truth, as ever, lies somewhere in the middle. The combination of connected products, together with intelligent data analysis, has the potential to transform the way we produce goods, run machinery, manage our cities and improve our lives. The internet of things is a real phenomenon and will take off in much the same way as the worldwide web did back in the 1990s.”

Grech continued, “And, just like the web, the full deployment of IoT across industries will take time, talent and persistence. The question is not whether it is going to happen, but what part the UK will play in it all. Consumers stand to benefit from electricity meters that talk to the grid to get the best deals, and health monitors that provide minute-by-minute data on people’s heart rates. With Google’s acquisition of Nest Labs and Samsung’s recent purchase of SmartThings, we will soon have access to a suite of clever gadgets that will create our future “smart” home. It’s a beguiling vision, albeit one with alarm bells (privacy and security obviously need resolving). But the real power of the internet of things lies beyond eye-catching consumer goods.”

Read more here.

Image courtesy flickr / defenceimages

Prelert’s Elasticsearch Equipped with Anomaly Detection

Prelert logoDaniel Gutierrez reported, “Prelert, the anomaly detection company, today announced the release of an Elasticsearch Connector to help developers quickly and easily deploy its machine learning-based Anomaly Detective® engine on their Elasticsearch ELK (Elasticsearch, Logstash, Kibana) stack. Earlier this year, Prelert released its Engine API enabling developers and power users to leverage its advanced analytics algorithms in their operations monitoring and security architectures. By offering an Elasticsearch Connector, the company further strengthens its commitment to democratizing the use of machine learning technology, providing tools that make it even easier to identify threats and opportunities hidden within massive data sets. Written in Python, the Prelert Elasticsearch Connector source is available on GitHub. This enables developers to apply Prelert’s advanced, machine learning-based analytics to fit the big data needs within their unique environment.”

The article continues with, “Prelert’s Anomaly Detective processes huge volumes of streaming data, automatically learns normal behavior patterns represented by the data and identifies and cross-correlates any anomalies. It routinely processes millions of data points in real-time and identifies performance, security and operational anomalies so they can be acted on before they impact business. The Elasticsearch Connector is the first connector to be officially released by Prelert. Additional connectors to several of the most popular technologies used with big data will be released throughout the coming months.”

Read more here.

Image courtesy Prelert.

Combining Connectivity with Cognitive Computing

Photo of David LoshinA recent report from David Loshin states, “As our world becomes more attuned to the generation, and more importantly, the use of massive amounts of data, information technology (IT) professionals are increasingly looking to new technologies to help focus on deriving value from the velocity of data streaming from a wide variety of data sources. The breadth of the internet and its connective capabilities has enabled the evolution of the internet of things (IoT), a dynamic ecosystem that facilitates the exchange of information among a cohort of devices organized to meet specific business needs. It does this through a growing, yet intricate interconnection of uniquely identifiable computing resources, using the internet’s infrastructure and employing internet protocols. Extending beyond the traditional system-to-system networks, these connected devices span the architectural palette, from traditional computing systems, to specialty embedded computer modules, down to tiny micro-sensors with mobile-networking capabilities.”

Loshin added, “In this paper, geared to the needs of the C-suite, we’ll explore the future of predictive analytics by looking at some potential use cases in which multiple data sets from different types of devices contribute to evolving models that provide value and benefits to hierarchies of vested stakeholders. We’ll also introduce the concept of the “insightful fog,” in which storage models and computing demands are distributed among interconnected devices, facilitating business discoveries that influence improved operations and decisions. We’ll then summarize the key aspects of the intelligent systems that would be able to deliver on the promise of this vision.”

The full report, “How IT can blend massive connectivity with cognitive computing to enable insights” is available for download for a fee.

Read more here.

Google Releases Linguistic Data based on NY Times Annotated Corpus

Photo of New York Times Building in New York City

Dan Gillick and Dave Orr recently wrote, “Language understanding systems are largely trained on freely available data, such as the Penn Treebank, perhaps the most widely used linguistic resource ever created. We have previously released lots of linguistic data ourselves, to contribute to the language understanding community as well as encourage further research into these areas. Now, we’re releasing a new dataset, based on another great resource: the New York Times Annotated Corpus, a set of 1.8 million articles spanning 20 years. 600,000 articles in the NYTimes Corpus have hand-written summaries, and more than 1.5 million of them are tagged with people, places, and organizations mentioned in the article. The Times encourages use of the metadata for all kinds of things, and has set up a forum to discuss related research.”

The blog continues with, “We recently used this corpus to study a topic called “entity salience”. To understand salience, consider: how do you know what a news article or a web page is about? Reading comes pretty easily to people — we can quickly identify the places or things or people most central to a piece of text. But how might we teach a machine to perform this same task? This problem is a key step towards being able to read and understand an article. One way to approach the problem is to look for words that appear more often than their ordinary rates.”

Read more here.

Photo credit : Eric Franzon

Getty Releases More Linked Open Data: Thesaurus of Geographic Names

Linked Open Data - Getty VocabulariesLast winter, SemanticWeb reported that the Getty Research Institute had released the first of four Getty vocabularies as Linked Open Data. Recently, the Getty revealed that it had unveiled its second. James Cuno wrote, “We’re delighted to announce that the Getty Research Institute has released the Getty Thesaurus of Geographic Names (TGN)® as Linked Open Data. This represents an important step in the Getty’s ongoing work to make our knowledge resources freely available to all. Following the release of the Art & Architecture Thesaurus (AAT)® in February, TGN is now the second of the four Getty vocabularies to be made entirely free to download, share, and modify. Both data sets are available for download at vocab.getty.edu under an Open Data Commons Attribution License (ODC BY 1.0).”

Read more

Semantic Web Job: Big Data Architect

TekTree Systems LogoNew York’s Tektree Systems is in need of a Big Data Architect. The job description states, “Hadoop Data Architect with both hands-on Big Data and relational experience and deep knowledge of physical data modeling, data organization and storage technology, experienced with high volumes and able to architect and implement multi-tier solutions using the right technology in each tier, based on fit. Required Skills and Qualifications:

  • Design  and development of data models for a new HDFS Master Data Reservoir and one or more relational or object Current Data environments
  • Design of optimum storage allocation for the data stores in the architecture.
  • Development of data frameworks for code implementation and testing across the program
  • Knowledge and experience with RDF and other Semantic technologies
  • Participation in code reviews to assure that developed and tested code conforms with the design and architecture principles
  • QA and testing of modules/applications/interfaces.
  • End-to-End project experience through to completion and supervise turnover to Operations staff.
  • Preparation of documentation of data architecture, designs and implemented code”.

Read more

Microsoft Assists Internet of Things

AllSeen Alliance logoPhil Goldstein of FierceWireless reported that, “Microsoft (NASDAQ: MSFT) joined the AllSeen Alliance, an open-source project founded on Qualcomm technology and aimed at coming up with a standard to connect devices and have them interact as part of the Internet of Things. The software giant’s participation in the group adds heft to its membership, which has been largely dominated by consumer electronics and home appliance makers.The AllSeen Alliance’s leading members include Haier, LG Electronics, Panasonic and Sharp, and in total the group now has 51 members. Adding Microsoft could ensure that future Windows devices interact with other connected gadgets via the AllSeen Alliance’s specifications.”

Read more

NEXT PAGE >>