Aaron Bradley recently posted a roundtable discussion about JSON-LD which includes: “JSON-LD is everywhere. Okay, perhaps not everywhere, but JSON-LD loomed large at the 2014 Semantic Web Technology and Business Conference in San Jose, where it was on many speakers’ lips, and could be seen in the code examples of many presentations. I’ve read much about the format – and have even provided a thumbnail definition of JSON-LD in these pages – but I wanted to take advantage of the conference to learn more about JSON-LD, and to better understand why this very recently-developed standard has been such a runaway hit with developers. In this quest I could not have been more fortunate than to sit down with Gregg Kellogg, one of the editors of the W3C Recommendation for JSON-LD, to learn more about the format, its promise as a developmental tool, and – particularly important to me as a search marketer – the role in the evolution of schema.org.”
Big Data has been getting its fair share of commentary over the last couple months. Surveys from multiple sources have commented on trends and expectations. The Semantic Web Blog provides some highlights here:
- From Accenture Anayltics’s new Big Success With Big Data report: There remain some gaps in what constitutes Big Data for respondents to its survey: Just 43 percent, for instance, classify unstructured data as part of the package. That option included open text, video and voice. Those are gaps that could be filled leveraging technologies such as machine learning, speech recognition and natural language understanding, but they won’t be unless executives make these sources a focus of Big Data initiatives to start with.
- From Teradata’s new survey on Big Data Analytics in the UK, France and Germany: Close to 50 percent of respondents in the latter two countries are using three or more data types (from sources ranging from social media, to video, to web blogs, to call center notes, to audio files and the Internet of Things) in their efforts, compared to just 20 percent in the UK. A much higher percentage of UK businesses (51 percent) are currently using just a single type of new data, such as video data, compared with France and Germany, where only 21 percent are limiting themselves to one type of new data, it notes. Forty-four percent of execs in Germany and 35 percent in France point social media as the source of the new data. About one-third of respondents in each of those countries are investigating video, as well.
KForce (a recruiter) is looking for a director of taxonomy and information architecture. The job duties include:
- “Oversee ongoing enhancement of our information architecture and the schema, including taxonomy / ontology development.
- Manage, train, mentor, and recruit a growing team of analysts and QA specialists.
- Manage multi-stage QA process for verifying the integrity of data scraped and then coded from the internet.
- Monitor labor market trends, and develop rules to account for the emerging jobs, skills and credentials within our taxonomies.
- Leverage published data series and other third party information to inform and validate data curation activities.
- Develop and implement coding enhancement initiatives based on their utility to products, research efforts and clients.
- Implement automated data coding and quality control procedures
- Work with a team of software developers to automate data coding and quality control procedures, to embed data innovations effectively within products, and to support the planning and implementation of a robust data warehouse infrastructure.”
Kevin Fitchard of Gigaom recently posted that, “Thanks to the popularity of its FireChat hyperlocal messaging app, Open Garden’s networking software has been downloaded into more than 5 million mobile devices around the world. Open Garden believes it now has enough users out there to execute the next the stage of its plan: it wants to use all of these smartphone nodes to create a new network for the internet of things. This concept probably requires some explaining as it doesn’t fit into any of the other IoT networking schemes we’ve written about in the past. Unlike say your smart home, which uses a hub to aggregate a bunch of Zigbee or Wi-Fi connections, or a connected vehicle fleet, which taps into the cellular network, Open Garden’s IoT network would be created through millions of shared connections owned by you, me or anyone else with one of its apps on their smartphones, tablets or PCs.”
A recent announcement on EurekAlert! states: “Researchers from North Carolina State University have developed artificial intelligence (AI) software that is significantly better than any previous technology at predicting what goal a player is trying to achieve in a video game. The advance holds promise for helping game developers design new ways of improving the gameplay experience for players.”We developed this software for use in educational gaming, but it has applications for all video game developers,” says Dr. James Lester, a professor of computer science at NC State and senior author of a paper on the work. “This is a key step in developing player-adaptive games that can respond to player actions to improve the gaming experience, either for entertainment or – in our case – for education.” The researchers used “deep learning” to develop the AI software. Deep learning describes a family of machine learning techniques that can extrapolate patterns from large collections of data and make predictions. Deep learning has been actively investigated in various research domains such as computer vision and natural language processing in both academia and industry.”
Net Consultants is looking to recruit a Java JEE/ C++ Developer with experience in RDF. The position description states, “Contract for US Citizen working at Government contractor in Rancho Bernardo on an archived image library system.
Technical Qualifications/Experience Required:
- C++ Developer
- 3+ years Java JEE and C++ development
- SQL/Oracle Development on Linux/Unix environment
- Writing SW for archiving, dissemination of large amounts of data. Understanding the topology for accessing, moving, storing data and generating discrepancy reports
- Some experience writing RESTful web services
- Cobra would be a plus (original application written in Cobra)”
Daniel Sparks of The Motley Fool reported, “”Chinese companies are starting to dream,” said early investor in Baidu (NASDAQ: BIDU ) and managing partner at GGV Capital Jixun Foo. Foo’s proclamation was made in an in-depth article by MIT Technology Review, which examined the Chinese search giant’s new effort to change the world with artificial intelligence. The company’s new AI lab does, indeed, accompany some lofty aspirations — ones big enough to hopefully help Baidu become a global Internet powerhouse and to compete with the likes of Google in increasingly important emerging markets where the default search engine hasn’t yet taken the throne. But what are the implications for investors? Fortunately, Baidu’s growing infatuation with AI looks like it could give birth to winning strategies that could build sustainable value over the long haul.”
Apigee wants the development community to be able to seamlessly take advantage of predictive analytics in their applications.
“One of the biggest things we want to ensure is that the development community gets comfortable with powering their apps with data and insights,” says Anant Jhingran, Apigee’s VP of products and formerly CTO for IBM’s information management and data division. “That is the next wave that we see.”
Apigee wants to help them ride that wave, enabling their business to better deal with customer issues, from engagement to churn, in more personal and contextual ways. “We are in the business of helping customers take their digital assets and expose them through clean APIs so that these APIs can power the next generation of applications,” says Jhingran. But in thinking through that core business, “we realized the interactions happening through the APIs represent very powerful signals. Those signals, when combined with other contextual data that may be in the enterprise, enable some very deep insights into what is really happening in these channels.”
With today’s announcement of a new version of its Apigee Insights big data platform, all those signals generated – through API and web channels, call centers, and more – can come together in the service of predictive analytics for developers to leverage.
Jeppesen is seeking an information solution architect. The job description states: “This person will serve as an Information Architect for development of a large, geospatial data management system. Specifically, this person will specify and implement approaches to ensure efficient access, editing, and transaction management for geospatial data and work in the database and access layer of the system. This involves tuning the Physical Data Model and optimization for performance with Oracle 12c Spatial and Graph, Oracle Workspace Manager, and custom developed data access frameworks and services in Java. This person will interact with data modeling, database administration, data center/IT, and software engineering teams.” Read more
Belkin International announced, “Belkin International, a leading Internet of Things company, and OSRAM SYLVANIA, a leading global lighting manufacturer, today announced that the two companies have entered into a strategic partnership to collaborate on residential solutions with the OSRAM LIGHTIFY™ smart connected lighting ecosystem and Belkin’s WeMo® home automation ecosystem. OSRAM SYLVANIA will first add WeMo compatibility to the SYLVANIA ULTRA iQ™ BR30 LED light bulb, followed by a broader portfolio of connective lighting products for the home shortly after the launch of OSRAM LIGHTIFY in Europe this fall.”