Apple is looking for a Siri Speech Engineer in Santa Clara, CA. According to the post, “The Siri Speech team is looking for a full-stack engineer to work on automatic speech recognition for Siri. To succeed in this role, you must be a strong programmer and a creative problem solver who thrives in a fast-paced environment, working across teams and organizations. You love building distributed systems at massive scale, tackling impossible problems, and you have a passion for customer experience. You enjoy learning new things and creating life-changing products.” Read more
Posts Tagged ‘Apple’
Apple is looking for a Siri Software Engineer, Data Services in Santa Clara, CA. According to the post, “The Data Services team is looking for outstanding engineers to build the next-generation of search and data infrastructure at Apple, to power features including Siri and Spotlight’s smart suggestions… Excited about shipping products used by hundreds of millions of customers? Applying cutting-edge technologies to create services that just work? If so, this might just be the place for you. Come work in an environment that will challenge you to push your limits, and train you to succeed as part of a team. Candidates must be willing to take on demanding projects in the areas of semantic search, structured data, knowledge representation, and distributed architecture. Key to success will be the ability to deliver results, a passion for perfection, and a willingness to keep learning and improving oneself.” Read more
Apple is looking for a Software Engineer, NLP in Santa Clara, CA. According to the post, “Do you want to be part of the team that delivers the best text input experience on iOS and OS X? Do you want do develop the best keyboards and input methods for customers worldwide? Are you interested in providing the best statistical language models, auto-correction and spellchecking experience to customers? The Natural Languages Processing team at Apple is hiring a Software Engineer to develop algorithms and data in these areas.” Read more
Wired’s Robert McMillan recently wrote, “…neural network algorithms are hitting the mainstream, making computers smarter in new and exciting ways. Google has used them to beef up Android’s voice recognition. IBM uses them. And, most remarkably, Microsoft uses neural networks as part of the Star-Trek-like Skype Translate, which translates what you say into another language almost instantly. People “were very skeptical at first,” Hinton says, “but our approach has now taken over.” One big-name company, however, hasn’t made the jump: Apple, whose Siri software is due for an upgrade. Though Apple is famously secretive about its internal operations–and did not provide comment for this article–it seems that the company previously licensed voice recognition technology from Nuance—perhaps the best known speech recognition vendor. But those in the tight-knit community of artificial intelligence researchers believe this is about to change. It’s clear, they say, that Apple has formed its own speech recognition team and that a neural-net-boosted Siri is on the way.”
Apple Insider recently reported, “According to Apple’s ‘Jobs at Apple’ website, the company is seeking ‘Siri Language Engineers’ fluent in Arabic, Brazilian Portuguese, Danish, Dutch, Norwegian, Swedish, Thai, Turkish and Russian, all of which are currently unsupported by the voice recognizing digital assistant. The job postings were first uncovered by MacRumors. Along with the nine new languages, Apple is looking to enhance Siri‘s existing lexicon with hires fluent in Australian and British English, Cantonese and Japanese. All listings ask not only for fluency, but for native speakers to handle colloquialisms locals may use when speaking to Siri. Apple also strives to make Siri’s own speech as natural as possible, meaning the potential hires will likely be working on responses to user queries.” Read more
Apple’s announcements at its WorldWide Developers’ Conference today had the crowd responding enthusiastically (of course, it was an Apple’s Developers Conference, so that just comes with the territory).
Much of the applause came in response to the new iOS 8 and its enhanced capabilities. As had been expected, as part of this, virtual assistant Siri got a bit of a facelift.
In the new iOS 8 for iPhones and iPads, due in the fall, there’s no need to touch your iPhone if it’s plugged in and you’ve got a question that needs answering and no hands to touch the mike. Apple also has partnered with music recognition service Shazam so that Siri now can recognize songs playing around them, and purchase them too; Shazam creates digital fingerprints of the audio it hears and matches it against its database of millions of tracks. Its natural language processing is fluent in 22 languages now, and streaming voice recognition means you can see what you’re saying as you are saying it.
As Apple’s Worldwide Developers Conference gets underway this week, speculation continues about whether we’ll see a preview of the long-awaited iWatch smart watch, along with more expected developments such as an update to the OS X operating system to bring it closer to resembling Apple’s mobile operating system experience. Rumors tout the iWatch as a device that will run iOS and include biometrics and health and fitness capabilities.
But even if the watch doesn’t appear until later this year, Apple’s timing is still on the right track – as is Microsoft’s fitness-focused, heart-rate monitoring smartwatch that is expected to debut this summer.
A new report from technology research firm ON World that surveyed 1,000 U.S. consumers finds that “wristworn devices are preferred by the majority of consumers who are most interested in a general purpose smart watch rather than dedicated fitness devices such as activity trackers and heart rate monitors.” One in five consumers either have or are planning to purchase a wearable technology product by next year and close to one-third are likely to purchase a wearable technology within two years, it finds.
Apple is looking for a Maps Search Engineer in Santa Clara, CA. According to the post, “Apple’s Local Search engineering team is looking for key players to build the foundation architecture of search.” This position will “Build high quality search systems for products including Maps search, Siri local search, and other features focused on understanding what’s interesting in the world around a given location.” Read more
Apple is looking for a Software Engineer, Natural Language Processing in Santa Clara, CA. The post states, “The Natural Languages Group is looking for an engineer to develop and apply algorithms and data in these areas. This involves application of bleeding edge machine learning and statistical pattern recognition on large text corpora. The position will involve all aspects of the use of natural-language processing in software, including functionality, algorithms, correctness, user experience, and performance, on both iOS and OS X.”
Qualifications for the position include: “Knowledge of natural-language processing techniques. MSc or PhD in Computer Science. Read more
Aaron Taube of Business Insider recently wrote in the SF Gate, “Apple is thinking about how it can figure out exactly how you feel at any given moment in order to show you the most relevant advertisements. In a patent application the company filed Thursday, Apple describes a hypothetical system that would analyze and define people’s moods based on a variety of clues including facial expressions, perspiration rates, and vocal patterns. To be clear, Apple patents just about everything it does, with most applications never amounting to anything with regard to the actual products Apple releases. Still it’s interesting to see how Apple is thinking about predictive, contextual advertising at such a granular level, especially in light of its battle with companies like Google and Facebook to offer search products (Siri, the App Store) that know precisely what a user is looking for — even if the user has not expressly communicated his or her desire.” Read more
NEXT PAGE >>