Top 10 Companies Looking for People with Hadoop Skills


Apache Hadoop has risen to the seven spot on the Indeed top 10 list of job skills that are in peak demand.  It’s a jump that shows why big data is expected to be one of the most discussed topics in the year ahead.

The job market for people with Hadoop skills is revealing. Amazon is the top job poster with 110 listings on Indeed. A distant second is eBay with 53 jobs listed. In the top 10, there are 400 jobs posted for people with Hadoop skills. Here is the full list, according to number of jobs posted to the site:

  • Amazon (110)
  • eBay (53)
  • Yahoo! Inc. (37)
  • Hortonworks (36)
  • Facebook (33)
  • Apple (28)
  • General Dynamics – IT (28)
  • EMC Corporation (27)
  • Northrop Grumman (25)
  • Twitter (23)

Amazoon is looking for software engineers, database administrators and developers. They are looking for engineers to lead teams developing distributed storage systems; software developer and test engineers; senior software developers to build new APIs and a host of other skill sets for jobs across the company. Yesterday, my colleague Klint Finley posted about an Amazon job listing for a new big data streaming processing service the company plans to launch.

eBay is known for its deep commitment to Hadoop. It recently rebuilt its search engine with a Hadoop foundation. In its most recent postings, the company is looking for software engineers and people with operations skills.

Yahoo! may be up for sale but it is still one of the most active recruiters for people with Hadoop experience. According to Indeed, Yahoo is looking for people with Hadoop skills to support search, its content platform group and overall application development. Here’s its ad in search for a solutions architect:

Solutions and Services team in the Cloud Platform Group at Yahoo! is looking for an architect to provide custom solutions for content delivery system running over distributed/cloud computing environment made up of thousands of Hadoop servers. In this capacity you will work with all internal Yahoo! application that use the infrastructure. You will review the application architecture, perform feasibility studies and provide the best tactical and strategic solution. You will become a subject matter expert in the areas of Hadoop and related technology such as PIG, Oozie, HIVE as well as all the components making up the content delivery system. You will collaborate with other architects and set the long term vision for the technical roadmap for these components.

The Solutions team is a major contributor to several open-source projects, including Hadoop ( As a lead member of the Solutions team, you will influence and shape the future distributed computing platform used by some of the largest software and Internet companies in the world.

Hortonworks is a challenger to Cloudera. The Yahoo spinoff has a few telling requirements for prospective employees.  It is looking for people with experience on AWS or OpenStack. It wants people with  knowledge of  Web-search, high-performance graphics, algorithmic trading and real-time operating systems. Like Cloudera, Hortonworks is  making a play in the software space with its own distribution.  Hortonworks is behind Apache Ambari, which is designed to deploy, configure, manage and monitor Hadoop clusters

Facebook is looking for data scientists; someone to own email on Facebook and a person to manage growth internationally. Many of the jobs Facebook posts are for analysts with Hadoop as one of a list of skills in its requirements. For example, Facebook is looking for a product manager who would focus on project areas “like underoptimized friending/signup/contact importing flows, optimize our mobile applications (esp on feature phones or in developing markets), or help new users ramp up to engaged users.” The boilerplate “Hadoop” requirement for these jobs: “Leverage tools like Hadoop/Hive, Oracle, ETL, R, PHP, Python, Excel, and MicroStrategy to drive efficient analytics and reporting.” That in itself shows a shift in the type of requirements that analysts and product managers will increasingly require.

Apple is heavily investing in Hadoop and other related technologies such as NoSQL. Yesterday it posted a job listing for an iOS Hadoop engineer to build a platform for the next generation of Apple Cloud services:

Explore the far reach of possibilities by joining the team building the future of Cloud services at Apple!

Consider joining a small operations team writing code and designing Cloud infrastructure using Hadoop? We are looking for an extremely capable engineer who has a strong background in developing and maintaining speech servers, and who has built high-performance, highly scalable and extensible systems architecture.

Within this highly visible position with the iOS organization, a successful candidate will collaborate with cross-functional engineering teams to define and implement some of the core backend platform frameworks and systems that will power next generation Apple Cloud services.

Apple is also looking for people to provide the next generation of advertising, field diagnostics, iTunes and more. Watch Apple and what it does with its new products. You’ll know Hadoop had an influence.

Wow. For every position General Dynamics has posted is a requirement for a polygraph test.  Oh, my.  General Dynamics provides information technology, systems engineering and professional services to customers in the defense, intelligence, homeland security, federal civil and commercial sector. Its place on the top 10 shows the deep interest that defense contractors have in big data.  Here’s a job posting for a senior cloud application developer for helping with a new platform-as-a-service:

General Dynamics-AIS is looking for a Cloud Application Developer to work with the task lead on an integrated team to help migrate an existing system to a cloud using map/reduce. This effort is focused on using analytics in the cloud and Platform as a Service (PaaS). Additionally, the Cloud Application Developer will contribute to further development and maintenance of the system.
Must have at least 10 years software development experience, at least 5 years of Java experience, and at least 1 year of experience using cloud algorithms such as map/reduce or Hadoop.
Must have at least a TS/SCI with CI polygraph clearance to start, but will need to take a Full Scope polygraph.

EMC is looking for more data scientists than any of the employers in the top 10. EMC Greenplum is building out an analytics platform for data scientists. The descriptions for the jobs also fits with EMC’s focus on professional services. An excerpt from one of the listings:

In some cases, our customers are taking the initial steps beyond BI, and in others, they are building Next Generation Analytics Platforms supporting 100s and 1000s of Data Scientists. The Analytics Architect will draw from 10+ years of experience building similar analytics-centric technology stacks, to design customized architectures based on Greenplums Unified Analytics Platform and partner technology ecosystem.

The Analytics Architect will partner closely with Greenplum Data Scientists to build new sources of data-driven innovation and value for our customers and prospects while proving the analytical capabilities of Greenplums technology stack. Together, they will help customers respond to their top-priority business goals by introducing new ways to tap their exploding data assets, combining a robust integrated technology stack and advanced forms of machine learning and data mining.

Northrop Grunman is the second defense contractor in the top 10. I like this cloak and dagger posting:

Do you have an analytical mind towards information security and the desire to make the world a safer place? Do you enjoy learning about new technologies and how they can be manipulated for nefarious purposes? Do you like exploring and testing out new technologies and applying them to solve real world problems? If so then look to join the Northrop Grumman team as a Software Engineer in support of a Computer Security Incident Response Center (CSIRC) team protecting a large Federal Government network.

Twitter rounds out the top 10. It follows a tradition we see with Web-oriented companies. It offers the employees the opportunity to work with open-source technologies and have the freedom to offer it upstream to the open-source community. Twitter is seeking people with Hadoop skills to advance its recommendation systems; discovery; filtering spam; personalization and its user interface. Twitter is also looking for people with experience in Hadoop to help with its storage architecture, operations, test and development and other needs throughout the company.

The listing on Indeed are a window into the huge demand for people with Hadoop skills. But how can all these positions be filled? The need for people with such deep technical talent demonstrates why Hadoop is still viewed as primarily for engineer-centric organizations.