Hadoop Engineer/Dev Ops Engineer
Dallas
Job Description Job Description CLIENT
As the market leader in decision management software, our client helps the world's largest companies in digitally transforming their integrated business planning, revenue management and supply chain management functions. It's platform puts the right information in front of the right people at the right time, so that everyone in a company can make smarter decisions, faster. They offer a cloud-based platform that connects the supply chain end-to-end through use of technologies like AI/ML and NLP. With a global presence across NA, Europe and Asia-Pac, the company provides services in multiple industries and to some of the biggest brands in the world. It runs as a flat organization with a very strong entrepreneurial culture (and no corporate politics).
POSITION SUMMARY
ROLE & RESPONSIBILITIES
Work with customers/technical consultants to devise and recommend big data solution architecture based on requirements
Analyze complex distributed production deployments, and make recommendations to optimize performance
Able to document and present complex architectures for the customers technical teams
Work closely with o9 Dev, Devops and project teams at all levels to help ensure the success of projects
Help design and implement Hadoop architectures and configurations for customers working with Cloud deployments
Write and produce technical documentation and manual to be provided to customer
Keep current with the Hadoop Big Data ecosystem technologies
EXPERIENCE, EXPERTISE & EDUCATION REQUIREMENTS
More than three years of DevOps experience, architecting large scale storage, data center and /or globally distributed solutions
Experience designing and deploying production large-scale Hadoop solutions
Ability to understand and translate customer requirements into technical requirements
Experience designing data queries against data in a Hadoop environment using tools such as Apache Hive, Apache Druid, Apache Phoenix or others.
Experience installing, administering and tuning multi-node Hadoop clusters
Strong experience implementing software and/or solutions in Cloud (Azure, AWS, GCP)
Strong understanding with various enterprise security solutions such as LDAP and/or Kerberos
Strong understanding of network configuration, devices, protocols, speeds and optimizations
Strong understanding of the Java development, debugging & profiling
Significant previous work writing to network-based APIs, preferably REST/JSON or XML/SOAP
Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
Experience implementing MapReduce jobs
Solid background in Database administration and design, along with Data Modelling.
Experience in architecting data centre solutions properly selecting server and storage hardware based on performance, availability and ROI requirements
Demonstrated experience implementing big data use-cases, understanding of standard design patterns commonly used in Hadoop-based deployments.
Excellent verbal and written communications
Demonstrable experience using R and the algorithms.