! Oops. This position is closed.

Hadoop - Technical Architect

  • Hyderabad, Telangana, India
  • Permanent
  • Software Engineering & Technology
  • 10 - 15 years
Job Title:

Hadoop - Technical Architect

Job Description:

Primary Skills: Hadoop Map/Reduce, SPRAK, Druid, Hive, YARN, Oozie, Scala/Java, Web Services, MySQL, Ansible scripts
• Experience designing and building Big Data applications in on-premise, hosted and Public cloud (like, AWS) operating in under stringent SLAs
• Experience in designing (cluster topology, technology stack) and defining (sizing, tuning, non-functional goals) of large-scale Hadoop (preferably Hortonworks Data Platform) as a Service cluster (multi-tenant)
• Deep architecture knowledge and hands-on experience in productionizing Hadoop based products using technologies, like, Kafka, Spark, HDFS, Hive, YARN, Druid, Kerberos, Oozie, SQL, Webservices (Must have), Superset, Kubernetes, dockers (Good to have) 
• Should have very strong performance engineering focus in Building High performing bigdata applications with scalable design. Work towards establishing performance engineering best practices both at Infra level and at application level and get into application level tuning when required.
• Implement system management strategies for monitoring, optimization, rapid feedback and high availability for Bigdata platform. Conduct capacity planning on periodic basis and forecast capacity demands. 
• Collaborating with engineering, DevOps & Hadoop admin team to ensure adherence to the optimal design practices, and troubleshoot and resolve issues in dev, test and production environments from infrastructure is highly available and performing as expected.
• Collaborate with Hadoop administrators to deploy/upgrade Hadoop cluster, manage (nodes, services, users, etc), cluster and job tuning, implement non-functional features, like, High availability, security, backup and disaster recovery, monitoring, etc.
• Should have expert level knowledge in one of JVM based languages, like, Java, Scala. Well versed with JVM design and tuning.
• Good knowledge of Linux based environments.

Experience Range:

10 - 15 years

Educational Qualifications:

Any graduation, and Any PG and Any Doctorate

Job Responsibilities:

Primary Skills: Hadoop Map/Reduce, SPRAK, Druid, Hive, YARN, Oozie, Scala/Java, Web Services, MySQL, Ansible scripts
• Experience designing and building Big Data applications in on-premise, hosted and Public cloud (like, AWS) operating in under stringent SLAs
• Experience in designing (cluster topology, technology stack) and defining (sizing, tuning, non-functional goals) of large-scale Hadoop (preferably Hortonworks Data Platform) as a Service cluster (multi-tenant)
• Deep architecture knowledge and hands-on experience in productionizing Hadoop based products using technologies, like, Kafka, Spark, HDFS, Hive, YARN, Druid, Kerberos, Oozie, SQL, Webservices (Must have), Superset, Kubernetes, dockers (Good to have) 
• Should have very strong performance engineering focus in Building High performing bigdata applications with scalable design. Work towards establishing performance engineering best practices both at Infra level and at application level and get into application level tuning when required.
• Implement system management strategies for monitoring, optimization, rapid feedback and high availability for Bigdata platform. Conduct capacity planning on periodic basis and forecast capacity demands. 
• Collaborating with engineering, DevOps & Hadoop admin team to ensure adherence to the optimal design practices, and troubleshoot and resolve issues in dev, test and production environments from infrastructure is highly available and performing as expected.
• Collaborate with Hadoop administrators to deploy/upgrade Hadoop cluster, manage (nodes, services, users, etc), cluster and job tuning, implement non-functional features, like, High availability, security, backup and disaster recovery, monitoring, etc.
• Should have expert level knowledge in one of JVM based languages, like, Java, Scala. Well versed with JVM design and tuning.
• Good knowledge of Linux based environments.

Skills Required:

Hadoop, Microservices, Big Data Analytics, AWS, Spark,

Job Code: KL-X386FED3
About Us
Headquartered in Dallas, [x]cube LABS helps enterprises globally with digital transformation and digital technology adoption. We take a highly collaborative approach and help solve problems across the entire digital transformation journey from ideation and strategizing to experimentation and execution. We specialize in helping enterprises transform customer experience, and in the process we help you leverage new, digitally driven, business models.

[x]cube LABS helps enterprises innovate and disrupt markets by leveraging digital as a strategy. The [x]cube team offers deep expertise in all things digital: CX strategy and transformation, digital innovation, augmented reality, virtual reality, blockchain, social, mobile, analytics, cloud, IoT, and more. We have delivered over 800 solutions across industries, won 25+ International awards, worked with 500+ clients & created value of over US $2.0 Billion for our clients. [x]cube is also one of the first 12 agencies globally to be approved by Google as a Certified Developer Partner.
Why Us?
We are one of the first 12 agencies globally to be approved by Google as a Certified Developer Partner and one of the few companies to receive AWS IoT Service Delivery Designation. At [x]cube LABS, innovation is the mantra at every desk and each project offers unique opportunities to learn, lead and achieve as a team.

A company always at the cutting-edge of innovation, we present an incredible opportunity to learn and grow.
Expertise in new-age technologies. Learn every tool and tech that will shape our tomorrow.
We believe in being creative and versatile. Get to work on a range of digital products crafted to engage and entertain.
[x]cube's clientele features some of the top global Fortune 1000 enterprises. Make your mark on products targeted at benefitting billions of lives.
Get access to our state-of-the-art research labs. Formulate and contribute to game changing visions.
Competitive packages, attractive benefits and rewards.
A fun, informal working environment.

[x]cube LABS, YesGnome & Upshot are divisions of PurpleTalk.

Have a query? Email us on alphateam@xcubelabs.com.