Big Data / Hadoop Developer

  • location: Durham, NC
  • type: Contract
easy apply

job description

Big Data / Hadoop Developer

job summary:
Job Description:

Our Client is currently seeking a Big Data / Hadoop Developer to work in Durham, NC at Client Investments.

Overview

We enable business partners to win in their respective marketplaces by crafting, building and maintaining the technology platforms and products of Client Institutional, Personal Investing and Workplace Investing.

We are hiring Technologist to build and support various BI applications on as part of Consolidated Data Platform initiative

Role Description

The group is looking for a Principal/Lead Big Data Application developer, you will be responsible for partnering with the project sponsors, Analysts and Architecture teams to design and deliver best in class business intelligence applications.

The expertise we're looking for

  • Bachelor's degree or higher in a technology related field (e.g. Engineering, Computer Science, etc.) required, Master's degree a plus
  • 10+ years experience in architecting and crafting highly scalable distributed data processing systems
  • Experience in implementing batch and real-time Big Data integration frameworks and/or applications,
  • Hands-on experience in one or more modern Object Oriented Programming languages (Java, Scala, Python)
  • Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash).
  • Execute projects in Agile environments (Kanban and Scrum).
  • Experience with Snowflake will be a plus.
Candidate Description

  • In-depth understanding of Hadoop Echo System & HDFS File System
  • Experience with Hive, Spark, Python (must have minimum three years hands on experience)
  • Experience with Oracle databases, PLSQL Programming is must
  • Experience with data ingestion tools SQOOP
  • Working experience in Java
  • 2+ years of experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash).
  • Experience and comfort executing projects in Agile environments (Kanban and Scrum).
  • You enjoy analyzing data, identifying gaps, issues, patterns and trends
  • You can analyze application dependencies and conduct impact assessment of changes
  • You have a good understanding of database design concepts - Transactional / Data mart / Data warehouse etc.
  • Working experience with ETL Informatica and Spark SQL
  • Experience with Spark - Understanding of Spark architecture and internals.
  • Experience with cloudera distributed version of Hadoop
  • Exposure/Knowledge of streaming apps like Kafka and/or Flume is an advantage
  • Experience with Snowflake, shell scripting and control M
  • Willing to participate on weekend on call rotation
Behavioural Attributes

  • Ability to work with multi-functional teams located across geographies
  • Good interpersonal and client-handling skills with the ability to balance expectations and explain technical detail.
  • Ability to multitask, prioritizes tasks, and quickly adjusts in a rapidly changing environment.
  • Work effectively with geographically dispersed team.
  • Balance complex information with accuracy and attention to detail.
  • A focused, investigative and inquisitive mind; together with creative abilities.
  • Excellent conflict management and negotiation skills
  • Eager to learn and continuously develop personal and technical capabilities.
  • High level of dedication, initiative, vision and enthusiasm
  • Professional approach to time, costs and deadlines.
Education and Experience

  • Bachelor degree in Computer Science or Information Technology
  • 5+ years of experience crafting and developing large-scale Big Data engineering applications,
  • 10+ years of hands-on experience in architecting, designing and developing highly scalable distributed data processing systems
 
location: Durham, North Carolina
job type: Contract
work hours: 9am to 5pm
education: Bachelors
 
responsibilities:
  • In-depth understanding of Hadoop Echo System & HDFS File System
  • Experience with Hive, Spark, Python (must have minimum three years hands on experience)
  • Experience with Oracle databases, PLSQL Programming is must
  • Experience with data ingestion tools SQOOP
  • Working experience in Java
  • 2+ years of experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash).
  • Experience and comfort executing projects in Agile environments (Kanban and Scrum).
  • You enjoy analyzing data, identifying gaps, issues, patterns and trends
  • You can analyze application dependencies and conduct impact assessment of changes
  • You have a good understanding of database design concepts - Transactional / Data mart / Data warehouse etc.
  • Working experience with ETL Informatica and Spark SQL
  • Experience with Spark - Understanding of Spark architecture and internals.
  • Experience with cloudera distributed version of Hadoop
  • Exposure/Knowledge of streaming apps like Kafka and/or Flume is an advantage
  • Experience with Snowflake, shell scripting and control M
  • Willing to participate on weekend on call rotation
 
qualifications:
  • Bachelor degree in Computer Science or Information Technology
  • 5+ years of experience crafting and developing large-scale Big Data engineering applications,
  • 10+ years of hands-on experience in architecting, designing and developing highly scalable distributed data processing systems
 
skills:
  • Bachelor's degree or higher in a technology related field (e.g. Engineering, Computer Science, etc.) required, Master's degree a plus
  • 10+ years experience in architecting and crafting highly scalable distributed data processing systems
  • Experience in implementing batch and real-time Big Data integration frameworks and/or applications,
  • Hands-on experience in one or more modern Object Oriented Programming languages (Java, Scala, Python)
  • Experience with DevOps, Continuous Integration and Continuous Delivery (Jenkins, Stash).
  • Execute projects in Agile environments (Kanban and Scrum).
  • Experience with Snowflake will be a plus.

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

easy apply

get jobs in your inbox.

sign up
{{returnMsg}}

related jobs

    Big Data Engineer

  • location: Research Triangle Park, NC
  • job type: Contract
  • salary: $50 per hour
  • date posted: 9/18/2019

    Java Developer

  • location: Chapel Hill, NC
  • job type: Permanent
  • salary: $105,000 - $120,000 per year
  • date posted: 6/12/2019