Hadoop Developer

  • location: Charlotte, NC
  • type: Contract
  • salary: $56.24 - $61.24 per hour

job description

Hadoop Developer

job summary:
Randstad Technologies is hiring and we're looking for someone like YOU to join our team! If you are seeking a new opportunity, looking to grow in your career, or you know someone who is - we want to hear from you! Take a look at the below opportunity, or feel free to visit RandstadUSA.com to view and apply to any of our open roles.

location: Charlotte, North Carolina
job type: Contract
salary: $56.24 - 61.24 per hour
work hours: 8am to 5pm
education: Bachelors
Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.

Design and build the data services on container-based architecture such as Kubernetes and Docker

Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data

Work with business analysts, development teams and project managers for requirements and business rules.

Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.

Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist

Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.

Work with DBAs in Enterprise Database Management group to troubleshoot problems and optimize performance

Support ongoing data management efforts for Development, QA and Production environments

Utilizes a thorough understanding of available technology, tools, and existing designs.

Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.

Acts as expert technical resource to programming staff in the program development, testing, and implementation process

- Software engineering experience

- ETL (Extract, Transform, Load) Programming experience

- Agile experience

- Hadoop experience

- Java or Python experience

- Design and development experience with columnar databases using Parquet or ORC file formats on Hadoop

- Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient - Distributed Datasets (RDDs)

- Experience integrating with IBM BPM RESTful API

- Nice to have - operational risk, conduct risk or compliance domain experience

- Nice to have - Experience with containers: Dockers, Kubernetes

- Nice to have - Experience with Amazon Web Services (AWS), Azure, Object Storage, Elastic Compute Cloud, OnDemand compute, etc.

  • Experience level: Experienced
  • Minimum 7 years of experience
  • Education: Bachelors
  • Hadoop
  • Hadoop
  • Apache

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

get jobs in your inbox.

sign up

related jobs

    Hadoop Developer

  • location: Charlotte, NC
  • job type: Contract
  • salary: $56.24 - $61.24 per hour
  • date posted: 4/1/2021

    Big Data developer

  • location: Charlotte, NC
  • job type: Contract
  • salary: $55 - $65 per hour
  • date posted: 4/26/2021

    Application Developer SME

  • location: Charlotte, NC (remote)
  • job type: Contract
  • salary: $63.84 - $73.84 per hour
  • date posted: 5/12/2021