Data Analyst SME (AI Engineer-Dev Ops)

  • location: Charlotte, NC
  • type: Contract
easy apply

job description

Data Analyst SME (AI Engineer-Dev Ops)

job summary:
Job Description :

  • AI Engineering team at Client is a tight-knit, startup oriented, fast-paced growing team working with various line of business.
  • With a huge scope for recommendation engines, forecasting, routing /workforce optimizations, image and video analytics (Drones, Satellites, LiDAR), robotics, anomaly detections and much more , the opportunities are ever-growing.
  • As a AI Engineer, you will be expected to help create innovative solutions to solve business problems by operationalizing models developed by data scientists in Private cloud including Pivotal cloud foundry / Kubernetes and in AWS/AZURE down the line.
Note : Candidates must be able to convert FTE

 
location: Charlotte, North Carolina
job type: Contract
work hours: 9am to 5pm
education: Bachelors
 
responsibilities:
  • AI Engineering team at Client is a tight-knit, startup oriented, fast-paced growing team working with various line of business.
  • With a huge scope for recommendation engines, forecasting, routing /workforce optimizations, image and video analytics (Drones, Satellites, LiDAR), robotics, anomaly detections and much more , the opportunities are ever-growing.
  • As a AI Engineer, you will be expected to help create innovative solutions to solve business problems by operationalizing models developed by data scientists in Private cloud including Pivotal cloud foundry / Kubernetes and in AWS/AZURE down the line.
 
qualifications:
Required Qualifications

  • Bachelor degree in Computer Science, Information Systems or related discipline; Or 6 years of prior equivalent work-related experience in lieu of a degree.
  • Solid Linux administration skills
  • Strong knowledge in Public/private cloud infrastructure and services (PCF, AWS, KUBERNETES)
  • Experience in continuous integration suites (Jenkins, Bamboo).
  • Experience in Web servers (Apache, NGINX)
  • Working knowledge in Git, GitHub, Bitbucket
  • A programming/scripting language (Python, Ruby, Bash, PHP, GO, NodeJS)
  • Application Containerization (Docker, Kubernetes, Rancher, ECS)
  • Working knowledge in monitoring tools (Nagios, ELK, New Relic, Graphite)
  • Working knowledge in Provisioning tools (Salt stack, Puppet, Chef, Vagrant, Terraform)
  • Knowledge in Java development
  • Experience deploying Data Science models in Python on PCF/Kubernetes.
  • Experience in Networking and VPC.
  • Experience in database technologies (Queries & Connections) is a Plus!
  • Should be able to pick up requests for container provisioning and model operationalization and act upon them quickly.
  • Should be able to work with startup mentality in a fast paced and constantly evolving team and should be self-motivated.
  • Have leadership skills to be proactive, take ownership, drive projects, interface business, represent the team and spread the knowledge.
Additional Info:

  • Assesses complex data systems and programs in support of ad-hoc and standing management or customer requests.
  • Creates programs, methodologies, and files for analyzing and presenting data. Examines data quality, applications, and functions.
  • Produces output and sustains operation.
  • Researches new data sources and analytical tools.
  • Contributes to new product development and improvement in product delivery and presentation.
  • Develops awareness of and familiarity with issues and events affecting organization, department, and/or customer.
  • Uses and supports database applications and analytical tools. Uses timely and appropriate participation of users/customers in data collection and query systems.
  • 10+ years of experience.
 
skills: Required Qualifications

  • Bachelor degree in Computer Science, Information Systems or related discipline; Or 6 years of prior equivalent work-related experience in lieu of a degree.
  • Solid Linux administration skills
  • Strong knowledge in Public/private cloud infrastructure and services (PCF, AWS, KUBERNETES)
  • Experience in continuous integration suites (Jenkins, Bamboo).
  • Experience in Web servers (Apache, NGINX)
  • Working knowledge in Git, GitHub, Bitbucket
  • A programming/scripting language (Python, Ruby, Bash, PHP, GO, NodeJS)
  • Application Containerization (Docker, Kubernetes, Rancher, ECS)
  • Working knowledge in monitoring tools (Nagios, ELK, New Relic, Graphite)
  • Working knowledge in Provisioning tools (Salt stack, Puppet, Chef, Vagrant, Terraform)
  • Knowledge in Java development
  • Experience deploying Data Science models in Python on PCF/Kubernetes.
  • Experience in Networking and VPC.
  • Experience in database technologies (Queries & Connections) is a Plus!
  • Should be able to pick up requests for container provisioning and model operationalization and act upon them quickly.
  • Should be able to work with startup mentality in a fast paced and constantly evolving team and should be self-motivated.
  • Have leadership skills to be proactive, take ownership, drive projects, interface business, represent the team and spread the knowledge.
Additional Info:

  • Assesses complex data systems and programs in support of ad-hoc and standing management or customer requests.
  • Creates programs, methodologies, and files for analyzing and presenting data. Examines data quality, applications, and functions.
  • Produces output and sustains operation.
  • Researches new data sources and analytical tools.
  • Contributes to new product development and improvement in product delivery and presentation.
  • Develops awareness of and familiarity with issues and events affecting organization, department, and/or customer.
  • Uses and supports database applications and analytical tools. Uses timely and appropriate participation of users/customers in data collection and query systems.
  • 10+ years of experience.

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

easy apply

get jobs in your inbox.

sign up
{{returnMsg}}

related jobs

    DevOps Engineer

  • location: Charlotte, NC
  • job type: Contract
  • salary: $47 - $57 per hour
  • date posted: 10/9/2019

    DevOps Engineer

  • location: Charlotte, NC
  • job type: Contract
  • salary: $39 - $49 per hour
  • date posted: 10/15/2019

    Senior Data Engineer

  • location: Charlotte, NC
  • job type: Contract
  • salary: $44.72 - $56.25 per hour
  • date posted: 10/18/2019