Machine Learing Data Analyst

  • location: Waterloo, IA
  • type: Contract
easy apply

job description

Machine Learing Data Analyst

job summary:
As a Machine Health Data Analyst, you will work on a cross-functional team focused on utilizing telematics data to deliver mission-critical solutions, and you'll do it with cutting-edge big data technologies. You will work on developing methods to analyze data from Agricultural, Construction, and Forestry machines to enable automated solutions to deliver machine uptime to our customers. There will be a focus on telematics data mining, manipulation, analytics, visualization, and automated solution development. Our team's entrepreneurial attitude and collaborative spirit sets us apart and will keep you impassioned, driven, and fulfilled. Responsibilities include: Development, maintenance, and support of methods to automate solutions to dealers.

location: Waterloo, Iowa
job type: Contract
work hours: 8am to 5pm
education: Bachelors
You will work on projects to:

  • Connect, flow, store, transform, analyze, and visualize data from Client engines and machines
  • Integrate with engineering and reliability subject matter experts on our team to develop automated solutions using SQL and R
  • Develop solutions for identifying patterns or trends to troubleshoot Engine related problems, using Telematics data and/or Warranty
Data Requirements:

  • At least 2 years of experience with writing scripts in SQL
  • At least 2 years of experience with R or Python
  • At least 2 years of experience with Tableau
  • Any experience with AWS (Amazon Web Services) Compute, Analytics, and/or Storage products is strongly preferred.
skills: Strong communication skills

  • both technical and interpersonal- to integrate and facilitate technical communication and alignment with our worldwide team
  • Bachelor's Degree in Mathematics, Statistics, Computer Science, Computer Engineering, or Data & Analytics
Ideal Candidate:

  • At least 2 years professional experience with statistical analysis methods
  • Tableau Desktop certified
  • At least 1 year experience with Apache Spark
  • At least 1 year experience with Scala
  • At least 1 year experience with R Markdown/R Notebooks /R shiny/R Connect
  • At least 1 year experience with creating and using AWS Lambda functions, SNS topics, and S3 Buckets
  • At least 1 year experience with the Hadoop framework, ecosystem, and components
Tools : You will utilize and gain experience in a wide range of tools and components as you develop cutting edge ways to better support our customers and dealers.

Such as: Aginity, RStudio, R Shiny/R Connect, Python, Hadoop, Hive, Hbase, NiFi, Hue, AWS, Netezza, Tableau, Sqoop, Oozie, Spark, HDFS, Grafana, Databricks

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

easy apply

get jobs in your inbox.

sign up

related jobs