Summary:
- DevOps/SRE role supporting NextGen Platforms built around Big Data Technologies (Docker-Container, Jupyter Notebook, Hadoop, Spark, Kafka, Impala, Hbase, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, DataRobot, C3, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite etc. Analyst is involved in the full life cycle of an application and part of an agile development process.
- They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility and education/experience within this role.
- Works on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
- Team member will be expected to provide subject matter expertise in managing Hadoop and Data Science Platform operations with focus around Openshift, Docker-Container, Cloudera Hadoop, Jupyter Notebook cluster management and administration
- Integrates solutions with other applications and platforms outside the framework
- He / She will be responsible for managing platform operations across all environments which includes upgrades, bug fixes, deployments, metrics / monitoring for resolution and forecasting, disaster recovery, incident / problem / capacity management
- Serves as a liaison between client partners and vendors in coordination with project managers to provide technical solutions that address user needs
- Ex: Java; CCAR; OTC; Agile; SQL Knowledge with Docker-Container, Openshift, Kubernetes, Jupyter Notebook, Cloudera Big Data stack
- technical knowledge: Unix/Linux; Database (Sybase/SQL/Oracle), Java, Python, Perl, Shell scripting, Infrastructure.
- Experience in Monitoring & Alerting, and Job Scheduling Systems
- Being comfortable with frequent, incremental code testing and deployment
- grasp of automation / DevOps tools - Ansible, Jenkins, SVN, Bitbucket
- Red Hat Openshift Admin / Dev Certification
- Knowledge with Cloudera Big Data stack (Cloudera Admin), Kafka, Spark, Impala, Hive, Hbase etc.
location: Phoenix, Arizona
job type: Contract
salary: $60 - 70 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
- DevOps/SRE role supporting NextGen Platforms built around Big Data Technologies (Docker-Container, Jupyter Notebook, Hadoop, Spark, Kafka, Impala, Hbase, Ansible and many more). Requires experience in cluster management of vendor based Hadoop and Data Science (AI/ML) products like Cloudera, DataRobot, C3, Panopticon, Talend, Trifacta, Selerity, ELK, KPMG Ignite etc. Analyst is involved in the full life cycle of an application and part of an agile development process.
- They require the ability to interact, develop, engineer, and communicate collaboratively at the highest technical levels with clients, development teams, vendors and other partners. The following section is intended to serve as a general guideline for each relative dimension of project complexity, responsibility and education/experience within this role.
- Works on complex, major or highly visible tasks in support of multiple projects that require multiple areas of expertise
qualifications:
- Experience level: Experienced
- Minimum 3 years of experience
- Education: Bachelors
skills:
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
For certain assignments, Covid-19 vaccination and/or testing may be required by Randstad's client or applicable federal mandate, subject to approved medical or religious accommodations. Carefully review the job posting for details on vaccine/testing requirements or ask your Randstad representative for more information.