job summary: Duties and Responsibilities:
Education & Experience:
- Provides intermediate level system analysis, design, development, and implementation of Big Data Analytic applications on AWS. Integrates third party products.
- Translates technical specifications, and/or logical and physical design into code for new or enhancement projects for internal clients. Develops code and test artifacts that reuse subroutines or objects, is well structured, backed by automated tests, includes sufficient comments and is easy to maintain. Writes programs, appropriate test artifacts, ad hoc queries, and reports. Employs contemporary software development techniques to ensure tests are implemented in a way that supports automation.
- Elevates code into the development, test, and Production environments on schedule. Provides follow up Production support. Submits change control requests and documents.
- Follows software development methodology. Follows architecture standards.
- Participates in design, code, and test Inspections throughout life cycle to identify issues. Participates in other meetings, such as those for use case creation.
- Participates in systems analysis activities, including system requirements analysis and definition (e.g., prototyping, experimentation, etc.).
- Understands client business functions and technology needs.
- Participates in special projects as assigned.
- Undergraduate degree in a related field or the equivalent combination of training and experience
- Minimum 3 years developer experience
- Strong written and oral communication skills
- Strong analysis and problem solving skills
- Strong planning and organizational skills
- Good understanding of lean principles, including Shift left, DevOps, CloudOps, continuous delivery; must have experience with Maven, Cucumber testing, Bamboo, git, etc.
- Worked on Big Data and Hadoop related technologies such as Hive, Impala, Oozie, Yarn, Map Reduce. Good knowledge of data streaming services such as Flume, Kafka, etc.
- Familiarity with data related strategies and architecture, including offensive (data prep/ transformation, enrichment, data analytics, modeling, visualization, etc.) and defensive (data ingestion, operationalization/ standardization, storage, security, etc.)
- Good understanding of the cloud architecture such as AWS, MS Azure, etc. and related Big Data technologies (AWS EMR, S3, Kinesis, Cloud Formation templates, Cloud Watch Logs, Splunk, Ansible, etc.)
- Familiarity with Machine Learning (ML) and AI concepts would a plus, especially around the use and application of such technologies to support our marketing and business efforts to deepen our understanding of external clients and investors.
- Familiarity with scripting languages and platforms such as Linux, Python, R, SQL, Scala, etc.
- Maintains flexibility in meeting deadlines and expectations in the face of shifting priorities.
location: Charlotte, North Carolina
job type: Permanent
salary: $80,000 - 120,000 per year
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.