job summary:
Overview:
The candidate will develop and provide direction to agile squads that design and develop highly scalable and distributed systems that are used to ingest and process structured & unstructured information from many external sources. The teams will build systems used to enrich, master and deliver the content to consumers.
As a content technology developer, you will work closely with design and product teams from around the globe and multiple content technology development teams to develop and deliver enterprise applications and digital products for Clarivate Content Technology. This will be an exciting opportunity to apply your development experience to create high impact data products. These products will be used by internal and external personnel to curate Clarivate's rich content and provide value add content for our customers. In this role, you will participate in reviewing business requirements and analysis till the final delivery of an enterprise software.
Duties and responsibilities:
Work on acquisition, transformation & ingestion of data packages
Provide support to offshore team in design, development and maintenance of new and existing jobs and data packages
Work with cross functional teams to deliver solutions efficiently
Review business requirements and perform a technical analysis of requirements and participate in estimation process
Participate in architectural discussions for new and existing data applications
Contribute ideas for making the data application efficient and faster to process large amount of data
Specific Technologies used in the Content Technology team:
Minimum 5-7 years of experience
Extensive experience with ETL Development Talend or similar tools
Experience in AWS Environment
Hands on experience working with XSLT, Unicode, XQuery, XML Schema, XPath, DTDs ETL development tools, large XML processing, Data Integration using AWS technologies
Experience developing jobs involving XML, JSON, Java, XSLT, AWS Services (S3, ECS, SQS) in Agile environment
Hands on knowledge of Git, Jenkins, Cloud Technologies (AWS preferred), RDBMS (PostgreSQL preferred)
Knowledge of Java programming, Shell Scripting, Unix environment
Prior experience using Apache Airflow is plus
Considerable experience with Unix (Linux) and common Unix tools, e.g. pattern matching, regular expressions, shell programming
Knowledge, Skills, and Abilities Required:
Proven record of delivering work individually in fast paced environment with minimum guidance
Good Communication Skill, result oriented
Experience working in global team
Deep knowledge of ETL & Data related technologies
location: Philadelphia, Pennsylvania
job type: Permanent
salary: $120,000 - 140,000 per year
work hours: 8am to 4pm
education: Bachelors
responsibilities:
Duties and responsibilities:
Work on acquisition, transformation & ingestion of data packages
Provide support to offshore team in design, development and maintenance of new and existing jobs and data packages
Work with cross functional teams to deliver solutions efficiently
Review business requirements and perform a technical analysis of requirements and participate in estimation process
Participate in architectural discussions for new and existing data applications
Contribute ideas for making the data application efficient and faster to process large amount of data
qualifications:
- Experience level:
- Education: Bachelors (required)
skills:
- ETL
- Content
- AWS
- JAVA PROGRAMMER
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.