An employer in San Diego is looking for a Lead Data Integration Engineer. This person will be based out of Mexico and will be joining the Enterprise Data Architecture and Platform organization. This team is the gateway for all data coming into the organization and is essential in defining and delivering the Data and Analytics based insights that can be leveraged by business teams throughout the retail organization. This person will be responsible for working closely Scrum Masters, Data Architects, QA Engineers and DevOps Engineers to engineer data and ETL pipelines to bring new data into a data warehouse. The Data Integration Engineer will be primarily working on API and data integrations, must have the ability to understand custom data pipelines and modern ways of automating data pipelines using cloud based and on premise technologies, as well as the ability to understand end to end data integration requirements. This person will be responsible for developing scripts to extract/load transform data, writing SQL queries, integrating on premise infrastructure with cloud infrastructure, and actively test and clearly document implementations. This person can sit in Queretaro, MX or be willing to relocate.
Required Skills & Experience
– 7+ years of experience writing complex SQL queries and custom data pipelines within large enterprise environment (preferably with Terabytes)
– Strong background in API data integration, working hands on with restful APIs and GraphQL
– 5+ years of experience developing ETL, ELT and Data Warehousing solutions
– Strong understanding of the business side of data, and experience preparing the ETL for the reporting layer
– 1+ years of Data warehousing experience within Snowflake
– 5+ years of experience with data engineering using Python based CI/CD pipelines
– 2+ years of experience within cloud environment: AWS, Azure or Google cloud
– Ability to engineer solutions using AWS products such as EC2, S3 buckets, Lambda functions, SQS queues, Dynamo DB etc.
– Experience with automation of DevOps build using GitHub/Gitlab, Jenkins or Maven
– Strong understanding of various data formats such as CSV, XML, JSON etc.
– Strong understanding of Agile process and tracking using Jira and Confluence
– Outstanding collaboration and communication skills across the team, able to provide guidance, also take feedback, and look at complex problems from various POVs
– BS in CS or related field
Nice to Have Skills & Experience
– Enterprise retail experience
– Reporting experience with Microstrategy or Looker
– Experience with DBT for transformation
– Experience with Airflow for orchestration
Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.