Posted 26 Jun 2024, 6:00 pm
Senior Data Engineer at Interwell Health
Sorry, but this job listing has expired!
Interwell Health’s Senior Data Engineer role is responsible for corporate claims data acquisition, file loading and structure development. The ideal candidate will have a strong background in data engineering.
This position requires using current Data warehouse technologies, design, test, plan, implement, maintain EDW databases.
The work you will do:
- Designs and develops database systems, processes, and documentation, based on the requirements of the company. This includes collecting data, analyzing data, designing algorithms, drawing flowcharts, and implementing code.
- Collaborates effectively with the development groups to deliver projects to the satisfaction of the client in a timely fashion.
- Maintains current knowledge of the latest and newest technologies in the areas of specific relevancy to the Data Architecture group for utilization as appropriate within the department. Identifies and researches enhancement options and process improvements, making suggestions to senior managers as needed and drive continued improvement and innovation.
- Assist with schema design, code review, SQL query tuning.
- Write and deploy SQL code.
- Guide and mentor junior teammates acting as a resource with respect to the definition of development processes, methodologies and frameworks and other technical aspects of the projects and provides direction and assistance regarding working with users and the process and issues encountered with user interaction.
- Become skilled with the architecture and technology supporting the Datawarehouse, to design and develop accordingly.
- Ensure the distribution of knowledge for processes being designed and built within the team to ensure assigned jobs are completed accurately and according to schedule and that all deadlines are met.
- Ensure the design of sufficient documentation for easy and smooth hand-over of projects.
- Participate in the formulation and design of methodologies.
- Trains in developing tools that are available with the Datawarehouse and Lakehouse infrastructure and help with development, where appropriate.
- Monitor and troubleshoot data pipelines to identify and resolve performance issues, bottlenecks, and data anomalies.
The skills and qualifications you need:
- 5-8 years related experience in data engineering with bachelor’s degree; or a master’s degree with 3 years’ experience.
- Ability to architect simple and complex solutions related to integration with other applications and design.
- Excellent communication skills, both verbal and written.
- Strong presentation skills. Ability to present formally to users and customers during gathering requirements, workshops, and feedback sessions.
- Experience working under tight deadlines while maintaining high product relevance and quality.
- Azure/SQL focus
- Strong understanding of data lake and Lakehouse architecture principles, data modeling techniques, and data warehousing concepts.
- 5-8 years’ experience using Azure Data Factory pipelines and Databricks.
- 5 years’ experience working with Azure databases.
- Extensive experience with MS SQL stack. (SSIS and ADF)
- Extensive experience with Azure - Cloud migration.
- Nice to have: Python experience.
Please mention the word **EXCITED** and tag RMzQuMTUwLjE4OC4xMA== when applying to show you read the job post completely (#RMzQuMTUwLjE4OC4xMA==). This is a beta feature to avoid spam applicants. Companies can search these words to find applicants that read this and see they're human.
The offering company is responsible for the content on this page / the job offer.
Source: Remote Ok