Principal Data Engineer
Mass General Brigham
Mass General Brigham relies on a wide range of professionals, including doctors, nurses, business people, tech experts, researchers, and systems analysts to advance our mission. As a not-for-profit, we support patient care, research, teaching, and community service, striving to provide exceptional care. We believe that high-performing teams drive groundbreaking medical discoveries and invite all applicants to join us and experience what it means to be part of Mass General Brigham.
Job Summary
Responsible for overseeing the design, development, implementation, and maintenance of data solutions within the organization. Support team of data engineers, collaborating with cross-functional teams, data scientists, and business stakeholders to ensure the efficient and reliable management of data.Essential Functions
-Develop ETL/ELT architectural patterns, guidelines, and methods to manage data assets across the data life cycle, from creation or acquisition through archival and purge.
-Design and implement ETL/ELT architectural standards for multiple data platforms and technologies.
-Drive delivery of highly optimized, scalable, and resource efficient ETL systems.
-Evaluate and recommend new ETL/ELT features, products, and technology.
-Act as an escalation point to troubleshoot and solve technical challenges.
-Provide leadership in ETL/ELT designs, architectural decisions, and technical solutions.
-Develop, test, and implement new and revised features and functionality as needed.
-Lead code review and knowledge transfer sessions for project ETL deliverables.
-Develop Data Engineering Quality circle agenda and drive implementation of standards and best practices.
-Mentor and support junior and senior data engineers sets.
Qualifications
- Bachelor's Degree Computer Science required or Bachelor's Degree Related Field of Study required
- MGB can review and consider experience in lieu of a degree
- Experience in professional information technology positions 8-10+ years required and Data warehousing development in large reporting environment(s) 5-7 years required and Experience working with data integration tools, ETL frameworks, and workflow management systems (e.g., Apache Airflow) required
Knowledge, Skills and Abilities
- Strong expertise in data engineering principles, data modeling, ETL development, and data warehousing.
- Strong ability in building data pipelines using Snowflake features ( Snowpipe, SnowSQL, Snow Sight, Data Streams ).
- Proficiency in working with relational databases (e.g., SQL Server, Oracle, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra).
- Familiarity with cloud-based data solutions, preferably with AWS, Azure, or Google Cloud Platform.
Additional Job Details (if applicable)
Working Conditions
- Onsite Flexible working model required
- In Office onsite weekly as planned for business needs
- M-F Eastern Business Hours required, remote working days require stable, secure, quiet working area
Remote Type
Work Location
Scheduled Weekly Hours
Employee Type
Work Shift
EEO Statement:
Mass General Brigham Competency Framework
At Mass General Brigham, our competency framework defines what effective leadership “looks like” by specifying which behaviors are most critical for successful performance at each job level. The framework is comprised of ten competencies (half People-Focused, half Performance-Focused) and are defined by observable and measurable skills and behaviors that contribute to workplace effectiveness and career success. These competencies are used to evaluate performance, make hiring decisions, identify development needs, mobilize employees across our system, and establish a strong talent pipeline.