We are looking for a highly skilled Senior Data Engineer to take a lead role in developing a state-of-the-art data platform. You will join a dynamic team dedicated to building a foundational Enterprise Data Lakehouse that will drive analytics and insights for critical business operations within the energy sector. This is a chance to make a significant impact by architecting and implementing data solutions from the ground up.
Key Responsibilities
-
Design and implement robust, scalable data pipelines to integrate disparate data sources into a unified Data Lakehouse.
-
Develop and implement data quality frameworks to ensure data correctness and build trusted datasets for analysis.
-
Collaborate with business experts to design and support a Data Lakehouse solution that accurately reflects complex business operations.
Required Qualifications
-
5+ years of experience in data engineering, with a focus on designing and maintaining data pipeline architectures.
-
Extensive programming skills in Python (including pandas, Numpy, Pyarrow) and SQL (ANSI, PLSQL, TSQL).
-
Proven experience implementing a Data Lakehouse using technologies like Apache Iceberg or Delta Lake.
-
Deep understanding of various data integration patterns, including ETL, ELT, Pub/Sub, and Change Data Capture.
-
Experience with modern software development practices, including CI/CD, automated testing, and version control.
This is a unique opportunity for a seasoned data professional to build a critical enterprise data platform and shape the future of analytics in a fast-paced industry.
#11029