CHS Corporate
Senior Data Engineer
,
Full Time

Summary:
As a Data Engineer, you will be instrumental in designing, developing, and maintaining our data pipelines and EDW infrastructure on GCP. This role requires a strong blend of technical expertise, problem-solving abilities, and excellent communication skills, with an emphasis on leadership and mentoring within the team.
Essential Duties and Responsibilities:
Required Education:
As a Data Engineer, you will be instrumental in designing, developing, and maintaining our data pipelines and EDW infrastructure on GCP. This role requires a strong blend of technical expertise, problem-solving abilities, and excellent communication skills, with an emphasis on leadership and mentoring within the team.
Essential Duties and Responsibilities:
- Design, build, and maintain scalable and efficient data pipelines using various GCP services.
- Develop and optimize complex SQL queries for data extraction, transformation, and loading (ETL/ELT) processes.
- Write clean, maintainable, and efficient Python code for data processing, automation, and API integrations.
- Collaborate with data architects, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.
- Ensure data quality, integrity, and security across all data platforms.
- Troubleshoot and resolve data-related issues, performing root cause analysis and implementing corrective actions.
- Contribute to the continuous improvement of our data engineering practices, tools, and methodologies.
- Mentor junior team members and provide technical guidance.
- Lead small to medium-sized data initiatives from conception to deployment.
Required Education:
- Bachelor's Degree or 4 years equivalent professional experience
- 5-7 years of professional experience in data engineering or a similar role.
- Proficiency in SQL, with a deep understanding of relational databases and data warehousing concepts.
- Expertise in Python for data manipulation, scripting, and automation.
- Demonstrable experience with Google Cloud Platform (GCP) services related to data engineering (e.g., BigQuery, Dataflow, Composer, Cloud Storage, Pub/Sub).
- Strong understanding of ETL/ELT processes, data modeling, and data architecture principles.
- Excellent problem-solving skills and a strong analytical mindset.
- Ability to work independently and as part of a collaborative team.
- Strong communication and interpersonal skills, with the ability to explain complex technical concepts to non-technical stakeholders.
- Proven leadership potential and a willingness to take ownership of projects.
- Experience with Agile development methodologies
- Proficiency with version control systems (e.g., Git, GitHub)
- Experience with other programming languages or data processing frameworks
- Familiarity with data visualization tools (e.g., Looker, Tableau, Power BI)
- Experience with data integration and migration within Oracle Cloud Infrastructure (OCI)
- Familiarity with data structures and extraction methodologies for Oracle Cloud ERP applications (e.g., Financials, HCM, SCM)
- Strong oral and written communication skills, including presentation skills (MS Visio, MS PowerPoint).
- Strong problem-solving and troubleshooting skills with the ability to exercise mature judgment.