About Low Carbon
Low Carbon is a purpose-driven company that creates large-scale renewable energy in the global fight against climate change. Our mission is to protect the planet for future generations while delivering positive returns for our communities and investors. Our goal is to produce as much new renewable energy as possible while limiting our own environmental impact. We do this by developing, investing in, and operating large-scale renewable energy projects across the globe.
Our people and culture are really important to us. We are friendly, approachable, and professional. We value enthusiasm, entrepreneurialism, clear communication, and drive. This, combined with our purpose and passion for climate change, is why our colleagues recommend Low Carbon as a great place to work.
Role Description
We are looking for an intern to join our Technology Department as an Intern Data Engineer for a period of 3 months. Working within the Digital and Data Team, you'll support the Technical Lead in building and delivering our next-generation data platform, a key enabler for Low Carbon's ambition to build renewable energy generation at scale. This role will work across the modern analytics stack, from implementing new back-end solutions for operational data, contributing to the transition from legacy systems to a modern Azure-based platform, to frontend dashboard creation.
This opportunity will provide the successful applicant with:
- Hands-on experience building scalable, reliable data solutions for a fast-growing renewable energy business
- Growth of technical skills in cloud architecture, data engineering, and analytics
- Experience within an Agile framework, supporting the full technology maturation funnel from ideation to deployment
- The chance to collaborate with a diverse team and contribute to impactful projects driving sustainability
Key Responsibilities
- Design and implement scalable data pipelines to ingest, transform, and store operational data from various sources, including SCADA and DCDA platforms
- Develop and maintain data models and Lakehouses using Microsoft Fabric, aligned with medallion architecture principles
- Optimise performance of data flows and ingestion pipelines, troubleshooting issues as they arise
- Ensure data quality, integrity, and security across all stages of the pipeline
- Support dashboard and reporting tool development for executive audiences, enabling analysis and visualisation of operational and financial performance
- Document data processes and contribute to best practices within the team
- Collaborate with technical and business stakeholders (Product & Design, Enterprise & Security Architecture, Portfolio Management) to understand needs and deliver solutions
- Adhere to technology governance and compliance standards, ensuring solutions are maintainable and secure
Skills & Experience
- Qualifications in Computer Science or Data Engineering (can be obtained via online courses, apprenticeships, or equivalent)
- Strong Python programming skills and familiarity with a Notebook based development environment
- Fluency in SQL and a good understanding of relational databases
- Demonstrable capability in Data Engineering or Analytics (formal training or self-taught)
- Excellent problem-solving skills and attention to detail
- Strong communication and collaboration skills
- Interest in renewable energy technologies
Desirable
- Experience working with solar and battery operational data
- Familiarity with Microsoft Fabric and/or GitHub Copilot
- Familiarity with Azure DevOps
- Experience with PowerBI
- Exposure to cloud architecture solutions (Azure preferred)
Our Compensation & Benefits
- 26 days holiday plus your birthday off
- Contributory Pension
- Pluxee for commercial discounts and perks
- 3 additional days for volunteering to support causes of your choice