About Gridcog:
Gridcog provides advanced software to plan, simulate and optimise decentralised energy projects. Our software is used by large energy suppliers, energy project developers, technology providers and large energy users. We have a loyal and growing customer base across Australia, and are expanding to the UK and Europe.
We believe the future of energy is distributed, smart and clean, and that we can use software to accelerate decarbonisation and to help tackle climate change.
We're a fast growing technology startup and we want to have a global impact. We are looking for smart, savvy and curious learners to join our team and to help invent new technology to lead the world into a decentralised energy future.
About the role:
We’re looking for a Software Engineer with Python and AWS data processing experience to join our team. A key component of the Gridcog platform is ingestion and processing of a wide variety of data sources related to energy generation, usage and prices. From energy regulators, energy suppliers, and customer assets such as solar and wind farms, and large scale batteries. You will help evolve and maintain our data processing infrastructure ensuring reliable delivery of data to help customers design and deliver the most efficient and effective energy transition outcomes.
We're a remote-first team that values some in-person time - our largest cluster is in Perth, followed by Melbourne - but anywhere in Australia could work for the right candidate.
Requirements
Our ideal candidate has:
- Proven experience as a software engineer or data engineer
- Experience designing and building data integrations and APIs with Python
- Experience with data engineering tools, and data processing with pandas, numpy, and similar.
- Experience with AWS, and familiarity with serverless and event-driven architectures
- Experience with ETL/ELT pipelines and both structured and unstructured data stores
- Solid algorithm development skills, good understanding of data structures
- System design skills: design robust, reliable and highly available online services
- Ability to communicate technical concepts clearly to technical and non-technical team members
- Experience with API design, database schema design, and automated testing
- CI/CD development experience and modern monitoring and observability techniques
What you’ll do:
- Build and take ownership of key services for our SaaS product, with a focus on backend services and data flows.
- Utilise your in-depth knowledge of AWS services to build scalable, reliable, and highly available cloud solutions.
- Work on data ingestion, processing, aggregation, and data pipeline components to enable seamless data transformation.
- Design and implement APIs and Event to enable integration with other applications
- Scalability and Performance: Optimise software components for performance and scalability to handle large data volumes efficiently.
- Documentation: Create and maintain clear and comprehensive documentation for software architecture and code.
- Collaboration: Collaborate with product managers, data engineers, and data scientists to understand and address customer requirements.
- Problem Solving: Troubleshoot and resolve software issues, including bug fixes, performance improvements, and enhancements.
Benefits
- Competitive salary package aligned with experience and skills.
- Opportunity to work in a remote-first business with flexible working arrangements.
- Weekly opportunities for in-person collaboration at co-working spaces and an annual whole company retreat.
- Join a high-performing, unapologetic energy and tech nerd team to tackle significant challenges.
- Engage in a high-trust distributed team environment that values innovation and creative problem-solving.
- Contribute to the decarbonisation of the world's energy system.
- Time and budget support for ongoing professional and personal development.
- Opportunity for ESOP participation
Tags
api
AWS
backend
cloud
python
Apply to job