Prodigy Finance – who are we?
Prodigy Finance is a platform that delivers socially responsible financial services; making it possible for students from more than 150 countries to fulfil their dream of studying at the world’s top universities and schools by financing their international postgraduate studies, whilst delivering competitive financial and social returns to alumni, institutional and private investors.
This borderless and innovative model enables education loan financing to students from across the globe, whilst using predicted post-degree affordability rather than present-day salary. Since 2007, Prodigy Finance has extended over US$1 billion through the platform to fund over 19,000 students.
We are funded by some of the best, pre-eminent institutions in the world including Index Ventures, Balderton Capital, RMIH, Credit Suisse and Deutsche Bank
What will you do in the role?
In this role, you get to be part of a data and analytics team that is building a state of the art data platform. You will be an integral and trusted member of the data engineering team tasked with advocating for and establishing good working relationships across the tech team and with data consumers in order to better understand the data being created in the organisation, how the data is being used, and help mould our data practices and standards to enable more effective and efficient use of our data.
What are the roles and responsibilities of this role?
Setting up realtime and batch data/machine learning pipelines, configuring databases, and servicing data to business stakeholders.
Be a mentor to new team members
Hands on coding/implementation to enable data to flow to our data systems from both internal and external sources
You take responsibility for the data and systems you both inherit and produce. Be passionate about data engineering and make our data solutions better
Research and stay abreast of key technical developments and trends
The integration of data sources into our data and analytics environment
The timely delivery of quality data to our business teams
Delivery of quality software and adherence to coding best practices
The quality of the data we store within our data and analytics environment
The adherence to target architecture guidelines, or your influence in changing how we architect for data integration, transformation and access
Helping our data engineering team focus on value in order for it to be a valuable asset to the business
What would the ideal candidate be great at?
Technical competence; love coding, able to learn new paradigms quickly and look to continuously improve and find better ways of doing things
Excellent critical judgement; able to make good decisions, be trusted, respected and dependable, be proactive and responsive, ask the right questions, raise flags at the right time, able to prioritize and plan your own individual tasks
Mindfulness; be considerate of the implications of your work, really care about what you are doing and the impact of your contribution
Teamwork and team spirit; we are all contributing to the same platform, so you need to not only be a great individual contributor but be more motivated by the achievements of the whole team – we only win if the team wins, see the impact of your own work and positively influence and help the work of others
Getting up to the front of the bus; get stuck in, execute, generate ideas, have an impact, don’t just sit back and be a passenger
Qualifications and experience
3+ years experience within software engineering
Python and SQL experience
Experience with relational database administration
Experience working with data pipelines and messaging systems (e.g. RabbitMQ, SQS)
Experience working with Cloud Platforms such as AWS or Azure and related tools (eg: Redshift, Kinesis, ECS, S3, Lambda)
Experience with software application development in a production environment
Understand software design principles and best practices (TDD, continuous deployment, etc.)
Experience working in a data engineering team
Experience with Data Warehousing and ETLs
Experience working with Postgres
Solid foundational knowledge of Data Science and Business Intelligence