Data Engineer
Chennai, TN, IN
About IDP
IDP is the global leader in international education services, delivering global success to students, test takers and our partners, through trusted human relationships, digital technology and customer research. An Australian-listed company, we operate in more than 50 countries around the world.
Our team is comprised of over 7,000 people of various nationalities, ages and cultural backgrounds. Proudly customer-first, our expert people are powered by global technology. Together, we offer unmatched services, helping local dreams become realities, all over the world.
Learn more at www.careers.idp.com
Role purpose
While a significant part of your role will be to write code in Python and utilize AWS services, the real value you’ll produce will be from really understanding the Outcome required by the Business, and “making it so”. We’re looking for everyone in our team to demonstrate leadership, working together to continuously improve the way we work, and work smarter not harder and continuously increase the value we provide to the business. You’ll be part of this.
• You love technology, are continuously learning and extending your knowledge of best practice and the business value of technology innovations.
Key accountabilities
- Bachelors or master’s in software engineering, Computer Science or other relevant disciplines
- Strong Knowledge in SQL and Python v3.x Language
- Can analyse new source data, model entities, and write complex SQL
- Minimum 7 years of Overall Development Experience and at least 4+ years on SQL coding and 2+ years on Python
- Could have worked on AWS Lambda and Microservices development.
- Advanced SQL, Python, CDK, Terraform, Lambda and scheduling software.
Required experience
- Uses reporting tools to produce information from established data sets. Tests changes to reports and data flows
- Performs and documents data processing tasks.
- Build out new analyses and other data related tasks
- Acquires new data and builds pipelines.
- Can combine multiple sources of data, allow for data quality, establish scheduling and integration, automate builds of environments, and run deployments