SAS

ETL Developer

Job Locations IN-Pune
Requisition ID
20069365
Job Category
Information Technology
Travel Requirements
None

Job Description:

We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and data warehouse solutions. The ideal candidate will work closely with data analysts, data scientists, and business stakeholders to ensure reliable, high-quality, and accessible data for analytics.

Key Responsibilities:

Design, develop, and maintain ETL/ELT pipelines

Build and optimize data models for data warehouses and data marts.

Develop scalable batch and real-time data processing systems.

Integrate data from multiple sources (APIs, databases, files, streaming systems)

Ensure data quality, governance, and security standards

Optimize SQL queries and database performance

Collaborate with BI and analytics teams for reporting requirements

Implement monitoring and logging for data workflows

 

Required Technical Skills:

Experience with ETL tools

Strong SQL and database knowledge (Snowflake, SQL Server, etc.)

Programming skills in Python / Scala

Experience with cloud platforms: AWS / Azure / GCP

Knowledge of data warehousing concepts (Star Schema, Snowflake Schema, Data Vault)

Familiarity with orchestration tools (Flow Manager, Control-M, etc.)

Advanced SQL skills and database expertise (with Oracle)

 

Qualifications:

Bachelor’s degree in computer science, IT, Engineering, or related field

Strong problem-solving and analytical skills

 

Nice to Have:

Certification in Python

Certification in AWS/Azure/GCP

Knowledge of data governance frameworks

 

 

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed