Job Profile- Bigdata ETL Engineer Work Location- Pune Degree- Bachelor’s Degree or higher in Information Systems, Computer Science, or related field (or equivalent experience). Job Role- Sr. Software Engineer Experience- 2-5 Yrs
Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analysing/interpreting code.
Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems.
Apply fundamental knowledge of programming languages for design specifications.
Well versed in Spark, ETL, and Data warehouse ecosystem
Developed data pipelines for ingesting data sources, transforming & loading data warehouses/lakes using Spark (preferably PySpark) & Python scripts
Worked with any ETL tool (Informatica, Talend etc.) to create packages for new data integrations, and maintain/enhance existing packages
Worked with Cloud Data Warehouses such as AWS Redshift, Snowflake, Google BigQuery
Worked with file formats such as Parquet, ORC
Worked with SQL query engines such as Presto, Impala
Python and shell scripting knowledge to create scripts to maintain & monitor data solutions
Knowledge of schedulers such as Oozie, Azkaban, Airflow
Knowledge of AWS data services such as EMR, S3, Athena, Glue, etc.
API development in Python Django/Flask
Python data packages such as pandas, NumPy
Worked with BI platforms such as Tableau, Periscope, Looker, etc. to create dashboards & reports