Job Title: Data Engineer
Location: DHA Phase-8, LahoreJob Responsibilities:1. Design, build, and maintain efficient and scalable ETL/ELT data pipelines using tools like Azure Data Factory or Apache Airflow2. Develop, test, and deploy Python-based data processing solutions3. Work with Snowflake, Databricks and other data warehousing technologies to support data analytics and reporting needs4. Write complex and optimized SQL queries for data transformation and extraction5. Ensure data quality, accuracy, and compliance with internal and external standards6.
Collaborate with data engineers, analysts, and stakeholders to define and implement data workflows7. Utilize version control (Git) and CI/CD pipelines for code deployment and process automation8. (Optional/Preferred) Support backend services using Django and task queues with CeleryQualifications Needed:1. Proficient in Python for scripting and data processing.2. Skilled in writing and optimizing SQL queries.3. Experience working with Snowflake cloud data platform.4. Hands-on experience with Databricks for big data and analytics.5.
Familiar with cloud platforms like Azure or AWS.6. Strong understanding of ETL/ELT processes and tools.7. Experience using ADF (Azure Data Factory) or Apache Airflow for orchestration.8. Knowledge of data warehousing concepts and architecture.9. Proficient in using Git for version control.10. Understanding of CI/CD pipelines for deployment automation.11. Experience with Django web framework (preferred).12. Familiarity with Celery for task scheduling and background jobs (preferred).