Sr. Data Engineer

1 Week ago • 5 Years + • Data Analyst

About the job

Summary

Ness Digital Engineering seeks a Senior Data Engineer with 5+ years' experience in ELT processes and data pipeline development using Snowflake. Responsibilities include designing, building, and maintaining robust ELT pipelines; leading offshore development teams; collaborating with stakeholders on data integration; optimizing pipeline performance and scalability; ensuring data quality; and collaborating with various teams. The role requires proficiency in SQL, Snowflake, and experience managing offshore teams. Compliance with data governance and financial services regulations is crucial.
Must have:
  • 5+ years data engineering experience
  • Snowflake expertise (ELT pipelines)
  • Proficiency in SQL
  • Offshore team management
  • Data quality assurance
  • Data integration expertise
Good to have:
  • Snowflake certifications
  • Financial services experience
  • Cloud platform (AWS, Azure) experience
  • Data orchestration tools (Airflow)
  • Python/JavaScript scripting
  • Data visualization tools (Tableau, Power BI)
Not hearing back from companies?
Unlock the secrets to a successful job application and accelerate your journey to your next opportunity.

Description

Position at Ness Digital Engineering (India) Private Limited

Sr. Data Engineer
Job Overview:
We are seeking an experienced Technical Data Engineer / ELT Developer with expertise in data integration and pipeline development using Snowflake. The ideal candidate will manage data ingestion and transformation, working with an offshore team to deliver efficient and scalable data solutions that align with business requirements and performance standards.
Key Responsibilities:
  • Data Pipeline Development: Design, build, and maintain robust ELT (Extract, Load, Transform) pipelines using Snowflake to support data ingestion, integration, and transformation.
  • Technical Leadership: Lead offshore development teams in implementing best practices for data engineering and ELT development.
  • Data Integration: Collaborate with stakeholders to understand data sources and integration requirements, ensuring seamless connectivity and data flow between systems.
  • Performance Optimization: Optimize data pipelines for performance, scalability, and reliability, including query tuning and resource management within Snowflake.
  • Data Quality Assurance: Implement and monitor data validation procedures to ensure data accuracy and consistency across systems.
  • Collaboration and Communication: Work closely with project managers, data architects, and business analysts to align project milestones and deliverables with business goals.
  • Documentation: Create and maintain detailed documentation of data pipelines, data flow diagrams, and transformation logic.
  • Issue Resolution: Troubleshoot and resolve issues related to data pipelines, including job failures and performance bottlenecks.
  • Compliance and Security: Ensure all data management processes comply with data governance policies and regulatory requirements in financial services.
Required Qualifications:
  • Bachelor’s degree in Computer Science, Information Technology, or a related field.
  • 5+ years of experience in data engineering with a strong focus on ELT processes and data pipeline development.
  • Hands-on experience with Snowflake cloud data platform, including data sharing, secure views, and performance optimization.
  • Proficiency in SQL and familiarity with data integration and ETL/ELT tools.
  • Experience managing and collaborating with offshore development teams.
  • Strong problem-solving skills and the ability to work independently to meet deadlines.
  • Excellent communication skills for effectively interacting with technical and non-technical stakeholders.
Preferred Qualifications:
  • Certifications in Snowflake or relevant data technologies.
  • Experience in the financial services sector with an understanding of data security and compliance requirements.
  • Familiarity with cloud platforms (e.g., AWS, Azure) and data orchestration tools (e.g., Apache Airflow).
  • Experience with scripting languages such as Python or JavaScript for data transformation.
  • Knowledge of data visualization tools (e.g., Tableau, Power BI).

View Full Job Description

Prague, Czechia (On-Site)

Bratislava Region, Slovakia (On-Site)

Iași County, Romania (Hybrid)

Texas, United States (Hybrid)

Iași County, Romania (Remote)

Košice Region, Slovakia (Remote)

Prague, Czechia (On-Site)

Bratislava Region, Slovakia (Hybrid)

Prague, Czechia (On-Site)

Prague, Czechia (Hybrid)

View All Jobs

Level Up Your Career in Game Development!

Transform Your Passion into Profession with Our Comprehensive Courses for Aspiring Game Developers.

Job Common Plug