Lead the design and delivery of analytics and AI solutions by turning business needs into scalable data pipelines, models, and production-ready technical solutions.
Day-to-Day Responsibilities:
Partner with business stakeholders to gather requirements and translate problems into AI and analytics solutions
Design and build scalable data pipelines, ETL/ELT workflows, and data platform components
Write hands-on code in Python and SQL to develop pipelines, models, and supporting solutions
Build and optimize solutions in Databricks for analytics, reporting, and AI use cases
Create documentation and technical requirements for offshore development teams
Guide offshore execution and ensure solutions are delivered accurately and efficiently
Improve data quality, performance, monitoring, and reliability across pipelines and platforms
Support analytics, business intelligence, reporting, and AI initiatives with scalable data architecture
Requirements:
Must-Haves:
6β10 years of experience in data engineering, analytics engineering, or related roles
Strong hands-on Python and SQL skills
Experience building and maintaining ETL/ELT pipelines and data workflows
Strong background in data engineering, data pipelines, and data warehousing
Databricks experience
Experience designing and delivering data and AI solutions
Ability to translate business needs into technical solutions
Strong communication skills and comfort in a client-facing environment
Bachelor's degree in Computer Science, Engineering, Information Systems, or related field
Nice-to-Haves:
AWS experience
Experience with AI/ML tools or AI solution development
Experience with workflow orchestration tools such as Airflow
Experience with API-based ingestion and modern cloud data platforms
Experience supporting near real-time analytics or reporting environments