r

Senior Data & AI Engineer

remoterocketship
1 hour ago
Full-time
On-site
Oregon, United States
Job Description:

Design and implement medallion architecture (bronze/silver/gold) using Delta Lake as the foundational layer across client environments. Build and optimize scalable data pipelines using Apache Spark and Lakeflow (Declarative Pipelines, Jobs, and Connect for ingestion). Architect and implement Unity Catalog as the backbone that governs data lineage, fine-grained access control (including row/column-level security), Lakehouse Federation, and Delta Sharing. Drive best practices for performance tuning, cost governance, and compute optimization on Databricks. Implement and optimize Databricks SQL for enterprise analytics, including AI/BI Dashboards and BI tool integration (Tableau, Power BI, Looker). Design data models that serve both operational reporting and strategic analytics use cases. Partner with business stakeholders to translate complex data requirements into reliable, performant solutions. Build production AI solutions on the Databricks platform with tools including RAG pipelines using Mosaic Vector Search, agent development with the Mosaic AI Agent Framework, Model Serving endpoints, and external model APIs (OpenAI, Anthropic, Gemini, and others). Leverage Mosaic AI, AI/BI Genie, and Databricks Apps to develop intelligent, data-driven applications that deliver real client value. Stay current on Databricks AI capabilities and bring emerging patterns to the practice proactively. Serve as a foundational technical resource for V2's Databricks practice, helping shape standards, methodologies, and delivery frameworks. Operate across client engagements as a hands-on practitioner and technical advisor, translating business problems into platform solutions. Support pre-sales and solutioning efforts: scoping engagements, contributing to proposals, and demonstrating credibility in client conversations. Supercharge the broader technical team by leading code reviews, conducting formal and informal knowledge transfer, and elevating the people around you. Requirements:

Platform mastery. Databricks depth that is real and demonstrable. Proficiency across Delta Lake, Unity Catalog, DLT, Databricks SQL, and Spark. Databricks Certifications (especially Data Engineer Professional and ML Engineer Professional) are strongly preferred. End-to-end engineering. You've built pipelines, governed data platforms, and designed Lakehouse architectures in production environments. AI integration experience. Hands-on experience integrating LLMs, building RAG pipelines, and deploying AI-powered solutions using Databricks. Consulting or professional services instincts. You understand how consulting engagements work and have experience with scoping, client communication, delivery under pressure, and building trust with senior stakeholders. Broad data platform fluency. Experience with cloud platforms (Azure, AWS, or GCP) and familiarity with adjacent data technologies, including Snowflake, Salesforce Data Cloud, or dbt, is a differentiator. Cultural alignment. You operate with speed and agility and enjoy fast-paced, entrepreneurial environments. Benefits:

Health insurance Retirement plans Paid time off Flexible work arrangements Professional development