D

Senior Data Engineer

Description Ciklum
1 day ago
Full-time
Remote
United States
Automation
Description

Ciklum is looking for a Senior Data Engineer to join our team full-time in the US .

We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live.

About the role:

As a Senior Data Engineer, become a part of a cross-functional development team engineering experiences of tomorrow.

We are seeking a highly motivated and hands-on Full Stack Data Engineer with strong experience in Microsoft Fabric and modern Azure-based data platforms. The ideal candidate should be capable of working across the full stack of data engineering — from ingestion and transformation to Gold-layer curation, analytics enablement, API integration, and AI-assisted application workflows.

This role requires engineers who can work independently, leverage AI-assisted development for rapid delivery, and collaborate across data, cloud, and lightweight application layers. Exposure to MCP (Model Context Protocol), ReactJS integration, and modern AI-enabled engineering practices is highly preferred.

Responsibilities:

  • Fabric Ecosystem & Full Stack Data Engineering
    • Manage OneLake structures and shortcuts for scalable enterprise data access
    • Design scalable Lakehouse solutions using Medallion Architecture (Bronze, Silver, Gold)
    • Build and optimize Delta Lake tables for analytics, reporting, and AI workloads
    • Develop data pipelines using Fabric Data Factory, Spark, and Notebooks
    • Create ingestion and transformation workflows for structured and semi-structured data
    • Implement orchestration, scheduling, monitoring, and recovery for enterprise pipelines
    • Design dimensional models (Star/Snowflake schemas) for BI and semantic layers
    • Build curated Gold-layer datasets for analytics and AI consumption
    • Support integration with Power BI semantic models and reporting platforms
  • Azure, Integration & Full Stack Development
    • Develop batch and incremental pipelines across Azure and external systems
    • Orchestrate ETL/ELT workflows using Fabric Pipelines and Azure Data Factory
    • Integrate Fabric platforms with APIs, AI services, and enterprise applications
    • Support MCP integration, AI workflows, and rapid prototyping initiatives
    • Collaborate on ReactJS-based apps, dashboards, and AI-driven user experiences
    • Automate workflows using Azure Functions, Logic Apps, Git, CI/CD, and Azure DevOps
  • Data Processing, Optimization & AI-Assisted Development
    • Develop and optimize PySpark notebooks for transformation, cleansing, and enrichment
    • Build efficient SQL queries, views, and stored procedures in Fabric Warehouse / Azure SQL
    • Implement optimization techniques including partitioning, caching, and query tuning
    • Monitor pipeline performance, troubleshoot failures, and improve system reliability
    • Implement logging, alerting, and operational best practices
    • Utilize AI-assisted development tools such as GitHub Copilot and modern AI coding assistants
    • Rapidly prototype and deliver scalable engineering solutions with minimal guidance
  • Governance, Security & Collaboration
    • Implement RBAC and secure data access across Fabric workspaces and Azure environments
    • Apply data quality validations and governance best practices
    • Support metadata management and lineage using Microsoft Purview
    • Collaborate with Data Architects, Analysts, BI Developers, Product Teams, and Business Stakeholders
    • Translate business requirements into scalable data and application solutions
    • Participate in Agile delivery processes, code reviews, and pull request workflows

Requirements:

  • 6 years in Data Engineering
  • Hands-on exposure to Microsoft Fabric (preferred) or strong Azure Data Engineering background with willingness to learn Fabric
  • Microsoft Fabric (Data Factory, OneLake, Synapse Data Engineering – basics)
  • Azure Data Services: ADLS Gen2, Azure SQL, Blob Storage
  • Strong SQL (joins, aggregations, performance tuning basics)
  • Python (PySpark) for data transformation
  • Azure Data Factory / Fabric Pipelines

Desirable:

  • Azure Functions / Logic Apps
  • Event Hubs / streaming concepts
  • Microsoft Purview (basic exposure)
  • Cosmos DB

What`s in it for you?

  • Strong community: Work alongside top professionals in a friendly, open-door environment
  • Growth focus: Take on large-scale projects with a global impact and expand your expertise
  • Tailored learning: Boost your skills with internal events (meetups, conferences, workshops), Udemy access, language courses, and company-paid certifications
  • Endless opportunities: Explore diverse domains through internal mobility, finding the best fit to gain hands-on experience with cutting-edge technologies
  • Care: Healthcare, Basic Life Insurance, Short and Long-term disability insurance according to the Company’s Benefit Plans

About us:

At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, you’ll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. In the US, Ciklum is growing fast—inviting experienced professionals to lead digital transformation alongside Fortune 500 clients. Be part of a company where innovation and impact go hand in hand.

Want to learn more about us? Follow us on Instagram, Facebook, LinkedIn.

Explore, empower, engineer with Ciklum!

Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.

#LI-AV3