AcademicJobs.com Jobs

AcademicJobs.com

Applications Close:

Adelphi

5 Star University

"Director, Data Integration and Interoperability"

Academic Connect
Applications Close

Director, Data Integration and Interoperability

Director, Data Integration and Interoperability

Company: University of Maryland Global Campus

Job Location: Adelphi, 20783

Category: IT Manager/Director

Type: Full-Time

This position requires the selected candidate to be on-site 2-3 days a week.

The Director, Data Integration and Interoperability will lead enterprise-wide data ingestion, transformation, and delivery initiatives in a Databricks centered environment. This role is critical in designing and operating robust data integration pipelines across internal and external sources, enabling access via APIs and lake federation, and other mechanisms ensuring that data is FAIR (Findable, Accessible, Interoperable, Reusable).

This position will design and oversee ELT and Lake Federation processes, as well as interoperability with MDM, data governance, DevOps, MLOps, and AIOps environments. In addition, it will be responsible for enabling dimensional data modeling to support scalable analytics and business intelligence solutions. This position requires both technical leadership and strong collaboration with engineering, data governance, analytics, and business teams.

Duties and Responsibilities:

Data Ingestion and Integration

  • Design and oversee scalable ingestion pipelines using Databricks pipelines, Lakeflow Declarative Pipelines, Delta Lake from diverse internal and external data sources.
  • Utilize tools like Databricks Auto Loader, Python, Kafka, REST APIs, SQL, and cloud-native connectors for real-time and batch data flows.
  • Establish standardized ingestion and orchestration patterns based on data provider contracts ensuring SLAs, quality, observability, and operational reliability.

Data Delivery and Federation

  • Ensure delivery of data to consumer apps and systems maintaining SLAs specified in data contracts.
  • Implement lake federation strategies to unify access across cloud platforms and on-premises systems.
  • Lead implementation and operation of API Store, managing delivery of curated and governed data sets via REST APIs, SQL endpoints, and federated access layers with fine grained access control.
  • Ensure scalable, secure, and performant data access for downstream analytics, reporting, and machine learning.

Interoperability with Enterprise Data Platform components

  • Implement data interoperability between the data Lakehouse and the Master Data Management (MDM) system (i.e., Profisee) in support of federated data stewardship and quality processes.
  • Collaborate with data governance and security teams to ensure proper metadata management, lineage tracking, and compliance with data access, archival, and regulatory policies and regulations (e.g., FERPA, GDPR, PCA, etc.) managed in a Purview environment.
  • Enable interoperability with MLOps and AIOps platforms to streamline model deployment, monitoring, and lifecycle management.

Dimensional Modeling and Analytics Support

  • Guide data engineering teams in designing and implementing dimensional models to support all enterprise reporting and analytics needs including AI/BI, BI, reverse ETL, lake federation, etc.
  • Ensure data models align with business definitions, support high-performance queries, and integrate cleanly with semantic layers, data quality, and data consumer tools and processes.
  • Partner with analytics teams to identify modeling needs and implement scalable, reusable data structures.

Metadata Management and Governance

  • Lead the implementation and automation of metadata management practices across all data pipelines and assets.
  • Collaborate on development of metadata stores in Unity Catalog to ensure consistent documentation, lineage tracking, and impact analysis.
  • Ensure technical, business, and operational metadata are captured and maintained for all datasets and records.
  • Collaborate with data governance teams and stakeholders to enforce metadata standards, naming conventions, and classification policies within the enterprise data platform and across enterprise systems.
  • Support discovery, reuse, and transparency of data assets through well-governed metadata practices.
  • Lead implementation of observability tools and reports.

Leadership and Strategy

  • Build and lead a team of data integration engineers and architects; provide mentoring, technical guidance, and career development.
  • Define roadmaps and execution plans for data interoperability and integration capabilities.
  • Manage project delivery timelines, budgets, and cross-functional dependencies.
  • Engage with business and technical stakeholders across the institution as needed.

Skills:

Technical Skills

  • Expertise in Databricks, Apache Spark, Delta Lake, SQL, Python, and Erwin Data Modeler.
  • Strong understanding of cloud data platforms, preferably Azure and AWS.
  • Experience with dimensional modeling (Kimball, Inmon, and Linstedt data vault modeling a plus.
  • Proficiency in API development, real-time streaming, and batch data processing.
  • Expertise in Databricks with emphasis in Unity Catalog, Lakeflow, Asset Bundles, Autoloader, and Pipelines, REST APIs, and Lake Federation.
  • Working knowledge of MDM platforms, data cataloging (e.g., Alation, Collibra), data lineage, and governance tools.
  • Integration experience with MLFlow.
  • Integration experience with NEO4J.
  • Strong leadership and team management capabilities.
  • Excellent verbal and written communication skills.
  • Strategic thinking with a bias toward execution.
  • Ability to manage stakeholder expectations and align technical execution with business objectives.

Education & Experience Requirements:

Education

  • Bachelor's degree in Information Science, Computer Science, Data Engineering, Information Systems, or related technical discipline.

Experience:

  • 10+ years of experience in data engineering, data integration and interoperability, or related roles.
  • 3+ years in leadership positions leading data engineering and integration teams.
  • Demonstrated experience leading implementation and operations in Databricks environment with high hands-on involvement.
  • Proven track record of developing implementing dimensional modeling in data Lakehouse environments.
  • Experience integrating data platforms with enterprise MDM, governance, MLOps, and AIOps tools.
  • Experience working in regulated, high-volume, multi-tenant environments.

All submissions should include a cover letter and resume.

Hiring Range: $186,000.00 - $211,000.00

Apply Now

10

Whoops! This job is not yet sponsored…

Pay to Upgrade Listing

Or, view more options below

View full job details

See the complete job description, requirements, and application process

Stay on their radar

Join the talent pool for AcademicJobs.com

Join Talent Pool

Express interest in this position

Let AcademicJobs.com know you're interested in Director, Data Integration and Interoperability

Add this Job Post to FavoritesExpress Interest

Get similar job alerts

Receive notifications when similar positions become available

Share this opportunity

Send this job to colleagues or friends who might be interested

Loading job count...
View More