AcademicJobs.com Jobs

AcademicJobs.com

Applications Close:

Merced

5 Star University

"Data Engineer (Hybrid)"

Academic Connect
Applications Close

Data Engineer (Hybrid)

About the Job

Are you an experienced data engineer interested in enabling a data-driven culture? Are you a continuous learner who is driven to both sharpen and broaden their technical skills? Are you a critical thinker that is solutions-oriented and wants to see their team succeed in complex technical deployments? If so, we invite you to apply for our Data Engineer position and join out Data Services team! Reporting to the Director of Data Services, this position is responsible for the seamless management of data extraction, loading, and transformation processes as well as the architectural design and optimization of our operational data store and our cloud-based enterprise data warehouse. This position applies skills as a seasoned, experienced data engineering professional with a full understanding of industry best practices, campus policies, and data management procedures. Under general supervision by the Director and the AVC of Data, Analytics, and Campus Business Processes, this position designs, builds, maintains, and optimizes data pipelines and integration workflows using modern solutions, serving both analytical platforms and destination applications. This position performs a variety of tasks related to ETL/ELT, data quality, data governance, and technical documentation. This position collaborates with Database Administrators, Analytics Engineers, and BI/Reporting Analysts to ensure timely, accurate, and secure data delivery in support of enterprise reporting, research, and operational needs. This position participates in technical design sessions and data governance activities to improve institutional data capabilities. The successful candidate will have advanced skills in ETL/ELT, Snowflake, AWS, Linux environments, and API implementation strong technical project implementation skills ability to apply software development lifecycle best practices to the data modeling world experience designing, building, maintaining, and enhancing dimensional data models a proven track record of finding and deploying efficiency improvements effective communication skills that span both technical and non-technical audiences a commitment to continuous learning and interest in staying current with modern approaches and solutions This position is eligible for a hybrid work arrangement. Fully remote work arrangement may be considered for highly qualified candidates.

Key Responsibilities:

  1. ETL/ELT Development:
  2. Design, develop, and maintain ETL/ELT processes to efficiently extract data from various sources, transform it to meet business requirements, and load it into data warehouses or data stores.
  3. Collaborate with data analysts, business users, and other stakeholders to understand data integration needs and requirements.
  4. Optimize ETL/ELT workflows to improve data quality, accuracy, and efficiency.
  5. Monitor and troubleshoot ETL/ELT jobs to ensure data integrity and consistency.
  6. Act as a subject matter expert in ETL/ELT and database architecture, offering guidance and mentorship to junior team members.
  7. Design, implement, and maintain database architectures that align with the organization's data strategy and business goals.
  8. Perform capacity planning, scalability assessments, and database performance tuning to ensure optimal database operations.
  9. Evaluate and recommend database technologies, tools, and platforms to support evolving data needs.
  10. Develop and maintain data models, schema designs, and data dictionaries for various databases.
  11. Document database architecture, ETL/ELT processes, and data flow diagrams for knowledge sharing and compliance purposes.
  12. Data Integration and Reverse ETL: Develops and supports data integrations between on-premises and cloud-based data platforms to destination applications. Ensures that data is delivered accurately and timely to downstream platforms supporting institutional operations. Develops and maintains moderately complex data models, schema designs, and data dictionaries for shared data models and integrations. Documents design, mappings, and data flow diagrams for knowledge sharing and compliance purposes.
  13. Cross-functional Collaboration: Partners with DBAs for schema design, indexing, and database performance optimization. Collaborates with Analytics Engineers to prepare analysis-ready datasets. Works with BI Analysts to ensure data availability for dashboards and operational reporting. Coordinates with external partners and functional units to articulate technical requirements.
  14. Data Privacy and Security: Ensure compliance with data privacy regulations (e.g. FERPA, IS3) by developing and enforcing data protection policies. Collaborate with legal and compliance teams to manage data access permissions, masking, and encryption. Establish and administer on-premises and cloud-based database security, role-based access control (RBAC), managing load operations, performance, computing and storage management, data replication and data sharing.
  15. Data Governance and Documentation: Establishes and maintains automated data quality checks, metadata capture, and data catalog contributions. Documents data pipelines, mappings, and workflow logic. Participates in campuswide data governance and stewardship efforts.
  16. Platform Innovation and Continuous Improvement: Explores new technologies, proposes automation opportunities, and contributes to architectural decisions to modernize the university's data ecosystem.

Qualifications

  • Bachelor's degree Computer Science, Computing Engineering, Information Technology, Information Systems, or related field from an accredited institution; and/or
  • Advanced degree in a related field from an accredited institution (preferred); and
  • Five years of progressively increasing responsibility and technical ability (required).
  • Solid working knowledge of advanced analytics and business intelligence concepts such as data analysis, ETL, data warehousing, data lake, enterprise big data platforms, reporting, visualization and dashboards.
  • Proven experience as an ETL/ELT Developer and Database Architect with a track record of successful project implementations. Strong SQL skills and experience working with structured and semi-structured data formats (e.g., JSON, Parquet, CSV).
  • Proficiency in modern data ingestion tools such as Informatica, Fivetran, or similar tools. Proficiency with python or similar languages to develop custom ingestion pipelines.
  • Proficiency in modern ETL/ELT tools such as Informatica, DataStage, Databricks, dbt, or similar tools.
  • Proficiency in modern data integration tools and/or Reverse ETL tools such as Mulesoft, Census, Hightouch, or similar tools. Proficiency in API implementation and management.
  • Familiarity with data modeling concepts including dimensional modeling, star/snowflake schemas, and normalization.
  • Familiarity with data quality best practices, documentation standards, and data governance frameworks.
  • Demonstrated ability to work with others from diverse backgrounds. Demonstrated effective communication and interpersonal skills. Demonstrated service orientation skills.
  • Strong communication, documentation and interpersonal skills including demonstrated ability to communicate technical information to technical and non-technical personnel at various levels in the organization.
  • Excellent problem-solving skills and the ability to troubleshoot complex data issues.
  • Strong analytical and design skills, including the ability to abstract information requirements from real-world processes to understand information flows in computer systems.
  • Knowledge of data warehousing concepts and best practices.
  • Thorough knowledge of data warehouse design, including managing schema objects (tables, indexes, and materialized views), creating reports based on the data in the data warehouse, monitoring the data warehouse's performance and taking preventive or corrective action as required.
  • Demonstrated ability to build and manage data pipelines across on-prem and cloud environments, integrating data from multiple source systems.
  • Partnering with all levels of an organization, including senior leadership, management, technical IT staff, business leaders, data stewards, and end users.
  • At least 5 years experience in data modeling, ELT/ETL, and cloud data platforms
  • Demonstrated Linux skills including security management, user management, process management, troubleshooting and debugging. Proficient in developing and managing automation, data processing, scripting, and navigating the file system.
  • Working knowledge of Oracle Data Integrator (ODI) including agent management, repository upgrades, job design, and error resolution
  • Relevant certifications (e.g., AWS Certified Data Analytics, Microsoft Certified: Azure Data Engineer, Snowflake: SnowPro Core Certification ) are advantageous.
  • Previous experience with higher education and ERP software systems such as Banner, PeopleSoft, or similar ERP systems.
  • Proficient with PL/SQL including writing efficient queries, designing database structures, and implementing stored procedures, functions, and triggers. Proficient in optimizing performance and error handling.
  • Experience with DevOps or CI/CD tools for data pipeline development and deployment.
  • Experience developing custom AWS Lambda functions for ingesting data
  • Working Condition: Exercise the utmost discretion in managing sensitive information learned in the course of performing their duties. Sensitive information includes but is not limited to employee and student records, health and patient records, financial data, strategic plans, proprietary information, and any other sensitive or non-public information learned during the course and scope of employment. Understands that sensitive information should be shared on a limited basis and actively takes steps to limit access to sensitive information to individuals who have legitimate business need to know. Ensure that sensitive information is properly safeguarded. Follow all organizational policies and laws on data protection and privacy. This includes secure handling of physical and digital records and proper usage of IT systems to prevent data leaks. The unauthorized or improper disclosure of confidential work-related information obtained from any source on any work-related matter is a violation of these expectations.

Salary

The salary range the University reasonably expects to pay for this position is $87,000 - $115,000.

How to Apply

An online application is required for each position to apply. To apply, visit https://careerspub.universityofcalifornia.edu/psc/ucm/EMPLOYEE/HRMS/c/HRS_HRAM_FL.HRS_CG_SEARCH_FL.GBL?Page=HRS_APP_JBPST_FL&JobOpeningId=79591&PostingSeq=1&SiteId=25&languageCd=ENG&FOCUS=Applicant

For applicants with disabilities who need additional assistance using TAM, or reasonable accommodations during the interview or search process, please contact ucmjobs@ucmerced.edu.

10

Whoops! This job is not yet sponsored…

I own this job - Please upgrade it to a full listing

Or, view more options below

View full job details

See the complete job description, requirements, and application process

Stay on their radar

Join the talent pool for AcademicJobs.com

Join Talent Pool

Express interest in this position

Let AcademicJobs.com know you're interested in Data Engineer (Hybrid)

Add this Job Post to FavoritesExpress Interest

Get similar job alerts

Receive notifications when similar positions become available

Share this opportunity

Send this job to colleagues or friends who might be interested

149 Jobs Found

Harvard University

Harvard University, Cambridge, MA, USA
Staff / Administration
Add this Job Post to Favorites
Closes: Dec 27, 2025
View More