Wednesday, September 17, 2025

Jeddah - Data Architect, Cloudera Architect, Informatica + Cloudera (Developer), Data Analyst – Business & Data Mapping (Teradata FSLDM), Teradata Developer

Client: Leading software company
Permanent positions
Job Location: Jeddah, Saudi Arabia
Salary: 40k SAR and above based on the experience for Architect roles and 25k SAR and above based on experience for developer position.
Follow us for more such Indian & International Job posts: https://whatsapp.com/channel/0029Vb6V9i18aKvQRFVE4i24

Mandatory Requirement: Valid Iqama

Please look for locally available candidates who are immediately available within Saudi Arabia and hold a valid transferable Iqama for immediate onboarding.

Position

Positions

Location

Joining Timeline

Data Architect

2

Jeddah

Immediate

Cloudera Architect

2

Jeddah

Immediate

Informatica + Cloudera (Developer)

3

Jeddah

Immediate

Data Analyst – Business & Data Mapping (Teradata FSLDM)

2

Jeddah

Immediate

Teradata Developer

3

Jeddah

Immediate

Teradata Developer – 3 Positions

Responsibilities:

 Demonstrated strong expertise in Teradata ETL architecture and advanced SQL

development, with the ability to handle large-scale data processing and transformation

tasks efficiently.

 Independently analyze, design, develop, test, and execute assigned modules, taking full

ownership and ensuring timely delivery with a proactive attitude.

 Design and maintain Teradata-based data warehousing solutions, including the

development of complex SQL queries, performance tuning, and schema design for large

datasets.

 Actively engage in performance optimization and query tuning to ensure efficient data

retrieval and processing in Teradata environments.

 Work collaboratively with developers, business analysts, QA teams, and stakeholders to

deliver reliable and scalable data integration solutions.

 Participate in various levels of testing, including unit testing, system integration testing

(SIT), and regression testing, ensuring solution quality and stability.

 Hands-on experience working with Teradata Financial Services Logical Data Model

(FSLDM) and creating source-to-target mapping documents to align with enterprise data

standards.

 Continuously enhance technical skills by staying updated with the latest Teradata platform

capabilities and related data warehousing technologies.

Data Analyst – Business & Data Mapping (Teradata FSLDM) – 2 Positions

Role Summary

This role combines data analysis, business requirement gathering, and data mapping expertise for

projects built on the Teradata Financial Services Logical Data Model (FSLDM). You will work across

business stakeholders, data architects, and ETL teams to translate financial services requirements

into accurate source-to-target mappings and analytics-ready datasets.

Key Responsibilities

Business Analysis & Stakeholder Engagement

 Collaborate with business teams to gather, analyse, and document requirements for reporting, analytics, and regulatory needs.

 Conduct workshops to understand banking/financial services processes and map them to FSLDM subject areas.

 Create functional specifications, process flows, and business glossaries.

Data Analysis & Profiling

 Perform data profiling to assess source data quality, completeness, and integrity.

 Identify data gaps and recommend remediation strategies.

 Analyse existing FSLDM structures to determine fitment for business requirements.

Data Mapping & Transformation Rules

 Develop Source-to-Target Mapping (STTM) documents aligning source systems (e.g., BANCs,

LOS, CLOS, MUREX) with FSLDM entities and attributes.

 Define data transformation logic, lookup references, and business rules for ETL developers.

 Ensure mappings comply with data governance, regulatory, and naming standards. 

FSLDM Model Understanding

 Leverage FSLDM’s logical/physical model to align business needs with enterprise data architecture.

 Identify required extensions or customizations to FSLDM.

 Support integration with analytical layers like FSAS.

Testing & Validation

 Support SIT/UAT by validating data outputs against business rules and source data.

 Coordinate defect triage and resolution with development teams.

Governance & Documentation

 Maintain mapping inventories, data dictionaries, and lineage documentation.

 Follow governance standards for metadata, change control, and versioning.

Category

Details

Domain Expertise

5–10 years in data/business analysis within banking or financial services; exposure to regulatory reporting (Basel, BCBS 239).

FSLDM Knowledge

Understanding of Teradata FSLDM or other financial services data models.

Data Mapping Skills

Strong in creating STTMs, transformation rules, and mapping documentation.

Data Analysis

Proficient in SQL for data extraction, profiling, and validation (Teradata, Oracle, or similar).

Business Analysis

Experience in requirement gathering, stakeholder management, and functional documentation.

ETL Collaboration

Ability to work with developers using Informatica, DataStage, or Talend.

Tools

ERwin (or similar modeling tools), Excel, PowerPoint, Confluence, Jira.

Soft Skills

Strong communication, presentation, and analytical thinking skills.

Preferred Qualifications

 Teradata Vantage experience.

 Experience with metadata tools (IBM Information Governance Catalog or Informatica

Enterprise Data Catalog (EDC) or erwin Data Intelligence (DI)).

 Exposure to data governance and quality frameworks.

 Knowledge of multiple source systems in banking (payments, trade finance, credit, GL).

Job Description – Informatica Developer with Cloudera – 3 Positions

Role: Informatica Developer with Cloudera Ecosystem

Experience: 7–12 years

Employment Type: Full-time

Role Overview

We are looking for an experienced Informatica Developer with Cloudera expertise to design,

develop, and optimize enterprise-level ETL processes and big data solutions. The ideal candidate will

have strong hands-on experience with Informatica PowerCenter for data integration along with 

Cloudera Data Platform (CDP) for large-scale data lakehouse implementations. The role requires

expertise in building secure, scalable, and high-performance ETL workflows and infrastructure within

regulated industries such as banking or financial services.

Key Responsibilities

 Design and implement ETL pipelines using Informatica PowerCenter to support enterprise

data warehousing and analytics initiatives.

 Develop robust ETL mappings and workflows to extract, transform, and load data from

diverse structured and unstructured sources into Cloudera-based data lakehouse environments.

 Architect and configure Cloudera platform infrastructure to support ingestion, processing,

and storage of large data volumes (structured & unstructured).

 Lead the setup and optimization of HDFS, Hive, Spark, Impala, Ranger, Atlas, and Iceberg-

based tables within Cloudera.

 Troubleshoot, monitor, and optimize ETL pipelines and Cloudera clusters for performance,

scalability, and reliability, ensuring adherence to SLAs and data quality standards.

 Collaborate with data architects, business analysts, and cross-functional stakeholders to align solutions with business needs.

 Work with security teams to implement Kerberos, Ranger policies, encryption, and secure

access protocols.

 Ensure high availability, scalability, and disaster recovery setup for DLH environments.

 Apply infrastructure-as-code practices using Ansible, Terraform, or similar tools.

 Provide production support, perform root cause analysis, and coordinate with Cloudera vendor teams when required.

 Document architecture, ETL processes, and best practices for knowledge sharing and compliance.

Required Skills & Experience

 Strong hands-on experience in Informatica PowerCenter (ETL design, development, performance tuning).

 5+ years of experience with the Cloudera Data Platform (CDH/CDP) ecosystem including

HDFS, Hive, Spark, Impala, Ranger, Atlas, Kafka, etc.

 Expertise in designing and supporting large-scale, high-throughput data platforms

(preferably in banking/financial services).

 Strong SQL/PL-SQL skills for data analysis, debugging, and validation.

 Experience in platform architecture, cluster configuration, capacity planning, and performance optimization.

 Knowledge of Linux system administration, shell scripting, and configuration management tools.

 Familiarity with cloud-native Cloudera deployments (AWS/Azure/GCP) is an added advantage.

 Excellent documentation, collaboration, and communication skills.

Preferred Attributes

 Experience with Data Lakehouse architectures on Cloudera within BFSI or regulated domains.

 Exposure to streaming pipelines, data governance frameworks, and production support workflows.

 Strong background in security implementation (Kerberos, encryption, Ranger policies).

 Ability to mentor teams and guide disaster recovery testing and failover design.

Data Architect – Data Lakehouse Transformation – 2 Positions

Key Responsibilities:

· Design and define the end-to-end architecture for the Data Lakehouse solution covering Bronze,

Silver, and Gold layers, metadata management, and data governance.

· Lead data platform modernization initiatives involving migration from legacy DWH to modern

Cloudera-based architecture.

· Translate business and functional requirements into scalable data architecture solutions.

· Collaborate with engineering, platform, analytics, and business teams to define data flows, ingestion strategies, transformation logic, and consumption patterns.

· Ensure architectural alignment with enterprise data standards, security guidelines, and regulatory

requirements.

· Define data modeling standards and oversee data modeling efforts across layers (relational and big data).

· Partner with the implementation oversight partner to review and validate logical and physical data

models.

· Drive architecture reviews, performance tuning, and capacity planning for the data ecosystem.

· Guide and mentor data engineering teams on architectural best practices.

Required Skills and Experience:

· 12+ years of experience in data architecture, data platform design, or enterprise architecture roles.

· Strong hands-on experience in Cloudera (Hadoop ecosystem, Hive, HDFS, Spark), Teradata,

Informatica PowerCenter/IDQ, and SQL-based platforms.

· Deep understanding of data ingestion, curation, transformation, and consumption in both batch

and near real-time.

· Banking industry experience with familiarity across domains such as retail, corporate banking,

credit risk, finance, and regulatory reporting.

· Proficiency in designing for scalability, performance optimization, and data security/compliance.

· Solid experience with data lakehouse concepts, open table formats (Iceberg/Delta), and layered

architectures.

· Experience integrating BI/reporting platforms (e.g., Power BI, Cognos) and downstream data

products.

Preferred Attributes:

· Experience with Kafka/NiFi for streaming ingestion and orchestration tools like Control-M or

Airflow.

· Knowledge of metadata, lineage, and data catalog tools.

· Familiarity with hybrid deployment models (on-prem and cloud) and DevOps/DataOps pipelines.

· TOGAF, CDMP, or DAMA certification is a plus.

Cloudera Architect - 2 Position

Role Overview:

We are looking for a highly experienced Cloudera Infrastructure Architect to design, configure, and

oversee the infrastructure setup and optimization for a large-scale Data Lakehouse (DLH) program.

The ideal candidate will have extensive experience with Cloudera Data Platform (CDP) on-premises

and cloud deployments, strong infrastructure and security skills, and a background in implementing

high-performance, secure, and scalable data platforms within the banking or financial services

domain.

Key Responsibilities:

 Architect and design the Cloudera platform infrastructure to support ingestion, processing, and storage of large volumes of structured and unstructured data.

 Lead the setup of Bronze and Silver layers on Cloudera including HDFS, Hive, Spark, Ranger, Atlas, Impala, and Iceberg-based tables.

 Define infrastructure topology, sizing, and cluster configuration based on workload and performance benchmarks.

 Collaborate with platform teams to ensure high availability, scalability, data security, and disaster recovery setup for the DLH environment.

 Work closely with security teams to implement Kerberos, Ranger policies, encryption, and secure access protocols.

 Provide technical leadership for installation, configuration, tuning, and troubleshooting of CDP components across development, UAT, and production environments.

 Define and enforce infrastructure-as-code practices using Ansible, Terraform, or similar tools.

 Act as the primary point of contact for Cloudera platform-related issues, performance tuning, and vendor coordination.

 Support monitoring, logging, and alerting setup using Cloudera Manager and other tools for proactive environment health tracking. 

Required Skills and Experience:

 10+ years of experience in platform infrastructure and architecture roles, including 5+ years with Cloudera (CDH/CDP).

 Deep understanding of Cloudera components like HDFS, Hive, Spark, Impala, Iceberg, Ozone, Ranger, Atlas, Knox, Kafka, etc.

 Experience designing and supporting large-scale, high-throughput data platforms for banking or other regulated industries.

 Strong experience in security implementation (Kerberos, Ranger), performance optimization, and capacity planning.

 Experience working with multi-node, on-premises or hybrid Cloudera clusters with integration to external data sources.

 Proficiency with Linux system administration, shell scripting, and configuration management tools.

 Familiarity with cloud services (Azure/AWS/GCP) and Cloudera’s cloud-native options is a strong plus.

 Strong documentation, vendor coordination, and infrastructure planning skills.

Preferred Attributes:

 Experience implementing Cloudera-based Data Lakehouse architecture in banking or capital markets.

 Understanding of integration with data ingestion pipelines, streaming platforms, and data governance tools.

 Exposure to performance benchmarking tools, synthetic data management, and production support workflows.

 Ability to guide and support platform teams in disaster recovery testing and failover design.


Important Instructions to candidates:

Please fill out the details and e-mail it to jobs@compassclock.in along with your updated resume to proceed with your application.

Name :
Email :
Mobile no:
Current CTC:
Expected CTC:
Notice Period:
DOB:
Gender:
Current Company:
Total years of experience:
Current Industry:
Designation:
Current State &City:
Linkedin id:

Reach out to the best jobs placement consultant in India Compass Clock Consultancy at jobs@compassclock.in or WhatsApp

No comments:

Post a Comment

Job Id: 525 - Retention Marketing Manager - Ecommerce - Marketing - Vacancies = 12

Job Id: 525 Role: Retention Marketing Manager Industry: Ecommerce Function: Marketing Salary (Lacs): 14.00-17.00 Exp (Yrs): 5.00-7.00 Age:...