0 suggestions are available, use up and down arrow to navigate them
What job do you want?

Technical Architect/Tech Lead job in Singapore at ANTAES ASIA PTE. LTD.

Create Job Alert.

Get similar jobs sent to your email

List of Jobs

Apply to this job.
Think you're the perfect candidate?
Technical Architect/Tech Lead at ANTAES ASIA PTE. LTD.

Technical Architect/Tech Lead

ANTAES ASIA PTE. LTD. Singapore Permanent

Job Description:

- Contribute to IT projects in the banking industry for Antaes clients

- Provide the architectural view of the solution and will be responsible for the technical documentations necessary for validations in the technical committees. (e.g , PPC, ARC, S-CAM, CAM)

- Responsible for the TAD (Technical Architecture Document)

- Responsible for ensuring the security requirements adheres to group and project standards.

- Ensure the solution is in-line with the TP2024 vision.

- Provide analysis/evaluation of technical solutions recommended by the group to facilitate management decision.

- Provide architectural roadmap (planning) for software/platform upgrades (due to obsolescence or for migration to the target platform).

- To engage the other technical stakeholders within project (e.g. DOMAIN ARCHITECT, Conformity Cell, Security Team, Application Integration and Production team) in the course of work.

- Responsible for implementing and enforcing the project Governance as the development framework to optimize IT activities & processes. In so doing, to assist the organization in attaining the level of quality as determined by the IT Management.

- Contribute to the promotion of Antaes services on top of assistance provided to clients

Job Requirements:

- Bachelor’s Degree, with 12 years of total IT experience

- 3 years as Senior Architect Role on Data Intensive Platform.

- 5 years of hands on experience working as Technical Lead &/ Senior Data Engineer

Strong Architecture & Design Experience in

Data Platform - Data Lake, Data Warehouse, Data Management, BI Analytics & Data Science Platform

Data Engineering Workloads - Sourcing, Ingesting, Distributed Layered Storage, History, Warehousing, Data Mart

Data Sourcing Approach - Batch, Real-time Streams, Change Data Capture, Bulk REST API

Data Modelling design covering Landing-Base-Quality-Transformed, Dimension (SCD), Facts, Star Schema

Data Storage approach covering Interim, Base, History, Archive, Formats, Schema registry, Schema Evolution

Designing Interactive, intuitive & fast User Consumption & Semantic layers using Data Hub & BI concepts

In-Depth understanding of Low level OS/DB/Server Security plus Data Security - at rest, In Motion & at use

System tool Integration experience with DataLake, DWH, DM, BI, DS Platform following enterprise guidelines

In-Depth Internal working know-how of Clustered Distributed Architecture covering Resource Management, Data Partitioning, Fault Tolerance etc.

Data Platform Management concepts - Scalability, HA, Failover, DR, monitoring, Tuning, Workload, performance, troubleshooting, release

Implementation experience overseeing setup of Platform Toolscovering Security Hardening, Multi-tenant, HA, DR Cluster, HA

Deployment Architecture implementation experience in Hybrid system with mix of On-premise (VM/PM), Private - Public Cloud

Designing Data Operations workloads covering Orchestration, Containers (Kubernetes, Docker & Open Shift) & CI/CD Pipeline

Technical Roadmap detailing for Data Platform covering Obsolescence, Upgrades (Major / Minor), Re-Architecture, Re-Platforming

As Technical lead

Data Project Implementation platform-tools Experience as a Technical Lead for Data Engineers

Big Data Hadoop - Hortonworks HDP 3.1.x & core Components covering data engineering, management, operation stack

Big Data Hadoop - Detailed Knowledge on HDP/CDH Migration to new Cloudera CDP platformData Storage – HDFS (File Format – Parquet, ORC, Avro, JSON), Hive (Schema, Partitioning), Data Lake (Object Store), NoSQL (MongoDB, HBase)

Access & Data Security – AD-LDAP-SAML- Kerberos-2FA IDP AuthN plus Data security through encryption, masking, filtering, anonymization

Access & Data Security – Tableau (Row Level Security, Access Management), Apache Ranger (Access Policy, Masking, Audit), OS Level (RHEL & Centrify)

Strong hands on experience in Processing Framework - Spark 2.x/3.x (Core, structured API, Streaming, MLLib)

Language & Package - Python (Scripting & PySpark), Scala using Spark API, Unix Shell, SQL Query (basic & advanced)

Data Integration tool - Informatica BDM/DEI 10.4.x, Apache Nifi (API, Kafka, JDBC based Ingestion) Data Governance - Informatica EDC, BDQ & Collibra covering Data Scanner, Catalog, Lineage, relationship, Dictionary (tech-biz)

Streaming Platform – Apache Kafka, Apache Nifi, Spark Streaming, Flink, Storm

BI Analytics – Tableau Server & Creator, QlikView, Power BI, BO

Advanced Analytics - RStudio, Zeppelin, Jupyter (Preferred R & Python skills)

Cloud – Migration to Public-Private cloud provider like IBM, AWS

OS & DB - RHEL, Centrify, Oracle, PostgreSQL, MongoDB, SQL Server

DevOps/DataOps - Bitbucket, Jenkins, Docker, Kubernetes, AutoSys

Strong Experience in Leading & guiding developers in standards process & tools used in design, development, testing & deployment


Recommended Skills

  • Api
  • Access Controls
  • Apache Flink
  • Apache H Base
  • Apache Hadoop
  • Apache Hive
Apply to this job.
Think you're the perfect candidate?

Help us improve BrightMinds by providing feedback about this job:

Job ID: 1_c1745a74fdf484770d879dd56cc9c3

BrightMinds TIP

For your privacy and protection, when applying to a job online, never give your social security number to a prospective employer, provide credit card or bank account information, or perform any sort of monetary transaction. Learn more.

By applying to a job using BrightMinds you are agreeing to comply with and be subject to the BrightMinds Terms and Conditions for use of our website. To use our website, you must agree with the Terms and Conditions and both meet and comply with their provisions.