Requirements:
- Bachelor’s or Master’s degree in Computer Science, Software Engineering, Machine Learning, Statistic, Data Science or other related technical disciplines
- 8 to 12 years of relevant experience in demonstrating a high degree of proficiency in designing and developing complex, high-quality big-data platforms serving analytical solutions, including 3+ years in a Big-Data Architecture role designing technology architectures, detailed workflows, best practices and governance guidelines
- Experience in Agile Software Development and having a strong understanding of Agile principles, practices and Scrum methodologies
- Proven track record of delivering scalable, resource intensive, integrated and operational big-data and analytical solution; experience creating point of views, roadmaps, and process diagrams
- Hands-on experience working with open-source big-data/data-lake like Hadoop, Spark, Kafka, SQL/NoSQL K-V/Columnar/Graph, ESB and analytical technologies like R/Python, TensorFlow and/or BI tools like Tableau/Power BI; Cloud analytics platform experience preferred
- English proficiency requirements are pursuant to Techcombank&039;s policy
Utilises new and advanced statistical models to enable the production of business insights and intelligence.
Computation Modelling - Level 2
- Explain the features and applicability of various data models to junior team members.
- Conduct statistical modelling of data to derive patterns / solutions.
- Run linear regression models and different types of algorithms for analysis.
Leads hypothesis testing based on data available, derives inferences, patterns or solutions to provide a point of view and defines process improvement opportunities.
Data Analysis - Level 4
- Evaluate prospective analytical tools and platforms for their functional capabilities and ability to meet requirements of the analytc environment.
- Build algorithms and work with machine learning/deep learning tools to deliver advance analytics solutions.
- Differentiate among data analytic approaches (descriptive / diagnostic / predictive / descriptive analytics).
- Lead the identification and interpretation of meaningful and actionable insights from large data and metadata sources.
- Demonstrate hypothesis testing and explain the statistical significance.
Data Management - Level 4
Shares the latest development of data management and translates complex data extraction issues into workable solutions for the team.
- Review processes and tools designed to monitor and analyze model performance and data accuracy.
- Collaborate with others Data Scientists and Engineers to build complex, technical algorithms in data analytics software applications to improve work efficiency.
- Oversee the consolidation and extraction of unstructured and diverse Big Data sources.
Reviews the policies and processes to integrate, access, share and link information across the organization to drive operational effectiveness.
- Lead discussions to understand data requirements and create re-usable data assets for faster deployment of machine learning models.
- Design, build, and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real time analytics platform.
Data Management - Level 4
Shares the latest development of data management and translates complex data extraction issues into workable solutions for the team.
- Collaborate with others Data Scientists and Engineers to build complex, technical algorithms in data analytics software applications to improve work efficiency.
- Oversee the consolidation and extraction of unstructured and diverse Big Data sources.
- Review processes and tools designed to monitor and analyze model performance and data accuracy.
Reviews the policies and processes to integrate, access, share and link information across the organization to drive operational effectiveness.
- Design, build, and maintain optimized data pipelines and ETL solutions as business support tools in providing analysis and real time analytics platform.
- Lead discussions to understand data requirements and create re-usable data assets for faster deployment of machine learning models.
Metadata Management - Level 3
Implements the policies and processes to integrate, access, share and link information across the organization.
- Obtain and integrate data and information from various sources into the firm’s platforms, solutions and statistical models.
- Ensure that data assets are organized and stored in an efficient way so that information is easy to access and retrieve.