Responsibilities:
Work with stakeholders including the Executive, Product, Data and Design teams to assist with data- related technical issues and support their data infrastructure needs.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
Work with data and analytics experts to strive for greater functionality in our data systems.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS Big Data technologies.
Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
Assemble large, complex data sets that meet functional / non- functional business requirements.
Create and maintain optimal data pipeline architecture.
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re- designing infrastructure for greater scalability, etc.
Requirements:
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Experience with object- oriented/object function scripting languages: Python or Java.
Having knowledge in the financial market, or the stock market will be a plus (but don’t worry if you haven’t, you can learn all about how to be a successful stock investor after joining us!).
Experience with big data tools: Hadoop, Spark, Hive, etc.
Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
Strong analytic skills related to working with unstructured datasets.
A successful history of manipulating, processing and extracting value from large disconnected datasets.
Experience with relational SQL and NoSQL databases including Postgres, SQL Server, Oracle, MySQL, Solr, Elasticsearch, Cassandra.
Experience with data pipeline and workflow management tools: Luigi, Airflow, etc.
Build processes supporting data transformation, data structures, metadata, dependency and workload management.
Experience with stream- processing systems: Storm, Spark- Streaming, Kafka, etc.
Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores.
Working location:
VCCI Tower, 9 Dao Duy Anh Street, Dong Da, Ha Noi
Benefits
Development opportunities
Working in the environment with talented young engineers with experience in developing financial and securities products
To be consulted and shared about the roadmap for capacity development and work at the company
Remuneration
Consider annual income changes
Competitive income (including 13 months salary + holiday bonus, Tet + job efficiency, annual business bonus);
Receiving 100% of the salary probation if the job is passed in the first 2 months
Work environment
Working in Grade A office area
Free use of diverse technology books available in the work area
Microwave ovens, refrigerators, coffee machines are always ready for all needs
Equipped with high- profile computers for work
Health care
Annual health check- up at prestigious hospital
Social insurance, health insurance, unemployment insurance under labor law
Culture
Teambuilding, Dclub, Volunteering
Developing community cultural programs, courses, sharing and spreading knowledge
Always appreciate personal intelligence, collective, creative ideas to develop the team and company
Dynamic working environment, companion development
Contact
Send CV information to tam.phamminh@vndirect.com.vn with the title: Data Engineer_Name or contact via Skype: phamminhtam.hr to find out more about the vacancy.
Ứng tuyển ngay