JOB DESCRIPTION
• Design and configuration of data movement, streaming and transformation (ETL) technologies such as Azure Data Factory, HDF, Nifi, Kafka, Storm, Sqoop, SSIS, LogicApps, Signiant, Aspera, Alteryx, Pentaho, Alooma, Airflow.
• Participate in deep architectural discussions to build confidence and ensure customer success when building new solutions and migrating existing data applications on the AWS, Azure or GCP platforms • Implement product features and refine specifications with our Data scientists, Data Analysts and Data Architects
• Highly technical and analytical, possessing 10 or more years of Datawarehouse and/or Analytics Systems development and deployment experience, IT systems and engineering experience, security and compliance experience, etc.
• Build cloud data solutions and provide domain perspective on storage, big data platform services, serverless architectures, Hadoop ecosystem, RDBMS, DW/DM, NoSQL databases, analytics services on public cloud platforms and security.
• Work with customers and the wider Rackspace organizations on re-thinking and redesigning IT Data landscapes using cloud-native technologies
• Minimum of 6 years of experience in architecture of modern data Warehousing platforms using technologies such as Big Data and Cloud
• Design and write excellent, fully tested code to build ETL /ELT data pipelines and stream on a public Cloud platform.
• Evangelize the cloud-native paradigm through the delivery of blogs, customer presentations and public speaking engagements
• Design and configuration of data movement, streaming and transformation (ETL) technologies such as Informatica, Nifi, Kafka, Storm, Sqoop, SSIS, Alteryx, Pentaho, Alooma, Airflow.
• Lead and deliver workshops, ideation and strategy sessions for customers working to solve their business use cases with innovative technology solutions
• Mentor and train other architects/Engineer within the wider Rackspace community on modern cloud-native Data Technologies
• Architecture and implementation experience in data ingestion and processing technologies (such as Kafka, Informatica, Data Factory, Alooma, Airflow, Data Quality and Metadata Management).
• Excellent technical architecture skills, design and deliver innovative Proof of Concepts for customers.
• Creation of descriptive, predictive and prescriptive analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS.
• Large scale design, implementation and operations of OLTP, OLAP, DW and NoSQL data storage technologies such as SQL Server, Azure SQL, Azure SQL DW, PostgreSQL, CosmosDB, RedisCache, Azure Data Lake Store, Hadoop, Hive, MySQL, Neo4j, Cassandra, HBase