职位详情
年终奖金
五险一金
弹性工作
带薪年假
团队聚餐
节日礼物
定期体检
What you’ll do:
Work with solution architect, technical lead and development team, design, build and run data platform on cloud and on-premise.
Selecting and integrating any Big Data tools and frameworks required to provide requested platform capabilities.
Develop and manage integration tools to facilitate effective platform operations across other disciplines of production service management.
Provide proactive support to platform tenants.
What you will need to succeed in the role:
Candidate should hold a university graduate certificate majored in IT or equivalent experience.
A minimum 2 years of significant working experience in software requirement, design, development, testing and support.
Experience on Cloud technologies, includes but not limited on Docker, Kubernetes. Project experiences on Google Cloud Platform is a plus.
Deep understanding on the Cloud Infrastructure Architecture which includes Network, OS, permission control etc.
Strong experiences in Python/Java/Shell development. Full stack data engineer is preferred.
Strong experience on design and implement DevOps Continuous Integration / Continuous Delivery (CI/CD) Pipleline with DevOps tooling such as Jenkins, Git, Ansible, etc
Familiar with open source stacks like Airflow, Spark, Flink, Apache Beam, Kafka, Trino, .etc
Experiences in Hadoop ecosystem, deep understanding on the Big Data Architecture is a plus.
Good command in spoken and written English as working language
Work closely with both local and global teams on the project delivery with agile/DevOps manners.
A good team player with strong analytical/troubleshooting skills, and self-motivated"
其他信息
语言要求:英语