1. understand customer business, identify needs and participate in the delivery of international big data projects;
2. designed data model and developed data etl based on maxcomputer and hadoop;
3. participated in the design and optimization of system architecture, and independently completed the writing of system analysis documents;
1. lead the design, development and support of database, bigdata and data warehouse solutions.
2. develop conceptual and physical model data models.
3. understand and work with multiple data sources to meet business rules and supports data scientist analytical needs.
4. develop code on big data analytics and large data processing using python, hadoop, and hive
5. develop bigdata eco systems components like hbase. hdfs, zookeeper, hive and mapreduce
6. develop bigdata solutions with nosql databases like hbase
7. design and model data warehouse and bi applications
8. develops etl processes, etl control tables, error logging, auditing, data quality, etc.
9.experience of business system microservice design based on springboot/springcloud technology.
10.experience of message middleware and distributed transaction management internet technology.
- 中国浙江省杭州市滨江区 网商路699号