Hi! I'm Peter Lee
My experiences include data Infrastructure, data science, and backend.
Not only work on the public cloud, but I also like to design and deploy pipelines in my local server. Recently, I deploy the Kubernetes cluster with the services which I used in the building pipeline. Like Apache Airflow, Gitlab CI/CD, Private Docker registry, and Apache superset. This process let me know more detail about these services.
With passion for software development, I like to share my project with the open source community.
Design and deploy our pipeline on premise server. (Hive, Spark, Airflow)
Build data warehouse infrastructure and schema. (ClickHouse Cluster)
In data team, I am responsible for building up data infrastructures and designing pipelines. I have huge passion for learning new knowledge and sharing with members. In addition, I am expert in optimizing the performance of a system by introducing new technology.
Data Collection and Exchange:
1. Design different algorithms and pipelines for automatic data exchange with business partners.
2. Design the backend system for data collection of Data SDK. The system reduces the time to update data from a day to an hour and achieves a cost reduction of 40% for ETL machines.
3. Deploy Elastic Stack to validate and visualize data from each new client in real time.
1. Introduce the Kubernetes Engine to automate the process of deployment and monitoring, significantly reducing the cost by 75% and the personnel expenses by 20 hours.
2. Provide advices about architecture design as well as deployment of Kubernetes with Istio, Cloud Function, Apache Airflow, Compute Engine, Serverless service and so on.
3. Develop internal bot notification system to provide instant error notifications of ETL states and Health Check Monitor.
1. Build up a platform for highly customized tagging services. The platform gives each individual ID different kinds of tags such as personal interests, language, age, and gender and reports the tagging rate of each tag.
2. Optimize the pipeline of data update to directly generate files and reports and achieve a personnel expense reduction from 2 weeks to less than 1 hours.
3. Replace previous VM/EMR environment for ETL with Apache Airflow and Kubernetes CronJob to make ETL pipeline more flexible and cost-effective. I also design new tools for team, it shows the cost(size) of every query.
Continuous Integration and Deployment
1. Design the complete CI/CD pipeline including building Docker images, deploying to Container Registry, 2-hours deploying images on the Kubernetes cluster, and deploying images on application environment after conducting unit tests.
2. Build up Jenkins on the GKE and design related architectures to automatically run ETL pipelines each day.
Migrate existing projects and the ETL pipeline from AWS to GCP. Transfer data from Hive and S3 to GCP. Set up a Gitlab server in GCP. Build up ETL pipelines for effective and efficient data migration, which is very important for team members to access the latest data on GCP.
PDIS is a government organization in Taiwan, it also has another name: Executive Yuan - Ministry of Digital - Audrey Tang's 唐鳳 Office. She leads the PDIS team to help our government. We incubate and facilitate public digital innovation for government.
1: As a software developer: I use Line BOT API to connect our internal systems to make an easy-to-use app to help our colleagues saving their time. I also designed API interfaces to allow other internal systems to hook up with our bot system.Up to now, it already bridged across our meeting reserve system and electronic bulletin board system.
2: About productive and quality: I designed a software that can display real-time subtitle during streaming. To meet the schedule, I took only one week to make it from idea to prototype. This system now works perfectly in all of our video conferences.
3: As a quick learner and contributor: I involved in many open source projects. Like Sandstorm(A open source private cloud with the container), Rocket.Chat(A open source chat app which like Slack). We have fixed many issues and add lots of features, and giving back to the community.
4: Eager to learn everything: when I have free time, I like to think about how to make the system more efficient. I still find more possibility to improve the current system. Like the subtitle display system, I would like to let some part of the system can automatically work. Let the computer can label the people with their name in real-time and display the information on the subtitle. On the road of development, I fixed various issues and bugs of Tensorflow. NVIDIA give me an award as the outstanding community developer in this April.
I was an iOS Engineer Intern when I was an undergraduate student. I developed some interesting project, like Geofenece App, LBS Service framework and customize UI.
Google Cloud platform
信学技報, vol. 117, no. 184, SC2017-13, pp. 1-6, 2017年8月.
SC2017-13 2017-08-18 (SWIM, SC)
2018 April - Outstanding-jetson-developer-community-contributions, Porting TensorFlow for NVIDIA Jetson.
2016 March - 2017 March
Master Degree of Computer Science.
Double Degree Program.
Bachelor of Engineering (B.E.) Computer Science.
Embeded System Lab
Master Degree of Computer Science.
Cloud Computing Lab