OKX will be prioritising applicants who have a current right to work in Singapore, and do not require OKX's sponsorship of a visa.
 

Who We Are

 
At OKX, we believe that the future will be reshaped by crypto, and ultimately contribute to every individual's freedom.
 
OKX is a leading crypto exchange, and the developer of OKX Wallet, giving millions access to crypto trading and decentralized crypto applications (dApps). OKX is also a trusted brand by hundreds of large institutions seeking access to crypto markets. We are safe and reliable, backed by our Proof of Reserves.
 
Across our multiple offices globally, we are united by our core principles: We Before Me, Do the Right Thing, and Get Things Done. These shared values drive our culture, shape our processes, and foster a friendly, rewarding, and diverse environment for every OK-er.
 
OKX is part of OKG, a group that brings the value of Blockchain to users around the world, through our leading products OKX, OKX Wallet, OKLink and more.
 
 

About the team:

OKX data team is responsible for the whole data scope of OKG, from techincal selection, architecture design, data ingestion, data storage, ETL, data visualization to business intelligence and data science. We are data engineers, data analysts and data scientists. The team has end-to-end ownership of most of the data at OKx throughout the whole data lifecycle including data ingestion, data ETL, data warehouse and data services.  As a data engineer of the team, you will work with the team to leverage data technologies to empower evidence-based decision-making and improve the quality of the company's products and services.

 

Responsibilities:

  • Design and build resilient and efficient data pipelines for both batch and real-time streaming data
  • Architect and design data infrastructure on cloud using industry standard tools
  • Execute projects with an Agile mindset
  • Build software frameworks to solve data problems at scale
  • Collaborate with product managers, software engineers, data analysts and data scientists to build scalable and data-driven platforms and tools
  • Ensure data integrity and scalability through enforcement of data standards. Improve data validation and monitoring processes to proactively prevent issues and quickly identify issues. Drive resolution on the issues.
  • Define, understand, and test external/internal opportunities to improve our products and services.

 

Requirements:

  • Bachelor’s Degree in Computer Science or have equivalent professional experience
  • Solid Experience with data processing tools such as Spark, Flink
  • Solid Experience implementing batch and streaming data pipelines
  • Solid experiences in Python/Go/Scala/Java.
  • In-depth knowledge of both SQL and NoSQL databases, including performance tuning and troubleshooting
  • Familiar with DevOps tools such as Git, Docker, k8s
  • Experience with the cloud (e.g. AWS, Ali Cloud, GCP, Azure)
  • Be proficient in SQL, familiar with advanced SQL features such as window functions, aggregate functions and creating scalar functions/user-defined functions.
  • Proven successful and trackable experience in full end-to-end data solutions involving data ingestion, data persistence, data extraction and data analysis.
  • Self-driven, innovative, collaborative, with good communication and presentation skills
  • Fluent in English, both written and spoken.

 

Preferred Qualifications:

  • Experience in FinTech, eCommerce, SaaS, AdTech, or Digital Wallet business industries.
  • Experience in working with teams across offices and timezones is a plus.
  • Experience in big data tools such as Amplitude/Tableau/QlikView, Ali Cloud DataWorks, MaxCompute, Hadoop, Hive, Spark and HBase is a big plus.

More that we love to tell you along the process!