About The Position
Freightos is the world’s largest international freight marketplace, with over $94 million raised from investors like GE Ventures, the Singapore Exchange and Aleph VC. It also has an incredible team of leaders with proven wins and already services from the world’s largest importers, freight companies, and carriers. It’s a big deal.
You know what’s more fun than bringing freight online? Surfing in the Maldives.
You know what a close second is? Helping companies around the world with actionable knowledge of freight demand, supply and pricing. Which is exactly what we’re doing with Freightos Data, a Freightos initiative to contribute business insight to international trade. Better data means better business decisions, agility and efficiency, which means that the laptop you’re reading this on will be cheaper. Freightos Baltic Index - the world’s only daily freight index - is one of our data products, (press mentions here, here, and here).
So our data isn’t just about big things (containers), it’s also big data sets. It crosses air and ocean, searches, price points, and bookings. Not to get too serious but you’re going to feel like that scene in Matrix where Neo sees the numbers that make up the world. But no, we don’t use green fonts.
We are looking for an experienced database engineer to join our team. You’ll transform raw data into useful data systems using algorithms, statistical analysis, and other Big Data-y concepts. Bottom line, you’ll turn huge data assets into a huge business value.
To succeed in this position, you should have strong analytical skills and the ability to combine data from different sources. Data-relevant development skills, like familiarity with several programming languages and knowledge of learning machine methods, are helpful.
You’ll work with stakeholders (yeah, we use big words like that) across the entire company to make it a reality. You’re basically bringing industry transparency, by:
- Analyzing and organizing raw data (container prices in, global trade trends out).
- Building data systems and pipelines (messy container prices in, clean container trends out).
- Interpreting trends and patterns (like that scene in A Beautiful Mind but with less background music).
- Conducting complex data analysis and reporting on results.
- Combining raw information from different sources (is there a correlation between ocean freight prices and the price of advertising on Facebook? No.)
- Exploring ways to enhance data quality and reliability.
- Identifying opportunities for data acquisition to help make our data set even better.
- Developing analytical tools and programs.
- Degree in Computer Science, IT, or an equivalent combination of education, training and experience..
- Experience with:
- Data management in databases such as MySQL, MSSQL, BigQuery.
- SQL and Python.
- Great numerical and analytical skills.
- Business Intelligence / Data Visualization platforms (e.g. Power BI, Tableaus, DataStudio).
- Excellent English communication skills (written and spoken).
- Data engineering certification (e.g Google Professional Data Engineer).
- Experience ETL tools/data orchestration platforms (Airflow, Luigi).
- Basic familiarity with supply chains.
- Experience with Pandas, NumPy, TensorFlow.