Truelogic is a leading provider of nearshore staff augmentation services, located in New York. Our team of 500 tech talents is driving digital disruption from Latin America to the top projects in U.S. companies. Truelogic has been helping companies of all sizes to achieve their digital transformation goals.
Would you like to make innovation happen? Have you ever dreamed of building Products that impact millions of users? Nice! Then we have a seat for you on our team!
What are you going to do?
You will have the opportunity to work in a forward-thinking and growth-oriented environment, at a company that ignites opportunity by setting the world in motion. Take on big problems to help drivers, riders, delivery partners, and eaters get moving in more than 10,000 cities around the world.
Occupy a unique position in the market, at the Corp Data team that ensures data-informed decision-making throughout the People & Places organization. They provide critical data and insights to better attract, develop, motivate, and retain most important asset – people, in a broad spectrum of people issues: recruiting, learning and development, organizational health, employee engagement, diversity, total rewards, retention, measurement of program effectiveness, and all things throughout the worker life cycle and related to People & Places.
- You are a self-starter with industrial experience in SQL, Data Modeling, and ETL pipeline design.
- You have experience implementing ETL pipelines in Hive, Spark or another MPP database architecture.
- You are comfortable with Spark and Presto, have used one or both frequently to process very large volumes of data.
- You are comfortable coding in Python, Java, or Scala.
- You have demonstrated competency in reliably operating hundreds of ETL pipelines with adherence to strict SLAs and quickly root-causing and correcting complex data problems.
- Detail-orientation, thoroughly tested code, and great documentation are the hallmarks of your work but you excel equally well at explaining concepts in “big picture” terms to a less technical audience.
What will help you succeed
- 3+ years of Data Engineering Experience – Design, develop, deliver and maintain data infrastructures.
- SQL Specialist – Strong knowledge and Seasoned experience with SQL Queries (strong in outer joins, aggregations, unions, and knowledge of window functions).
- Data Modeling Experience.
- Languages: At least one of Python, Java, Scala, Go.
- Solid ETL experience.
- Able to troubleshoot the data issues in dashboarding tools and suggest solutions.