Senior Data Engineer
Job Description
As a Big Data developer you are responsible for building software that enables our clients and business partners to make efficient decisions with ease. You are able to design robust solutions on state-of-the-art compute engines such as Databricks, Presto, and modern data warehousing solutions (Synapse, BigQuery, Redshift or Snowflake).
Qualification
You are proficient in SQL and at least one programming language (Python/Java/Scala)
Experience with AWS data stack or Azure
You have a background in data pipelining, distributed data processing, software engineering components, and data modeling concepts (Spark)
Demonstrated analytical and problem-solving skills, particularly those that apply to a big data environment
Experience with T-SQL or PL/SQL
Experience with Python (including Numpy, Panda libraries)
Experience with Databricks, Spark
Strong Experience with design techniques of relational databases
Strong Experience with concepts such as Data Marts, Data Warehouses, Data Lakes
Very good English (verbal and written)
Experience in using Agile methodologies (Scrum, Kanban, etc.)
Ideally, you’ll also have:
Knowledge of data formats: Parquet, ORC, Avro, as well as emerging concepts like Deltalake/Apache Hudi/Iceberg
Containerisation: Docker, Kubernetes, DevOps principals
Experience in using Agile methodologies (Scrum, Kanban, etc.)
Experience in Cloud Azure/AWS/GCP
About The Global Professional Services Firm
A global delivery services organization providing offshoring support in areas such as assurance, tax, consulting, and technology. It enables businesses to enhance efficiency, access specialized talent, and drive growth through scalable, high-quality professional services. With a strong focus on collaboration and innovation, it supports complex business needs across multiple industries. The organization operates through offices across the globe, allowing it to deliver around-the-clock support and local expertise at scale.