Your Tasks
Collaborate with other teams and build data services for data ingestion, processing and visualization of data product’s insights.
- Collaborate with other teams and build data services for data ingestion, processing and visualization of data product’s insights.
- Integrate cloud providers and third-party tools to provide teams with a holistic overview of their cloud costs, code quality and software security
- Provide essential services for the platform like billing data ingestion, configuring end-user data configuration and management portals, managing data contracts or defining data pipelines between different services.
- Design, develop and implementation of data integration solutions supporting batch/ETL and API-led integrations that deliver tangible business value.
- Proactively assess the current state of the technology and identify gaps and overlaps. Capture the future state of technology vision in actionable, context-specific roadmaps.
- Develop policies around data quality, data security, data retention, data stewardship, and proactively identify and support project impacts
- Serves as an expert level technical resource across multiple initiatives.
- Work in a team-based environment including a global workforce, vendors, and third-party contractors.
- Translate high-level business requirements into detailed technical specifications.
- Collaborate closely with peers, offering mentorship and fostering a knowledge-sharing environment.
- Continuously evaluate and advocate for advanced tools, technologies, and processes that drive industry best practices.
- Actively participate in our Agile development processes, contributing to team meetings and delivering incremental improvements.
- Collaborate with cross-functional teams to understand data requirements and deliver reliable data solutions.
- Monitor data pipeline performance, troubleshoot issues, and implement optimizations to improve efficiency.
- Assist in the design and development of APIs for seamless data integration across platforms.
Your Profile
Bachelor’s degree in computer science, Engineering, or a related field.
- Bachelor’s degree in computer science, Engineering, or a related field.
- 5+ years of Data engineering experience in designing/ building complex data pipelines supporting data analytics and data warehousing.
- 4+ years working with Data and relevant computation frameworks and system.
- 4+ years using Python programming language.
- 5+ years’ experience in writing SQL and query optimization.
- 3+ years of experience working with columnar databases such as AWS Redshift, Snowflake, Databricks, Vertica etc.
- Nice to have: Master’s degree in computer science or engineering.
- Nice to have: Cloud Technologies: Proven experience in a cloud computing environment, preferably GCP, Azure, AWS or similar.
- Nice to have: Python: Practical development and data analysis experience using Python and/or PySpark
- Nice to have: Advanced Data Processing: Knowledge in using data processing technologies such as Apache Spark, Flink or Kafka.
- Nice to have: Workflow Management: Familiarity with orchestration and scheduling tools like Apache Airflow.
- Nice to have: Experience with data reporting and visualization tools (e.g., PowerBI, MicroStrategy, Tableau or similar).
- Nice to have: Experience with Agile data engineering principles and methodologies.
- Nice to have: Exceptional problem-solving skills and willingness to learn new concepts, methods, and technologies.
- Nice to have: Strong understanding of ELT methodologies and tools.
- Nice to have: Experience in data warehousing and familiarity with data warehousing concepts and terminologies.
- Nice to have: Capable of troubleshooting and conducting root cause analysis to address and resolve data issues effectively.
- Nice to have: Analyze and develop physical database designs, data models and metadata
- Nice to have: Communication: Outstanding communication skills, coupled with strong problem-solving, organizational, and analytical abilities.