Senior Data Engineer, REMOTE WARSZAWA

Requirements

What's important for us:



  • Experience of minimum 5 years as a full stack or data engineer in the fields of data warehousing, data monitoring, and building and maintaining ETL pipelines

  • Deep experience with data pipeline and workflow management tools (e.g. Airflow)

  • Solid knowledge and experience with database design, setup, and maintenance

  • Proven ability to operate in highly dynamic environments with high product velocity

  • Strong mastery of Python and SQL

  • Strong communication skills, both orally and written

  • ETL experience on Google Cloud Platform & Snowflake

  • Financial experience getting data out/in

  • Experience with Affinity CRM

  • Data experience and BI Tool on ToughtSpot

  • Bachelor's or master's degree in Computer Science, Database Management, etc.


Nice to have:



  • Experience with BigQuery, SIGMA

  • Experience in venture capital data operations

  • Familiarity with data visualization tools (e.g. Looker, Tableau, PowerBI, or similar), CRM (Salesforce), automation tools (Zapier)

Advance your career with Sunscrapers, a leading force in software development, now expanding its presence in a data-centric environment. Join us in our mission to help clients grow and innovate through a comprehensive tech stack and robust data-related projects. Enjoy a competitive compensation package that reflects your skills and expertise while working in a company that values ambition, technical excellence, trust-based partnerships, and actively supports contributions to R&D initiatives.


The project:


As a Senior Data Engineer, you will report to a Head of Data & Analytics and help build the entire data stack and infrastructure that support operations.


Responsibilities:



  • Design, build, and maintain the data infrastructure necessary for optimal extraction, transformation, and loading of data from a variety of data sources using SQL, NoSQL, and big data technologies

  • Develop and implement data collection systems that integrate a variety of sources such as proprietary company data and third-party data sources etc.

  • Build out an automated process to collect and visualize user engagement data from CRM/UI


IMPORTANTLY, the project is being implemented in cooperation with a client from the West Coast (Pacific Time), and in the first weeks of cooperation, availability is expected from 5:00 p.m. to 8:00 p.m.



,[Design, build, and maintain the data infrastructure necessary for optimal extraction, transformation, and loading of data from a variety of data sources using SQL, NoSQL, and big data technologies, Develop and implement data collection systems that integrate a variety of sources such as proprietary company data and third-party data sources etc., Build out an automated process to collect and visualize user engagement data from CRM/UI] Requirements: Python, ETL, Airflow, SQL, Communication skills, Google Cloud Platform, Snowflake, BI, Affinity CRM, BigQuery, Data visualization, Looker, Tableau, PowerBI, Salesforce Additionally: Sport subscription, Private healthcare, International projects, Free coffee, Playroom, Free snacks, Free beverages, In-house trainings, Modern office, No dress code.

Data publikacji: 2024-04-26
APLIKUJ