Few words about project

We’re looking for Senior Data Engineer who will join our team in Warsaw or remotely.

We are carrying out the project for our client, an American private equity and investment management fund - listed on the Forbes 500 list - based in New York. We support them in the area of data pipeline, infrastructure and data engineering team. They operate very widely in the world of finance, loans, investments and real estate. As a Senior Data Engineer you’ll design and implement core systems that enable data science and data visualization at companies that implement data-driven decision process to create a competitive advantage. You’ll build data lakes, data warehouses and data pipelines.

What's important for us?

At least 5+ years of professional experience in data-related role

Undergraduate or graduate degree in Computer Science, Engineering, Mathematics, or similar

Expertise in Python and SQL languages

Experience in designing data warehouses (Snowflake or Redshift)

Experience with different types of database technologies (RDBMS, noSQL, etc.)

Experience in building ETL processes and data pipelines with platforms like Airflow or Luigi

Expertise in AWS stack and services

Proficiency in using Docker

Great analytical skills and attention to detail - asking questions and proactively searching for answers

Excellent command in spoken and written English, at least C1

Creative problem-solving skills

Excellent technical documentation and writing skills

Ability to work with both Windows and Unix-like operating systems as the primary work environments

Nice to have - Experience with infrastructure-as-code tools, like Terraform

Nice to have - Familiarity with data visualization in Python using either Matplotlib, Seaborn or Bokeh

Nice to have - Proficiency in statistics and machine learning, as well as Python libraries like Pandas, NumPy, matplotlib, seaborn, scikit-learn, etc

Nice to have - Knowledge of any Python web framework, like Django or Flask with SQLAlchemy

Nice to have - Experience in operating within a secure networking environment, like a corporate proxy

Nice to have - Experience in working with repository manager, for example Jfrog Artifactory

You will be responsible for...

Developing data technology stacks including data lakes, data warehouses and ETL pipelines

Building data flows for fetching, aggregation and data modeling using batch and streaming pipelines

Developing reusable library code, infrastructure and toolsets for data scientists

Designing datasets and schemes for consistency and easy access

Creating solutions that enable data scientists and business analysts to be self-sufficient as much as possible.

Documenting design decisions before implementation

Benefits

Renowned global clients

Work with exciting scaleups and enterprises from USA, Western Europe and Scandinavia.

Culture of feedback

We foster open communication based in frequent mutual feedback.

Flexibility

Work remotely or from the office with an elastic work schedue.

Personal development

Engineering open spaces with CTO will help you develop your skills.

Medicover & Multisport

Stay healthy and fit with private healthare package and access to 4000+ sport facilities.

Join our team!

We're always on the lookout for talented teammates. Send us your CV and we'll get in touch once we have the right project for you.

Hi there, we use cookies to provide you with an amazing experience on our site. If you continue without changing the settings, we’ll assume that you’re happy to receive all cookies on Sunscrapers website. You can change your cookie settings at any time.