Вакансія Data Engineer — Azure / Data Factory / Azure Functions / Snowflake| Nature-based Solutions, Черкаси. Пошук роботи в Черкасах - Робота Data Engineer — Azure / Data Factory / Azure Functions / Snowflake| Nature-based Solutions (ГрінМ). Шукаю роботу в Черкасах.

Мінімум раз в житті кожної людини цікавить пошук роботи. І, незважаючи на те, що робота в Черкасах є завжди, багато хто не знає, де її шукати і куди йти працювати. Що вибрати - вакансії в Черкасах в невеликих фірмах або ж здійснювати пошук роботи в корпораціях? Що краще: робота в Черкасах або виїхати в інше місто або навіть країну? Питань багато, тому ласкаво просимо на портал, орієнтований на пошук роботи і вакансій, а також розміщення резюме в Черкасах!

вакансії Черкасах / вакансії IT, WEB фахівці / Data Engineer — Azure / Data Factory / Azure Functions / Snowflake| Nature-based Solutions

На 12.01.2026 в нашій базі:  322 вакансій,   2317 резюме

Вакансія: Data Engineer — Azure / Data Factory / Azure Functions / Snowflake| Nature-based Solutions (в Черкаси)

  • Дата додавання:
  • Зарплата:163000 грн
  • Місто:Львів
  • Бажана Освіта:повна вища
  • Досвід роботи:обов'язковий
  • Графік роботи:повний робочий день

Загальна інформація про вакансію


<p><strong>About the Client &amp; Mission</strong></p> <p>Our client is the world’s largest environmental nonprofit focused on reforestation and sustainable development (Nature-based Solutions). We are building a modern cloud data platform on <strong>Azure and Snowflake</strong> that will serve as a single source of truth and enable faster, data-driven decision-making.</p> <p></p> <p>About the Initiative</p> <p>This role supports a Data Warehouse initiative focused on tangible delivery impact: trusted data, clear and scalable models, and fast release cycles (1–3 months) with well-defined SLAs. You’ll work in a collaborative setup across <strong>Data Engineering ↔ BI ↔ Product</strong>, often handling <strong>1–2 parallel workstreams </strong>with proactive risk and dependency management.</p> <p></p> <p><strong>Core Stack</strong></p> <ul><li><strong>ELT/DWH:</strong> Azure Data Factory + Azure Functions (Python) → Snowflake</li><li><strong>CI/CD:</strong> Azure DevOps pipelines + <strong>DL Sync</strong> (Snowflake objects and pipeline deployments)</li><li><strong>Primary data sources:</strong> CRM/ERP (Dynamics 365, Salesforce), MS SQL, API-based ingestion, CDC concepts</li><li><strong>Data formats:</strong> JSON, Parquet</li></ul> <p><strong>Team (our side)</strong></p> <p>Lead Data Engineering, PM, DevOps, QA.</p> <p></p> <p>Your Responsibilities</p> <ul><li>Design, build, and maintain <strong>incremental and full-refresh ELT pipelines</strong> (ADF + Azure Functions → Snowflake).</li><li>Develop and optimize <strong>Snowflake SQL</strong> for the DWH and data marts (Star Schema, incremental patterns, basic SCD2).</li><li>Build production-grade <strong>Python</strong> code in <strong>Azure Functions</strong> for ingestion, orchestration, and lightweight pre-processing.</li><li>Implement and maintain <strong>data quality controls</strong> (freshness, completeness, duplicates, late-arriving data).</li><li>Support <strong>CI/CD delivery</strong> for Snowflake objects and pipelines across dev/test/prod (Azure DevOps + DL Sync).</li><li>Contribute to documentation, <strong>best practices</strong>, and operational standards for the platform.</li><li>Communicate clearly and proactively: <strong>status → risk → options → next step</strong>, ensuring predictable delivery.</li></ul> <p>Requirements (Must-have)</p> <ul><li>4+ years in Data Engineering or related roles.</li><li>Strong <strong>Snowflake SQL</strong> (CTEs, window functions, COPY INTO, MERGE).</li><li>Hands-on experience with <strong>incremental loading</strong> (watermarks, merge patterns) and <strong>basic SCD2</strong> (effective dating / current flag).</li><li>Strong <strong>Python</strong> (production-ready code), including API integration (pagination, retries, error handling), logging, configuration, and secrets handling.</li><li>Solid experience with <strong>Azure Data Factory</strong> (pipelines, parameters, triggers) and <strong>Azure Functions</strong> (HTTP/Timer triggers, idempotency, retries).</li><li>Understanding of <strong>ELT/DWH modeling</strong> (Star Schema, fact/dimension design, performance implications of joins).</li><li>CI/CD familiarity: <strong>Azure DevOps</strong> and automated deployment practices for data platforms (DL Sync for Snowflake is a strong plus).</li><li>Strong communication skills and a proactive, accountable approach to teamwork.</li></ul> <p>Nice to Have</p> <ul><li><strong>PySpark</strong> (DataFrame API, joins, aggregations; general distributed processing understanding).</li><li>Experience with <strong>D365 / Salesforce</strong>, MS SQL sources, API-based ingestion, and CDC patterns.</li><li>Data governance/security basics, Agile/Scrum, and broader analytics tooling exposure.</li></ul> <p>Selection Process (Transparent &amp; Practical)</p> <p><strong>Stage 1 — Intro + TA + Short Tech Screen (40–60 min, Zoom):</strong></p> <ul><li>project context (multi-project setup, 1–3 month delivery cycles), must-haves for Azure/ELT, a short SQL/Python scenario;</li><li>soft skills &amp; culture match discussion covering: Proactive communication &amp; stakeholders, Critical thinking &amp; judgment, Problem solving &amp; systems thinking, Ownership &amp; maturity.</li></ul> <p><strong>Stage 2 — Deep-Dive Technical Interview (75–90 min, with 2 engineers):</strong><br />Live SQL (CTE/window + incremental load/SCD2 approach), PySpark mini-exercises, Azure lakehouse architecture discussion, plus a mini-case based on a real delivery situation.<br />No take-home task — we simulate day-to-day work during the session.</p> <p></p> <p>What We Offer</p> <ul><li>Competitive compensation.</li><li>Learning and growth alongside strong leaders, deepening expertise in Snowflake/ Azure / DWH.</li><li>Opportunity to expand your expertise over time across diverse, mission-driven &amp; AI projects.</li><li>Flexible work setup: remote / abroad / office (optional), gig contract (with an option to transition if needed).</li><li>Equipment and home-office support.</li><li><strong>36 paid days off per year:</strong> 20 vacation days + UA public holidays (and related days off, as applicable).</li><li><strong>Monthly benefit of the cafeteria:</strong> $25 to support your personal needs (learning, mental health support, etc.).</li><li>Performance reviews: ongoing feedback, compensation review after 12 months, then annually.</li><li>Paid sabbatical after 5 years with the company.</li></ul> <p><strong>P.S.</strong> Dear fellow Ukrainians,<br />we kindly ask you to apply for this role in a professional and well-reasoned manner, clearly highlighting the experience that is most relevant to the position.</p> <p>If you are unsure whether your background fully matches the requirements, please feel free to mention this openly in your application. This will not reduce your chances of being considered; it helps us review your profile fairly and prioritize candidates based on overall fit for the role.</p>

Переглянути контакти

Подивіться схожі вакансії на порталі Jobs.ua

Або подивіться резюме на порталі Jobs.ua

Інші вакансії в рубриці «IT, WEB фахівці»:



Вакансія Компанія Зарплата
Data Engineer — Azure / Data Factory / Azure Functions / Sno... ГрінМ 163000 грн.
Технічний спеціаліст Tracklam 10000 грн.
Junior Web Developer (Trainee) Zlata Fortuna 10000 грн.
React Native Developer IT Recruitment Solutions ... 63000 грн.
QA / QC Engineer IT Recruitment Solutions ... 75600 грн.
Business Analyst IT Recruitment Solutions ... 80000 грн.
.NET Developer IT Recruitment Solutions ... 126000 грн.
Python Developer IT Recruitment Solutions ... 105000 грн.
QA Lead (manual) IT Recruitment Solutions ... 126000 грн.
Technical Support Analyst HS-Soft 64000 грн.







Jobs.ua рекомендує переглянути: