Senior Data Engineer

AI overview

Join a high-performing team to architect intelligent data ecosystems integrating Generative AI while mentoring others and ensuring data integrity across multiple industries.
We build the tech that moves industries forward. We have our eyes set on AI, energy, logistics, sports and other complex and exciting segments. We believe in an innovative approach to solving deep issues and encourage our people to find their own solutions. We are constantly rethinking processes, business models, architecture, and tech stacks.  We foster a sense of curiosity, experimentation, and passion beyond code. With us, you can easily deepen your knowledge in any field you’re curious about. And because we work across many industries, you’ll be gaining the experience others can only dream of. At the forefront of reimagining how industries operate, we are a team of builders and thinkers reshaping e-commerce, ticketing, and logistics from first principles. Our work is grounded in curiosity, experimentation, and a drive for real business impact. We eliminate inefficiencies—not just in code, but in legacy models and outdated assumptions. For those who seek to solve complex problems and mentor others in the process, this is a place to thrive. We are looking for a Senior Data Engineer who combines deep technical expertise with a strategic mindset. In this role, you’ll architect and build intelligent data ecosystems that power autonomous workflows—integrating Generative and Agentic AI to help businesses move faster, think smarter, and operate more efficiently. Equal parts architect and builder, you’ll be instrumental in delivering high-impact, AI-powered solutions across diverse industries. In this role, you will
  • Analyze and optimize business processes by collaborating with stakeholders to uncover inefficiencies and define data requirements for automation
  • Design scalable, modular data architectures that integrate with Generative AI and Agentic AI systems to support real-time decision-making
  • Engineer robust ETL/ELT pipelines using Python, cloud-native services, and orchestration tools, supporting both batch and streaming data needs
  • Architect RAG and vector database solutions using semantic search to enable LLMs to retrieve curated, context-rich business data
  • Build intelligent data products, from predictive models and decision engines to AI-driven insights platforms
  • Implement data quality, validation, and governance frameworks to ensure data integrity, lineage, and compliance across systems
  • Lead technical discovery sessions with clients to transform complex business challenges into AI and data-driven opportunities
  • Mentor team members on best practices in data engineering, AI integration, and modern cloud architectures
  • What you will bring
  • Expert-level Python proficiency for data engineering, including API integrations, data transformations (Pandas, PySpark), and automation
  • Proven experience designing and deploying large-scale data platforms on AWS, GCP, or Azure
  • Strong foundation in building production-grade ETL/ELT pipelines using Apache Airflow, Kafka, Spark, or cloud-native tools
  • Hands-on experience with vector databases (e.g., Pinecone, Weaviate, Chroma, Milvus) and implementing semantic search
  • Demonstrated knowledge of Generative AI and LLMs, with practical experience in RAG architectures and prompt engineering
  • Deep understanding of data governance, quality, and documentation, with a focus on lineage, metadata, and compliance
  • Familiarity with cloud services including serverless computing, managed databases, and data warehouses such as BigQuery, Redshift, or Snowflake
  • Experience working with complex real-world data environments, including legacy systems, SaaS integrations, APIs, and databases
  • Fluency in both Lithuanian and English languages, written and spoken
  • What we offer
  • A working culture that is high performing, ambitious, collaborative and fun
  • Health insurance and a yearly training budget (local and international conferences, language courses), employee-led workshops
  • Flexible working hours
  • Unlimited WFH (work from home) policy
  • Extra vacation days: 2 after working at NFQ for two years and 4 after four years on our team
  • Bonus for referrals
  • For those who dream of traveling: WFA (work from anywhere) possibilities in NFQ - approved countries
  • Office perks and team activities
  • Salary range:
    17 215 - 27 540 PLN + VAT (B2B) 
    14 270 - 22 200 PLN gross (Permanent)

    If you have any questions, please contact me at jowita.dudek@nfq.com or via Linkedin

    Check all our career opportunities here.

    Perks & Benefits Extracted with AI

    • Flexible Work Hours: Flexible working hours
    • Health Insurance: Health insurance and a yearly training budget (local and international conferences, language courses), employee-led workshops
    • Home Office Stipend: Unlimited WFH (work from home) policy
    • Office perks and activities: Office perks and team activities
    • Paid Time Off: Extra vacation days: 2 after working at NFQ for two years and 4 after four years on our team

    We build technologies that accelerate business success.

    View all jobs
    Salary
    17 215 zł – 27 540 zł per month
    Ace your job interview

    Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

    Senior Data Engineer Q&A's
    Report this job
    Apply for this job