About MI-C3
Established in 2014 by CEO and founder Glen Scott, MI-C3 International Limited is a Malta-based company specializing in delivering trusted software solutions tailored for mission-critical environments. Our flagship product, AFFECTLI, empowers organizations to make informed, data-driven decisions by providing a consolidated, real-time view of complex operations. We pride ourselves on fostering a collaborative, agile work environment that celebrates diversity, rewards
Data Integrations Engineer (NiFi)
We are seeking a mid-level Data Integration Engineer with hands-on experience in Apache NiFi to join our dynamic team. In this role, you will design, implement, and maintain real-time data integration pipelines, handling data from diverse sources such as IoT/IIoT devices, third-party APIs, and raw files. Your primary focus will be on processing streaming data to provide valuable insights that drive informed decision-making within our organization.
As MI-C3 transitions towards Fluvio and Rust, experience with these technologies will be advantageous but is not mandatory. The ideal candidate will possess a deep understanding of data pipelines, real-time event streaming, and ETL workflows, coupled with a passion for exploring and implementing new technologies.
Key Responsibilities
- Collaborate with cross-functional teams to design and implement scalable, real-time data streaming solutions using Apache NiFi.
- Ingest and process data from various sources, including IoT/IIoT protocols (e.g., MQTT, SNMP, CoAP, TCP, WebSockets) and third-party APIs.
- Develop and maintain robust ETL pipelines, ensuring data is transformed and loaded efficiently for analysis and storage.
- Continuously monitor and optimize data workflows to maintain low-latency, high-throughput processing capabilities.
- Configure and manage message brokers such as Kafka, RabbitMQ, and AMQP to facilitate efficient data exchange and support event-driven architectures.
- Implement validation checks and quality measures to ensure the accuracy, reliability, and integrity of integrated data.
- Proactively identify, diagnose, and resolve issues related to data ingestion, transformation, and streaming processes to ensure uninterrupted data flow.
Technical Requirements
- Demonstrated experience in designing and implementing data integration solutions using Apache NiFi for real-time streaming data.
- Strong skills in Java and Python for developing custom data processing components and applications.
- Familiarity with tools such as Apache Spark and Kafka for building scalable data integration solutions.
- Experience configuring and managing message brokers like RabbitMQ, AMQP, and Kafka to enable efficient data exchange.
- Hands-on experience with protocols such as MQTT, SNMP, CoAP, TCP, and WebSockets for data capture from edge devices and industrial systems.
- Knowledge of data validation techniques and quality assurance practices to ensure reliable data integration.
- Strong analytical and problem-solving abilities, with a keen attention to detail.
- Excellent communication and teamwork skills to effectively collaborate with cross-functional teams.
- A proactive mindset with a willingness to learn and work with new tools and technologies, including Fluvio and Rust.
Preferred Qualifications
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Familiarity with Fluvio and Rust is a plusExperience with cloud-based platforms and distributed systems is advantageous.
- Understanding of embedded systems, requirements engineering, and systems integration is beneficial.
What We Offer
- Be part of a forward-thinking company that values innovation and continuous improvement.
- Opportunities for professional development and career advancement within a growing organization.
- A supportive and inclusive work environment that values diversity and collaboration.
- A comprehensive compensation package commensurate with experience and qualifications.