Title: Data Integration Engineer
Duration: 1+ Year Contract
Location: 100% REMOTE Work
Rate: $Open/hr.
Requirements
Responsibilities
- Develop, test, document and maintain scalable data pipelines.
- Build out new data integrations including APIs to support continuing increases in data volume and complexity.
- Establish and follow data governance processes and guidelines to ensure data availability, usability, consistency, integrity, and security.
- Build and implement scalable solutions that align to our data governance standards and architectural road map for data integrations, data storage, reporting, and analytic solutions.
- Collaborate with analytics and business teams to improve data models that feed business intelligence tools, increasing data accessibility and fostering data-driven decision making across the organization.
- Design and develop data integrations and a data quality framework. Write unit/integration/functional tests and document work.
- Design, implement, and automate deployment of our distributed system for collecting and processing streaming events from multiple sources
- Perform data analysis required to troubleshoot data-related issues and assist in the resolution of data issues.
- Guide and mentor less experienced engineers on coding best practices and optimization.
Qualifications:
- Education: 4-year college degree or equivalent combination of education and experience. Prefer academic background in in Computer Science, Mathematics, Statistics, or related technical field.
- 7 years of relevant work experience in analytics, data engineering, business intelligence or related field.
- Skilled in object-oriented programming (Python in particular).
- Experience developing integrations across multiple systems and APIs.
- Experience with or knowledge of Agile software development methodologies.
- Experience with cloud-based databases, specifically AWS technologies (e.g. Redshift, RDS, S3, EMR, EC2, Kinesis).
- Experience using SQL queries as well as writing and optimizing SQL queries in a business environment with large-scale, complex datasets.
- Experience with data warehouse technologies. Experience creating ETL and/or ELT jobs.
- Excellent problem solving and troubleshooting skills.
- Process oriented with great documentation skills.
- Experience developing in a Linux environment.
- Experience designing data schemas and operating SQL/NoSQL database systems is a plus.
- Experience with Big Data tools like Spark, Hadoop, Kafka, etc. is a plus.
Benefits
Note: If interested please send your updated resume to [email protected] and include your rate requirement along with your contact details with a suitable time when we can reach you. If you know of anyone in your sphere of contacts, who would be a perfect match for this job then, we would appreciate if you can forward this posting to them with a copy to us.