Why We Work at Dun & Bradstreet
Dun & Bradstreet unlocks the power of data through analytics, creating a better tomorrow. Each day, we are finding new ways to strengthen our award-winning culture and accelerate creativity, innovation and growth. Our 6,000+ global team members are passionate about what we do. We are dedicated to helping clients turn uncertainty into confidence, risk into opportunity and potential into prosperity. Bold and diverse thinkers are always welcome. Come join us! Learn more at dnb.com/careers.
You will be part of the team responsible for measuring the quality of our Global Inventory data. As a Data Quality Analyst, you will take a lead role implementing data quality standards, developing data quality strategies, and collaborating closely with cross-functional teams to ensure the accuracy, consistency, and reliability of our data assets. Your technical skillset will be instrumental in managing the team’s pipelines for Data Quality monitoring.
Key Responsibilities:
- Execute a comprehensive data quality monitoring strategy which aligns with the organization's Data Quality Standards and business objectives.
- Develop a strong understanding of Dun & Bradstreet’s inventory data.
- Perform baseline data quality monitoring to proactively identify data quality issues, metrics.
- Employ advanced data analysis and profiling techniques.
- Liaise with business stakeholders to ensure requirements are clear and documented.
- Automate data quality monitoring solutions and internal processes.
- Create or update data models to ensure that data is stored in an organized structure.
- Utilize PowerBI and/or Looker to design, create, connect and administer dashboards which derive insights from data quality monitoring results.
- Implement a robust data validation framework with automated testing processes.
- Communicate with the globally distributed stakeholders using JIRA and Confluence.
- Capture requirements accurately and seek strong understanding of use cases.
- Recommend improvements to data quality team’s internal processes.
- Generate regular reports on data quality metrics.
- Review data to identify patterns or trends that may indicate errors in processing.
- Develop comprehensive documentation of data quality processes, procedures, and findings, and ensure junior members document their work.
- Comply with data governance policies and procedures.
- Remain an expert in industry best practices and technologies related to data quality.
- Provide guidance and mentorship to junior data quality engineers, fostering their growth and development.
Required Traits Include:
- Bachelor's degree in Business Analytics, Computer Science, Information Technology, or a related field.
- 5+ years of experience and demonstrated in-depth knowledge of data analysis, querying languages, data modelling, and the software development life cycle.
- Expertise in SQL (preferably Bigquery).
- Proficient in Python.
- Familiar with Airflow, GCP Composer and Terraform.
- Agile mindset and deep understanding of agile project management (Scrum/Kanban).
- Experience in Database design, modelling, and best practices.
- Experience with cloud computing technologies (preferably GCP).
- Experience with PowerBI, Looker or similar data visualization tool.
- Ability to mentor & provide guidance to less experienced members of the team.
- Analytical, process improvement and problem solving skills.
- Strong communication and the ability to articulate data issues and solutions.
- Commitment to meet deadlines and uphold the release schedule.
- Experience collaborating across time zones as part of a global team.
- Experience with Microsoft Suite, including Excel, Word, Outlook and Teams.
Notice to Applicants: Please be advised that this job posting page is hosted and powered by Lever. Your use of this page is subject to Lever's
Privacy Notice and
Cookie Policy, which governs the processing of visitor data on this platform.