About IPinfo
IPinfo is a leading provider of IP address data, including geolocation, VPN and residential proxy detection, mobile carrier data, and over 20 other context tags. Our API handles over 120 billion requests per month, and we also license our data for use in many products and services you’ve likely interacted with.
We’re a fast-growing, bootstrapped company with a globally distributed team of around 60 people. Our data powers customers such as Cloudflare, T-Mobile, SpaceX, DemandBase, and Clearbit, among many others. We also sponsor and contribute to academic conferences like ACM IMC and ACM CoNEXT, staying closely connected with the internet measurement and research community.
About the Privacy Team
The Privacy team delivers accurate, verifiable insight into how IP addresses are used across the internet. Instead of relying on opaque risk scores or ambiguous classifications, we focus on defensible, fact-based signals derived from observable network behavior.
Our work centers on identifying and analyzing the applications, protocols, and infrastructure used to anonymize or tunnel internet traffic—such as VPNs and residential proxy networks. We combine large-scale internet measurement with careful analysis to understand how these technologies are deployed and how they behave in real-world environments.
Examples of our research and methodology include:
We collaborate closely with internet infrastructure providers, including major CDNs and network operators / ISPs, as well as customers across sectors such as banking, ad tech, fraud prevention, and security. Our work helps these partners understand traffic quality, abuse patterns, and privacy tooling with clarity and confidence.
Team members are expected not only to build and maintain systems, but also to challenge assumptions, validate signals, and publish findings that can withstand external scrutiny.
How We Work
We’re an ambitious, fully remote team spread across the globe. We sync up on a monthly all-hands Zoom call, and most teams meet every 1–2 weeks. Everything else happens asynchronously using Slack, GitHub, Linear, and Notion.
This setup allows you to choose the hours that work best for you, while still collaborating closely with teammates. Autonomy, ownership, and clear communication are essential to how we operate.
Responsibilities
-
Design, build, and operate data collection and analysis pipelines to detect proxy usage, VPN applications, and related spoofing or evasion behavior
-
Work with large-scale internet measurement data (we collect 75+ TB per week, including BGP, DNS, ping, and traceroute data from 1200+ global vantage points)
-
Research, apply, and implement techniques from cutting-edge internet measurement and network security research
- Maintain a high bar for signal quality and defensibility, prioritizing observable network behavior over heuristics or guesswork
- Communicate findings clearly by contributing to blog posts, technical documentation, and research publications, both internally and externally
Skills and Experience
Required
- Background in one or more of:
-
Internet measurements or network telemetry
-
Data engineering for large-scale, high-volume, or real-time systems
-
Network engineering or ISP / CDN operations with experience analyzing traffic behavior
-
Security research or applied network security
-
Threat intelligence or abuse / fraud analysis
- Deep understanding of networking protocols and networked applications (e.g. VPNs, proxies, tunneling, routing behavior)
- Proficiency in Bash scripting, Python, Go, or similar languages for building data collection and analysis systems
-
Experience working with cloud platforms such as Google Cloud Platform, Amazon Web Services, or Microsoft Azure
- Strong working knowledge of Linux-based systems, including operating, debugging, and optimizing services on production servers
- Experience building, deploying, and operating containerized workloads using Docker or similar technologies
- Proficiency with Git and collaborative development workflows (code reviews, pull requests, CI)
- Strong analytical skills and attention to detail; ability to distinguish signal from noise
- Excellent communication skills and ability to clearly explain complex technical findings
- Curiosity and commitment to continuous improvement; belief that systems, signals, and processes can always be improved
Nice to Have
-
Experience building or operating real-time or high-volume data pipelines, or working with large-scale internet measurement datasets
-
Experience with workflow orchestration and scheduling systems such as Airflow
-
Familiarity with ad tech, fraud detection, abuse prevention, or cybersecurity use cases
- Experience publishing or presenting technical work, including blog posts, academic papers, whitepapers, or conference talks
What We Offer
- Build at a bootstrapped, independent company with no board or outside investors — we optimize for long-term product quality, not short-term growth targets
-
Real ownership and autonomy: you’ll shape systems, signals, and direction, not just implement tickets
-
100% remote, globally distributed team
-
Flexible working hours designed for deep focus and a sustainable pace
-
Competitive salary, adjusted for experience and local market
-
Flexible vacation policy built on trust and personal responsibility
- Solve hard, real-world problems at internet scale, using data most companies never see
-
At least one annual company-wide gathering to reconnect and reset in person