Staff Engineer

AI overview

Join the BWAN Data Engineering team to design and develop streaming services that handle millions of data points per minute while ensuring quality and scalability.

About Netskope

Today, there's more data and users outside the enterprise than inside, causing the network perimeter as we know it to dissolve. We realized a new perimeter was needed, one that is built in the cloud and follows and protects data wherever it goes, so we started Netskope to redefine Cloud, Network and Data Security. 

Since 2012, we have built the market-leading cloud security company and an award-winning culture powered by hundreds of employees spread across offices in Santa Clara, St. Louis, Bangalore, London, Paris, Melbourne, Taipei, and Tokyo. Our core values are openness, honesty, and transparency, and we purposely developed our open desk layouts and large meeting spaces to support and promote partnerships, collaboration, and teamwork. From catered lunches and office celebrations to employee recognition events and social professional groups such as the Awesome Women of Netskope (AWON), we strive to keep work fun, supportive and interactive.  Visit us at Netskope Careers. Please follow us on LinkedIn and Twitter@Netskope.

About the role

Please note, this team is hiring across all levels and candidates are individually assessed and appropriately leveled based upon their skills and experience.

By bringing together zero trust security with network optimization, Netskope Borderless WAN allows customers to confidently provide secure, high-performance access to every remote user, device, site, and cloud. It also simplifies the on-boarding of traffic to the Netskope Security Cloud and NewEdge network, so customers can on-board more quickly and efficiently.

What's in it for you

The BWAN Data Engineering team builds and optimizes systems spanning data ingestion, processing, storage optimization and more. We work closely with engineers and the product team to build highly scalable systems that tackle real-world data problems and provide our customers with accurate, real-time, fault tolerant solutions to their ever-growing data needs.

We are looking for skilled engineers experienced with building and optimizing cloud-scale distributed systems to develop our next-generation ingestion, processing and storage solutions. You will work closely with other engineers and the product team to build highly scalable systems that tackle real-world data problems. Our customers depend on us to provide accurate, real-time and fault tolerant solutions to their ever growing data needs. This is a hands-on, impactful role that will help lead development, validation, publishing and maintenance of logical and physical data models that support various analytics and OLTP environments.

What you will be doing

  • Design and develop streaming services that will ingest millions of data points per minute.
  • Understand the customer and their reporting requirements, to ensure that our solutions deliver.
  • Making important technical decisions on behalf of the engineer team. Evaluate open source technologies to find the best fit for our needs, and contribute to some of them to meet our unique needs and help the community.
  • Coordinate with other service development teams, product management and support teams to ensure scalability, supportability and availability for owned services and dependent services.

Required skills and experience

  • 5+ years of hands-on experience in architecture, design or development of enterprise data solutions, applications, and integrations
  • Excellent algorithms, data structure, and coding skills with Golang. The ability to develop and deliver code in an idiomatic way that is performant.
  • Proficiency in SQL and/or other database technologies.
  • Experience building products using some of the following distributed technologies:
    • Distributed Queues (i.e. Apache Kafka, AWS Kinesis or GCP PubSub)
    • Columnar or NoSQL Stores (i.e. Big Query or Clickhouse)
    • Relational Stores (i.e. Postgres, MySQL or Oracle)
  • Experience with software engineering standard methodologies (e.g. unit testing, code reviews, design document)
  • Excellent written and verbal communication skills

Good to know

  • Read/Write intensive DBs (BigTable, DynamoDB, HBase, Cassandra)
  • Keen on applying Data Governance principles
  • Exposure to a Cloud provider (GCP, AWS, Oracle), plus Terraform
  • Distributed Processing Engines (i.e. Apache Spark, Apache Flink or Apache Beam)
  • Experience with ADR and other product processes
  • Prior experience working in a distributed (remote) team environment

Education

  • BS or MS in Computer Science or equivalent training/knowledge

#LI-NN1

Netskope is committed to implementing equal employment opportunities for all employees and applicants for employment. Netskope does not discriminate in employment opportunities or practices based on religion, race, color, sex, marital or veteran statues, age, national origin, ancestry, physical or mental disability, medical condition, sexual orientation, gender identity/expression, genetic information, pregnancy (including childbirth, lactation and related medical conditions), or any other characteristic protected by the laws or regulations of any jurisdiction in which we operate.

Netskope respects your privacy and is committed to protecting the personal information you share with us, please refer to Netskope's Privacy Policy for more details.

Netskope, a global cybersecurity leader, is redefining cloud, data, and network security to help organizations apply zero trust principles to protect data.

View all jobs
Get hired quicker

Be the first to apply. Receive an email whenever similar jobs are posted.

Ace your job interview

Understand the required skills and qualifications, anticipate the questions you may be asked, and study well-prepared answers using our sample responses.

Staff Engineer Q&A's
Report this job
Apply for this job