censys logo
Software Engineer – Data Pipeline - censys Ann Arbor, MI, USA Bookmark Share Print 168 0 2

Listing Description

Software Engineer – Data Pipeline

Engineering

Ann Arbor, MI, USA

We're looking for a Data Engineer to help grow our data processing pipeline! We perform billions of network handshakes and DNS lookups per hour as well as consume external data feeds to maintain an up-to-date view of all hosts and networks on the Internet You will help build and maintain the processing pipeline that consumes inbound data feeds to produce a consistent view of Internet hosts. We leverage the Google Cloud Platform (including Google Dataflow, Bigtable, and BigQuery) for processing data as well as build our own analysis tools. Your responsibilities will include exploring new ways of processing and analyzing incoming network data, and building out our data processing pipeline.

The types of things you’ll do:

Work with Apache Beam, Airflow, Google Dataflow, BigTable, and BigQuery to build the next generation of the Censys data processing pipeline

Design automated solutions for building, testing, monitoring, and deploying ETL pipelines in a continuous integration environment

Work with application engineers to develop internal APIs and data solutions to power Censys product offerings

Coordinate with backend engineering team to analyze data in order to improve the quality and consistency of our data

Desired Qualifications

Bachelor's degree in Computer Science or related field, or equivalent experience

3+ years of full-time, industry experience

Deep understanding of relational as well as NoSQL data stores (e.g., Snowflake, Redshift, BigTable, Spark) and approaches

Hands-on experience building data processing pipelines (e.g, in Storm, Beam)

Proficiency with object-oriented and/or functional languages (e.g. Java, Scala, Go)

Strong scripting ability in Python/Ruby/BASH

We celebrate diversity and are committed to creating an inclusive environment for all employees. Censys is an equal opportunity employer.Work with Apache Beam, Airflow, Google Dataflow, BigTable, and BigQuery to build the next generation of the Censys data processing pipeline

Design automated solutions for building, testing, monitoring, and deploying ETL pipelines in a continuous integration environment

Work with application engineers to develop internal APIs and data solutions to power Censys product offerings

Coordinate with backend engineering team to analyze data in order to improve the quality and consistency of our data


Listing Details

  • Citizenship: No Requirements
  • Incentives: Stock Options

 

  • Education: No Requirements
  • Travel: No Travel
  • Telework: No Telecommute



About Us

NinjaJobs is a community-run job platform developed by information security professionals. Our unique approach of focusing strictly on cybersecurity positions allows us to personalize the user experience.

Our Contacts

1765 Greensboro Station Pl.
Suite 900
Tysons Corner Va 22102

(703) 594-7765