Data Engineer Job at Zettalogix Inc

Zettalogix Inc Plano, TX 75075

Job Role: Data Engineer

Location: Plano, TX (Local Preferred)

Duration: Long-term Contract

Visa: Any (Except OPT and CPT)

10+ years of experience

Key skills: kafka streams, redshift or snowflake, (Java or python) Scala

Roles & Responsibilities:

  • Apply data and systems engineering principles to develop code spanning the data lifecycle including ingest, transform, consume end to end from source to consumption for operational and analytical workloads that minimize complexity and maximize business value.
  • Work as part of an agile scrum team to deliver business value.
  • Participate in design sessions to understand customers' functional needs.
  • Work with solution architect and development team to build quick prototypes leveraging existing or new architecture.
  • Provide end-to-end flow for a data process, map technical solutions to the process.
  • Develop and deploy code in continuous development pipelines leveraging off-the-shelf and open-source components of Enterprise Data Warehouse, ETL, and Data Management processes adhering to the solution architecture.
  • Perform software analysis, code analysis, requirements analysis, release analysis and deployment.

Experience & Qualifications:

  • Hands-on development experience in distributed, analytical, cloud-based, and/or open-source technologies.
  • At least 10 years of professional experience, building software for Data Ingestion/Data Movement ETL pipelines for operational and/or analytical systems.
  • Expertise with coding and implementing data pipelines in cloud-based data infrastructure, analytical, and no-SQL databases (i.e.: AWS, Snowflake, MongoDB, Postgres).
  • Hands-on programming experience in Python and/or Java, and/or SnowSQL.
  • Experience leveraging build and deploy tools (i.e.: Github, Gradle, Maven, Jenkins).
  • Ability to travel up to 10% of the time, if not less.
  • Bachelor’s degree (or higher)
  • Experience Implementing software leveraging flow-based pipelines such as NiFi or Airflow and Streaming services such as Kafka.
  • Experience building data pipeline framework.

Job Type: Contract

Salary: $65.00 - $70.00 per hour

Experience level:

  • 10 years

Schedule:

  • 8 hour shift
  • Day shift
  • Monday to Friday

Experience:

  • Data Engineer: 10 years (Preferred)
  • kafka streams: 6 years (Preferred)
  • redshift or snowflake: 5 years (Preferred)
  • Python or Java: 5 years (Preferred)
  • AWS, Snowflake, MongoDB: 5 years (Preferred)

Work Location: One location




Please Note :
chrismaxcer.com is the go-to platform for job seekers looking for the best job postings from around the web. With a focus on quality, the platform guarantees that all job postings are from reliable sources and are up-to-date. It also offers a variety of tools to help users find the perfect job for them, such as searching by location and filtering by industry. Furthermore, chrismaxcer.com provides helpful resources like resume tips and career advice to give job seekers an edge in their search. With its commitment to quality and user-friendliness, Site.com is the ideal place to find your next job.